Nvidia Turing Product Reviews and Previews: (Super, TI, 2080, 2070, 2060, 1660, etc)

You do realize devs had only 2 weeks to work on the implementation, don't you?
Nope

https://www.pcgamesn.com/how-dice-made-nvidias-ray-tracing-dreams-a-reality

"DICE made Nvidia’s ray traced gaming dreams a reality in just eight months"


Given the scale of the challenge that getting a new, never-before-seen game technology into a title the size of Battlefield 5 represents, it’s not surprising DICE would’ve wanted more time. But in the end its PC developers have only had a little over eight months.

“We started working on very early drivers,” says Holmquist, “and not on real hardware, at the end of last year.”

And to be able to run such an intensive game engine, with such a demanding new feature, has meant the hardware requirements outside of just the GPU have had to change.

“What we have done with our DXR implementation is we go very wide on a lot of cores to offload that work, so we’re likely going to require a higher minimum or recommended spec for producing RT. And very wide is the best way for the consumer in that regard, with a four-core or six-core machine.

“We haven’t communicated any of the specs yet so they might change, but I think that a six-core machine – it doesn’t have to be aggressively clocked – but 12 hardware threads is what we kind of designed it for. But it might also work well on a higher clocked eight thread machine.

:runaway:

 
Serisously...just..no. You posted something wrong..lets move on.
Digital Foundry said:
Also, expect huge gains. This was less than two weeks of work by DICE. They barely had access to these cards prior to this event.
https://wccftech.com/resident-evil-2-support-nvidia-rtx/

Your article only talks about DICE reworking the engine from the ground up to support the notion of Ray Tracing:
There is a big amount of work to express the Frostbite rendering system in the world of ray tracing, all that had to be done,” says Holmquist. “Getting the material colour resolved, getting the ability to ray trace with the terrain – which has a very special way of rendering – character skin meshes, which are very different as well in the way the rendering pipeline works. So all these things took quite a lot of effort to get it to a point where it works consistently.”

But getting access to the hardware to implement RTX happened only very recently:
“We started working on very early drivers,” says Holmquist, “and not on real hardware, at the end of last year.”

So again only two weeks for Dice to make the demo work on RTX 2080Ti. I would say they have done a decent job in such a short time.
 
Reworking the engine for ray tracing and actually implementing RTX/DXR with actual hardware acceleration on specific GPUs are two completely different things.
The latter was done in just 3 days apparently. So with all that groundwork done by DICE and they likely got hardware very very early compared to others, what we see in BFV really should be considered close to as perfect as we can get currently. It looks fantastic and considering what they're doing with reflections, it probably performs really well in context.

Will it be playable at 4k or even 1440p at consistent 60+? Probably not. Is the hardware good enough for our first jaunt into hybrid rendering in games? Certainly looks that way and it's very exciting.

Will the cut down 2070 be enough to experience these games at a decent frame rate? That's my question and one I hope will be answered in reviews, even just by seeing how the 2080 performs.
 
I would expect frustum culling to be a huge issue with raytracing/rasterised hybrid engines. Unless I missed something I haven't seen anyone talking about that?
 
The latter was done in just 3 days apparently. So with all that groundwork done by DICE and they likely got hardware very very early compared to others, what we see in BFV really should be considered close to as perfect as we can get currently. It looks fantastic and considering what they're doing with reflections, it probably performs really well in context.

Will it be playable at 4k or even 1440p at consistent 60+? Probably not. Is the hardware good enough for our first jaunt into hybrid rendering in games? Certainly looks that way and it's very exciting.

Will the cut down 2070 be enough to experience these games at a decent frame rate? That's my question and one I hope will be answered in reviews, even just by seeing how the 2080 performs.

Yup, as someone mentioned previously, this is kind of like the Geforce 256 launch. Except instead of hardware T&L being hyped it's RT being hyped. And just like Geforce 256, it'll struggle to give the experience that gamers and enthusiasts expect. But it does get the ball rolling for RT which is the important bit.

1 or 2 hardware generations from now is when I expect hardware accelerated RT to be both usable and impressive in games.

Also, unlike the Geforce 256, I won't be buying into it with first gen hardware. :p

Regards,
SB
 
Do we know if the RT cores, Tensor Cores, and "shaders" cores, have the same clockspeed ? Or have they independent clock domain they can adjust depending on the load ?
 
Given how long it usually takes for new techniques to make it into games, even if they shipped software a year after getting access to production hardware it would still be impressive. I really don’t understand what people are complaining about (except the prices lol). Developers seem to be really stoked at the prospect of having another powerful tool in the box and I’m optimistic we’ll see exciting stuff soon.
 
15351314852929me8r.jpg


I don't think people are looking at Turing properly, as in the form of gaming. Why does Jensen have all this extra "stuff" on the chip, when Vulcan doesn't need it?

An interesting video....



Also, there are rumors that Huang's comments (about the ray tracing gigas), are fabricated.
 
I don't think people are looking at Turing properly, as in the form of gaming. Why does Jensen have all this extra "stuff" on the chip, when Vulcan doesn't need it?
What is it about the Enlisted implementation that indicates it's not using the RT cores?
 
I don't think people are looking at Turing properly, as in the form of gaming. Why does Jensen have all this extra "stuff" on the chip, when Vulcan doesn't need it?
No API "needs it", you could do raytracing on DirectX 9 with the days hardware if not even earlier. DXR (and probably Vulkan sooner or later) just offer a standardized way to do it
 
Nvidia's layers, or Vulcan's... ?

You mean Vulkan?

The Nvidia driver translates the Vulkan raytracing instructions to native code to run on the hardware. Just because they aren't using a Nvidia-specific API, doesn't mean that the Nvidia-specific hardware isn't being used or isn't a benefit. It's no different to the "CUDA cores" (another term for Nvidia's streaming processors) still getting used when running OpenCL instead of CUDA.
 
No API "needs it", you could do raytracing on DirectX 9 with the days hardware if not even earlier. DXR (and probably Vulkan sooner or later) just offer a standardized way to do it


We are talking about Nvidia's Turing chip. Microsoft's DXR & Vulkan are open standards. Nvidia's solution is proprietary and layered... as they try to market data center GPUs, as gamer cards.
 
We are talking about Nvidia's Turing chip. Microsoft's DXR & Vulkan are open standards. Nvidia's solution is proprietary and layered... as they try to market data center GPUs, as gamer cards.

Developers don't have to use Nvidia's proprietary API to utilize Turing's raytracing hardware so Vulkan and DXR applications/games will be able to benefit from the presence of the RT and Tensor cores. So, as long as the custom hardware does enable better performance, I'm not sure what the problem is with the specific hardware implementation. This seems like a weird angle to criticize Turing on.
 
Last edited:
We are talking about Nvidia's Turing chip. Microsoft's DXR & Vulkan are open standards. Nvidia's solution is proprietary and layered... as they try to market data center GPUs, as gamer cards.
RTX operates on top of DXR/vulkan, it doesn't replace them and based on everything told so far, all they've shown should work regardless of your videocard manufacturer assuming they support DXR. Vulkan might be trickier since there's no standardized way yet (nvidia is offering their extensions for this I think)
 
Back
Top