Nvidia Turing Product Reviews and Previews: (Super, TI, 2080, 2070, 2060, 1660, etc)

So people really think that Turing is a different core architecture than Volta?

Just looks at the basics:
Same concurrent floating point and integer execution units.
Same L1 architecture where shared memory and L1 cash are now merge into one.

How many hints do you need?
 
Why would Vega be a dud? Turing is basically Vega + RT + Tensor so Vega rasterization performance with the same number of SMs should be very similar with a smaller die size.

If Vega would be a dud then so is Turing.

It does seem though that nvidia recalibrated its releases after seeing Pascal’s competition. This was probably the best time to gamble on RT transistors.

*Volta, surely.

I think what Rootax is saying is that nVidia never bothered to develop full Volta stack because Vega was a "dud", being content to rely on Pascal instead while further developing the microarchitecture into what is now Turing.
 
*Volta, surely.

I think what Rootax is saying is that nVidia never bothered to develop full Volta stack because Vega was a "dud", being content to rely on Pascal instead while further developing the microarchitecture into what is now Turing.

Yes, thx you for the translation :)
 
So people really think that Turing is a different core architecture than Volta?

Just looks at the basics:
Same concurrent floating point and integer execution units.
Same L1 architecture where shared memory and L1 cash are now merge into one.

How many hints do you need?


So if the GTX 2060 really lacks RT hardware, will nvidia say it uses Volta architecture?

Assuming there might be a new xx106 chip to replace GP106 of course, which isn't certain at the moment.
 
The RTX nomenclature is the indicator of the presence of RT hardware, not some codename only forum nerds care about. If it's called GTX then I don't care if they still call it a Turing or TU106 as long as it's presented to the public correctly. But who knows what they'll call the chip, or what architecture they'll label it as. Knowing Nvidia they'll probably brand it as Turing with an RTX name and have zero RT hardware on it.
 
It's fake.

:'(

Ah well, should've twigged to how unlikely a leak it seems. Doesn't mean it won't happen, the game will be out in like a year.

Also, I've seen a lot about the dedicated raytracing hardware, but don't tell me RTX has dedicated inference cores too? I mean, I get the local cache requirements vs executions speed for inferencing are much higher than for training and other normal GPU stuff. But separating out every possible software scenario into its own special hardware costs a lot of money. No wonder the 2080 has less performance per dollar than a 1080ti.
 
According to Digital Foundry, who spoke to DICE engineers on site:

1-BFV was being developed on Titan V, as such the whole RTX operation lacked proper RT acceleration on the RT cores.
2-DICE only received Turing two weeks before the demo
3-DICE is also not using the Tensor Cores in denoising currently
4-At this state, the 1440p performance is 40-50fps, @4K it's sub 30fps, DICE is amazed the game worked at 4K at all
5-DICE will make the quality of Ray Tracing scalable to work on different GPUs, also to decouple the RT resolution from the rendering resolution.
6-RT acceleration is currently lacking as it only works after G-Buffer stage, DICE will change it to work asynchronously alongside the rasterization which will lead to a significant FPS boost.
7-Also DICE will be merging the drawn object instances into the same acceleration structure, they expect this to lead to an almost 30% increase in raytracing performance.
8-DICE is looking into RT AO, and RT particles ..
 
Last edited:
According to Digital Foundry, who spoke to DICE engineers on site:

1-BFV was being developed on Titan V, as such the whole RTX operation lacked proper RT acceleration on the RT cores.
2-DICE only received Turing two weeks before the demo
3-DICE is also not using the Tensor Cores in denoising currently
NOW I am really excited for Turing and RT as rendering technique :)

If that is really the case, the upcoming titles with RT/RTX will be a whole new ballgame. And BFV already looks amazing, as SofTR does. Go, RayTracing, Go! :)
 
NOW I am really excited for Turing and RT as rendering technique :)

If that is really the case, the upcoming titles with RT/RTX will be a whole new ballgame. And BFV already looks amazing, as SofTR does. Go, RayTracing, Go! :)
While the raytracing itself might look "amazing", BFV is quickly becoming the poster child on "how not to do your RT effects" - every window is like crystal clear shining mirrors and even the damn buildings themselves seemed almost reflective
 
While the raytracing itself might look "amazing", BFV is quickly becoming the poster child on "how not to do your RT effects" - every window is like crystal clear shining mirrors and even the damn buildings themselves seemed almost reflective

Exagerating the effect seems in line though with the fact that they had very little time with the actual hardware. Prolly it was not their intention
 
I'm not sure if this is the best thread for this, or which (mods move?), but apparently NVIDIA is keeping tight leash on Turing
According to [H]ardOCP they require AIBs to let NVIDIA know who gets to review their custom cards as well as require everyone reviewing (AIB custom or FE, doesn't matter) to sign 5 year NDA before getting drivers
https://www.hardocp.com/article/2018/08/28/nvidia_controls_aib_launch_driver_distribution/
Since when did Nvidia and AMD not know who got what cards? AMD themselves tightly controlled which AIBs sent cards. And the reviewers have already signed NDAs, otherwise they wouldn't be getting anything.
 
Since when did Nvidia and AMD not know who got what cards? AMD themselves tightly controlled which AIBs sent cards. And the reviewers have already signed NDAs, otherwise they wouldn't be getting anything.
You never had to sign a 5 f''.. ING year nda just to review a card though...
 
It's probably best to start a new thread for if you want to promote Kyle's self-interests. As Guru3D said
That's just a big can of nonsense (and I initially wrote another word there). NVIDIA always has tracked what media gets what AIB samples, period. You know who does that as well? AMD, they even regulate what brand and sample end up at what reviewer. How conveniently he forgets to mention that.
I think Kyle is letting his feelings getting to his better judgment.
https://forums.guru3d.com/threads/n...stom-rtx-2080-ti-reviews.422723/#post-5579179
 
Back
Top