Direct3D feature levels discussion

I'm a bit confused about this. Has it actually been explicitly stated that Nvidia's new neural rendering technologies in their RTX kit are Blackwell only as a hard requirement? The developer page has one mention of Blackwell from what I can tell as more of a marketing tie in.

A performance requirement is different as that would apply to simply just supporting cooperative vectors in Direct X as well with all architectures.
 
Has it actually been explicitly stated that Nvidia's new neural rendering technologies in their RTX kit are Blackwell only as a hard requirement?
No. If anything we have the opposite statements thus far. But we don't know what impact the lack of Blackwell's h/w optimization will have on performance of any of these new features.
 
Turing is going to age like fine wine again if Neural Texture Decompression gets used, and RTX Geometry accelerating RT performance. Crazy architecture, just crazy.
Imagine buying a 2080 Ti in 2018, spending top dollar, but in 2025, you can still play everything. Its performance is on par with the future ¿5060?, the VRAM amount is still decent, and it's compatible with DLSS 4 and FSR 3 frame generation. You could play almost until 2027 with it.
 
Turing is going to age like fine wine again if Neural Texture Decompression gets used, and RTX Geometry accelerating RT performance. Crazy architecture, just crazy.

More like crazy developer relations and marketing. In an alternate universe none of this stuff gets used if Nvidia doesn’t push it aggressively. The idea that Alan Wake 2 is already being updated with RTX geometry is just nuts.
 
Turing is going to age like fine wine again if Neural Texture Decompression gets used, and RTX Geometry accelerating RT performance. Crazy architecture, just crazy.
The 1080 Ti gets all the praise, but the 2080 Super and 2080 Ti will last even longer. The 1080 Ti became obsolescent circa 2023, lasting six years. The 2080 Super and 2080 Ti will last until the PS6 launches. That's a whole decade. I expect the 4090 and 4080 Super to last until the PS7 too.
 
Turing is going to age like fine wine again if Neural Texture Decompression gets used, and RTX Geometry accelerating RT performance. Crazy architecture, just crazy.
When you compare Turing vs RDNA1 (2070 Super vs 5700 XT), you can find this:
  • In Alan Wake 2, the 2070 Super is 60% faster .. because of it's mesh shader support.
  • In Avatar and Star Wars Outlaws, the 2070 Super is 20% faster because of ray tracing acceleration (hardware on 2070 Super vs software on 5700 XT).
  • In Indiana Jones and Metro Exodus EE, you can't even run those on the 5700 XT.
The 1080 Ti gets all the praise, but the 2080 Super and 2080 Ti will last even longer. The 1080 Ti became obsolescent circa 2023, lasting six years. The 2080 Super and 2080 Ti will last until the PS6 launches. That's a whole decade. I expect the 4090 and 4080 Super to last until the PS7 too.
In similar prospects, the 2080 Super is 38% faster than 1080Ti in Avatar and Outlaws, the 2080Ti is 60% faster.
In Alan Wake 2, the 2070 Super (not the 2080 Super) is 40% faster than the 1080Ti. The 2080 Super and 2080Ti is probably 60% and 80% faster respectively, and you still can't run Indiana Jones and Metro Exodus EE on the 1080Ti.
 
Back
Top