Nvidia GeForce RTX 50-series Blackwell reviews

I generally find something to enjoy in almost all of boy wonder Daniel Owen's videos.. but he's not known for holding back when it comes to clickbait.

This one veers into the parody of itself category:

 
The 5070Ti is the same level as a 4080 Super or 7900XTX.

 
I guess you could call the results "interesting" in a sense that pretty much all of benchmarks show where the card is math or bandwidth limited in comparison to 4080/S.
In the first case it ends up being slower as you'd expect from TFs.
In the second - on par or even faster, again, as you'd expect from GB/s.
 
HUB managed to find a few games where the 5070 Ti was slower than the 4070 Ti super. TPU had very different results. With the margins being so thin these discrepancies can significantly influence people’s perception depending on which reviews you consume.
 
HUB managed to find a few games where the 5070 Ti was slower than the 4070 Ti super. TPU had very different results. With the margins being so thin these discrepancies can significantly influence people’s perception depending on which reviews you consume.
Have they reported clocks on their tests? There could be variations which throw the results off by few %s. I think smallest gap between 4070 Ti Super and 5070 Ti we got was 9%

Edit: we got 8% in Assetto Corsa and STALKER 2, both at 4k.
At 1080p it was just 3-4%
 
HUB downclocked their 5070 Ti review sample to “stock” clocks but it shouldn’t cause an 8% swing. It’s becoming increasingly important to focus on the games you actually play at the resolutions you care about. Averages don’t tell the full story.
 
One thing I always wonder is in these days of upscaling, should I be paying attention to the 4K numbers which usually show the biggest performance increase or the 1080p number which is the actual internal resolution (with DLSS performance) I'll likely be running most of the most demanding games at with a GPU like the 5070Ti.
 
Regarding the PhysX being deprecated on 50-series.. it sucks, but they've got to move on at some point, right? So long as all of these games can be played without GPU PhysX acceleration.. then I don't see any big issue. We actually need IHVs and APIs to start being a bit more aggressive in deprecating some of this old stuff that's either not worth supporting anymore or is actively holding back progress in other areas. There's lots of tech demos from the past which no longer work on modern hardware.. it's the price we pay to move on.

Hardware accelerated PhysX was in a small amount of games to begin with, and yes they are cool.. and yes it's sad to see support fade away for some of this stuff... but 40 series GPUs still have many years of support left. There'll be GPUs out there which can play this stuff for a long time yet. If the extra effects are THAT important.. just keep one of these GPUs around lol.
 
Hardware accelerated PhysX was in a small amount of games to begin with, and yes they are cool.. and yes it's sad to see support fade away for some of this stuff... but 40 series GPUs still have many years of support left. There'll be GPUs out there which can play this stuff for a long time yet.

The problem is if you're looking to upgrade. A 40 series is not an option right now as Nvidia stopped making them, so any Nvidia upgrade path for my 3060 means losing these effects. I was actually looking forward to a 4k 60+ playthrough of Arkham Knight with all Nvidia effects enabled when I get a new card, welp.

Secondly, while I don't think Nvidia exactly needed to call a press conference for this or anything, the way it wasn't mentioned until people starting filing bug reports is really not the best way they could have handled this. I gotta think some kind of better compromise was possible when Nvidia saw the writing on the wall for 32 bit CUDA years ago instead of this sudden stealth deprecation.

Chances are this is not the last you'll hear of it. :)

1740006331586.png

1740006431117.png
 
I was actually looking forward to a 4k 60+ playthrough of Arkham Knight with all Nvidia effects enabled

Me too. I saw somewhere that Arkham Knight is on 64-bit PhysX libraries so should be safe. Glad I finished the earlier Batman games already.

Secondly, while I don't think Nvidia exactly needed to call a press conference for this or anything, the way it wasn't mentioned until people starting filing bug reports is really not the best way they could have handled this. I gotta think some kind of better compromise was possible when Nvidia saw the writing on the wall for 32 bit CUDA instead of this stealth deprecation.

Yeah all they had to do is drop a sad face emoji and say they had no choice but to move on from legacy 32-bit CUDA. The silent sabotage is just asking for pitchforks.
 
The problem is if you're looking to upgrade. A 40 series is not an option right now as Nvidia stopped making them, so any Nvidia upgrade path for my 3060 means losing these effects. I was actually looking forward to a 4k 60+ playthrough of Arkham Knight with all Nvidia effects enabled when I get a new card, welp.
I think Arkham Knight works, its older Physx titles that don't. From the 360/PS3 era.
 
Me too. I saw somewhere that Arkham Knight is on 64-bit PhysX libraries so should be safe. Glad I finished the earlier Batman games already.



Yeah all they had to do is drop a sad face emoji and say they had no choice but to move on from legacy 32-bit CUDA. The silent sabotage is just asking for pitchforks.
I think Arkham Knight works, its older Physx titles that don't. From the 360/PS3 era.

Ah yeah that tracks, it's not 32 bit. Arkham City however. :(
 
One thing I always wonder is in these days of upscaling, should I be paying attention to the 4K numbers which usually show the biggest performance increase or the 1080p number which is the actual internal resolution (with DLSS performance) I'll likely be running most of the most demanding games at with a GPU like the 5070Ti.
1080p would be closer to the performance scaling you will get. As long as ray reconstruction isn’t used anyway.
 
Last edited:
Ah yeah that tracks, it's not 32 bit. Arkham City however. :(
Yeah, Arkham Knight works, any 64 bit game will still work fine. However 75% of PhysX games are 32 bit, so they are locked out.

I also remembered a very funny thing, there were two games that straight up relied on CUDA (not PhysX) to do some visual features .. Just Cause 2 used CUDA to render more realistic water and better depth of field, and NASCAR 14 used CUDA to render more realistic smoke and particles.

Both of these games are 32 bit, which means they won't be able to use these effects (they can't be run on the CPU), which is hilarious and disgusting at the same time!
 
Last edited:
Back
Top