Nvidia GeForce RTX 50-series Blackwell reviews

32 bit PhysX support is deprecated on RTX 50 series, this means 32 bit games with PhysX will not be hardware accelerated on RTX 50 GPUs (which is the majority of PhysX games).

This is hugely unacceptable and is a needless move as well. A 4090 will work better than a 5090 in these games. I mean what the hell!


How many 32 bits games out there (must be all very old) still benefit from PhysX support? I suppose that modern CPU must be way more than enough to run them well enough. If that's the case it could very well that running PhysX on GPU could degrade performance.
 
How many 32 bits games out there (must be all very old) still benefit from PhysX support? I suppose that modern CPU must be way more than enough to run them well enough. If that's the case it could very well that running PhysX on GPU could degrade performance.
It's not about if running on GPU makes sense performance wise, it's about the fact that you don't get those fancy PhysX effects if you don't.
 
How many 32 bits games out there (must be all very old) still benefit from PhysX support? I suppose that modern CPU must be way more than enough to run them well enough. If that's the case it could very well that running PhysX on GPU could degrade performance.
There are people with low end CPUs who will buy 5060s and 5050s, these CPUs will almost certainly not be able to handle the load.

Also there is a high chance the game code won't allow these advanced PhysX effects to work on the CPU to begin with. Either way this looks bad for NVIDIA and their supposedly formidable software support.
 
It's not about if running on GPU makes sense performance wise, it's about the fact that you don't get those fancy PhysX effects if you don't.

Are there any fancy PhysX effects requiring GPU? I mean, it must be relatively easy to do them with CPU now, if there're interests in them.
I personally am not aware of any game I currently playing is even in 32 bits. Even World of Warcraft (I'm not playing it now though) dropped 32 bits support back in 2018. I can only imagine most of these games must be very old.
 
Are there any fancy PhysX effects requiring GPU?
Yeah, fluid simulations in Borderlands games and Cryostasis, also smoke/fog simulation in Akrkam games, Metro games and Assassin’s Creed 4.

Here is the Cryostasis PhysX demo running at 13fps on 7950X3D and 5090, vs a 100fps on the 4090.

Utterly shameful and disgraceful.

 
This means that the RTX 5090 and the RTX 5080 (and all other RTX50 series GPUs) cannot run games like Cryostasis, Batman: Arkham City, Borderlands 2, GRAW 2, Mirror’s Edge, Assassin’s Creed IV: Black Flag, Bioshock Infinite with GPU-accelerated PhysX. Instead, you’ll have to rely on the CPU PhysX solution, which is similar to what AMD GPUs have been offering all these years.
I've tried running about half of these on my 4090 at various times over the last 2 years and I can certainly say that enabling GPU PhysX in them has lead to some serious performance issues which basically made them unplayable.
So not sure that this is such a big deal. The tech seem to have been abandoned for quite some time now in terms of support, deprecating that completely in an official capacity to me looks like just an acceptance of what has happened already anyway.
Would be cool if they'd go back and patch all these games to work properly on modern day OS/h/w of course but this isn't entirely an Nvidia task to perform here.
 
Hey I got an idea for a HUB video.. 'Nvidia's cynical ruse requires GAMERZ BUY A SECOND GPU'
Hell yeah let's bring back those dedicated Physx GPU good times!

Actually this also sounds like something GN might love, too. I'll start a bidding war between the Steves.
Cause who wouldn't want to snark about cleaning the pipeline of unwanted 4060s?
 
32 bit PhysX support is deprecated on RTX 50 series, this means 32 bit games with PhysX will not be hardware accelerated on RTX 50 GPUs (which is the majority of PhysX games).

This is hugely unacceptable and is a needless move as well. A 4090 will work better than a 5090 in these games. I mean what the hell!


Damn this sucks. The Physx effects are still impressive in Arkham City and Borderlands 2.

Who really cares about losing proprietary feature support in software that's been obsolete for over a decade now ?

"Obsolete" isn't particularly relevant to the complaint here, Physx ultimately moving to be wholly CPU based does nothing for existing games where you lose a significant number of effects without GPU acceleration support (or at least, be able to run them at acceptable framerates).

Basically a bunch of older titles just got a graphical downgrade on the latest Nvidia GPU's.

1739892539350.png
 
Last edited:
I do.

I’m currently playing Borderlands pre-sequel and I haven’t played Batman Arkham Knight yet.
And here I thought from my impressions that the PC tech enthusiast community at large weren't the ones who were so 'principled' or interested about maximalist preservation of games ...
 
"Obsolete" isn't particularly relevant to the complaint here, Physx ultimately moving to be wholly CPU based does nothing for existing games where you lose a significant number of effects without GPU acceleration support (or at least, be able to run them at acceptable framerates).

Basically a bunch of older titles just got a graphical downgrade on the latest Nvidia GPU's.
Well that's called "legacy cruft" especially with non-standard proprietary technical environments (CUDA) like this!

Regressions such as these instances (including past examples involving 3D Vision and CSAA) are going to be to be a part of the deal in bringing about a "better future" ...
 
Last edited:
Are there any fancy PhysX effects requiring GPU? I mean, it must be relatively easy to do them with CPU now, if there're interests in them.
I'm sure it would be, if NVIDIA allowed it (and made it multithreaded)
 
Last edited:
I've tried running about half of these on my 4090 at various times over the last 2 years and I can certainly say that enabling GPU PhysX in them has lead to some serious performance issues which basically made them unplayable.
The GPU PhysX code is usually single threaded and is prone to frame pacing issues, however the average frame rates is so much higher than the CPU path. This is a separate issue.
Basically a bunch of older titles just got a graphical downgrade on the latest Nvidia GPU's.

1739892539350.png
Completely agree with Alex's take on this.

Who really cares about losing proprietary feature support in software that's been obsolete for over a decade now ?
All enthusiasts care, lots of people on Reddit and on Resetera care. The backlash from this is going to be so severe.
 
The GPU PhysX code is usually single threaded and is prone to frame pacing issues, however the average frame rates is so much higher than the CPU path. This is a separate issue.
Dunno what issue is that but Cryostasis was unplayable for me with GPU PhysX as well as AC Black Flag.
Deprecation of old 32 bit drivers is to be expected in general, and is very much required to maintain OS security and compatibility.
I'm not sure what can be done here aside from games being patched to remove said options - like what Mafia 2 DE did. Which is hardly saving anything.
 
All enthusiasts care, lots of people on Reddit and on Resetera care. The backlash from this is going to be so severe.
If the protests ultimately fail all that's left for them to bat an eye on are their financial statements but should there be no material impact either, what else can they do aside from conceding ?

Doesn't it feel good for the industry to converge on standardized technologies because that way we can avoid more shortsighted tragedies from happening again ?
 
Dunno what issue is that but Cryostasis was unplayable for me with GPU PhysX as well as AC Black Flag.
I had similar issues, they are mostly frame pacing issues, it happens in several old games (so not necessarily PhysX related), using vertical sync/fast sync or a frame cap solved the issues for me.

Deprecation of old 32 bit drivers is to be expected in general, and is very much required to maintain OS security and compatibility.
I don't care about the reason, If security is the reason then NVIDIA can provide a beta branch or an experimental branch for users to use on their own risks, but to leave users hung out to dry like that is completely unacceptable.

Doesn't it feel good for the industry to converge on standardized technologies because that way we can avoid more shortsighted tragedies from happening again?
Let's not go there please, the industry is a big piece of slow moving shit, they converged on shitty DX12 API, they failed to converge on a physics standard after so many years to this day. It's why we are so pissed off about this PhysX thing, PhysX games are gems among gaming in general, as they provide visual effects not seen in any game to this day.
 
Back
Top