Digital Foundry Article Technical Discussion [2024]

Objects missing in the frames where Physx is off? Timestamped below Physx frames have papers on the floor, fire extinguisher effects, etc...
Performance at the expense of visual fidelity.
 
Last edited:
Only Physx game where I have a stability issue is with Black Flag and it's smoke effects.
It's Ubisoft, they needed to update the game to handle more powerful GPUs. PhysX implementation in that game works fine on Kepler, but anything newer and it stutters (CPU limited implementation).
I think Fallout 4 also crashes with GPU Physx effects enabled.
Yeah, it's the only PhysX implementation that crashes as of right now, it's a Bethesda problem, as it works fine on any GPU older than Turing. The developer needed to patch the game, but of course it's Bethesda, so they didn't.

And of course there is a mod that fixes the issue.
That drop from 357fps to 92fps is brutal!!
Yup, I guess advanced GPU physics are still a fps hog to this day.

Objects missing in the frames where Physx is off? Timestamped below Physx frames have papers on the floor, fire extinguisher effects, etc...
Yeah, that's been always the case with PhysX games, PhysX off completely gets rid of objects with advanced physics.
 
Yup, I guess advanced GPU physics are still a fps hog to this day.

Yeah but the numbers are also being skewed due to GPUs getting better at rendering old games faster than they’re getting better at running tough workloads. The “drop” from 400fps to 200fps isn’t that interesting. Better to look at the absolute cost in frametimes.

I wonder how a proper async compute based GPU physics solver would fare today. You could likely overlap a lot of the compute work with other 3D stuff.
 
I don't want to see or hear any bitching from anyone again about the performance drop from ray tracing after seeing this...

That drop from 357fps to 92fps is brutal!!

Yeah, but people bitched about the performance with Physx all the time during its heyday, even on high end Nvidia GPU's. Considering that scene in particular is showing 'only' ~double the performance of GPU physx when run on the CPU, when the rest of the benchmarks are closer to 3-5X, I'd say there's something particularly off about that section.

Considering how long it's been since any (prominent) game shipped with GPU Physx, I'd wager Nvidia hasn't exactly spent much time optimizing it for newer architectures over the years. You've got scenes in that Compubase video showing a near 3X performance hit when the Physx side is just a few newspaper clippings blowing around, hell I remember playing Arkham City on my 1660 and Physx on high just meant I had to drop the res from 4k to 1800p to maintain 60. It's probably extremely inefficient on ADA.

It'd be another matter if like RT, Nvidia was promoting GPU Physx as the new frontier. Now it's basically in legacy support mode.
 
Considering how long it's been since any (prominent) game shipped with GPU Physx, I'd wager Nvidia hasn't exactly spent much time optimizing it for newer architectures over the years.

They actually still use it in their "suite of tools" for products like omniverse.
Physical representations
Omniverse supports various specifications and standards that simplify the exchange of 3D-related data across multiple tools. Nvidia is working with the Universal Scene Description (USD) community to extend the specification to support Material Definition Language (MDL) and PhysX capabilities.

PhysX describes objects' mechanical and fluid properties and interactions.
 
Just a theory here but my guess is Physx has some limitations in terms of how parallel it is, and I would guess physics in general. Which means if you want faster physics as opposed to more complex physics (in terms of both scope and sample rate) modern GPUs might be too wide.
 
Proprietary features available only on one hardware vendor is a curse. They solve issues that can't be applied homogeneously in software production, because, why bother add it, if it cannot be turned on for half or more of your potential customers?
 
Good to know. I haven’t played Arkham Knight yet. And yeah PhysX was a mess in Black Flag.
I tested this recently btw out of curiosity. The PhysX causes huge stutters on the medium and High setting, but not on the low setting. The medium and high setting must have some sort of awful bug with them.
 
Proprietary features available only on one hardware vendor is a curse. They solve issues that can't be applied homogeneously in software production, because, why bother add it, if it cannot be turned on for half or more of your potential customers?
I dont mind a hardware vendor trying to gain some market advantage here by innovating and improving on areas they spent their own time and money on researching and developing. That seems entirely fair to me. If these features are perceived as desirable enough, it usually gets the ball rolling on alternative/open solutions, so we do ultimately all benefit in the end, with a little patience.

It'd be different if they had features/tech that were so locked down by patents or whatever that nobody else could ever come up with comparable alternatives, but that's usually not the case, so I think the situation is fine.
 
The medium and high setting must have some sort of awful bug with them.
They activate interactive smoke particles from smoke bombs, guns and fire places (camps, chemnies, ..etc). These particles recieve and cast shadows, once they are in view of the player they causes stutters.

Something in their behavior causes the stutters, maybe it's the shadowing part or maybe the number of particles is too high.
 
I dont mind a hardware vendor trying to gain some market advantage here by innovating and improving on areas they spent their own time and money on researching and developing. That seems entirely fair to me. If these features are perceived as desirable enough, it usually gets the ball rolling on alternative/open solutions, so we do ultimately all benefit in the end, with a little patience.

It'd be different if they had features/tech that were so locked down by patents or whatever that nobody else could ever come up with comparable alternatives, but that's usually not the case, so I think the situation is fine.
I didn't say that my problem is a vendor enjoying the fruits of their innovation. I didn't bring the subject of fairness either.

It's the vanishing of that innovation altogether that I am talking about.
 
Trapped in a plain white room with only two plants for company, an on-location Rich dials into the next DF Direct Weekly for salvation. Joining him this week as John and Alex, discussing the shutdown of the Yuzu and Citra emulators, the latest Sony PC ports from Nixxes, the Xbox partner showcase and hints of machine learning-based upscaling from AMD. Meanwhile, John's beard attracts an extraordinary and possibly unhealthy level of attention.

0:00:00 Introduction
0:01:30 News 01: Nintendo forces Yuzu, Citra shutdown
0:28:16 News 02: Horizon PC specs released, Ghost of Tsushima PC announced
0:39:29 News 03: Xbox partner showcase drops!
0:58:53 News 04: Dragon’s Dogma 2 has unlocked frame-rate on consoles
1:09:03 News 05: Footage emerges from cancelled TimeSplitters game
1:15:30 News 06: AMD teases AI upscaling… but what is it?
1:26:42 Supporter Q1: Should I buy an OLED Steam Deck now, or wait for new portable Windows handhelds?
1:33:56 Supporter Q2: If cost were no issue, what VR headset would be best for PC gaming?
1:37:23 Supporter Q3: How does DLSS quality at 1440p compare to DLSS performance at 4K?
1:42:28 Supporter Q4: Do you think Sony will release their own dedicated gaming handheld?
1:48:20 Supporter Q5: How does John feel about being labeled as a television producer?
1:49:00 Supporter Q6: Did John and Marc get up to any shenanigans in Germany?
1:52:16 Supporter Q7: What’s the longest John has gone without shaving his beard?
 


0:00:00 Introduction
0:01:30 News 01: Nintendo forces Yuzu, Citra shutdown
0:28:16 News 02: Horizon PC specs released, Ghost of Tsushima PC announced
0:39:29 News 03: Xbox partner showcase drops!
0:58:53 News 04: Dragon’s Dogma 2 has unlocked frame-rate on consoles
1:09:03 News 05: Footage emerges from cancelled TimeSplitters game
1:15:30 News 06: AMD teases AI upscaling… but what is it?
1:26:42 Supporter Q1: Should I buy an OLED Steam Deck now, or wait for new portable Windows handhelds?
1:33:56 Supporter Q2: If cost were no issue, what VR headset would be best for PC gaming?
1:37:23 Supporter Q3: How does DLSS quality at 1440p compare to DLSS performance at 4K?
1:42:28 Supporter Q4: Do you think Sony will release their own dedicated gaming handheld?
1:48:20 Supporter Q5: How does John feel about being labeled as a television producer?
1:49:00 Supporter Q6: Did John and Marc get up to any shenanigans in Germany?
1:52:16 Supporter Q7: What’s the longest John has gone without shaving his beard?

Nice to see that Rich is also a big Trek fan!
 
your gpu can be used for AI accelereration (NPU = neural processing unit) but if this setting can use GPUs for it, who knows
there is this news today mentioning a NPU -or TPU depending on the device-, but it basically doubles the performance of your computer without buying new components nor changing anything on your computer (hardware wise).


I guess this could be used on consoles, so you don't have to buy a PS5 Pro -perhaps it's just me but the PS5 seems powerful enough-
 
there is this news today mentioning a NPU -or TPU depending on the device-, but it basically doubles the performance of your computer without buying new components nor changing anything on your computer (hardware wise).


I guess this could be used on consoles, so you don't have to buy a PS5 Pro -perhaps it's just me but the PS5 seems powerful enough-

That's not what I got from it, seems they're just running on the code on the logic that's the fastest at running it.

Which consoles do anyway, because they're consoles.
 
Something also always worth keeping in mind is that the term more performance doesn't inherently differnetiate between latency vs. throughput or the the specifics of the workload. With what is being said my guess is that it helps with the throughput of "larger" workloads that can essentially more easily queue. I'm not sure if it would be as applicable for something like real time gaming just going by what is being said and the theory behind it.

The downside is that bottlenecks can happen as data is shuffled between different units, affecting the speed and efficiency with which tasks can be completed. By running more subtasks simultaneously in parallel, across multiple processors, the researchers hope to regain lost time and energy.
 
Back
Top