Water! PhysX Position Based Dynamics

Damn! Differents dev have been showing similar demos for bloody centuries but not a single game has showed something like that in real time.

Try just imagine if you want create a complete lake or just a river with that with pyhsx interaction ( like woods who lay on the surface.. ) a big computing time will be needed..

Same for the smoke... In addition it need to be in phase with the gameplay / the graphics / the situation.
If this just for add this type of thing in a game scene just because you want to show it, better to not do it.

lets take the example of someone smoking a cigarette in a game, yes you can put incredible realist smoke, if just for the smoke of the cigarette it cost you 60% of your compute time.. is it really needed ?
 
Last edited:
A seperate low end gpu just for physx? I mean you can get a 192 kepler core GT 720 for cheap. You don't even need good bandwidth so 64 bit memory is okay.

I know that will never take off. but Nvidia could promote some of their low end gpu's as purely for physics. Obviously you need the dual pci-e slot motherboards.
 
I looked at the thread's second page and PCIe bandwith seems to be an issue (latency does not help it either probably). "Bad news" : GT720 is a PCIe 8x 2.0 card.


lets take the example of someone smoking a cigarette in a game, yes you can put incredible realist smoke, if just for the smoke of the cigarette it cost you 60% of your compute time.. is it really needed ?

That's about the figure to have good shadows. Perhaps most niceties, novelties will be like that from now on (of course, the awesome smoke must cast smooth shadows from multiple light sources)
 
How many games that launched in 2014 support hardware-accelerated PhysX?
I think the last game I played with that was Arkham Origins, and that was a year ago.
 
I looked at the thread's second page and PCIe bandwith seems to be an issue (latency does not help it either probably). "Bad news" : GT720 is a PCIe 8x 2.0 card.


That's about the figure to have good shadows. Perhaps most niceties, novelties will be like that from now on (of course, the awesome smoke must cast smooth shadows from multiple light sources)

Yes but the shadows are for all the scene / frame, not just 1 small parts of it. ( you will not just cast shadows from a small object in a complete scene, but for all the scene ).

A seperate low end gpu just for physx? I mean you can get a 192 kepler core GT 720 for cheap. You don't even need good bandwidth so 64 bit memory is okay.

I know that will never take off. but Nvidia could promote some of their low end gpu's as purely for physics. Obviously you need the dual pci-e slot motherboards.

What will you do ? ask peoples who have put 700 to 1000$ in the high end gpu, to buy a second one just for physx ? And even on developper side, will they really want to put so much time for get just a niche of peoples who can run thoses effect ? (( this said i know many peoples who have do it )

Thoses effect have nothing really new, they are all running on CUDA (PhysX ), OpenCL ( Bullet, Maya etc ) since years for 3Dmax, Blender software etc ) I ask me if Nvidia just dont include them in the Gamework "PhysX Flex" just for "marketing" purpose.
 
Last edited:
Well metro 2033 seems to be the most extensive example of hardware accelerated physics so in that case a higher end graphics card makes a difference but for a few peripheral effects a low end card is fine.
 
lanek: I agree. That's why I don't think the idea of dedicated physics cards will take off and it's probably not so easy to separate graphics and neatly anyway.
 
Here's a bechmark on youtube showing that a geforce gt 430 paired with a gtx 680 doesn't bottleneck the main card on mafia 2 and gives improved fps.

 
Well havok were developing gpu physics but when intel bought them out they nixed that presumably because they want the physics to run on the cpu or perhaps larrabee before it was canned.
 
Is there an alternative to these PhysX effects that also run on GPU using open standard?

Developers are free to write their own OpenCL or DirectCompute based effects.

GPU physics has potential. The Batman series and Borderlands have used it to decent effect but fidelity is lacking. It's just a matter of time though until we get fluid dynamics in games as good as we see in those demos.
 
The reason I'm asking this is because I have Kaveri and I probably need to buy a dGPU to play next gen games (it can play PS360 games with a very respectable setting), but that meant the iGPU will be useless even if it is a very good iGPU + the fact that Kaveri isn't really fast thus need every help it can get (offloading physic calculation to GPU would be a start). Anyway, I'm getting OT here, so last question... Since most of the review sites always focused on high end GPU, how much is the hit using PhysX on low/mid end GPU? Is it feasible on 740? 750? 760?
 
Since most of the review sites always focused on high end GPU, how much is the hit using PhysX on low/mid end GPU? Is it feasible on 740? 750? 760?

People tend to forget that PhysX is adjustable so the default settings aren't necessary the best. And in some games PhysX effects are independently adjustable depending on how it's implemented in the game.

  • Auto-Select - This allows the driver to automatically determine whether to select your GPU or CPU for processing PhysX effects. This is the recommended setting, as in most cases the drivers should be able to determine based on your GPU model(s) and CPU specifications which hardware to use for processing PhysX for optimal performance.

  • [GPU Name] - If selected, this option allows you to force PhysX processing onto a specific Nvidia GPU on your system. Use this if you want to experiment to see if shifting PhysX load to a particular GPU on a multi-GPU system can improve your performance.

  • CPU - If selected, this option forces all PhysX processing to occur on your CPU, which is the default for systems which don't have an Nvidia GPU, and essentially turns GPU-based PhysX off. This may help performance if you have a low-end GPU and a high-end CPU for example, or for particular games which are so strenuous on your GPU that offloading the PhysX processing to the CPU can improve overall performance.

The bottom line is that for a system with a single PhysX-capable GPU, in some cases you will see a noticeable FPS rise, in other cases you may see a noticeable FPS drop. It all depends on whether the additional PhysX effects are set to be always on in the game, or whether they can be enabled or disabled. Furthermore the degree to which your FPS increases or falls also depends on how powerful your GPU is compared to your CPU. If you have a high-end CPU and a low-end GPU for example, then shifting the PhysX load to the GPU may have a negative impact overall.

On systems with multiple PhysX-capable Nvidia GPUs the story will be different - particularly on a non-SLI setup where you can set your most powerful Nvidia GPU as the primary graphics card, and add a second weaker Nvidia GPU and set it to just process the PhysX effects. In such a scenario you should usually get the benefit of both optimal FPS and additional PhysX effects. So for example if you have an unused GeForce 8 PCI-E card, you can slot it into a spare PCI-E port and utilize it for faster GPU-based PhysX in Multi-GPU (not SLI) mode.
http://www.tweakguides.com/NVFORCE_7.html
 
Is there an alternative to these PhysX effects that also run on GPU using open standard?

For games, outside Bullet, not really... This said as said trinibwoy, you could use OpenCL or Directcompute for developpers to include it in their own engine. Bullet is a bit more complex to use of PhysX ( who is nearly out of the box for developpers ) and is based on real material physic ( atomic level ). Hence why it is more used for movies and CGI. ( because in a game you dont want forcibly assign atomic specifications to the objects and material. ).

For professional 3D, animation softwares, they tend now to developp their own engine of simulation, physics .. based on their own API. This said, they need complex one, not "visual" effect.

This said, a good question will be to know why OpenGL graphic engine and game developpers dont use OpenCL for develop their own physic engine so far and rely to proprietary one. Khronos = OpenGL and OpenCL. Example for Unreal engine who seems now include Nvidia CUDA, Apex, PhysX, gameworks directly. ( Well, its abit complicate, as Nvidia promote it this way, it is in the front page of Unreal Engine, but developpers of Unreal Engine say this is not the case )
 
Last edited:
Back
Top