AMD: "[Developers use PhysX only] because they’re paid to do it"

Richard

Mord's imaginary friend
Veteran
Saw on Blue's an article with some choice quotes from Huddy wrt PhysX in games and the reasons why they exist. Controversial for sure, nVidia's response should be along soon. For now let's try and keep the discussion civil.

Richard Huddy via article said:
“I’m not aware of any GPU-accelerated PhysX code which is there because the games developer wanted it with the exception of the Unreal stuff,” he says. “I don’t know of any games company that’s actually said ‘you know what, I really want GPU-accelerated PhysX, I’d like to tie myself to Nvidia and that sounds like a great plan.’”

Apologies for the quoting overload.

So, is he right? Is this just AMD stalling for time? Is PhysX just a retelling of the Glide story? Do you prefer paper or plastic? So many questions.
 
I'm not sure but mabye AMD should open up their pockets a bit and get some games to use open cl for physics.

Its a great marketing point from nvidia and until there is a viable alternative devs will take it esp if there is money involved.

Amd has a unique situation because they can work hard to create synergy (god i hate that word) between their cpus and gpus. And unlike nvidia can make things that use all of their cpu cores and then the gpu . It should allow for better performance and with 6 ,8 , 12 ,16 core chips coming from amd in the next year or two they might want to have something on the game side to show that power.
 
Sounds about right. We often hear the fact that nVidia engineers actually did all the work to add nVidia specific features like proprietary MSAA or hardware PhysX to games as a justification for why they should be, in some cases artificially, disabled on ATI cards. We know money is changing hands to create these marketing deals. When was the last time you saw a GPU PhysX enabled PC game that wasn't accompanied by a big marketing push from nVidia?
 
Im not sure what to say about this, but money hatting is alive and well it seems in the PC gaming space as well.
 
Maybe he's mostly right, maybe he's mostly wrong, or maybe it's somewhere in the middle.

Whichever, I like that he put this "ball" into play. The topic needs discussion by the relevant parties.
 
Maybe he's mostly right, maybe he's mostly wrong, or maybe it's somewhere in the middle.

Whichever, I like that he put this "ball" into play. The topic needs discussion by the relevant parties.

Maybe he should just stick to talking about ATI.
 
We still don't have an AMD physics demo of any kind. If there is going to be any buzz about physics we at least need something in our hands that works. Obviously it's been possible since bloody 2006. I'm getting sick of ATI talking down about nVidia's physX when all we have from ATI are some vintage videos.

 
I heard a rumor that there will be some demos demoed somewhere sometime very soon, and I really wish I could make it there! :(
 
Dean Sharpe didn't get the memo

Dean Sharpe didn't get the memo from Huddy concerning this.
http://www.pcgameshardware.com/aid,706182/Exclusive-tech-interview-on-Metro-2033/News/

"PCGH: It could be read that your game offers an advanced physics simulation as well as a support for Nvidia's PhysX (GPU calculated physics) can you tell us more details here?

Does regular by CPU calculated physics affect visuals only or is it used for gameplay terms like enemies getting hit by shattered bits of blown-away walls and the like?
Dean Sharpe: Yes, the physics is tightly integrated into game-play. And your example applies as well.

PCGH: Besides PhysX support why did you decide to use Nvidia's physics middleware instead of other physics libraries like Havok or ODE? What makes Nvidia's SDK so suitable for your title?

Dean Sharpe: We've chosen the SDK back when it was Novodex SDK (that's even before they became AGEIA). It was high performance and feature reach solution. Some of the reasons why we did this - they had a complete and customizable content pipeline back then, and it was important when you are writing a new engine by a relatively small team"

Wonder how much Nvidia paid them to use PhysX concidering they started using it before Nvidia ever bought Ageia.
 
That would go a long way to explaining why the major PhysX GPU titles are generally subpar games with low budget devs. IE - Devs willing to throw something into their game in order to secure either additional funds or co-marketing agreements.

Other than Batman AA and Unreal Tournament, most titles just seem to have it tacked on for no other reason than marketing.

Sacred 2 was overdone, example. It could have been implemented much more tastefully without such a large performance hit, but since Nvidia needed to show how awesome GPU accelerated physics were compared to CPU physics, it HAD to be overblown and in your face such that you couldn't help but notice it.

And that's the feeling I get most of the time when seeing PhysX GPU acceleration in games. Rather than just using it in such a way as to enhance the game, they have to take it up a couple notches into "in your face, you MUST notice me" area.

This was something Ageia suffered from also, where they were so desperate to market Physx that the effects were often overblown and ended up dragging FPS down, often into unplayable territory (GRAW). Where instead they could have used it without being so in your face, and thus without such a large performance hit.

Regards,
SB
 
digi> it's not really surprising that AMD will demo physics at GDC, the question is if it will be with a library/SDK that will be available to develop with soon or not. They had a lovely Havok demo running on OpenCL, without any sign of any Havok running on the GPU any time soon. So any such demo really should be followed by release, pref with the release date announced with the demo.

CNCAddict> Well, you did have the Havok on OpenCL demo at last GDC, which isn't that long ago. Though, if AMD were to release a closed physics library only taking advantage of AMD GPUs I think the reception would be "meh". I'm much more interested in an "open" physics library that will run on nVidia, AMD GPUs as well as CPUs so it will run on any/all platforms. That'll be the threshold where physics will make a much larger impact on games.
 
Dean Sharpe didn't get the memo from Huddy concerning this.
http://www.pcgameshardware.com/aid,706182/Exclusive-tech-interview-on-Metro-2033/News/

"PCGH: It could be read that your game offers an advanced physics simulation as well as a support for Nvidia's PhysX (GPU calculated physics) can you tell us more details here?

Does regular by CPU calculated physics affect visuals only or is it used for gameplay terms like enemies getting hit by shattered bits of blown-away walls and the like?
Dean Sharpe: Yes, the physics is tightly integrated into game-play. And your example applies as well.

PCGH: Besides PhysX support why did you decide to use Nvidia's physics middleware instead of other physics libraries like Havok or ODE? What makes Nvidia's SDK so suitable for your title?

Dean Sharpe: We've chosen the SDK back when it was Novodex SDK (that's even before they became AGEIA). It was high performance and feature reach solution. Some of the reasons why we did this - they had a complete and customizable content pipeline back then, and it was important when you are writing a new engine by a relatively small team"

Wonder how much Nvidia paid them to use PhysX concidering they started using it before Nvidia ever bought Ageia.

They said it themselves. They are a small group of devs. I bet they'd love some TWIMTP advert money.

Metro is going to be a big gtx 480 title nvidia is going ot push it for all its worth.
 
CNCAddict> Well, you did have the Havok on OpenCL demo at last GDC, which isn't that long ago. Though, if AMD were to release a closed physics library only taking advantage of AMD GPUs I think the reception would be "meh". I'm much more interested in an "open" physics library that will run on nVidia, AMD GPUs as well as CPUs so it will run on any/all platforms. That'll be the threshold where physics will make a much larger impact on games.

Yes, that's when I think devs will finally embrace or at least be more open to planning for GPU physics during pre-planning, engine developement and budgeting. Rather than just being tacked on at a later date by one of the IHVs.

Effects should then not only be more impressive, but also be relatively seemless and not so much 'Look at ME!" type effects. Well one can hope.

Regards,
SB
 
AMD did it even more blatantly with 3DNow!, but sure he's right. They are just a little smarter at it than AMD was at the time ... rather than giving money they give away a library at subsidized cost, provide free man power to implement it and tweak your engine and co-operate on marketing.

They save the developers money rather than giving them money, but it works out to the same thing in the end.

That said, 3DNow! was more blatant but also less insidious. Supporting multiple sets of very small SIMD routines is easily doable and standard practice, except for the PhysX CPU implementation, but putting two physics engines in a game isn't really an option. Paying for 3DNow! support helped AMDs customers ... paying for PhysX supports helps NVIDIAs customers and hurts ATI customers. Hell it hurts a lot of NVIDIA customers too who don't have the money for a top end GPU or a dedicated PhysX one. When the most popular physics library doesn't use SIMD instructions on the CPU and has code paths for fine grained multithreading which it only uses on the consoles but not the PC you know there is something very wrong.
 
Last edited by a moderator:
Whatever happened to letting your own products speak for themselves?

I can only imagine the lynchmob that would be out here today if an Nvidia rep would say something similar about an ATI technology.
 
Wonder how much Nvidia paid them to use PhysX concidering they started using it before Nvidia ever bought Ageia.
The fact they have used it before Nvidia bought it also makes you quoting it as some sort of proof Nvidia haven't paid them also irrelevant.
 
A lot of developers are using *cpu* physx on their own, so nothing special about metro2033 choosing physx as it's (cpu) physics implementation. Huddy's point is that every significant gpu physx usage are done/"paid" by nvidia. And mostly on some second-grade titles - I guess it's only Batman and Mirrors Edge that can qualify as triple-a.
Btw, is metro 2033 really a top game, or just yet another heavily nvidia promoted/improved physx/launch title, ending up in the mid-70s? (just a question - haven't seen any indication of either)
 
I can only imagine the lynchmob that would be out here today if an Nvidia rep would say something similar about an ATI technology.

What do you expect? Nvidia haven't exactly endeared themselves to consumers have they? Wait, let me re-phrase. They haven't endeared themselves to non-nvidia users. AMD are just trying to push for more open solutions, which is great for all consumers, and are throwing a bit of dirt Nvidia's way for pushing a proprietary product by means of $$.
 
Back
Top