NVIDIA shows signs ... [2008 - 2017]

Status
Not open for further replies.
Yeah but you don't need a SLI motherboard to plug a NVidia GPU whereas you do need an Intel socket MB to plug-in an Intel CPU ;) And along that path the people using SLI are at least in my opinion a minority (and a relatively small one at that) so it should be NVidia's incentive to stimulate SLI adoption since as it stands now the majority of the people won't be needing the SLI feature let alone shell out a price premium on it to compensate for the SLI fee. BUT if it did come as a feature of the board for the same price then someone with a GeForce could add a second card some time in the future therefore resulting in a buy for NV.

The mother board makers and Nvidia both know that Sli stickers sell motherboards. Now, if the customer base installs Sli on it or not is not the real question here. If a mobo maker sells a non sli mother board, and it's being compared to 10 other competing ones with Sli for $10 or even $20 more, I would say 4/5 would opt over it for an Sli one regardless if they buy two Nvidia GPU's or no Nvidia GPU's at all.
Hell, I bought and made sure all my mother boards are multi gpu capable with the thought of going x fire, yet not once have I done so. I guess it's a investment/security thing.
 
Well what does that tell you about SLI's perceived value among OEM's? If it wasn't worth the $5 they would have more models without SLI support. It's a simple value proposition. What really surprises me about SLI is the amount of low-end offerings like dual-9500GTs out there. Obviously these OEMs are successfully peddling a double helping of crap to the masses.
 
My bad ;) My memories went a little far back to the times of the nforce chipsets when the price difference was more substantial between a SLI and a non-SLI mobo, didn't know it went to only $5. As for the dual-9500GTs that was exactly my point ;) Whereas computer enthusiast knowing the associated drawbacks of SLI/Crossfire often stay away from them but the average customer is more easily convinced that with a second 9500 his games will really be flying :)
 
Was it already mentioned?

John Carmack says no to PhysX

http://www.guru3d.com/news/john-carmack-says-no-to-physx-/

It's no news though. John Carmack dislikes PhysX (the company) back then when it's not bought by NVIDIA yet. He thinks Aegia was selling something (a so-called "physics accelerator") no general consumer would buy just to seek a potential buyer for the company. And apparently they succeeded as NVIDIA later bought them. He thinks it's not a honesty way of doing business.
 
PhysX was an idiotic idea when you had to buy proprietary special purpose hardware from a no-name start up with no presence in real systems.

PhysX is still an idiotic idea from a SW development perspective, since it is tied to a specific IHV. It looks a little more promising since at least NV has a pretty big presence, but you still cannot integrate any fundamental features that rely on physx as they wouldn't run on a lot of systems.

David
 
It looks a little more promising since at least NV has a pretty big presence, but you still cannot integrate any fundamental features that rely on physx as they wouldn't run on a lot of systems.

If I was a game dev, it would look even less promising to me if it was backed by nv, as it would be disabled on many systems that are/were working fine.
 
PhysX is still an idiotic idea from a SW development perspective, since it is tied to a specific IHV. It looks a little more promising since at least NV has a pretty big presence, but you still cannot integrate any fundamental features that rely on physx as they wouldn't run on a lot of systems.

Of course it runs on CPU as well. You don't need a PhysX add-on card or a GeForce display card for PhysX to work. You just need them for it to run faster. So in a sense you can use PhysX as, say, Havok, because it runs fine on CPU alone.

The problem here is that NVIDIA disabled hardware based PhysX when the main display card is not an NVIDIA one. There could be some technical reasons behind this decision, or there may be not.

It could be that PhysX is not as well optimized for a CPU as other physical engines such as Havok. I don't know, because I don't use both, so I can't compare. There were some evident where some people profiled older PhysX library and found that some significant part of time were spent on some stupid x87 instructions such as floating point division. It could be that Ageia didn't bother to optimize PhysX for CPU to make their hardware accelerator looks even better compared to CPU. I don't know if this is still true right now though.

Anyway, as I said before, with compute shader it's very likely that next generation physical engine will be based on compute shader. Of course, it still has to support CPU because not all graphics chips support compute shader.
 
Of course it runs on CPU as well. You don't need a PhysX add-on card or a GeForce display card for PhysX to work. You just need them for it to run faster. So in a sense you can use PhysX as, say, Havok, because it runs fine on CPU alone.
There is a difference if a game uses SW PhysX only, or if the game uses HW PhysX, which is performed via CPU. I think in the later case CPU gets highly unoptimized code, which even isn't multi-threaded (only one core is used independently on CPU configuration).
 
I'd argue they have. Havok's CPU based physics seem quicker and more robust than PhysX's CPU implementation.

Didn't someone disassemble some PhysX CPU code before? The code used for HW accelerated PhysX run on CPU, that is. And found that it only uses up to MMX and doesn't even take advantage of any form of SSE? I could swear I read that in these forums somewhere about a year or so ago.

Regards,
SB
 
I'd argue they have. Havok's CPU based physics seem quicker and more robust than PhysX's CPU implementation.

Didn't someone disassemble some PhysX CPU code before? The code used for HW accelerated PhysX run on CPU, that is. And found that it only uses up to MMX and doesn't even take advantage of any form of SSE? I could swear I read that in these forums somewhere about a year or so ago.

Regards,
SB

Yeah, that's what I talked about in my post. Although, that's before NVIDIA acquired AEGIA, and I don't know whether it's still in that condition. Basically, we need someone who has evaluated both solutions to have a more informed opinion on this matter.

Anyway, my point is if PhysX is reasonably optimized for CPU then it should be comparable to other physics engine even if its hardware support is limited to NVIDIA only. I mean, there are no GPU accelerated physics engine other than PhysX right now. Although this will change very soon.
 
I'd argue they have. Havok's CPU based physics seem quicker and more robust than PhysX's CPU implementation.

How can you tell? Havok games span the gamut from horrible to good physics. You sure you aren't just comparing developer talent? Stuff like HL2 and Red Faction actually used heavily modified versions of Havok.
 
Was it already mentioned?

John Carmack says no to PhysX

http://www.guru3d.com/news/john-carmack-says-no-to-physx-/

Actually, that's not what he says. He says no to dedicated hardware for physics accelleration. He says that it was clear to him that this would be replaced by a more cpu/gpu integrated form of accelleration pretty soon, and that he thought even Ageia probably knew that at the time, and only released hardware to get purchased. He didn't think that was fair to people who bought such a card (though interestingly now it seems it still does its job pretty well, witnessing some of the tests we've seen online recently).

He doesn't say a whole lot else on the matter. Not even that he thinks PhysX is a bad idea. Maybe I missed something, but I doubt it.
 
nVidia lost 4 PhysX "AAA" titles: GRAW, GRAW 2, Terminator: Salvation, Bionic Commando
How did nVidia "loose" those titles just because the studio went bankrupt? The games are still there. Also, several of their developers have already set up a new studio and their devrel connections surely won't disappear overnight when they get new projects. Even if future Ghost Recon games on the PC were to be farmed out to different developers there could still be PhysX depending on what Ubisoft wants.
 
Status
Not open for further replies.
Back
Top