Anti-competitive Actions (PhysX) by Nvidia - class action?

What if Intel decides that Nvidia GPUs conflict with their IGPs 'stability' so they make a driver update which blocks Nvidias 'Optimus' GPUs from even functioning. Its a similar hack situation, the Nvidia GPU isn't meant to be there according to Intels priorities as the ATI GPU isn't meant to be there according to Nvidias priorities.

All is fair in love and business, and Intel can just turn around and say 'Look, Nvidia did the exact same thing for the exact same reasons' so its not like Nvidia would be in a good position to complain.
 
have creative ever said you can use a creative card for sound together with a graphics card from an other vendor, I doubt they have, its just assumed

have intel ever said you can use a i5 for computation together with a graphics card from an other vendor, I doubt they have, its just assumed
 
So your contention is that there is no way ever that an ATI driver update could break the PhysX driver. While I'd agree with, you can never sy never. Remember, people always thought 3DFX would never go out of business let alone get bought up by Nvidia.

His contention is that it shouldn't factor into Nvidia's decision, just as a sound card maker doesn't disable it's use whenever a network card is present, just because they are both plugged into PCI slots. Creative doesn't care whether a network card driver or even a driver from an onboard Realtek audio is present, it's not their position to worry about that. It's not Nvidia's position either, they just chose to make it so.
 
Exactly.

I'm not asking NV to modify their PhysX driver to work with ATI hardware. I have NV hardware and I'm willing to buy NV hardware for PhysX (or whatever else). I'm simply asking them not to disable their hardware if I choose another vendor's hardware for a completely different function (rendering).

This is what they are doing.

PhysX on GPU is not rendering. Their box doesn't state it will do PhysX so long as I don't use someone else's sound card or someone else's video card, etc.
 
But then, their boxes state it will do physx, doesn't it? They do not state, however, under which circumstances. Correct? So, basically you still can enjoy (or not) GPU-physx, can't you?

Plus, AFAICT, GPU-physx is not so independent from the graphics card as you might think. What if they don't even have a function to transfer back the calculated data once it's been done with on the physx-GPU? They just wouldn't need it on their own cards after all.

I do not know of any physx application, where the retired kernels are getting reused by the CPU. Maybe that's the reason we're not seeing gameplay affecting gpu-physx but rather only eye-candy physx - who knows?

So there would be an additional conflicting potential with the rest of the system, wouldn't it?
 
That's bizarrely contrived. This is the whole reason for device drivers. AFAIK, the game calls on the PhysX driver, not the video driver. Are you trying to tell me the game tells the video driver to render a scene and then the video driver calls the PhysX driver? I seriously doubt that.
 
Wouldn't a more interesting question be since you will be able to use both amd and nvidia cards in one system for OpenCL, why can't you use amd for rendering and nvidia for physics? And what is the special difference between PhysX on CUDA compared to OpenCL on CUDA?

EDIT>
CarstenS interesting point about the transfer back. What about Batman:AA, doesn't the PhysX part there actually influence gameplay in some way, or was it purely eye candy?
 
Last edited by a moderator:
What if Intel decides that Nvidia GPUs conflict with their IGPs 'stability' so they make a driver update which blocks Nvidias 'Optimus' GPUs from even functioning. Its a similar hack situation, the Nvidia GPU isn't meant to be there according to Intels priorities as the ATI GPU isn't meant to be there according to Nvidias priorities.

All is fair in love and business, and Intel can just turn around and say 'Look, Nvidia did the exact same thing for the exact same reasons' so its not like Nvidia would be in a good position to complain.

That would be completely fair and warranted in the same way Nvidia blocks PhysX when an ATI card is in the system. As any additional GPU over and above the integrated one could cause system instability that would not exist with the integrated GPU alone.

That could also be extended to discrete GPU's in a system containing an integrated GPU.

Hopefully Intel and AMD won't follow in Nvidia's footsteps with this however.

Regards,
SB
 
That would be completely fair and warranted in the same way Nvidia blocks PhysX when an ATI card is in the system. As any additional GPU over and above the integrated one could cause system instability that would not exist with the integrated GPU alone.

Hopefully Intel and AMD won't follow in Nvidia's footsteps with this however.

Regards,
SB

So Intel should disable their GPU? I hope they don't advertise their gpu as a feature. :LOL:

Intel can't block an other gpu. They must hack the os for it.
 
So far, it's all been purely eye candy. A random new member posted a reply in another, related thread that mentioned this, so I went googling. While I cant confirm what he said yet, he still seems to be correct in that GPU physx is only (currently) doing eye-candy physics and not gameplay-altering physics.
 
EDIT>
CarstenS interesting point about the transfer back. What about Batman:AA, doesn't the PhysX part there actually influence gameplay in some way, or was it purely eye candy?
Haven't played it much besides doing some benchmarks early on, so i cannot be sure. But from what I remember it's pure eye-candy: Some cobwebs, some newspapers whirled around, some (admittedly very nice) fog and some banners hanging from the roof.

I think only in techdemos there's been a thing similar to gameplay physics and in the UT3-physx-levels. But having to transmit some altered level geometry back to the cpu to take into account for aiming, shooting and movement possibilites is a whole lot different in my book than transmitting vertex coordinates of some 10k particles (plus, i obviously don't know if they can be grouped like DX-batches or if they instantly saturate busses and chipsets).
 
This concept that it's perfectly fair to disable something is insane.
These independent components piping data to one another. Program->PhysX->Program->GPU.

We're not talking about forcing NV to open PhysX, but only to allow an independent piece of hardware to perform it's job even if they don't like all the other independent pieces of hardware in your chassis.

I think a strong case could be made, particularly in the EU, that this is highly anti-competition. They're trying to use a monopolistic position in one market (PhysX) to increase share in another market (GPUs).
 
http://physxinfo.com/articles/?page_id=154

That sure looks like a severy physics (remember, it's not spelled with an "x") monopoly. Except, you want to tell us somethin like how unfair it is, that volkswagen has a monopoly on volkswagens, because there's no one else you can buy them from.


BTW: Care to take actual technical issues like the one mentioned above into consideration? There sure is a reason why Microsoft, Nvidia and Ati are urging game devs to do batching with their draw calls. :)
 
Oki, haven't played it at all myself, and remember people were talking about the amazing physics in it, so thought perhaps it were actually doing something more.
 
Nvidia should just explicitly state that PhysX is only supported in specific configurations. Sort of how game boxes specify that multiplayer is contingent on you having an internet connection.
 
Carsten are you saying there is direct data passing GPU->PhysX->GPU?
I am saying there's a whole lot of data involved, which has not yet been shown to be transferable with decent performance back from the GPU into the host system.
 
I think a strong case could be made, particularly in the EU, that this is highly anti-competition. They're trying to use a monopolistic position in one market (PhysX) to increase share in another market (GPUs).

Good luck on trying to assert that Nvidia has a 'monopolistic position' in 'PhysX'. :LOL:

As if developers have no choice but to use PhysX, because their games won't sell without it. As if there aren't competing middlewares out there that have more than 20% market share.

Heck, I have even heard pundits dream about the possibilities of the eventual appearance of the chance of rumours about the plausibility of some tentative future demo of a mythical competing GPU based physics technology standard (which will instantly grab all of the market share from day 1, because it is open).

I don't think your proposal stands a chance, at least not in the EU. It is hard enough just to get convictions in the blatant cases of real violaters of antitrust laws, like Microsoft and Intel. Class action suit in the US? Sure, almost anything will fly.
 
I'm not saying the monopoly on PhysX is anti-competitive. Sorry if that wasn't clear. I'm saying that using that monopoly to establish another in a different market is anti-competitive.

Windows came with IE. You could load any other browser without Windows disabling features, but this was till found to be anti-competitive.
 
Back
Top