How will NVidia counter the release of HD5xxx?

What will NVidia do to counter the release of HD5xxx-series?

  • GT300 Performance Preview Articles

    Votes: 29 19.7%
  • New card based on the previous architecture

    Votes: 18 12.2%
  • New and Faster Drivers

    Votes: 6 4.1%
  • Something PhysX related

    Votes: 11 7.5%
  • Powerpoint slides

    Votes: 61 41.5%
  • They'll just sit back and watch

    Votes: 12 8.2%
  • Other (please specify)

    Votes: 10 6.8%

  • Total voters
    147
Status
Not open for further replies.
Yeah but as of now GPU accelerated Havok isn't available on any products so I still don't get it...am I the only one living in the present here? :D

But it seems that Havok being ported to OpenCL wil be not too far away, this is the point. Of course until then Physx will be the only (on some) GPU accelerated middleware, but I think nobody will discuss this. It´s maybe that there is the need for a GPU accelerated middleware that will work on all GPUs.


That's very subjective. Just check out the gametrailers comments section for their BAA vids for example. You can try your best to make it seem that unless PhysX rocks your world then it's useless but that's not how it works. Regular people appreciate little crap like smoke and blowing paper. Especially when Batman is spinning around kicking ass in the middle of it. Think of all the little things developers add to games - did the working faucets in Prey or Bioshock change your game experience?

Sincerely, I played Batman on a not-too-old PC with a C2Duo Q6600 and I had to deactivate Physx GPU support on a GTS 250 otherwise the game was unplayable even at 1024x768. So, what is the "better experience"? I also saw Mirror Edge and there is some small eye candy here and there. Not really what changes my opinion about a game. Ironically, games using physics as tha base of the gameplay (mostly puzzles) use a CPU physicd simulation. Also, the point Tchock made some post ago was still unresponded.
 
Yea, and when PhysX will get acceleration via OpenCL, that advantage will be nullified again.
Currently we have no idea if or when Havok with OpenCL will ever see the light of day, let alone any games actually using it... Nor do we know what nVidia has planned for PhysX, if Havok finally gets OpenCL support. It's quite possible that nVidia has already prepared for this, and will have an OpenCL version, as to not lose the marketshare it has worked so hard on gaining in the past months.

If Physx will go the OpenCL path, then it´s only good for all (except that it wil run two times slower on ATI hardware by default, of course :D)
 
Which makes me scratch my head over you saying RV870 would be barking up the wrong tree.

You already answered it yourself, partly...
The first generation of DX11 games won't make full use of its features. So for the most part, you'll be running DX9/10-class code, but faster.
AMD doesn't have a physics solution, so that ability of the architecture isn't going to be leveraged either.

nVidia may not yet have a DX11 part, but they do have DX10 parts, with a physics solution, and are actively promoting this to developers.

So the net result is ~DX10-class graphics on AMD, vs DX10-class graphics with physics on nVidia.
AMD will either get poor framerates with CPU physics, or will get less detailed graphics.
Which is why I think AMD is barking up the wrong tree. Judging from the hype regarding Batman, I think the average gamer may go for nVidia because they prefer the physics effects over higher framerates and more AA.
 
But it seems that Havok being ported to OpenCL wil be not too far away, this is the point.

Based on what?

Sincerely, I played Batman on a not-too-old PC with a C2Duo Q6600 and I had to deactivate Physx GPU support on a GTS 250 otherwise the game was unplayable even at 1024x768.

I'm not exactly sure why your hardware shortcomings have any bearing on the utility of PhysX. The game would be unplayable on your setup on a 30" monitor too. Does that mean 30" monitors are useless? :)
 
Based on what?

Declarations made by AMD and havok representatives, I have to look for the links

I'm not exactly sure why your hardware shortcomings have any bearing on the utility of PhysX. The game would be unplayable on your setup on a 30" monitor too. Does that mean 30" monitors are useless? :)

Sorry, the problem is that on a not-too-old card that Nvidia declares defeating the new ATI card in Physx benchmark the Batman game is simply unplayable. This is not a problem of hardware shortcomings, most of games run well with that card, and if Nvidia wants everybody to get a GTX295 for Physx game I think they have a "little problem".
 
Which is simply not true. PhysX supports all the major consoles, and also CPUs, PPUs and GPUs.
Havok currently doesn't support any PPU or GPU. So PhysX is the one available on a broader range of products.

So is Bullet -- and Bullet already has CUDA support. This makes me wonder why AMD didn't implement OpenCL support for Bullet, as Havok is under Intel's control, while Bullet is free -- they could have even made a Bullet on CTM, and sell it as Bullet is BSD. And as soon as Bullet supports OpenCL/DX CS, I guess both Havok and PhysX will have a hard time on the PC market anyway, as it can be easily contain nVidia/Intel/AMD specific paths (which is unlikely for Havok/PhysX).

Another observation: PhysX could also benefit from the UE3 games at the moment, but things may easily change once a new engine pops up, or UE4 integrates physics/provides support for other physics engines out of the box. Of course, being free even for commercial usage help as well, but on that front, they compete against Bullet, and this is something where they can only loose.

I can't believe that nVidia bets a lot on PhysX, as it's a stop-gap measure at best; I guess the main idea behind PhysX is to show developers that you can do GPGPU while running a game ... well, sort of.
 
Declarations made by AMD and havok representatives, I have to look for the links

Declarations huh? :)

Sorry, the problem is that on a not-too-old card that Nvidia declares defeating the new ATI card in Physx benchmark the Batman game is simply unplayable. This is not a problem of hardware shortcomings, most of games run well with that card, and if Nvidia wants everybody to get a GTX295 for Physx game I think they have a "little problem".

Ah that, well yeah I think we've firmly established that piece of PR was horseshit.
 
So is Bullet -- and Bullet already has CUDA support. This makes me wonder why AMD didn't implement OpenCL support for Bullet, as Havok is under Intel's control, while Bullet is free -- they could have even made a Bullet on CTM, and sell it as Bullet is BSD. And as soon as Bullet supports OpenCL/DX CS, I guess both Havok and PhysX will have a hard time on the PC market anyway, as it can be easily contain nVidia/Intel/AMD specific paths (which is unlikely for Havok/PhysX).

Another observation: PhysX could also benefit from the UE3 games at the moment, but things may easily change once a new engine pops up, or UE4 integrates physics/provides support for other physics engines out of the box. Of course, being free even for commercial usage help as well, but on that front, they compete against Bullet, and this is something where they can only loose.

I can't believe that nVidia bets a lot on PhysX, as it's a stop-gap measure at best; I guess the main idea behind PhysX is to show developers that you can do GPGPU while running a game ... well, sort of.

Well, Bullet sounds great on paper, but not too many big titles have used it so far. I think Bullet's main problem is that Intel and nVidia have much deeper pockets and are very successful in marketing their solutions to developers, and offering support.
This may also be the reason why AMD decided to team up with Havok rather than Bullet.
As far as I know, nVidia had no hand in the Cuda support for Bullet though. Perhaps then it is also AMD's fault that Bullet doesn't have AMD support, namely the Cuda SDK is a nice product for developers to use and experiment with GPGPU, while AMD's solution isn't quite there yet.
 
ATi had some GPU physics demos back in the X1800/1900 days... they'll come sooner or later ;)

I don´t remember GPU Physics that times, but I remember GPU computing. But in this case they are working with Havok, which has all the interest to have a GPU accelerated API.
 
Maybe, but personally I think that something is moving.
There were also demos in April, so it will take time, but I think it will come sooner than later.

Oh come on!!! :LOL: How can you have all this faith in a fantasy and at the same time not want to give an existing product a chance? Where's your faith that PhysX will improve "sooner than later"?
 
Oh come on!!! :LOL: How can you have all this faith in a fantasy and at the same time not want to give an existing product a chance? Where's your faith that PhysX will improve "sooner than later"?

Maybe because nvidia doesn't want to have us have a chance anymore?
They'll have a hard time explaining that it might completely crash your computer pre OpenCL and everything is fine and dandy after OCl (If, PhysX will be ported to OCL that is)
 
The first generation of DX11 games won't make full use of its features. So for the most part, you'll be running DX9/10-class code, but faster.
I very much disagree with this. I think DX11 titles will come out this year that will utilize the new DX11 features, and I expect many more next year.

Oh come on!!! :LOL: How can you have all this faith in a fantasy and at the same time not want to give an existing product a chance? Where's your faith that PhysX will improve "sooner than later"?
Nothing to do with faith in it improving, everything to do with faith that nVidia is going to keep it proprietary and it won't be widely adopted. :yep2:
 
Funny Volition can manage so much out of a "cripple" CPU based physics engine while PhysX games seem to have the... err.. same... library... effects.

Only Cyrostasis and the UT3 map pack show at least some developer effort.

Mmm?

BTW Sacred 2 had leaves. LEAVES flying around FFS for...
sacred2.gif
 
Oh I see what you've done there. So PhysX must provide physics nirvana else it's a failure. Hmmmm curious that other middleware isn't held to the same lofty standard.

Well either something is meaningful or it's not. A flag that is physics driven, or a room that you can go into that is not related to the game you are playing in order to knock down a stack of barrels doesn't really add anything meaningful.

Fluttering paper as a graphics effect is just as good looking in a game when it's driven artistically instead of with physics, and without a massive frame rate drop as your graphics card loses power to physics calculations. Sometimes you even get problems such as rag-doll looking utterly unnatural.

Whether it's Havok or Physx is pretty immaterial if it doesn't bring anything meaningful, and is just used (for instance) as a marketing point to sell one IHVs hardware instead of adding anything of worth to a game.

You don't get a pat on the back from the customer by saying "gee, but we're just as pointless as all the other middleware."
 

So it is just the right time to come out :)
Anyway, at that time Open CL did not exist.

You mean with Intel, who will have interest in a GPU accelerated API... when they release their GPU. But no sooner.

They need also to stop Nvidia gaining ground with their middleware.

Oh come on!!! :LOL: How can you have all this faith in a fantasy and at the same time not want to give an existing product a chance? Where's your faith that PhysX will improve "sooner than later"?

uh? I´m not "giving no chance to Physx". I already said that if they port it to OpenCL all the customers will have a gain. But I see difficulties in the present implementation and in the future if it will stay as a product tied to one vendor only.
 
I very much disagree with this. I think DX11 titles will come out this year that will utilize the new DX11 features, and I expect many more next year.

I'd concur. Considering games coming out THIS year have already been shown with Dx11 specific features. And games that are already out will be patched with Dx11 specific features.

Additionally, Havok was already shown running physics interactions on rather immature OpenCL ports. IE - things that could affect gameplay and had nothing to do with graphics.

ATI and I believe Havok has also demonstrated AI behavior through (again) immature OpenCL ports.

And I'm not sure where people go from Havok running on every single gaming computer being less prevalent than PhysX which is limited to a partial subset of Nvidia hardware. Not even all Nvidia hardware at that. This is BTW - a response to the person that said CPU based Havok is less prevalent than PhysX.

And to the other person that said PhysX is only a year old. HUH??? PhysX has been around since the early 2000's. Even Nvidia has owned it for over a year and a half now.

Regards,
SB
 
Status
Not open for further replies.
Back
Top