How will NVidia counter the release of HD5xxx?

What will NVidia do to counter the release of HD5xxx-series?

  • GT300 Performance Preview Articles

    Votes: 29 19.7%
  • New card based on the previous architecture

    Votes: 18 12.2%
  • New and Faster Drivers

    Votes: 6 4.1%
  • Something PhysX related

    Votes: 11 7.5%
  • Powerpoint slides

    Votes: 61 41.5%
  • They'll just sit back and watch

    Votes: 12 8.2%
  • Other (please specify)

    Votes: 10 6.8%

  • Total voters
    147
Status
Not open for further replies.
Nvidia's support of 120 Hz monitors is for stereo3D. It has nothing to do with the framerate delta of a 60 Hz Limitation.

Is that in reply to me ?

were those 2 monitors (or 1 in the u.k because the non samsung fails british standards and cant be sold here) created to work with 3d vision or is there going to be a move anyway to 120hz for monitors

If so Im not sure what your trying to tell me, are you saying the glasses were made to support the 120hz monitors and monitors are moving to 120hz and not the monitors were made 120hz to be compatible with the glasses ?
 
Its not a very demanding game as far as GPU workload is concerned. A single card struggling 1600p with PhysX on isnt all that surprising. At lower resolutions. The game is entirely GPU PhysX limited.

Can it be that the developers have overdone it a bit with the amount of physics effects when you set them to high? To get a picture of what I mean imagine I drive in a racing sim along a hillside and rocks are bouncing down from the top of the hill. Now are there say 50 rocks or 500 rocks in the particular scene? In the first case of the example it affects gameplay just fine because I'd have to try to avoid those rocks and not bump into them; in the second case it's completely silly since in reality I'd be never able to avoid anything. Else I'm asking if there's a sensible measure with the amount of effects, because albeit I realise that physics cost in computing resources, there's something not adding up if you end up with half the framerate with maximised settings.


Due to the Way standard SLI rendering PhysX is setup. At lower resolutions your actually limited by the GPU Compute performance of the primary GPU. As with AFR. Only the primary GPU handles PhysX operations. Obviously as resolution scales up AFR will have a better balance because theres more traditional GPU workload to work with.

Shutting off SLI and just running a second card as a PhysX device at lower res's actually does more than SLI. Or keeping SLI and running a 3rd GPU as a PhysX device independently helps alleviate this bottleneck. Its just important to understand that this is a GPU Compute Bottleneck that can be dealt with a few ways.

If someone is to add more hardware into a system to better balance out resources between them, then the effects have to make a huge difference to gameplay. Give me a kickass racing sim that has as realistic as possible physics effects and I'll be sold.
 
Have you ever come across a game that does NOT drop in fps when you go from low to higher detail settings?
There's no such thing as a free lunch.

i think what he was trying to say is that what's the point of putting PhysX if it kills the framerate, enough to kill the gameplay. If the gamer has to turn down the level of graphics just to turn on PhysX or worse, turn PhysX off so he/she can play the game, then what's the point of putting it there in the first place... just saying... like if it negates performance, the what's the point of putting it there, then might as well not use it...
 
TGDaily is saying that Nvidia is countering the HD5xxx (in addition to physx > all) with ............ Press releases!

"We don't comment on unreleased products.... Oh wait... we're at Defcon 1 - Smithers, release the FUD!" Sounds good if they ever get above that two percent yield.

How's that for predicting the future instead of relying on what we have in our hands?

When was the last time they did this kind of forward looking spoiler/advertising...? Was it for NV30 during the 9700 launch...? It can't have been G80 as Nvidia were first to market with that generation.
 
Last edited by a moderator:
If i remember correctly, Ati gained a lot of market share with HD4xxx series in the discrete segment. That can suggest that above mentioned argument ("Wow, I love those effects, but why can't I get them on my AMD GPU or CPU?") wasnt really that convincing.

That was then, now we have games like Batman, and who knows what else they're going to release. You can't compare it... In the 4xxx era, PhysX was mostly just a promise, now we have actual big titles coming out with PhysX support.
 
If it's going to help visuals to be more realistic then yes. What's wrong with that?

Nothing wrong with it. I'm just saying that it's probably easier to make PhysX look cool/new/flashy/dynamic etc to Joe Schmoe than some heavily tessellated scenes.
 
No, it is a performance card.

Not really getting your point here. Your complaint is equally applicable to other IQ enhancing effects as has been mentioned previously. So what if a performance card can't max out the IQ, is this a new phenomenon or something?
 
But that is not the case, so that's a purely theoretical "what if".

I turned off PhysX in mirrors edge b/c after meeting with the sister for the first time the game completely tanked frame rate like 2-5 fps in the room full of glass cases. It crashed as well. And I did have the new patch. I have a 8800GT though. I admit I think PhysX or any other accelerated physics is extremely cool though. If by the time the nvidia dx11 cards come out ATI hasn't got any physics acceleration going on I would definitely pay extra for the same performance level if I had the option of accelerated physics in games only on Nvidia cards.
 
Ail anyone who plays Batman will understand the way the PhysX are done. No they wont alter gameplay. But they will alter the game feels and ultimately its immersion factor.

PhysX are used in such a way to demonstrate how the Scarecrow is tearing down Batman's grasp on reality and making him insane. Its a total immersion factor for the level. Its probably the best use of PhysX I have seen. Now if your familiar with the Batman comics and franchise. This makes total sense.

When playing without PhysX. The gameplay is the same. But the game is not. Can other games leverage PhysX the same way? I dont know for sure. But I dont think the way they are done in Batman is at all tacky and its clear the developers thought its usage out pretty well.

As far as PhysX levels of performance. Batman offers 3 levels of PhysX, And the settings I used were "high" but the medium settings are a little more forgiving to mainstream hardware.

IE I have little doubt that someone with a 9800GTX can play it at 1680x1050 with 4xAA with medium PhysX and get nearly the same experience.
 
I turned off PhysX in mirrors edge b/c after meeting with the sister for the first time the game completely tanked frame rate like 2-5 fps in the room full of glass cases.

That's odd, I played it on my 8800GTS320 and it ran pretty well with PhysX, at 1280x1024 with 4xAA.
 
I was leaning that way for sure, and then I looked at the poll options and, alas, the author (ahem!) opened the door to the relevance of this part of the discussion with option #4.

Well, PhysX was the obvious choice :smile: It's interesting though, I really think that the last 20 pages or so have answered the question
 
Sorry to burst your bubble, but as stated before, the exact config you mention runs at 12fps (http://www.revioo.com/articles/a13159_3.html).

I said medium setting. Not high.


*edit* Anyway,. Looks to me like the 9800GTX ran out of memory. As no other piece of hardware shows that level of drop off. So its very unlikely say a GTS 250 1 gig would show the same drop off. So I'd count to either A) Being Memory related or B) A bug.

The GTX 260 I used was a 192 SP version which isnt that much more powerful than a 9800GTX other than its memory and ROPS.
 
Last edited by a moderator:
having a 9800gtx doing physx increases fps by 50%
I wonder if physx titles became really common if nv would re-release a ppu a graphics chip with all the graphics stuff removed
 
having a 9800gtx doing physx increases fps by 50%
I wonder if physx titles became really common if nv would re-release a ppu a graphics chip with all the graphics stuff removed

Why bother? What costs a 9800GTX nowadays?
 
Yeah after closer examination of the graph. The 9800GTX just ran out of memory at those settings. Trying 2xAA would yield very different results. Since I havent tested the 9800GTX specifically by itself yet. I can only take the websites word for it. If the 9800GTX is offering slower performance with 4xAA than CPU running the PhysX. Its a question of memory allocation.

*Edit* When you look at a card like a 9800GTX +. You have to be aware it has certain limitations. Largely its memory amount. But the performance drop off you witness in this instance is clearly related to memory. If someone wanted to keep the game at nearly the same quality run 4xAA --> and medium physX. All they'd have to do is adjust the settings slightly to lower texture detail. Or Settle for 2xAA. 512 Megs of memory by today's standards. ((especially with the Unreal 3 Engine)) is really really low.

Heres a list of the settings Batman uses. Though I'm sure the "12" number is great for sensationalism. Any practical gamer will adjust their settings slightly to use PhysX.

batmansettings.jpg
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top