How will NVidia counter the release of HD5xxx?

What will NVidia do to counter the release of HD5xxx-series?

  • GT300 Performance Preview Articles

    Votes: 29 19.7%
  • New card based on the previous architecture

    Votes: 18 12.2%
  • New and Faster Drivers

    Votes: 6 4.1%
  • Something PhysX related

    Votes: 11 7.5%
  • Powerpoint slides

    Votes: 61 41.5%
  • They'll just sit back and watch

    Votes: 12 8.2%
  • Other (please specify)

    Votes: 10 6.8%

  • Total voters
    147
Status
Not open for further replies.
That was then, now we have game like Batman, and who knows what else they're going to release. You can't compare it... In the 4xxx era, PhysX was mostly just a promise, now we have actual big titles coming out with PhysX support.

Yes i can compare it and im doing it right now. We had UT3, Mirrors Edge and about 7 other titles yet it didnt do much for nvidia sales. Now we have one more game - Batman.
And besides who knows what else DX11 will offer? Thats the same level or argument really.
I tried few of the advertized free physix titles on 8800GT (i dont even remember their names now) and expirience was not really good and performance even worse. Im sure Batman is much better than that (since it seems to be a good game in itself) but its way way way too little to get many sales when new Ati hw will be out. That TGDaily article linked earlier may have bigger impact and give Nv some time to realease DX11 HW IMO.
 
I don't think anybody is arguing that Batman is enough to counter AMD's new hardware. That would be silly. What it does is give some substance to Nvidia's marketing. Before BAA, all they had was crap like Darkest of Days, Terminator Salvation and Cryostasis. When people see reviewers saying you need Nvidia hardware to get the best out of a high profile AAA title, they aren't going to come to forums and argue about it for a week. It's going to affect their purchasing decision.
 
Yeah after closer examination of the graph. The 9800GTX just ran out of memory at those settings. Trying 2xAA would yield very different results. Since I havent tested the 9800GTX specifically by itself yet. I can only take the websites word for it. If the 9800GTX is offering slower performance with 4xAA than CPU running the PhysX. Its a question of memory allocation.

*Edit* When you look at a card like a 9800GTX +. You have to be aware it has certain limitations. Largely its memory amount. But the performance drop off you witness in this instance is clearly related to memory. If someone wanted to keep the game at nearly the same quality run 4xAA --> and medium physX. All they'd have to do is adjust the settings slightly to lower texture detail. Or Settle for 2xAA. 512 Megs of memory by today's standards. ((especially with the Unreal 3 Engine)) is really really low.

Heres a list of the settings Batman uses. Though I'm sure the "12" number is great for sensationalism. Any practical gamer will adjust their settings slightly to use PhysX.

Forgive the OT, and I mean no offense, but why do you use periods instead of commas?
 
36 with PhysX high, 12 with 4xAA at medium (light green).

Again. This is memory related. Great for arousing people. But not very practical in nature. In the case of Batman. You are far less hurt by turning down the settings a notch on a 512 meg card than you are by turning PhysX off.

1) Slightly lower texture detail or medium level shadow volumes or lower AA by a notch?
2) Or Turn PhysX off completely

What would any sensible gamer do?

This is not something new either with the way the G92 reacts to the Unreal Engine. When it runs out of memory its performance topples off in a frightening way. And yes PhysX requires a certain amount of memory to be available.

I don't think that's right. Even on my 8800GTS320 I can play it with normal PhysX and 4xAA... but not with high PhysX, no matter what.

To be honest Scali. I wouldnt be surprised if these results were true. Given my experience with the G92 and it running out of memory. Again it really doesn't change the fact that PhysX and the math component is not the primary bottleneck here. I am actually gonna jam in a 9800GTX real quick and give it a whirl. Since someone asked for it anyway.
 
Last edited by a moderator:
TGDaily is saying that Nvidia is countering the HD5xxx (in addition to physx > all) with ............ Press releases!
The GT300 is rumored to be in the shops in late November, which places it just after the 5xxx series launch. Of course all this could be a spoiler too and we will not know for sure until the reviews come in around Christmas.
:LOL:

Ok, the card is coming out in November but the reviews won't be out until Christmas? :LOL:

Sorry, it just got me.
 
That's odd, I played it on my 8800GTS320 and it ran pretty well with PhysX, at 1280x1024 with 4xAA.

Yeah I thought it odd as well, and I was disappointed. I was really excited to see my physX in action finally. I did not troubleshoot much (Besides the patch) b/c I don't have the energy/time. It was 1680*1050 or whatever and I turned AA down. I would far rather have cool physics than any AA at all honestly.
 
So anyway. I just ran the 9800GTX through batman with my earlier benches with PhysX set to medium. ((As well as a 9600GT PhysX card)) And I simply can't reproduce the one websites results. The results are pretty much as I said they would be. I am sure somehow they ran out of memory. Batman is a little buggy at times with its memory but the 9800GTX + by itself seems to run it acceptably at 4xAA.


mainstream.png


*edit* corrected an error in the graph.
 
So anyway. I just ran the 9800GTX through batman with my earlier benches with PhysX set to medium. ((As well as a 9600GT PhysX card)) And I simply can't reproduce the one websites results. The results are pretty much as I said they would be. I am sure somehow they ran out of memory. Batman is a little buggy at times with its memory but the 9800GTX + by itself seems to run it acceptably at 4xAA.

Looks like what I suspected.. that other site probably switched the 4xAA medium and no AA high results?
Could you also test with PhysX high? If that gets ~12 fps, then we know for sure.
 
I was teasing Chris, I was thinking of doing it just to see what the fuss is about first hand. Thanks for all your work on the graphs, gives me a good idea on what to expect.

With two in I should be able to run it ok.
 
PhysX "High" Ran at 28 FPS with 4xAA/16xAF and max settings for me. Not anywhere close to 12 FPS. My guess is they encountered the "FPS bug" where occasionally FPS plummets randomly ((even at intro screen)) to single digits. But thats just a guess. My system is also different than theirs.

If your actually interested in Batman PhysX and its performance. I did a performance and subjective analysis of the game. I mostly focused on high end systems though.

http://forums.slizone.com/index.php?showtopic=39205

Putting my other cards back in now..
 
I am a terrible writer when it comes to punctuation :)

OK :)

Thanks for the 9800 GTX benchmarks. It seems the game remains playable with PhysX, though I'm still surprised by the performance hit. I admit I have only seen the game in Youtube videos, but I get the feeling that the framerate shouldn't drop that much. Could there be efficiency issues perhaps with the drivers, or related to the architecture of the GPU itself?

OK, they said that the existing $129 Nvidia card was faster, I didn't believe that, now the GT300 is faster, why should I believe them now?

To be fair, they were talking about Batman, and they said that with PhysX enabled, their card was faster, which is probably true since the game essentially ends up being CPU-limited with Radeons. Granted, it's not so much that the GeForce is faster, but it avoids CPU-limitation.

...And it's only one game :D
 
Regarding the market-traction of Physx mentioned in this thread, I'd like to mention my view. I have also been a fan of Novodex since before the takeover, and I have been dreaming of physics in games. But the manner in which Nvidia promotes it leaves me cold. If I wanted more realistically fluttering capes in a game, I'd most likely be playing Sims 3: The cloth-shop extravaganza. This isn't what Novodex was promising. They promoted actual physics affecting and incorporated into the gameplay. Right now I'm angry at Nvidia for this (love spurned and all that :oops:). There is also the the thing to note: PhysX MAY become something valid, I KNOW Dx11 is going to be valid. Futureproofing etc. As for the easy sell part, this is a tech-forum presumably filled with people willing to jump on the next cool tech. If it is this hard selling to them, how hard will it be to sell to the less tech-savvy?
Lastly, I think it is time to get our collective heads out of our butts. It is pretty clear to me that the next BIG tech for PC's has come in the unsexy harddrive-department. SSDs and all that. I know I'll be getting one and probably pay more for it then for a GPU, and it will most likely have a way bigger effect on my PC use then any of the presently sold GPUs.
 
But the performance drop off you witness in this instance is clearly related to memory. If someone wanted to keep the game at nearly the same quality run 4xAA --> and medium physX. All they'd have to do is adjust the settings slightly to lower texture detail.

Ang on the 9800gtx referred to above is dedicated to physx it wouldnt be loading any textures would it ?
 
Err if a 9800GTX is just doing PhysX processing combined with another GPU it wouldnt have to worry about that. But the 9800GTX in my tests was running solely other than the 9800GTX + 9600GT results. Using a seperate PhysX card will free up memory though on the primary rendering device always. This is not something discussed much. I'm not quite sure the exact amount but GPU Compute does require some memory allocation.
 
Status
Not open for further replies.
Back
Top