How will NVidia counter the release of HD5xxx?

What will NVidia do to counter the release of HD5xxx-series?

  • GT300 Performance Preview Articles

    Votes: 29 19.7%
  • New card based on the previous architecture

    Votes: 18 12.2%
  • New and Faster Drivers

    Votes: 6 4.1%
  • Something PhysX related

    Votes: 11 7.5%
  • Powerpoint slides

    Votes: 61 41.5%
  • They'll just sit back and watch

    Votes: 12 8.2%
  • Other (please specify)

    Votes: 10 6.8%

  • Total voters
    147
Status
Not open for further replies.
Is this meant to be some sort of joke? Is NV that desperate that they come up with this kind of BS?

It's true - Nvidia PR put that out a few days back. It was reported all over.

:LOL: Yes it's a joke. It's a parody of the "Consumers don't need DX11 (because we don't have it)", PR release they put out a few days back. It's a "Consumers don't want to buy new stuff at Christmas, (because we don't have any new stuff to sell at Christmas)" joke.

Thought I'd better clear that up before we get quoted on some Chinese website and then see it coming back on Western tech sites.
 
But PhysX is doing just that right, more visual detail.

That's the point, what PhysX is doing is showing more visual detail, in a way that is very obvious to the average consumer.
It's going to be very hard to top that visual impact with DX11 features.
People will just be wondering: "Okay, so what's so good about this DX11 stuff? Why don't I see waving banners, smoke and paper or leaves blowing around? Why does breaking glass look so 2004?"
I mean, we all know that DX11 is the better technology, and in theory you could even accelerate physics through DX11 aswell. But no game is actually going to be doing that on the short term.
 
That's the point, what PhysX is doing is showing more visual detail, in a way that is very obvious to the average consumer.
It's going to be very hard to top that visual impact with DX11 features.

the details weren't added by physx, the details were animated by physx (badly.) Animated flags and breaking glass doesn't reduce my framerate in half in other games.

I'll be very happy with my DX11 smoke and dirt in DiRT2..
 
But PhysX is doing just that right, more visual detail. It has no effect on gameplay and certainly, hanging a few banners around so certainly, a simpler version of the details added could've been done on CPU without halving the framerate.

Why? The level of accuracy with physics can spoil or improve most definitely gameplay in a racing sim.

As for some developers overdoing it with effect X or feature Y that's not something new either. It started with the first shaders in games where some have gone that far that about every surface has to shine like a monkey's butt just for the most blind to notice that shaders are being used.

The trick for any developer out there is to find an as healthy as possible balance for all factors. They just have to find the right portion for everything.
 
Apparently, we decided upstream that Youtube is all you need to promote stuff nowadays

Who needs Youtube when you've got one of these and one of these :LOL:

the details weren't added by physx, the details were animated by physx (badly.) Animated flags and breaking glass doesn't reduce my framerate in half in other games.

Have you even played the game? The people that have disagree with you :)
 
Well, I didn't run the tests earlier to be unfair, those are the settings I've been playing the game under. I'm sure there are peopel with 17" CRTs buying GTX 285s, though I'm not certain how common that combo is, but I bought it since I own a 30" LCD and it's the current fastest single-GPU board available. Anyways, running it at 1920x1200 with 4xAA and high physics, I got 45fps. Getting ready for work this morning so don't have time for more testing.
 
Well, I didn't run the tests earlier to be unfair, those are the settings I've been playing the game under. I'm sure there are peopel with 17" CRTs buying GTX 285s, though I'm not certain how common that combo is, but I bought it since I own a 30" LCD and it's the current fastest single-GPU board available.

No need to exaggerate like that.
I think we can all agree that 30" is not that common, and that most people probably have a screen ranging between 20" and 26", which generally is 1920x1080 res or lower.
 
No need to exaggerate like that.
I think we can all agree that 30" is not that common, and that most people probably have a screen ranging between 20" and 26", which generally is 1920x1080 res or lower.

It's a ridiculous argument no matter how you look at it. People are saying you can't run PhysX at X resolution and Y settings. But this has always been the case for lots of things. Replace PhysX with "AA", "Very High", "Soft shadows" etc, etc.

Reading this thread you would think that before PhysX people were playing every game in all its glory at 2560x1600 on 6600GTs. Btw, the demo runs fine for me (45fps+) at 1680x1050 4xAA, max PhysX on a single stock GTX 285. Things aren't quite so smooth at 2560x1600 though....
 
I think we can all agree that 30" is not that common, and that most people probably have a screen ranging between 20" and 26", which generally is 1920x1080 res or lower.

And I think we can agree that while there are certainly people with 20" screens buying GTX 285s, it's probably not that common either. That was my point.

Anyways, I never imagined when I ran those tests real quick last night and posted the results that it would turn into a "nyah, nyah, scores don't count." I'm freakin' done with this thread.
 
And I think we can agree that while there are certainly people with 20" screens buying GTX 285s, it's probably not that common either. That was my point.

The actual size of the screen isn't all that relevant, it's the resolution. I just mentioned the 20"-26" range because they all generally only do 1920x1080 or lower. You'd need something like 30" or larger to get to 2560x1600.
And that's the point. Monitors that can't do more than 1920x1080 are far more common than ones that can do 2650x1600, even under GTX285-owners.

Heck, the TV in my livingroom isn't even 30", there's no way I'd use such a monster display for my PC. It doesn't fit on a desk, and it doesn't really work unless your living room is fairly large. I'd be too close to the screen to comfortably game on a 30" display in my livingroom.
 
It's a ridiculous argument no matter how you look at it. People are saying you can't run PhysX at X resolution and Y settings. But this has always been the case for lots of things. Replace PhysX with "AA", "Very High", "Soft shadows" etc, etc.

Reading this thread you would think that before PhysX people were playing every game in all its glory at 2560x1600 on 6600GTs. Btw, the demo runs fine for me (45fps+) at 1680x1050 4xAA, max PhysX on a single stock GTX 285. Things aren't quite so smooth at 2560x1600 though....

Yeah but AA doesn't halve your framerate! And of course not everyone has a 30" LCD, but then again not everyone has a GTX 285 either.

Most gamers probably have a 22" screen with a 8800/9800 GT/GTS 240 or something like that. The whole question is: is it worth it? Is AA4X worth a 20% drop in FPS? Is PhysX worth a 50% drop?

To the former I would say yes unless your framerate drops too low in absolute terms, to the latter I'd say it depends on exactly what PhysX brings you in each specific game.
 
Yeah but AA doesn't halve your framerate!

Not anymore...

To the first I would say yes unless your framerate drops too low in absolute terms, to the second I'd say it depends on exactly what PhysX brings you in each specific game.

I think the answer to the second is the same as the answer to the first.
Namely, as long as your framerate isn't too low, why not keep piling on the eyecandy? That's what we've been doing ever since PC gaming started, right? At least, I've always pushed my hardware as far as it can go while still allowing good playability.

The amount of framedrop isn't important. If I set a game to the lowest possible settings, I can get the highest framerates, and going to the highest settings, I may get a few hundred fps less, which may be a few 1000% droppage... Who cares as long as I still get good framerates?
 
Yeah but AA doesn't halve your framerate! And of course not everyone has a 30" LCD, but then again not everyone has a GTX 285 either.

Most gamers probably have a 22" screen with a 8800/9800 GT/GTS 240 or something like that. The whole question is: is it worth it? Is AA4X worth a 20% drop in FPS? Is PhysX worth a 50% drop?

To the former I would say yes unless your framerate drops too low in absolute terms, to the latter I'd say it depends on exactly what PhysX brings you in each specific game.


No, AA today doesn't reduce your frame rate by half, but one time in history it did and look where we are today. 4x and even 8x AA with 10-40% hit(depending on card/setup), 5 years ago, you wouldn't think of doing 4xAA unless you were getting 60fps or more at high res because the fps would tank. PhysX is the new AA. It great when it doesn't hurt you, but when it does, your best to play without it.
 
Yea, that's the point.
DX10 may be less efficient, but even current cards can run 60+ fps with maxed out settings in the latest games. So it's 'efficient enough'.
I think we're past the point of being able to milk the shaders for more per-pixel detail, more polygons and all. We already have per-pixel lighting, shadowing, realtime reflections, refractions, ambient occlusion and all that.
You could try to add more eyecandy and more detail, but are people even going to notice? With Crysis DX9 vs DX10 people mostly noticed the drop in framerate, the visual differences weren't apparent to most people until you took screenshots and compared side-by-side. I take that as a clear sign that the road of more visual detail is a dead end at this point.

Till the games look like movies there is no end to visual details.
 
Till the games look like movies there is no end to visual details.

There's more to looking like movies than just increasing the polycount ad infinitum.
I think at this point things like better AI and physics are far more important for better perceived realism than more visual detail.
Besides, it's not like DX11 is going to be THAT big of a deal, it's not going to close the gap with movies :)
 
Well, I didn't run the tests earlier to be unfair, those are the settings I've been playing the game under. I'm sure there are peopel with 17" CRTs buying GTX 285s, though I'm not certain how common that combo is, but I bought it since I own a 30" LCD and it's the current fastest single-GPU board available. Anyways, running it at 1920x1200 with 4xAA and high physics, I got 45fps. Getting ready for work this morning so don't have time for more testing.

No one said that you ran them in that resolution to be unfair; thanks for the above in any case.

There's more to looking like movies than just increasing the polycount ad infinitum.
I think at this point things like better AI and physics are far more important for better perceived realism than more visual detail.
Besides, it's not like DX11 is going to be THAT big of a deal, it's not going to close the gap with movies :)

Movies aren't interactive; that's why I don't cope well with that sort of comparisons. How about some Pixar style motion blur in a game? I guess right now we'd need more TMUs than SPs on a GPU even with an adaptive algorithm like that heh...
 
There's more to looking like movies than just increasing the polycount ad infinitum.
I think at this point things like better AI and physics are far more important for better perceived realism than more visual detail.
Besides, it's not like DX11 is going to be THAT big of a deal, it's not going to close the gap with movies :)

Well, each new generation gets you closer.
So you agree your comment about the end of visuals is BS?
 
Well, each new generation gets you closer.
So you agree your comment about the end of visuals is BS?

No I don't. I stand by what I said, which was ofcourse about the extra visuals in the context of DX11 and the upcoming hardware generation.
This particular step isn't going to lead to all that much of a visual impact. So it's a dead end trying to go that route. As I said, what are you going to do, tessellate the heck out of everything?
Accelerated physics gives you more bang for the buck. People see more happening on the screen, easier to impress.
 
Movies aren't interactive; that's why I don't cope well with that sort of comparisons.

I agree, then again, I didn't make the comparison, I just responded to it.

How about some Pixar style motion blur in a game?

Nice, but we already have pretty good motion blur in DX10. Heck, with Crysis most people didn't even see the difference between DX9 and DX10 when it came to motion blur.
The visual impact is just very minimal at this point.
 
Status
Not open for further replies.
Back
Top