How will NVidia counter the release of HD5xxx?

What will NVidia do to counter the release of HD5xxx-series?

  • GT300 Performance Preview Articles

    Votes: 29 19.7%
  • New card based on the previous architecture

    Votes: 18 12.2%
  • New and Faster Drivers

    Votes: 6 4.1%
  • Something PhysX related

    Votes: 11 7.5%
  • Powerpoint slides

    Votes: 61 41.5%
  • They'll just sit back and watch

    Votes: 12 8.2%
  • Other (please specify)

    Votes: 10 6.8%

  • Total voters
    147
Status
Not open for further replies.
Its funny, there seems to be less interest at this point as to what Nvidias high end GPU is going to be. The interest seems to revolve around the question of when relative to AMD's high end GPUs.
 
Its funny, there seems to be less interest at this point as to what Nvidias high end GPU is going to be. The interest seems to revolve around the question of when relative to AMD's high end GPUs.

To be fair. This thread was never about the GT300 specifically. Its been a gloating thread from the start ;)
 
To be fair. This thread was never about the GT300 specifically. Its been a gloating thread from the start ;)

I guess you're right. :oops:

Theres also the DX11 vs Physx debate which deserves to be spun off as its off topic, but there isn't much of relevance to spin off. :D
 
I dunno, if it's not out until January I think that would hurt them plenty enough just missing the holiday season. :???:
Pfft, holiday season isnt important. If it does matter even remotely then Nvidia will have launched G300 in some form or another. ;)
sarcasm re holiday season :p


I'm now addicted to using the spoiler tag, damn you Richard for hooking me up on this.
 
As promised, here are some quick Batman: AA numbers using the in-game benchmark. Test system is a Core i7 965, 6GB, GTX 285, VelociRaptor/Intel X-25M SSD, 190.38 driver set. All in-game settings maxed out, 16x AF set via the control panel (8x AA via the app). I don't have much free time tonight, so just ran the test three times at 2560x1600.

PhysX off = 58 average fps
PhysX Normal = 35 fps
PhysX High = 28 fps

So at that res, with those settings, PhysX literally cuts the frame rate in half (at least in that benchmark, which specifically focuses on scenes using it). The difference between high and normal is cloth physics (no fluttering flags or cobwebs at normal).

8xMSAA in 2560? Doesn't sound like a very demanding game to me. Two questions for when you have time: how does it fair with 4xAA instead and what's roughly the average framerate that still guarantees playability?
 
8xMSAA in 2560? Doesn't sound like a very demanding game to me. Two questions for when you have time: how does it fair with 4xAA instead and what's roughly the average framerate that still guarantees playability?

Problem is, that these very low FPS are reached also at lower resolutions, like there was a "CPU limited" issue (I don't know if it's a driver or implementation related problem, or that at this point the calculations are simply too many for the present GPUs), and this is the current high-end single card. I have a GTS250, imagine that frame rate JR showed being more than halved again even at 1024*768 and we can have the picture.
 
Last edited by a moderator:
I think the moral of this thread is that nVidia will spread whatever FUD they feel they must in order to make up for their lack of a competing product. :yep2:

They are working with what they've got, but it's never been the case before that they've had so little to work with.

Unfortunately whenever Nvidia is in that position, they tend to throw out what little honesty they have and seem to be happy to say whatever they need to say, even if it's lies and nonsense. It's one of the reasons I boycott their products. As a consumer I just don't trust them. I feel like I'm being treated like an idiot when I see things like that "Physx > All" PR statment, so I don't want to give them my money.
 
@neliz no, but if they give an answer like "im not allowed to comment"
we've got them after all nv wouldn't get them to sign something saying they cant tell us if they have something they don't have :D
 
Pfft, holiday season isnt important. If it does matter even remotely then Nvidia will have launched G300 in some form or another. ;)

Of course, Nvidia will soon be telling us that Christmas isn't important and Easter is what consumers actually want. :


Nvidia's Sr. Vice President, Investor Relations, Mike Hara, has played down the significance of Christmas at the Deutsche Bank Securities Technology Conference. Instead, Mr. Hard insists Holidays like Easter, 4th July, and Thanksgiving are the future.

“Christmas by itself is not going be the defining reason to buy a new GPU. It will be one of the reasons. This is why Microsoft is in work with the industry to allow more freedom and more creativity in how you build holidays, which is always good, and the new features at Christmas are going to allow people to do that. But that no longer is the only reason, we believe, consumers would want to invest in Christmas,” explains Mr. Hara.
 
Of course, Nvidia will soon be telling us that Christmas isn't important and Easter is what consumers actually want. :


Nvidia's Sr. Vice President, Investor Relations, Mike Hara, has played down the significance of Christmas at the Deutsche Bank Securities Technology Conference. Instead, Mr. Hard insists Holidays like Easter, 4th July, and Thanksgiving are the future.

“Christmas by itself is not going be the defining reason to buy a new GPU. It will be one of the reasons. This is why Microsoft is in work with the industry to allow more freedom and more creativity in how you build holidays, which is always good, and the new features at Christmas are going to allow people to do that. But that no longer is the only reason, we believe, consumers would want to invest in Christmas,” explains Mr. Hara.

Someone should make a generator for this, where you pick 2 words (e.g. GPGPU and DX11, Easter and Christmas, etc.) and it generates an Nvidia statement. :D
 
8xMSAA in 2560? Doesn't sound like a very demanding game to me. Two questions for when you have time: how does it fair with 4xAA instead and what's roughly the average framerate that still guarantees playability?

Proves the point I made earlier though... Even current cards can get 60+ fps at very high resolutions.
The game is capped at 60 fps anyway, so what point is there in a faster AMD card?
It may be faster than the GTX285, but the GTX285 can already do 60+ fps at 2560 res with 8xMSAA and 16xAF. Where do you go from there?
 
Proves the point I made earlier though... Even current cards can get 60+ fps at very high resolutions.
The game is capped at 60 fps anyway, so what point is there in a faster AMD card?
It may be faster than the GTX285, but the GTX285 can already do 60+ fps at 2560 res with 8xMSAA and 16xAF. Where do you go from there?

eyefinity?
 
eyefinity?

I don't see myself splashing out tons of cash on a bunch of monitors... heck, I don't even have the room for them on my desk. I think that goes for most people, they neither want to spend the money or the room required for such a setup.
As I said before, I think it's a niche feature.
 
Someone should make a generator for this, where you pick 2 words (e.g. GPGPU and DX11, Easter and Christmas, etc.) and it generates an Nvidia statement. :D

Maybe ask Charlie if you can borrow his article generator?
 
As promised, here are some quick Batman: AA numbers using the in-game benchmark. Test system is a Core i7 965, 6GB, GTX 285, VelociRaptor/Intel X-25M SSD, 190.38 driver set. All in-game settings maxed out, 16x AF set via the control panel (8x AA via the app). I don't have much free time tonight, so just ran the test three times at 2560x1600.

PhysX off = 58 average fps
PhysX Normal = 35 fps
PhysX High = 28 fps

So at that res, with those settings, PhysX literally cuts the frame rate in half (at least in that benchmark, which specifically focuses on scenes using it). The difference between high and normal is cloth physics (no fluttering flags or cobwebs at normal).

What am I missing here? I thought by buying a nVidia card (and one of the best) you get a lot of visual extras and the card takes care of the math doing it? Then why does the fps tank? 285 is quite a capable card, no?

Looks like strange marketing to me; "Buy a PhysX card with lots of visual extras in games and get your fps halved at the same time". With fps nearly halfed, why couldn't these extras be run on the CPU and an AMD GPU instead?
Please explain as I feel very confused at the moment.

With DX10.1 wasn't the games (like Assassins Creed before a patch took it away) both looking better AND running faster?
 
This pertains to all video cards

Actually, I am more interested in minimum fps than max frames or average frames, coz the limiting factor is always the minimum frames in a game, they could render something with an average fps of over 100 but if the minimum fps is like 6 then ithe gamer will absolutely see the difference.
 
What am I missing here? I thought by buying a nVidia card (and one of the best) you get a lot of visual extras and the card takes care of the math doing it? Then why does the fps tank? 285 is quite a capable card, no?

Looks like strange marketing to me; "Buy a PhysX card with lots of visual extras in games and get your fps halved at the same time". With fps nearly halfed, why couldn't these extras be run on the CPU and an AMD GPU instead?
Please explain as I feel very confused at the moment.

Have you ever come across a game that does NOT drop in fps when you go from low to higher detail settings?
There's no such thing as a free lunch.
 
Status
Not open for further replies.
Back
Top