NVIDIA G92 : Pre-review bits and pieces

Who in their right mind would buy a 256MB card these days? The idea of a good midrange card is to be able play at lower resolutions with ALL the eye candy. 256MB is already obsolete for several games and many more soon to be released. It is simply a hard cap that will never allow the user to play current and future games at the settings he desires, absurd.
 
The clear cost advantage doesn't matter when your product margins are higher than your corporate margins anyway, which they are with G92 (see Q3 CC, 'well above corporate margins', presumably 55% or higher). When you're at that point and your corporate goal is to maximize margins, the goal should be to sell as many chips as you possibly can produce.

As for AIBs, remember that this will be on a cheaper PCB and, AFAIK, with 700MHz GDDR3... The latter is ridiculously cheap nowadays, so I'm not sure why anyone is surprised. Now, if it was 900MHz, that would be even more impressive, I guess.

As for 256MiB, I agree it's not a very good purchase decision BUT this is for the 150 euros market. That's 1280x800 to 1280x1024, which is only going to take ~10MiB for a non-HDR framebuffer (~15MiB including the backbuffer). So textures are the primary problem here, but lower texture settings aren't going to be as noticeable at lower resolutions. Same for smaller shadowmaps etc.

So yes, it's not ideal, but you shouldn't be expecting too much for that price point either.
 
Arun said:
So yes, it's not ideal, but you shouldn't be expecting too much for that price point either.
I don't think 512MB of 700MHz GDDR3 is too much to expect. Even if Nvidia or ATI thought otherwise, they could've cut back on the core; I'd much rather have a 96sp or 240sp part with 512MB of slower GDDR3 than a blazing fast core that is going to waste because it is choked by only have 256MB to work with.
 
I don't think 512MB of 700MHz GDDR3 is too much to expect. Even if Nvidia or ATI thought otherwise, they could've cut back on the core; I'd much rather have a 96sp or 240sp part with 512MB of slower GDDR3 than a blazing fast core that is going to waste because it is choked by only have 256MB to work with.

If theres not much redundancy in the core this is not a very cost effective solution. Lowering the amount of memory has immediate and obvious benefits to price. For 1280x1024 and 1440x900 resolutions
it could do quite well.
 
ChrisRay said:
immediate and obvious benefits to price

And an immediate and obvious negative effect on performance/dollar.
ChrisRay said:
For 1280x1024 and 1440x900 resolutions
it could do quite well.
Maybe if you don't use AA...

From here: Link
(without AA)
Stalker Results said:
Although the maximum preset does not push all of the detail sliders to their highest values, it's still enough to demand over 400MB of onboard RAM at 1600 x 1200.
Bioshock Results said:
Even so, as one can see, at the maximum possible DX9 settings, the game can easily use 300MB of local memory.
World in Conflict Results said:
The same issue with AA and memory recorded in STALKER seems to be present here but note that for any resolution, at maximum settings, 256MB of RAM isn't enough to not experience constant data swapping. Obviously one can scale the detail down to fit into that amount of local memory, but it's nowhere near as pretty like that!
COD4 Results said:
Permission to say "ouch", Sargent? Lowest resolution, no AA, on max details = not even for 256MB or 320MB VRAM, thank you very much. Call of Duty 4 is not only about modern warfare but modern graphics cards too.
Crysis Results said:
On the High DX9 detail setting, the usage is still more than 256MB but overall, it's actually less than we've seen in Call of Duty 4 above
 
For 150 . Do you really expect to use maximum settings? I'm well aware of the memory usage in modern games. But card would likely not exist at said price point if it were a 512 meg card.

You want the extra performance pay a small premium. Consider the current 100-150 market and tell me that a 256 meg G92 will not be 100x better than what is currently being offered.
 
So why not bump up the price a small amount.

ChrisRay said:
tell me that a 256 meg G92 will not be 100x better than what is currently being offered
:rolleyes: Ok, it won't be.

ChrisRay said:
Consider the current 100-150 market
So you are saying Nvidia is going to leave a $100 gap in their lineup: $150-MSRP for the 256MB 8800GT and $250+MSRP for the 512MB 8800GT? The key price point is $200- anyway; 512MB cards have been done at this level many times in the past... I don't see how I'm being unreasonable.
 
The clear cost advantage doesn't matter when your product margins are higher than your corporate margins anyway, which they are with G92 (see Q3 CC, 'well above corporate margins', presumably 55% or higher).

We're still thinking G92 is going into more than 8800GT aren't we? I'd assume the margins are a blended average if so. I think it's pretty unimaginable by historical standards they could do that at the GT-only price points with a chip that big otherwise.
 
We're still thinking G92 is going into more than 8800GT aren't we? I'd assume the margins are a blended average if so. I think it's pretty unimaginable by historical standards they could do that at the GT-only price points with a chip that big otherwise.
Actually my guess is it's NOT a blended average, and that's why the claimed margins are so high.

If you only 60% of the chips on a wafer could work as 8800 GTs, and that 80% could work either as 8800 GTs or as something else with more redundancy... Then one simple way to calculate margins is to consider 80% yields for *all* chip sales. This results in higher margins for the 8800 GT, and lower margins for lower-end SKUs.

And yes, I'm still betting on my 6C/192-bit SKU! ;) Plus, for those above me bitching about 256MiB not being enough... would 384MiB be enough? (for less than the 256MiB 8800 GT, presumably)

P.S.: ninelven, do you realize that the long-term 8800GT MSRP is $199? (that's presumably post-holiday though, but the gap should never be anywhere near $150-250. Do you realize 150 euros is pre-VAT, anyway?)
 
It's not about margins at all for DAAMIT. They need the cost advantage in order to sell cards. If they had exactly matching prices (assuming about similar performance) that would be a lost game already. Their only chance to increase their market share this gen is with lower prices.

Talking average Joes here, of course. All these people know at the moment is "OMFG, NV is t3h king" from every benchmark in every magazine everywhere since more than a year.
 
Arun said:
would 384MiB be enough?
Well, it would certainly be a better balance than 256MB.... Whether it would be enough or not, I don't know. It would probably be sufficient with no AA; but 4x would probably see it choking again in some titles. Would be interesting to see some testing on this matter though.

Arun said:
ninelven, do you realize that the long-term 8800GT MSRP is $199
No, I was not aware of this, and it would be great if true. Still, it doesn't address the uselessness of 256MB cards in upcoming games. Mind if I ask where this comes from though?

Arun said:
Do you realize 150 euros is pre-VAT, anyway?
Eh, what do I care? I thought people outside the U.S. were used to getting fleeced for hardware... :devilish: j/k guys.
 
No, I was not aware of this, and it would be great if true. Still, it doesn't address the uselessness of 256MB cards in upcoming games. Mind if I ask where this comes from though?
http://anandtech.com/video/showdoc.aspx?i=3140&p=5

Eh, what do I care? I thought people outside the U.S. were used to getting fleeced for hardware... :devilish: j/k guys.
The point is that 150 euros pre-VAT is a fair bit more than $150 USD ;) Yes, even after taking into account the EU tends to get shafted. So the *gap* between the SKUs isn't as big as you seem to be thinking it is. In the end, between 900MHz GDDR3 256MiB and 512MiB 8800 GT SKUs, I wouldn't expect a price difference of more than $25-30 or so, but we'll see.
 
:rolleyes: Ok, it won't be.

So, do you think the current 8600 GT/GTS or the HD2600 Pro/XT at that price-point are better cards than this 8800 GT 256MB or even the HD3850 ? :p
I'd say that the last two are an excellent value to anyone who doesn't go above 1280 x 1024 and is not interested in going over the top with AA and AF, at least compared to the former occupants of this price segment.

edit
Geo doesn't like "rolleyes". Ban the smiley, i say. ;)
 
Last edited by a moderator:
Some day when I run wild banning users left and right, when Rys finally catches me to ask wtf is going on, I'll almost certainly answer "One rolleyes smiley too many, man!".
 
So, do you think the current 8600 GT/GTS or the HD2600 Pro/XT at that price-point are better cards than this 8800 GT 256MB or even the HD3850 ? :p
I'd say that the last two are an excellent value to anyone who doesn't go above 1280 x 1024 and is not interested in going over the top with AA and AF, at least compared to the former occupants of this price segment.

edit
Geo doesn't like "rolleyes". Ban the smiley, i say. ;)

Thats exactly my point inkster. And thanks you pointing it out. The G92 @ 256 megs is loads better than a 8600 or HD2600 card. Specially at the 150 price segment.

Chris
 
Thats exactly my point inkster. And thanks you pointing it out. The G92 @ 256 megs is loads better than a 8600 or HD2600 card. Specially at the 150 price segment.

Chris

But this raises an interesting question.
Nvidia states that a 256MB model can have GDDR3 anywhere between 1400 and 1800MHz effective.

What are the odds of a 1400MHz 512MB 8800 GT ever seeing the light of day ? I know there are HD3850 with 512MB, so why not a "light" 8800 GT 512MB ? ;)
 
But this raises an interesting question.
Nvidia states that a 256MB model can have GDDR3 anywhere between 1400 and 1800MHz effective.

What are the odds of a 1400MHz 512MB 8800 GT ever seeing the light of day ? I know there are HD3850 with 512MB, so why not a "light" 8800 GT 512MB ? ;)

Honestly I dont know. I've been briefed about some more upcoming products recently. But I havent heard anything about a "light" 8800GT 512. I guess it depends on how much Nvidia wants to enforce their reference design.
 
Back
Top