NVIDIA G92 : Pre-review bits and pieces

Arun said:
In the end, between 900MHz GDDR3 256MiB and 512MiB 8800 GT SKUs, I wouldn't expect a price difference of more than $25-30 or so, but we'll see.
Well, if this ends up being the case, I would have a very hard time recommending anyone buy the 256MB version. If the 512MB GT gets to $200 promptly the the 256MB version would appear to me to be a pointless SKU. Bottom line is there is no way I would recommend a 256MB card to a gamer these days regardless of the price point.

INKster, the rolleyes was for the "100x better" hyperbole. Well warranted.

INKster said:
and is not interested in going over the top with AA
I'd hardly call 4x AA over the top...
 
Well, if this ends up being the case, I would have a very hard time recommending anyone buy the 256MB version. If the 512MB GT gets to $200 promptly the the 256MB version would appear to me to be a pointless SKU. Bottom line is there is no way I would recommend a 256MB card to a gamer these days regardless of the price point.

I'd hardly call 4x AA over the top...
Your comments seem to indicate that you want everything for free. You want high-end gaming with high-end features for mid-range price. Why do I say this? You keep talking about all the modern games, with features cranked up and now 4x AA. That doesn't sound like midrange to me.

There are plenty of reasons why 256MB would be fine and you haven't even seen benchmarks yet and you are already complaining!

Lastly, having more memory doesn't always mean faster performance. Anyone remember the GeForce 5200 boards with 256 MB?

-FUDie
 
FUDie said:
Your comments seem to indicate that you want everything for free.
Except they don't. Playing at a much lower resolution = free, how? I've already said that something like Arun's suggestion would be much better balanced. I believe I mentioned earlier they could reduce the SP count. 1440x900 resolution has 57% the pixels of 1900x1200 (a resolution the 8800GT still performs well at). What I am saying is that, if the goal was to reduce cost while giving the consumer the best experience possible, then sacrificing some SPs and bandwidth seems a much better trade off than going to 256MB boards. Current and upcoming games are already using more than 256MB without AA, and it is only going to get worse.

FUDie said:
That doesn't sound like midrange to me.
So you are saying that a $200 price point is not midrange (the price the 512MB 8800GT will supposedly drop to)?
 
Last edited by a moderator:
Except they don't. Playing at a much lower resolution = free, how?
I didn't say "they", I said "you". You want AA, you want highest quality features, you want latest games, you want it all, for $200. Sure, lots of games use more than 256 MB with all features cranked up, that's why most games allow you to dial down features.
I've already said that something like Arun's suggestion would be much better balanced. I believe I mentioned earlier they could reduce the SP count.
And lose money? If they don't yield many parts with fewer SPs, then either nvidia sells working 112 SP parts with SPs disabled (lost profits) or they don't have enough parts to satisfy the market (lost sales).
1440x900 resolution has 57% the pixels of 1900x1200 (a resolution the 8800GT still performs well at). What I am saying is that, if the goal was to reduce cost while giving the consumer the best experience possible, then sacrificing some SPs and bandwidth seems a much better trade off than going to 256MB boards. Current and upcoming games are already using more than 256MB without AA, and it is only going to get worse.
You can't sacrfice SPs unless they are already broken. Every working chip you sell as non-working is lost money. Remember the 9500 Pro? I imagine that most of those chips would have worked at least as 9700 non-Pros, which is why 9500 Pros were never around in abundance.

-FUDie
 
Thats exactly my point inkster. And thanks you pointing it out. The G92 @ 256 megs is loads better than a 8600 or HD2600 card. Specially at the 150 price segment.

Chris
Except that the 8600/2600 series are hovering at $100 level now .. :p
 
FUDie said:
I didn't say "they", I said "you"
They is a pronoun, it was used in reference to the word comments which is plural. "Your comments seem to indicate..."

FUDie said:
If they don't yield many parts with fewer SPs, then either nvidia sells working 112 SP parts with SPs disabled (lost profits) or they don't have enough parts to satisfy the market (lost sales).
You are confusing the issue. I never said such a part had to come from G92 (or solely for that matter). If G98 turns out half decent I might prefer a 512MB version of it to a 256MB version of G92 given the resolution I play at. Furthermore, Arun made the suggestion of a 192bit/384MB part. I tend to give him the benefit of the doubt, and I would submit Nvidia wouldn't produce such a part if it was losing them money.

FUDie said:
You can't sacrfice SPs unless they are already broken.
I wasn't suggesting otherwise (regarding G92, a different core is another matter). All I am saying is that a 8800GT core is going to be severely crippled (and thus an inefficient design in both absolute performance and performance/dollar) by only having 256MB of memory. Is that so hard to accept?

FUDie said:
You want AA, you want highest quality features, you want latest games, you want it all, for $200.
Ignoring the logical fallacy All - Resolution = All... It looks like I will get it with the 512MB 8800 GT. Your point?
 
Last edited by a moderator:
Except that the 8600/2600 series are hovering at $100 level now .. :p

When a successor comes out there are usually two things that can happen:

- The old part disappears from the market.
- The old part gets a huge discount and sells for a much lower price.

I think that it's obvious what happened with the 8600's...


BTW, the same thing will probably happen to the HD2600 line once the HD38xx is in place.
 
When a successor comes out there are usually two things that can happen:

- The old part disappears from the market.
- The old part gets a huge discount and sells for a much lower price.

I think that it's obvious what happened with the 8600's...


BTW, the same thing will probably happen to the HD2600 line once the HD38xx is in place.
They hit the $100 mark months ago, not recently.
 
They hit the $100 mark months ago, not recently.

Yeah, but that was because the HD 2600 Pro/HD 2600 XT obviously couldn't compete against the 8600 GT/8600 GTS on the performance department, similar -to some extent- to how the HD 2900 XT couldn't touch the 8800 GTX/Ultra.
If the same happens this time between the HD 38xx and the 8800 GT's then you'll know why.
 
Yeah, but that was because the HD 2600 Pro/HD 2600 XT obviously couldn't compete against the 8600 GT/8600 GTS on the performance department, similar -to some extent- to how the HD 2900 XT couldn't touch the 8800 GTX/Ultra.
If the same happens this time between the HD 38xx and the 8800 GT's then you'll know why.
Err? werent you trying to justify the $100 mark because of 8800GT's introduction? :LOL:

I dont think a 256MB 8800GT would even come close to cannibalizing the 8600 series given the price disparity. The 512MB 8800GT is most probably cannibalizing the 8800GTS/GTX series even when its not in stock. :smile:
 
You are confusing the issue. I never said such a part had to come from G92 (or solely for that matter). If G98 turns out half decent I might prefer a 512MB version of it to a 256MB version of G92 given the resolution I play at. Furthermore, Arun made the suggestion of a 192bit/384MB part. I tend to give him the benefit of the doubt, and I would submit Nvidia wouldn't produce such a part if it was losing them money.
Idle speculation by someone becomes fact, gotta love it.

In any event, how many G92s would yield only a 192-bit bus? Not too many I bet. Just like ATI's R300 didn't yield many 9500 Pros. Also, I didn't say it would be a money losing product, I said it would lower margins. Any good part sold as a crippled part is money you didn't make.
I wasn't suggesting otherwise (regarding G92, a different core is another matter). All I am saying is that a 8800GT core is going to be severely crippled (and thus an inefficient design in both absolute performance and performance/dollar) by only having 256MB of memory. Is that so hard to accept?
Have you seen benchmarks?
Ignoring the logical fallacy All - Resolution = All... It looks like I will get it with the 512MB 8800 GT. Your point?
Scaling resolution doesn't mean all resources in a game will scale. Your point? Hence my comment about not using max quality.
 
GeForce 9 soon? ;)
FW156.80 said:
[DEV_0405&SUBSYS_15D21043] NVIDIA GeForce 9500M GS
[DEV_0405&SUBSYS_16341043] NVIDIA GeForce 9500M GS
[DEV_0405] NVIDIA NB9P-GE1
[DEV_0406] NVIDIA NB9P-GE

[DEV_042E&SUBSYS_17C21043] NVIDIA GeForce 9300M G
[DEV_042E] NVIDIA NB9M-GE1

I bet we will see 2008 G92-based GeForce 9, maybe with D3D10.1 support...
 
Idle speculation by someone becomes fact, gotta love it.
I'm not responsible for how others interpret what I say, damnit! ;) However, I've pretty much got the existence of a 192-bit SKU confirmed, fwiw, I'm just not 300% sure it's G92-based yet.

In any event, how many G92s would yield only a 192-bit bus? Not too many I bet.
That depends on the finer details of your redundancy mechanism but yes, given the demand at those lower price segments, you'd expect that you'd also have to sell fully usable chips for such a redundant SKU.

However, it's worth pointing out that a 192-bit PCB ought to be cheaper than a 256-bit one, and that 6 full-density memory chips are *afaik* slightly cheaper than 8 half-density chips (i.e. 384MiB vs 256MiB). So even if it wasn't justified in terms of yields, you'd still have savings elsewhere. Ideally, you wouldn't want too much demand for such a SKU though, I guess.

AnarchX said:
I bet we will see 2008 G92-based GeForce 9, maybe with D3D10.1 support...
I'm way ahead of you! :p http://forum.beyond3d.com/showpost.php?p=1091738&postcount=3
D9M looks spookily like G96, too...
 
me said:
Mid-November:
"Today nVidia announced all GeForce 8 Series graphic-cards will support D3D10.1 through a later driver-release, which will come contemporaneously with Vista SP1.
They decided to notify this relative late, because in their opinion they have more to offer than just a filled feature-list."
http://forum.beyond3d.com/showpost.php?p=1083116&postcount=1054

Interestingly, Nvidia also notes that the card anticipates DirectX 10.1, but also notes that the upgrade will not be used by most developers.
http://www.extremetech.com/article2/0,1697,2216589,00.asp

:LOL:
 
Haha, not bad! Although all this says is that the 8800GT supports 10.1, it says nothing about the rest of the G8x family. And it could be misquoting someone, but I am inclined to believe it.

Also, if the GeForce 9300M is basically a G84, is it just me or is that a massively aggressive numbering scheme, since the 9300 would be competing against the 3600? :| This doesn't make any sense! Or is NVIDIA thinking that customers compare performance based on model number, but not price? Errr!
 
DEV_042E is G86
[DEV_042E] NVIDIA G86GL-850
So 9300 would be a right name, if faster SKUs will come like G96, which wil probadly new 9600/700-Series(or NB9P-GS/GT).

But it would be a nice PR-trick if NV will really announce soon, that all GF8 are capable of D3D10.1 (like the Chinese slide stated...) and HD2000-Series would not. :LOL:
 
FUDie said:
Idle speculation by someone becomes fact, gotta love it.
Where did I say it was fact? All I said was that I believe him; I didn't say you had to or should for that matter. Your lame attempts at putting words in my mouth are getting old.

FUDie said:
Have you seen benchmarks?
Have you? /Oh Snap.

FUDie said:
Scaling resolution doesn't mean all resources in a game will scale.
I never said it did (again trying to put words in my mouth and failing miserably). In fact, I have been arguing quite the opposite; haven't you been paying attention?


P.S. Love the non-responses (guess I will have to repeat the question):
ninelven said:
So you are saying that a $200 price point is not midrange?
 
Back
Top