NVIDIA G92 : Pre-review bits and pieces

Now i think this does indeed hint at G92s potential when it comes to clocking it much higher than what we see now in the form of the GT part.
Arun has been alluding to that for a long time now, he must counting his bet money ..
 
can we expect the G92 128sp 512mb GTS to outclass the GTX? or will its lesser bandwidth, pixel fillrate and memory mean GTX still is on top?

Depends on the amount of ROPs and final clockspeed for those. If that thingy will have still 16 ROPs@600MHz, then it'll still be by a margin behind the 8800GTX in ultra high resolutions with AA. The real question then is if such a model with a higher price than the 8800GT makes even sense.

I'd figure that for NV to be able to ask for X amount more than for the GTs, there should be also a healthy frequency increase compared to the latter.
 
So, there are going to be 4 "8800GTS" cards? WTF? Did Nvidia sign some sort of treaty that precludes them from using anything but 8800 as the name for their cards?

The naming scheme truly sucks if all the above should be correct; I can understand that G80 boards will eventually stop being produced, but at least for crying out loud name the G92 boards 8850 or something along that line in order for both partners and final consumers to be able to keep them apart.

Or does anyone think that if I waltz into a store tomorrow and ask for 8800GTS the average salesperson will be able to know what is what?
 
SLI should be considered when thinking of model names/Hardware configurations/differences. My 8800GT cards dont work with my 8800GTX ones in SLI obviously. P
 
SLI should be considered when thinking of model names/Hardware configurations/differences. My 8800GT cards dont work with my 8800GTX ones in SLI obviously. P

Sounds wonderful for someone that would like to pair two 8800GTS in the nearest future in a system. There's something not adding up here; is it really definite that the G92 based GTS boards are going to be called 8800GTS? If yes it'll cause one hell of a confusion for a lot of sides.
 
My 8800GT cards dont work with my 8800GTX ones in SLI obviously. P
So HybridSLI slides talk about pairing some lowly on-board integrated graphics with everything from a 8400 up to a 8800GTX. When (/if?) HybridSLI becomes available can we expect to be able to do HybridSLI between GT and GTX cards (or more interesting an 8600 and a 8800)?
 
I thought hybrid SLI was targeted for the low/mid end part where the performance benefits are worth while compared to it being paired with a highend solution?
 
I thought hybrid SLI was targeted for the low/mid end part where the performance benefits are worth while compared to it being paired with a highend solution?

It isn't a high-end solution. It just so happens to work with high-end cards.
 
It isn't a high-end solution. It just so happens to work with high-end cards.
My point is that as presented (IGP + low-end) it's not a high-end solution, but if you can get that side of the spectrum working together then shouldn't it be possible to get the high-end also working together. Basically if you have a GTX now and in a year the only thing you find for sale are GTs, still being able to SLI them together.
 
My point is that as presented (IGP + low-end) it's not a high-end solution, but if you can get that side of the spectrum working together then shouldn't it be possible to get the high-end also working together. Basically if you have a GTX now and in a year the only thing you find for sale are GTs, still being able to SLI them together.

That would lead to lost sales, as fewer people would buy high-end (and thus high-margin) parts. They'd just buy the new mainstream performance card (like the 8800 GT) every time one is released.
 
Well many do, like me. GF5900, GF6800 non-ultra, X1800XT when 1900 released, later the GTO for my HTPC and now the 8800GT. Always one step below the top-dog because of the right price/perf ratio.
 
The Inquirer is having another Nvidia-bash today.

http://www.theinquirer.net/gb/inquirer/news/2007/11/11/nvidia-mystery-thermal-analysis

The gist of it is that Nvidia supposedly requested lots of different PC chassis to be sent in for thermal analysis prior to the release of G92. The Inq reckons that the reason for this is that RV670 turned out to be unexpectedly good/early, and if Nvidia had released 8800GT as it was originally intended to be, it (8800GT) would have been 10% behind RV670 in performance.

To avoid this, Nvidia supposedly did a lot of testing to determine what was the maximum clock speed that wouldn't dangerously overheat in any chassis (using all the thermal analysis data) and then up-clocked 8800GT to only just below that level (thus pulling 10% ahead of RV670). So 8800GT with the standard single-slot cooler is supposedly operating very close to its thermal limit unless you have good case-cooling (and it is much hotter and louder than a 3850). All of the factory-overclocked 8800GT cards are presumably using non-standard dual-slot coolers.

Does anyone believe this story? :) If it's true then it might suggest that RV670 has more scope for good-quality cooling and overclocking than G92 has. It would also explain why 8800GT runs at such a high temperature in its normal configuration.
 
The Inquirer is having another Nvidia-bash today.

http://www.theinquirer.net/gb/inquirer/news/2007/11/11/nvidia-mystery-thermal-analysis

The gist of it is that Nvidia supposedly requested lots of different PC chassis to be sent in for thermal analysis prior to the release of G92. The Inq reckons that the reason for this is that RV670 turned out to be unexpectedly good/early, and if Nvidia had released 8800GT as it was originally intended to be, it (8800GT) would have been 10% behind RV670 in performance.

To avoid this, Nvidia supposedly did a lot of testing to determine what was the maximum clock speed that wouldn't dangerously overheat in any chassis (using all the thermal analysis data) and then up-clocked 8800GT to only just below that level (thus pulling 10% ahead of RV670). So 8800GT with the standard single-slot cooler is supposedly operating very close to its thermal limit unless you have good case-cooling (and it is much hotter and louder than a 3850). All of the factory-overclocked 8800GT cards are presumably using non-standard dual-slot coolers.

Does anyone believe this story? :) If it's true then it might suggest that RV670 has more scope for good-quality cooling and overclocking than G92 has. It would also explain why 8800GT runs at such a high temperature in its normal configuration.


No.
For the simple reason that there are indeed factory "OC'ed" versions of the 8800 GT still using the default cooler.
Look no further than Zotac, for instance, where there is a "standard" 8800 GT (which is actually running 60MHz above the Nvidia-referenced 600MHz), and then an "AMP!" edition, clocked at 700MHz.

As we can see on their website, both use the very same single-slot cooler:
http://www.zotac.com/news-071029.htm

8800GT-512DDR3-card&box.jpg


8800GT-512DDR3-AMP-card&box.jpg
 
It occurs to me: this alleged up-clocking of the 8800GT at the last minute would also explain why it is called "8800GT"; the pre-upclocked version would have been at a speed appropriate to the name.
 
You're saying its harder to change the name than to change the clocks shipped by all the board vendors, along with giving them enough time to adjust their OC skus?
 
It could be that SKU/promo/marketing material had already gone to channel/been printed. Certainly cheaper to reflash the boards than to reprint boxes, etc. I'm not sure I entirely subscribe to this theory, mind you.
 
I know make quips about it from time to time, but it might be a good time to ask outright: What is it with Charlie and Nvidia? I mean, all jokes about Jen stealing his girlfriend aside, when did this all start (because it has been going for a while now).
 
Back
Top