nvidia "D8E" High End solution, what can we expect in 2008?

If that was really the case, a 0.8ns GDDR3-based G92 card would have surfaced long ago...

I disagree. There's been no need for such a card. ATi still hasn't released a single GPU SKU that can compete with either the 8800 GTX or the Ultra. Nvidia should just focus on delivering that level of performance in a cheaper package with a G9x-based single-GPU SKU, and leave the highest level of performance for the dual-GPU SKU.
 
..or for next gen single-core GPU :)

I wish. I'm in the market for a new graphics card and after my experience with Crossfire X1900's in 2006, I've decided I'll hold out until the IHVs can come up with a solution that is more elegant than AFR. I want to play video games, not the waiting game. Always waiting for a working driver to come out that enables CF/SLI in the latest game is no fun.
 
GF9800GT specs :(

9800GT: 112SPs, 650/1625/1000MHz.

Well, if true then i ask what is going on with NVIDIA? It seems there will be NO performance boost over GF8800GTX. Moreover it seems GF9800GTX will be SLOWER than GF8800Ultra/GTX in FullHD resolution or 1680x1050 with AA enabled due to only 256-bit memory bus and 16ROPs.
GF9800GT is believed to be slower than current new GF8800GTS too according to specs :(

It`s first time in GPU history when forthcoming highend GPU ISN`T visibly faster than it`s precedessor :( Could we have any hope that G100/GT200 will recompense us all this "GF9" situation?

Well if true it might mean only relatively small performance increases compared to predecessors.

By the way this 256bit/16 ROPs singled out explanation has to stop at some point; read the full review and have one of the stops in applications like this one:

http://www.computerbase.de/artikel/...ce_8800_gts_512/9/#abschnitt_colin_mcrae_dirt

Guess what the G92s are running out of VRAM before anything else; it doesn't mean that there aren't any detectable differences for the less bandwidth/ROPs but the smaller by 256MB framebuffer hurts a G92 way more often than the other stuff.

It goes without saying of course that if the above should be true that the naming scheme is by far ridiculous.
 
Yeah, it seems to be.

http://bbs.chiphell.com/viewthread.php?tid=16763&extra=page=1

Which basically says:

9800GTX - 128sp,256-bit 675c/1688sp/2200m
9800GT - 112, 256-bit 650/1650/1800

At least you'll finally get your Hynix .8ns GDDR3 G92 part with the GTX...2400mhz effective GDDR3 should do quite a bit for G92, as it closes the gap with the 8800GTX quite a bit, and along with the improvements to G92 over G80 should bring a superior product finally. The GT will most-likely keep the same 1.0ns stuff, and we do in fact have a rebranding of the 8800gt.

The move to 2400mhz GDDR3 indeed is squeezing every last damn drop out of the chip if it can't support GDDR4, which it seems it can't. The article mentions something to the effect it was supposed to launch at the end of last year, as I've seen others on B3D mention, but G80 will lose it's place in the market with it's release, and they need(ed) to clear the market first, hence why the staggered release with the 8800gt first. That's a nice PR reason, but I imagine the real reason they rushed 8800gt to market instead of waiting for the G80 to clear the market and releasing all the G92 chips simultaneously as the 9800 series was the 3800 series and it's price/performance ratio and they needed at least one SKU to compete with it.


So assuming it does use Hynix's .8ns chips, which they must be since nothing else afaik is rated to run at 2200mhz effective (and leaves room for Super Mega Fragmaster Overclock Delight Editions with faster stock mem speeds), you're paying for the ~20% bandwidth increase over a 8800GTS 512MB, 8800gt, or 9600gt.

It's a cheap shot to differentiate products not by chip or abilities so much as because of simple binning of RAM chips, especially when the architecture depends on it more than most we've seen (ex: 8800gt vs 9600gt), but I guess that's what we'll see. It's also worth mentioning that you shouldn't hold your breath for a 1GB part, I doubt they do or will do 1Gb densities at that speed.
 
Last edited by a moderator:
It would be very cynical, misleading and pathetic of NVIDIA if those rumors turned out to be true. Such specs, in my opinion, do not live up to the "9800" name. If true, it looks like preying on those uninformed, for whom 9800GT(X) is better than 8800GT(X) simply because 9800 > 8800 :cry:.
 
Well since Ati has nothing to compete with NV can do anything the want, and maby save the real GTX for the fall release ..
 
Because it is an extraordinary product:

resizephppk7.jpg


:LOL:

This graphics is much older than G92 and was leaked by trainee or someone in summer 2007 on NVs FTP.

Seems like it is a self-fulfilling prophecy, in bad mention...:LOL:
 
Well since Ati has nothing to compete with NV can do anything the want, and maby save the real GTX for the fall release ..

Of course it's completely out of the question that they just simply don't have anything better ready? :rolleyes:
 
http://www.xbitlabs.com/news/video/...erformance_Mainstream_Graphics_Processor.html

Unfortunately, things may not be very positive for pricing and availability of the GeForce 9600 GT-based graphics cards. Graphics boards based on Nvidia GeForce 9600 GT sent to X-bit labs were all based on A1 revision of the code-named G94 chip. Typically Nvidia releases its chips commercially in A2 revision. Nvidia’s code-named G92 graphics processor, which powers the latest GeForce 8800-series products, was launched as A2 with certain capabilities disabled. Those features are expected to be re-enabled only in A3 version of G92 set to be out in February or March.
I found this interesting. Certain capabilities/features not enabled in current G92s. Could be why the G94 seems to do much better than expected against a 112SP G92?

If this is true i can see why the 9800 series are being released around late Feb/March timeframe since these will be based on A3 revision. Maybe this 9800GTX is more than just a "OCed 8800GTS 512mb". Actaully this all makes sense, since IMO nVIDIA had to rushed the 8800GT (G92 A2) because of the unexpected early launch of the HD38x0 series. This probably screwed up their lineup/roadmap. Wasnt the RV670 suppose to hit the markets Q108 so it would go up against the 9600GT originally? They just didnt want to give up that market and since the current G80s/84 cant cut it in performance/price, they had to release atleast one SKU and that was based on the G92A2. But the question is, just what are these missing features/capabilities? any ideas?

Also Vr-zone seems to mention that the memory clock hasnt been decided. What is the highest GDDR3 clock out today? 2400MHz (effective?)

edit - think that picture is fake. Only one SLi connector.
 
Back
Top