NVIDIA: Beyond G80...

Lol I wouldn't get too excited over 2W even with the higher clocks :) That could be due to a better PCB or power circuitry too.
 
I bet this Ultra would be considered a decent value if priced in the $650 range, although I don't think NV is particularly interested in high volumes for this SKU, and will probably just go for higher margins and limited supply. I just can't help but think NV is holding something else back, simply because they don't need it yet. It will also be interesting how much more performance they can squeak out of their drivers moving forward with this 8 series architecture. One thing that still impresses me about the 8 series architecture is that NV can create so many different cards based on varying memory bus width, varying stream processors, varying shader clock domains, etc.
 
don't think cherry picked cores will do that lol, we would have seen it by now, board design could be. Actually when I saw the pic of the first ultra, with the two six pin, I was thinking it couldn't be......
 
Cherry picked cores? Add to the efficient board design.

VR-Zone mentioned that there is no problem with availability.
That would somewhat contradict such theory, unless they've been cherry picking dies for the past 6 months or so, just waiting for an "unknown moving target" (the R600 XTX "supposed" level of performance and "supposed" date of introduction).

Besides, cherry picked cores can already be seen in many "OC" and "Extreme" editions of the G80 GTX.
 
VR-Zone mentioned that there is no problem with availability.
That would somewhat contradict such theory, unless they've been cherry picking dies for the past 6 months or so, just waiting for an "unknown moving target" (the R600 XTX "supposed" level of performance and "supposed" date of introduction).

Besides, cherry picked cores can already be seen in many "OC" and "Extreme" editions of the G80 GTX.
Thats contrary to the reports from Inq, Fuad etc. With an MSRP of $850, only a handful of them could be enough to avoid availability problems. :smile:


Besides knowing Nvidia, I think they learned their lesson with the GTX512 so they could have been cherry picking right from the start. They had the market to themselves (since launch) just like the 78xx series. :|
 
Thats contrary to the reports from Inq, Fuad etc. With an MSRP of $850, only a handful of them could be enough to avoid availability problems. :smile:


Besides knowing Nvidia, I think they learned their lesson with the GTX512 so they could have been cherry picking right from the start. They had the market to themselves (since launch) just like the 78xx series. :|

Well, perhaps the cooler can be saved to the next one, just like with the 7800 GTX 512MB. ;)
 
From
8800Ultra Preview @ Beareyes translation (oced version with 650/1650/1130MHz)

image005.jpg


Have we seen G80 A3 before? I only remember A2's.

If it's a new spin it might make the Ultra a little more interesting. If they improved yields there might be reasonable availability (have there been any statements from people in the know claiming availability would be low? or is that just a near-universal assumption based on price?). Or there might be more overclocking headroom than on an A2 G80. Or they might have fixed some bug that was reducing perf/clock.

Or it could be something entirely different with no apparent effect :(.
 
Well, i guess they haven't been cherry picking cores for the past six months... ;)
If i'm reading it correctly, this sample was made in the second week of March.
And, as armchair_architect spotted very well, since it bears a different revision (A3), it's safe to say it probably wasn't made just to be included in a very low volume part as the top-end.
Rev. A3 GTX and GTS G80's might indeed make an appearance, if they haven't already.


edit
I am curious though.
Why would they name this core the "G80-450-A3", when the GTX was the "G80-300-A2" ? (Incidentally, the GTS was the "G80-100-A2")
It doesn't seem that different from the GTX, and it has no extra/disabled functionality like the GTS.
 
Last edited by a moderator:
It's interesting that the results from different sources above used 650Mhz for the core but stock memory speed and then one overclocked that. Are we sure it is not 650 rather than 612MHz core ? Seems strange or conincidence otherwise.

It still seems a lot of money however, perhaps the extra memory speed will come more into it's own either against ATi at higher resolutions and IQ (where nvidia has always been a bit weak) or in SLI mode.
 
Last edited by a moderator:
It still seems a lot of money however, perhaps the extra memory speed will come more into it's own either against ATi at higher resolutions and IQ (where nvidia has always been a bit weak) or in SLI mode.
It's hard for me to believe that, this time around, ATI could possibly be noticeably better in any aspect of image quality.
 
Hmm, on further investigation if you look at the ntune gpu overclocking control panel here

http://fabu.beareyes.com.cn/2/lib/200705/02/20070502011_3.htm

I think that top radio button is the default factory option ( no overclocking ) and therefore the gpu is going out at 650 .. which is a more sensible number than 612 !

How I wrote, this is a factory overclocked version with 650/1650/1130MHz. There maybe higher clocked ones from other AIBs, maybe the speculated 675MHz. ;)

Interessting is that with the overclocking to 720MHz the shaders should go up to amazing 1.8GHz.
 
Last edited by a moderator:
Back
Top