NVIDIA: Beyond G80...

The question is... do the SLI issues in Vista come from two separate cards using two PCI-e slots or from SLI in general? If I'm remembering the interviews correctly, Vista was requiring a separate driver for each card or something. Perhaps a GX2 card occupying only one PCI-e slot avoids the the issues and it can be easier to write the driver like the XP driver was written.
 
Vista requires that the same WDDM driver be loaded for all graphics adapters in a system, so you can't mix IHVs or do inter-IHV where the drivers have to be different.

Changes to the driver model in Vista might make SLI more of a challenge, but since you're still loading a driver instance per GPU with a GX2, I don't see how a GX2 makes it easier in either case.
 
If naming convention would hold, you'd think that it's not a multi-GPU board... of course, naming strategies change all the time so who knows. It also better be as fast as GTX SLI, since with prices of those hovering around ~$560 anything slower would be a rather dubious price-performance proposition.

BTW, what other computer component had the price of "owning the best" double in the last 7 years? I know that the price/performance ratio of 8800 Ultra will be several orders of magnitude greater than of Geforce 2 Ultra, but still, the top-of-the-line CPUs have been around $1K for a decade now while high-end GPUs prices are marching higher.
 
true, but CPU power can be used for many things out of the box, we don't see that with GPU's yet, but I guess with GPGPU coming it will become like that (and of course more people are folding ever year too)
 
They seem to indicate that this 8800ultra is some sort of a multi GPU configuration. Although changing the "GX2" moniker to "ultra" makes no sense, the price tag of $999 might make sense, if the 8800 ultra is 2 G80s (GTX) on a single slot PCI-e card. (performance of GTX SLi?)

Id be very surprised to, and its seems nVIDIA is doing a Intel, where intel has its extreme editions.. nVIDIA has its ultra editions.. all for $999.
 
Unless they're making it single-slot by using watercooling, I'm not really sure how that'd work anyway.
 
In order to release it at 1000 dollars a pop, this can't simply be a direct R600 XTX opponent like what the 7950 GX2 was to X1950 XTX, since that was released at a comparable price point.

I wonder if there will be another SKU fitting between GTX and Ultra by May 1st ?
 
Unless they're making it single-slot by using watercooling, I'm not really sure how that'd work anyway.

Yea, and thats the problem with the whole "GX2" theory. G80 already consumes hefty amounts of power, and two of them doesnt make things any better.

Not to mention the heat these GPUs produce which eliminates any possible GX2 style SKU until nVIDIA reduce the die size of the current G80, heat and power consumption.

Or maybe these "ultra" cores are simply cherry picked G80s on steroids? Just like the 7800GTX512mb. So a core clock of ~700 or more Mhz (could be paired with GDDR4) with limited supply i.e the $999 price tag.
 
Why do it though? Is being the absolute fastest with a card really that important to Nvidia? Even if it means releasing a $999 card that is likely to be in extremely low supply? That doesn't say much for confidence in your current, nor does it hint at a true refresh of the product any time soon. If this is a GX2 type product then there are other issues that would need to be address. Such as how are you going to make a reasonable sized product using two GTX configurations? The amount of heat output alone would be insane and would be a major hindrance. Not only that but to me it would be far less sensible than a user just buying SLI GTX. To be honest, if this is true then I have to say it.... it comes across as Nvidia being afraid of R600.
 
Why do it though? Is being the absolute fastest with a card really that important to Nvidia? Even if it means releasing a $999 card that is likely to be in extremely low supply? That doesn't say much for confidence in your current, nor does it hint at a true refresh of the product any time soon. If this is a GX2 type product then there are other issues that would need to be address. Such as how are you going to make a reasonable sized product using two GTX configurations? The amount of heat output alone would be insane and would be a major hindrance. Not only that but to me it would be far less sensible than a user just buying SLI GTX. To be honest, if this is true then I have to say it.... it comes across as Nvidia being afraid of R600.

If they were afraid, they would have released it at the usual $599~$649 bracket to better compete.
When you have something that is 30% more expensive than the competition you are clearly aiming at low volume right from the start, and at this point the Halo Effect is pretty thin anyway (not to mention it would be lost somewhat in the been-there-done-that department, given the two early GX2 experiments).
This wouldn't work against the huge R600 marketing coming ahead, and they know it.
The anticipation craze is too high and too long to be blown with a simplistic (and way more expensive) approach.


Complete technical side-question:
Does the G80 memory controller support Rambus XDR RAM ? Or are they stuck with GDDR3/GDDR4 ? Is 2.8GHz GDDR4 the maximum someone can get out of Samsung now ?
 
Last edited by a moderator:
Complete technical side-question:
Does the G80 memory controller support Rambus XDR RAM ? Or are they stuck with GDDR3/GDDR4 ? Is 2.8GHz GDDR4 the maximum someone can get out of Samsung now ?
I'm pretty darned certain it doesn't. Why would it?
 
I'm pretty darned certain it doesn't. Why would it?

The "double GPU" theory is not the only one.
Nvidia could hypothetically use additional memory bandwidth, in order to compensate for the 384bit bus.
And in terms of high speed IC's, right now i only see a 2.4GHz+ GDDR4/XDR solution as a possible fit for this scenario.
Plus, top speed memory is still a huge added cost.

I just remembered that the string in the last few .inf files from the Forceware sets explicitly says "Geforce 8800 Ultra", not "8850", "8900", "8850 GX2", etc, etc...
 
Given posts 417/418, allow me to place an innocent bet :D:

- 650~670MHz core, 1600~1800MHz in the shader clock department.
- 768MB/1.5GB of 2.4GHz GDDR4 memory.
- Dual-slot height self-contained water cooling solution.
 
Back
Top