nvidia "D8E" High End solution, what can we expect in 2008?

Just how exactly do you expect them to use this "spare" time "improve" the profit of the chip? I assume the chip is taped out or very near to it, and I don't know what you can do at this point in time to improve margins other than disabling portions.

there's word of the 65nm D9E coming on february 7. Compared to the original target of late 2007 maybe that buys some time to get a slightly better stepping out. Obviously the high end Christmas sales are let to G80.. and it doesn't feel unusual to see the new stuff coming on january or february.
 
there's word of the 65nm D9E coming on february 7. Compared to the original target of late 2007 maybe that buys some time to get a slightly better stepping out. Obviously the high end Christmas sales are let to G80.. and it doesn't feel unusual to see the new stuff coming on january or february.

Yeah, i'd posted it here before.

G80 High-End sales this Christmas will only make sense if the buyer plans to do Tri-SLI.
Otherwise, one or two "G92" 8800 GTS 512MB are much more reasonable purchases, since they'll be very close, or perhaps even surpass the GTX/Ultra.
 
Doing a more substantial redesign of a GPU architecture costs money. A lot of it.
So if you can skip that part once in a while then you're already improving the company's bottom line.
ATI did largely that with the R420/R480 collecting the benefits of R300, R360, etc.

In the case of the Geforce 8, the mid-life upgrade (8800 Ultra) to the 8800 GTX consisted of little more than a -slightly- new cooler and higher clockspeeds on the same core, not even the PCB was changed in any significant way.
That didn't happen with the top Geforce 6's (there were 130nm and 110nm versions) or with the Geforce 7's (there were 110nm and 90nm cores, as well as two GX2 models and 7x50 speed-bump variants).

Not quite true. Nvidia did TRY a 6800 Ultra "refresh" but it was so horribly overclocked that they couldn't cherry pick enough to make it retail reality. However they WERE able to rain on ATI's launch by making sure the 6800 Ultra was available to reviewers.

Likewise, with the 7800 GTX 512 "refresh," which again they had a difficult time hand picking enough in numbers at the required clock speeds to make more than just a token appearance in retail. I still remember multiple friends that waited months hoping one would come in stock camping Newegg for a chance to pay over 1000 USD for a card that was only marginally faster than a R520.

Regards,
SB
 
Not quite true. Nvidia did TRY a 6800 Ultra "refresh" but it was so horribly overclocked that they couldn't cherry pick enough to make it retail reality. However they WERE able to rain on ATI's launch by making sure the 6800 Ultra was available to reviewers.

Likewise, with the 7800 GTX 512 "refresh," which again they had a difficult time hand picking enough in numbers at the required clock speeds to make more than just a token appearance in retail. I still remember multiple friends that waited months hoping one would come in stock camping Newegg for a chance to pay over 1000 USD for a card that was only marginally faster than a R520.

Regards,
SB

The 6800 Ultra 512MB "refresh" consisted of a different PCB, double the memory of it's predecessor and very slight clock increases. I don't think anyone believes it was meant to become a market commodity (especially because they are still to this day extremely rare).
It's as you said, a reviewer's card, much like the X700 XT, for instance

As for the 7800 GTX 512MB, it was a different story.
Yes, the cooler was different, yes the PCB was mostly the same, but there were very substantial clock increases. The core was raised a full 100MHz above the standard GTX -something that even with overclock was difficult to accomplish on the cheaper model-, and the memory chips consisted of the -then rare and expensive- 1800MHz GDDR3 parts from Samsung.
And, despite the bad reputation, there were indeed a quite a few more GTX 512 cards in the wild than when Nvidia silently announced the 6800 Ultra "Extreme".
The outrageous price-tag led to it being very unpopular, and it did have a mere 4 months at the top before the 7900 GTX came out in March 2006.


In the current "halo product segment", the 8800 Ultra is far from having anything quite like it. Even the 8800 GTX couldn't be beat by the HD2900 XT (512MB or 1GB), so it could have been a rare bird if Nvidia wanted to.
I don't think that's the case, however, since you can easily buy an Ultra -if you have the money for it anyway- ever since it was announced.
 
Hmm there are two versions of NV next-gen GPU rumours because.... about 2 weeks ago Digitimes reported that GF9-series will be launched in February but today Fudzilla is writing - D8E (2xG92 card with DualPCB) is ready to launch in January too.... So i doubt that NVIDIA will push 2 GPUs in two months (D8E in Jan and D9E in Feb)....
So which source says the truth?? :)
 
Hmm there are two versions of NV next-gen GPU rumours because.... about 2 weeks ago Digitimes reported that GF9-series will be launched in February but today Fudzilla is writing - D8E (2xG92 card with DualPCB) is ready to launch in January too.... So i doubt that NVIDIA will push 2 GPUs in two months (D8E in Jan and D9E in Feb)....
So which source says the truth?? :)

2xG92 is not a next-gen GPU, it is just another SKU. They will launch an SKU with a fully-enabled G92 in just a few days from now. A January launch of 2xG92 board seems highly probable to me.

And I don't think we will see D9E as early as February :cry:.
 
D9E and D8E are perfectly compatible - 7800 GX2, anyone?

Whether driver immaturity or any other reason the 7800 GX2 was far from being exciting and I sure hope we won't see another thingy like that.
 
Sure, I agree - what I meant by that is NVIDIA worked on the 7800 GX2 even though they knew there'd be a 7900 GTX very shortly afterwards, and then a 7950 GX2. The same could very well be true here; I'm not trying to imply anything about the quality of the different products, since I have no idea whatsoever on that.
 
D9E and D8E are perfectly compatible - 7800 GX2, anyone?
7800GX2 wasn't an official product AFAIR.
But i agree that NV can release G92GX2 SKU in January and a new top-end chip in February -- don't see anything that makes this impossible.
 
7800GX2 wasn't an official product AFAIR.
But i agree that NV can release G92GX2 SKU in January and a new top-end chip in February -- don't see anything that makes this impossible.

The 7800GX2 was a retail product. Actually, I've seen one recently on the second hand market. It had some very fancy cooler. I guess you mean the 7900GX2, which was afaik for OEM's only.
 
FWIW, my current theory is that D9E is an A11, which explains the ridiculously narrow time gap between D9E and D8E. Theoretically, that might also imply D9P/D9M aren't back yet, and June 2008 would be the target for an A12.
 
FWIW, my current theory is that D9E is an A11, which explains the ridiculously narrow time gap between D9E and D8E. Theoretically, that might also imply D9P/D9M aren't back yet, and June 2008 would be the target for an A12.

Maybe not so ridiculous if Nvidia decides to position a dual-G92 only slightly above the new 8800 GTS 512MB.
That would still leave plenty of space for an "Ultra-like" Geforce 9 to show up at the very top-end of the market and have the two in the market simultaneously until early Summer.
 
Well, that's possible. It'd certainly help to have an idea of the die size for D9E...
 
Well i think if they go to GX2 like card it`s not a right choice.... Don`t you think this solution is more expensive than one bigger single GPU (D9E) even if it would have 512-bit memory bus?? OK NVIDIA made this with G71 but as we remember this chip is much smaller, cooler and cheaper than G92 (196mm^2 vs 330mm^2)....

I hope G92 is something like NV42 aka GF6800GS which was nothing more than NV40 in 110nm process and a few months after it NVIDIA released G70. I believe there will be the same with G92 and D9E....
Another possibility is NVIDIA want to launch 2xG92 DualPCB in Jan/Feb card because they want to push D9E in 55nm later next year??
 
I do not see the point of a G92 X2 product and a D9E released so closely together and I doubt this very much. Why spend all that money creating and testing a new PCB etc when you have a better product coming out within 30 days?

On the otherhand you could have AIB people creating them like Asus and Gigabyte of yor, I can see that happening.

Perhaps G92 X2 is just another red herring from nvidia to muddy the waters for onlookers?
 
As I said, if D9E is indeed an A11, then presumably their roadmap was based on it needing a respin. D8E and D9E about 1.5 months apart might nto make sense, but 3.5-4 months apart presumably would have. Now, once you realize that, the PCB might be pretty much done and it makes more sense to keep D8E anyway. However, I don't know if D9E is an A11, but it'd certainly explain quite a few things.
 
Back
Top