nvidia "D8E" High End solution, what can we expect in 2008?

G92 is at roughly 750M and I'm afraid we still haven't seen it's peak frequencies if there's going to be a high end single chip variant out of it.

If you'd take in a purely hypothetical case 1800M vs. 750M, that's an increase of 2.4x in terms of transistors. If the 1.8B rate should be close to the truth and the result is barely 50-60% faster than a single G92, then there's something horribly wrong with that chip.

No no i mean it should be about 50-60% faster tha single G92 if it has about 1billion of transistors :) With 1,8b it should give 100% or more performance boost imo ;)
 
I really hope there is such a chip due to be released imminently, but I somewhat doubt it. 192SPs w/384-bit memory controller would be a helluva chip, and much preferred over 2x G92.
 
What is with the rumours and information regarding nVidia. It's very sparse and/or contradictory.

The old rumours and information on web for other launches were much more consistant. It seems like there is a lot more missinformation etc.
 
I really hope there is such a chip due to be released imminently, but I somewhat doubt it. 192SPs w/384-bit memory controller would be a helluva chip, and much preferred over 2x G92.

Hmm but rumours were that GF9800GTX/GT are based on very high clocked G92 and G92 has 128SP not 192 so if there will be 192SP then it means it is no G92 :)
 
Hmm but rumours were that GF9800GTX/GT are based on very high clocked G92 and G92 has 128SP not 192 so if there will be 192SP then it means it is no G92 :)

I don't completely understand why there can not be more than 128 shaders when there is already a 112 variant of the g92.
 
Hmm but rumours were that GF9800GTX/GT are based on very high clocked G92 and G92 has 128SP not 192 so if there will be 192SP then it means it is no G92 :)

The key phrase there being "rumors were" ;) There's too much FUD floating around right now, but the rumor du jour happens to suggest a 192SP/384-bit MC part.
 
I don't completely understand why there can not be more than 128 shaders when there is already a 112 variant of the g92.

Do you think it`s possible to do 192SP GPU with "only" ~750M transistors? If G80 has a 128SP with ~680M+NVIO outside chip. Maybe i`m wrong but i think if it really be 192SP/384-bit GPU it couldnt be G92. There must be another chip with more SP/greater mem interface size.
 
So in your opinion all these rumours about 192SP/384-bit are BS?? :)
I was talking about G90 =) Rumours of its existance are false. There is no such chip now.
As for the chip with 192 SPs -- it could be what G90 was supposed to be (before it was cancelled -- if it ever existed at all) or it will be a part of G10x line (or should i say G1xx? i'm not sure that the next NV's generation after G10x will be G11x, it might well be G2xx instead).
 
According to Fud

Hacks from TechConnect Magazine have managed to score a screenshot of the upcoming 9800 GX2. This is nothing new from the design side, as the card features the same reference design with a partner 's sticker covering the half of the top cooler cover. The card's clocks are far higher than 600/1500/2000 MHz for the GPU/shaders/memory which were reported earlier.

Anyway, this card comes with 1GB of GDDR3 memory that works at 2400MHz. We doubt this figure, as it sounds a tad too high.

News Source: http://www.fudzilla.com/index.php?option=com_content&task=view&id=5818&Itemid=1
Screenshots: http://www.tcmagazine.com/comments.php?id=18247&catid=2
 
The rumours on the GX2's frequency are funny at least; first they started at 550MHz, later it grew to 600MHz and today it's supposed to be at 660MHz. Can they finally make up their mind? LOL :D
 
With the sudden rumours of higher clocked GX2 from previously speculated specs of 600/1500/1000, could this be a new revision for G92? just like 8800Ultra and the A3 revision G80 core, allowing much higher clocks compared to the A2.

I thought the biggest issue this card faced was the thermal/power issue, yet they up the clocks? this doesn't really make any sense. We know that the GX2 was delayed but maybe they decided to tweak further to allow higher clocks while maintaing acceptable TDP?
 
Back
Top