The G92 Architecture Rumours & Speculation Thread

Status
Not open for further replies.
If you go way back to early last year and the rumours that were then current and coming from the power supply people, it seems to me you wouldn't be shocked that the refreshes for the first Vista flagships weren't going to be performance barn burners. But then in the history of the industry refreshes have typically fit that profile anyway.

Except hasn't Nvidia commited to just a yearly release of a high end product? Basically holiday season each year?

If they are sticking with a High End, High End Refresh...Next High End, Refresh, etc. schedule - Would that mean that we'd then only be seeing a new product once every 2 years?

So is it your opinion then that we'll only see major improvements in speed/IQ every two years with just a minor refresh in between?

This is something I've been wondering about ever since I heard the change. Granted it's only a 6 month difference between the more traditional 9 month cycle for major product upgrades.

Regards,
SB
 
Except hasn't Nvidia commited to just a yearly release of a high end product? Basically holiday season each year?

If they are sticking with a High End, High End Refresh...Next High End, Refresh, etc. schedule - Would that mean that we'd then only be seeing a new product once every 2 years?

So is it your opinion then that we'll only see major improvements in speed/IQ every two years with just a minor refresh in between?

This is something I've been wondering about ever since I heard the change. Granted it's only a 6 month difference between the more traditional 9 month cycle for major product upgrades.

Regards,
SB

7900 GTX -> 7950 GX2 -> 8800 GTX
All that in 10 months.

This year they've launched a somewhat minor update with the 8800 Ultra, but there's no reason (yet) to suspect that they won't deliver a high-end part in November.
 
7900 GTX -> 7950 GX2 -> 8800 GTX
All that in 10 months.

This year they've launched a somewhat minor update with the 8800 Ultra, but there's no reason (yet) to suspect that they won't deliver a high-end part in November.

In which I strongly think it is the G92x2.
 
In which I strongly think it is the G92x2.

The G7x2 was something of a rush to beat the 580.
This is not the way to make a high performance card in my opinion if you have the luxury of one year to make a better part.

Duplicating your memory chips, halving your pcie bandwidth, double pcb, hard to cool so lower clock, doesn't sound right to me, think about it.
 
The G7x2 was something of a rush to beat the 580.
This is not the way to make a high performance card in my opinion if you have the luxury of one year to make a better part.

There is a much easier reason, noone write it about yet, manufacturing process problems (leakage high, yield low), ATi have many troubles with TSMC process technologies in the last ~2 years, its possible this time NV have troubles, the difference is NV have the time to fix everything and not need to paper launch highend card with 180watt power consuption.
 
What are the chances of Nvidia deliberately spreading a FUD campaign to hide their true plans... again ?
Remember that it was not until a few short weeks before launch that we knew that they would be doing the Unified Shader dance, despite Dave Kirk's public antics... ;)
 
What are the chances of Nvidia deliberately spreading a FUD campaign to hide their true plans... again ?

The wise man never rules out the possibility in this industry. Tho I think it somewhat more unlikely with a refresh than a new gen. But then depending on how you count G70, there was definitely some sleight of hand there too.
 
Nvidia probably doesn't need a new high end since ATI doesn't have one. Looks like things are slowing down, which really they have been for a couple years.

199/229 whatever for this new part sounds awesome though, if it can play Crysis decently.
 
What are the chances of Nvidia deliberately spreading a FUD campaign to hide their true plans... again ?
Remember that it was not until a few short weeks before launch that we knew that they would be doing the Unified Shader dance, despite Dave Kirk's public antics... ;)

Second time in a row? thats deserve Oscar trophy :smile:
 
So nvidia high midrange will likely be

8700GTS = 750Mhz, 64 shaders, 512MB 256bit 900MHz GDDR3 memory

8800 GTS = 500MHz, 96 shaders, 320MB 320bit 800MHz GDDR3 memory


Which is better? Do you plump for memory amount or memory bandwidth ??

I assume the 8700GTS will be cheaper, if so will the 8800GTS be phased out do you think?
 
So nvidia high midrange will likely be

8700GTS = 750Mhz, 64 shaders, 512MB 256bit 900MHz GDDR3 memory

8800 GTS = 500MHz, 96 shaders, 320MB 320bit 800MHz GDDR3 memory


Which is better? Do you plump for memory amount or memory bandwidth ??

I assume the 8700GTS will be cheaper, if so will the 8800GTS be phased out do you think?


But it is impossible for Nvidia to rule out the 2XG92 in 1 Tera Flops, providing the rumor of the G92 is only capable of 64SP.
 
Don't forget the ~50 gigatexel/s rumor, which would require the same texturing ability of G80, although at 750-800mhz.

It seems regardless of shaders, the ROP structure would still need to be there (if the rumor is true.)
 
Hi,

This was posted on another forum talking about the next high-end part...

"This November ( not in the movies ), new top dawg...single GPU ( no 7950GX2 look-alike )...768MB ( 1GB edition...highly unlikely for the first 3 months or so ) Samsung GDDR4-3200, 384bit BUS ( like G80 )...G80/G80+ SP's ( 128 or 128+ )...24xx MHz S.D. Frequency..."

That's all I can say"

What do you think?
:D
 
Status
Not open for further replies.
Back
Top