NVIDIA: Beyond G80...

G84 as a 8800 ?!? ;)
Could this be the rumored 256bit variant of the 8600 GTS showing its little face disguised as an innocent mobile variant of the "8800" family ?

I think not, because G84 has more than enough bandwidht in relation to it's gpu-power.

It should be a higher clock, because there is much room from the 475/950/700Mhz of 8600M GT.

Maybe 8800M GTX is dual-G84?
 
I think not, because G84 has more than enough bandwidht in relation to it's gpu-power.

It should be a higher clock, because there is much room from the 475/950/700Mhz of 8600M GT.

Maybe 8800M GTX is dual-G84?

Actually, all the Geforce 8600 GTS reviews i've seen so far pretty much tell the same tale:
If it had a 256bit bus it would be a killer price/performance mainstream graphics card, but the current 128bit bus is severely limiting its potential, and even the high speed 2.0GHz GDDR3 memory chips can't make up for the narrow path.
 
Actually, all the Geforce 8600 GTS reviews i've seen so far pretty much tell the same tale:
If it had a 256bit bus it would be a killer price/performance mainstream graphics card, but the current 128bit bus is severely limiting its potential, and even the high speed 2.0GHz GDDR3 memory chips can't make up for the narrow path.

Read again what I wrote and think about it. ;)

The memory bandwidth is not G84's problem!
(Main problem atm are the 256MB in combination with der GF8-vram-bug)
 
Read again what I wrote and think about it. ;)

The memory bandwidth is not G84's problem!
(Main problem atm are the 256MB in combination with der GF8-vram-bug)

vram bug ? :???:
Also, there are 512MB 8600 GTS out there and the performance level is not any higher than a regular 256MB model until you bring the AA/AF/higher resolution into the equation, but that isn't any different than the 320MB/640MB 8800 GTS' problem. It's not a framebuffer limitation per se.

The problem with the 8600 GTS is that it has to try to compete against even the old Geforce 7900 GS ~ GT, all the while still clinging on to 6800 GT levels of memory bandwidth (which, as we know, is half as fast as a 7800 GT/7900 GS ~ 7900 GT).
 
22W+? Sounds quite fishy to me.. unless they meant it will draw more than 22W, which could point to alot of different power consumption numbers far beyond 22W.

Its weird how its named G84, yet its called 8800M (potentially a beefed up G84? or similiar specs to what G84 is now? hence 22W?). It would be interesting to see just how much stuff is cut down from the full fledged G80.

This is interesting because while nVIDIA has managed to pull out a mobile version of G80, ATi on the other hand has the R600 which isn't really renowned for its power efficiency.

Whats this vram bug?
 
The question that remains in my mind is to what degree GPU manufacturers are willing to differentiate their products specifically for the scientific computing customer. Attributes such as double precision floating point support, enhanced IEEE 754 compliance and low power consumption are not big concerns to NVIDIA's traditional gaming and visualization customers, but are important in technical computing environments. In addition, if GPUs are to be applied across a typical HPC system, like a cluster, they will need to be incorporated into individual server nodes. This requires relationships with a different set of hardware manufacturers, software vendors, and channels than NVIDIA has traditionally dealt with.

Not surprisingly, NVIDIA has been thinking about these issues as well and has apparently come to the conclusion that a separate HPC product line is required. Keane told me that the company is developing a "computing" product alongside its current Quadro and GeForce CUDA-compatible lines. The NVIDIA computing line -- as yet unnamed -- will be designed specifically for high performance computing applications, and will be targeted to both workstations and servers. The new devices will support double precision math, a basic requirement for many technical computing applications. Double precision support will make its first NVIDIA appearance at the end of Q4. At this point, it's not clear if NVIDIA's first double precision processor will be in a Quadro product or the new HPC offering.

http://www.hpcwire.com/hpc/1582455.html
 
At the end of the day, Intel is the one who gains more profit out of this.

Sure it doesnt come with a low price (im surprised nVIDIA actually made a deal with intel after all these years of being stubborn with the SLi feature) but to couple "SLi" with intel chipsets could create quite a stir in the enthusiast/performance sector.
 
At the end of the day, Intel is the one who gains more profit out of this.

Sure it doesnt come with a low price (im surprised nVIDIA actually made a deal with intel after all these years of being stubborn with the SLi feature) but to couple "SLi" with intel chipsets could create quite a stir in the enthusiast/performance sector.

They(probably) need to fill the void of demand that will appear as a consequence of the AMD/ATi Wedding. Intel sells a lot more mobos for Intel than nV does for Intel, giving those ppl the option to get at least 2 nV video cards if they skipped on the nV mobo is good business.
 
http://www.vr-zone.com/?i=5049

According to the graphics card makers, NVIDIA may be cutting prices across the board soon and advising some of their partners to keep their inventory low. GeForce 8800 Ultra should be receiving the biggest cut and price may fall to as low as US$599. 8800GTX and 8800 GTS prices may be lowered as well to better compete against the Radeon HD 2900XT. 8600GTS and 8600GT should receive some price cuts as well when HD 2600 and 2400 cards start to flood the market. It seems that NVIDIA is adopting the price cutting strategy to put down their competitor before they have any chance to recoup any market share.

This is seriously good news for us. Price cuts!

So the 8800ultra will be dropped to $599. How would this affect the rest of the lineup.

8800ultra $799 --> $599
8800GTX $599 --> $449?
8800GTS 640 $399 --> $299?
8800GTS 320 $299 --> $229~?
8600GTS $199~230--> $149?
8600GT $149 --> $99~?

Or im expecting too much from stubborn ol nVIDIA? :LOL:

However, if they want to keep inventory low, could this be an indication that something is on the way?
 
However, if they want to keep inventory low, could this be an indication that something is on the way?

Something is always on the way, it's just how long it will take to get to us. :)

And I wouldn't be surprised of NVIDIA dropping prices on 6 month old high end models. In the midrange the prices will probably depend a lot more on how exactly the new Radeons stack up against NVIDIAs offerings.
 
Back
Top