Micron ships first GDDR3 to ATI and Nvidia.

BoardBonobo said:
What are the odds on something high powered arriving nefore the end of the year, that use GDDR3?

Source
NV40 should use that I guess (though honestly I don't expect it to really arrive this year, but what do I know). Possibly R9900?
Though I'm still trying to figure out what the difference between DDRII and GDDR3 really is.
The device runs at twice the speed at half the power of current solutions, the company said. The technology can provide an aggregate bandwidth of 6.4GB/s per device, achieved with a 1.6GB/s per pin data rate.
That should be 1.6Gbit/s per pin AFAIK - which would correspond nicely with all those announcement from last year which said 500-800Mhz GDDR3. This would also mean the 6.4GB/s per device figure is a 32bit-wide chip. Still, 800Mhz DDR would be nice, quite a bit more than the current record (500Mhz on the NV30Ultra, so it's not quite twice the speed), especially if the "half the power" part is true.
 
How funny !

Some time ago , when ATi anounced GDDR3 , nVIDIA and it's fanboys complained that the standard was just a simple derivation of GDDR2 . While I don't doubt that they sure are using those things called "transistors" when they make it and that K9 benefits from the experience brought by the making of K5 I surely think that nVIDIA should stop thinking that every trend in the video card industry was devloped by them .

Not even the T&L concept that was the major PR thing in the GF1 launch wasn't nVIDIA's thing but S3's .

GDDR3 evolves from DDR-II, but still has some pretty important differences. Firstly, GDDR3 makes use of a single-ended, unidirectional strobe that separates the reads and writes. DDR-II, by contrast, uses differential bi-directional strobes. Secondly, GDDR3 utilises a “pseudo-open drainâ€￾ interface technique that is based on voltage rather than current. This was done so that graphics chips can be compatible with DDR, DDR-II and GDDR3. Like DDR-II, GDDR3 interface uses 1.8-Volt SSTL. Such memory is better suited to point-to-point links used on graphics cards and allows GPU developers to reach the new performance and feature heights with their products.
 
David G. said:
Not even the T&L concept that was the major PR thing in the GF1 launch wasn't nVIDIA's thing but S3's .

What? Are you denying the existence of SGI? ;)
We love you, Infinite Reality! :) ( or was it another SGI part which introduced hardware T&L? )


Uttar
 
David G. said:
Not even the T&L concept that was the major PR thing in the GF1 launch wasn't nVIDIA's thing but S3's .

Er ... Savage 2000 & GeForce 256 was announced at the end of august 1999.

Savage 2000 was available in retail in december 1999
GeForce 256 was available in retail in october 1999

And finally ... even with the latest driver i'm not sure that the T&L Feature was activated on Savage 2000 (in feb-2000 it was not the case)

So yes ... T&L is a S3 thing :LOL:
 
Wonder if Nv will use it :/
Or if they'll stand by their statement and wave off GDDR3 as a lame copy of DDRII and only use DDRII :LOL:
 
Unit01 said:
Wonder if Nv will use it :/
Or if they'll stand by their statement and wave off GDDR3 as a lame copy of DDRII and only use DDRII :LOL:
Don't think they ever stated that. Sure they will use it.
 
Back
Top