Samsung launches 256Mbit GDDR2

Their production part runs at 1.6 Gbit/pin? (Which I believe would then imply an actual speed of 800 Mhz, 1600Mhz "effective"?)

Or is that the top end they plan for this part?

Their product page currently tops out at 300 Mhz for 256 Mbit parts...
 
These RAMs are 8Mx32 configuration. That means a 256MB video card which uses 8 of them will have a 256 bits bus. Running at 800Mhz (1.25 ns) they'll provide more than 50GB/s bandwidth. Quite crazy I'd say.

Product Page
 
Can't remember exactly where, but that 800Mhz ("1600Mhz effective") ram was discussed earlier on that forum. What's new is that samsung announced production. Though, according to their product page, production status is "Engineering Sample" without a date of mass-availability.
Will we see this in the r420? Rumours say things like "2 times as fast as a R9800pro" and "basically r420 is 3*rv350 at higher clocks" which would mean ATI would also need roughly 2 times the memory bandwidth if the card should be 2 times as fast (unless BIG improvements in the bandwidth saving features are incorporated into r420). That would mean ~700Mhz (x2) on a 256bit bus (I just don't see a 512bit bus as a viable solution), and even if the ram speed would be a bit conservative it still looks like the currently available ram won't cut it (only goes up to 500Mhz at best).

edit: if you click on the product guide on that product page linked above, you can see that Engineering Samples were slated for 3Q03 back in June, with "CS" (whatever that is) and "MP" (I guess that's Mass Production) scheduled for 4Q03.

edit2: I think I had this discussion in mind - http://www.beyond3d.com/forum/viewtopic.php?t=6376 - however it was talking about micron, not samsung...
And I still don't know what exactly the difference between gddr2 and gddr3 is, but it can't be that much if you take that samsung quote:
samsung said:
Samsung’s 256Mb GDDR2 SDRAM also supports GDDR3, which awaits JEDEC standard approval next month.
 
mczak said:
That would mean ~700Mhz (x2) on a 256bit bus (I just don't see a 512bit bus as a viable solution), and even if the ram speed would be a bit conservative it still looks like the currently available ram won't cut it (only goes up to 500Mhz at best).

Agreed. I'd have to assume that this ram will be making its way into the high-end ATI and nVidia products next spring. If history is any guide, nVidia will go for the highest available, and ATI will be a bit more conservative. 700 Mhz GDDR2 for R420 sounds like reasonable speculation to me.

This also bodes well for the mainstream ($150-$200) market. I imagine board makers will start to transistion these 128 bit cards to the new memory densities to get 128 MB using 4 chips.

So we'd be looking at closing in on 20 GB/sec memory bandwidth here, without moving to the "expensive" 256 bit bus.
 
CS = Customer Samples, normally with production quality, for making demo hardwares
 
One question - how much heat will those puppies give off? The DDR2 memory currently around emits plenty of heat and if this stuff is running at a 60% higher clock I'd expect this to cause similar problems.

Any ideas?
 
Mariner said:
One question - how much heat will those puppies give off? The DDR2 memory currently around emits plenty of heat and if this stuff is running at a 60% higher clock I'd expect this to cause similar problems.

Any ideas?


I think these chips will be produced by different process technologies. I remember reading somewhere that Samsung will start to produce memory chips with .10um processes.
AFAIK currently available DDR2 chips are using .13 or .15um processes. Thats why they may be running hot.
I may be wrong of course. :?
 
Mariner said:
One question - how much heat will those puppies give off? The DDR2 memory currently around emits plenty of heat and if this stuff is running at a 60% higher clock I'd expect this to cause similar problems.

Any ideas?

Heat dissipation heavily depends on the vattage -> depends on clock... ;)
 
Mariner said:
One question - how much heat will those puppies give off? The DDR2 memory currently around emits plenty of heat and if this stuff is running at a 60% higher clock I'd expect this to cause similar problems.

Any ideas?
The quote for the micron gddr3 was "The device runs at twice the speed at half the power of current solutions", and I'd expect similar numbers for the samsung ram chips. Even keeping in mind these are marketing wattages, not real ones, I tend do believe it shouldn't produce more heat than current graphic card ram chips (which is still quite a lot).
 
T2k said:
Mariner said:
One question - how much heat will those puppies give off? The DDR2 memory currently around emits plenty of heat and if this stuff is running at a 60% higher clock I'd expect this to cause similar problems.

Any ideas?

Heat dissipation heavily depends on the vattage -> depends on clock... ;)
It depends more on the voltage.
 
I think you cannot give a real answer on this one. If you increase the frequency a little bit ( say 5 % ), the heat production wont increase mutch.(relative seen to the 5% ) When increasing the voltage, there is a instant gain in heat production. ( relative to the increase ).
When running mutch higher frequency's ( say 60% ) the heat production would be more than the 60%.
Thats my opinion, i cannot give the exact reason for that.
 
So does anyone know for sure which process these chips will be produced on i.e. 0.11, 0.10, 0.09 microns etc?

If we knew what process the current chips are produced on, I'm sure some of the clever folk on this board could guesstimate the probable increase or even decrease in power consumption between these and the new chips (and therefore heat output).
 
AAlcHemY said:
I think you cannot give a real answer on this one. If you increase the frequency a little bit ( say 5 % ), the heat production wont increase mutch.(relative seen to the 5% ) When increasing the voltage, there is a instant gain in heat production. ( relative to the increase ).
When running mutch higher frequency's ( say 60% ) the heat production would be more than the 60%.
Thats my opinion, i cannot give the exact reason for that.

No basically heat production is like this:

Heat=frequency*X + leakage.

In the good old days you could ignore leakage but with the smaller processes leakage has been an increasing part of the power consumption.

If you increase voltage by 5% the power will increase by 1.05^2 or 10.25%.
 
A little off topic, but i remember reading on theses forums some months ago that ATI bought from Samsung a lot of gdr 2 that Nvidia did not use for the 5800.

What happened to that memory ? Lost or on R360/RV360 cards ?
 
According to a recent paper I attended, E=C V^2 F

Part of the problem of achieving an increase in F is that I suspect you might have to increase V to get it... unless you can reduce C, I suppose.

Mind you, it's been a long time since I studied electronics.
 
Back
Top