GDDR-3 will appear when? Balancing GPU vs memory capability

g__day

Regular
Its seems Micron may be the slowest to market with an expected Q2 2004 general availability in volume. Hynix are thought to be ahead - but I am unsure if they have sufficient volume for a NV40 or more likely R420 version to appear with GDDR3.

I understand GDDR3 offers about twice the bandwidth of GDDR2 and consumes much less power - possibly meaning you don't need and external power connector for these cards. So if you're designing a high end card - you are probably extremely interested during your design phase will you have a memory system that can reliably deliver 40GB/sec or closer to 80GB/sec. Not knowing which one you have to play with could make your design choices to hit a balanced design alot tougher. Does anyone have more insights on this matter?

Secondly does anyone have news/insights on whether we could expect to see GDDR3 appearing on the next round of top end videocards or will they come out 6 months later?
 
All I seem to remember about GDDR3 is that Samsung was poo-pooing it for some reason.
 
Ratchet said:
All I seem to remember about GDDR3 is that Samsung was poo-pooing it for some reason.

I thought that they were unhappy that ATi developed it because they had just finished working with nvidia in the development of DDR2. They were expecting DDR2 to be the next standard in graphics memory and were in the stages of implementing that when ATi announced the standard of GDDR 3 which really circumvented the whole road map of putting DDR 2 in the mainstream for graphics memory. I think Samsung had big plans for DDR 2 up until that point. The development of GDDR 3 also put ATi on the map in terms of being able to collaborate in the creation of a memory standard and it is regarded as a driving force in the development of memory standards. Samsung, I suspect, was forced to redraw their memory supply roadmap possibly even left with a large quantity of DDR 2 in inventory that had a significantly reduced market. GDDR 3 said to the PC industry that there was more then one graphics chip designer in the world especially where you consider that DDR 2 was the baby of the collaboration between Samsung and nvidia IIRC. GDDR 3 made that whole implement a wasted effort.
 
Sabastian said:
Ratchet said:
All I seem to remember about GDDR3 is that Samsung was poo-pooing it for some reason.

I thought that they were unhappy that ATi developed it because they had just finished working with nvidia in the development of DDR2. They were expecting DDR2 to be the next standard in graphics memory and were in the stages of implementing that when ATi announced the standard of GDDR 3 which really circumvented the whole road map of putting DDR 2 in the mainstream for graphics memory. I think Samsung had big plans for DDR 2 up until that point. The development of GDDR 3 also put ATi on the map in terms of being able to collaborate in the creation of a memory standard and it is regarded as a driving force in the development of memory standards. Samsung, I suspect, was forced to redraw their memory supply roadmap possibly even left with a large quantity of DDR 2 in inventory that had a significantly reduced market. GDDR 3 said to the PC industry that there was more then one graphics chip designer in the world especially where you consider that DDR 2 was the baby of the collaboration between Samsung and nvidia IIRC. GDDR 3 made that whole implement a wasted effort.
yeah that sounds about right.
 
I understand GDDR3 offers about twice the bandwidth of GDDR2 ...

I doubt that's technically correct and before I shoot my own foot can someone set this minor tidbit into perspective?
 
maybe because GDDR3 uses less voltage, you can go relatively KRAZY with the voltage and get some absurd bandwidth?
 
We've had 500MHz DDR-2 for quite some time now, but 1GHz DDR-3 isn't even close. Of course, if you take memory ICs that offer a twice as wide data path ...
 
Ailuros said:
I understand GDDR3 offers about twice the bandwidth of GDDR2 ...

I doubt that's technically correct and before I shoot my own foot can someone set this minor tidbit into perspective?

It has been sometime since I read about it. From what I can remember it was something in the range of 20- 30% faster overall. I might be wrong but …

EDIT: IIRC GDDR 3 is DDR 2 with some sort of enhancement that reduces graphics memory latency.
 
Pardon me but that doesn't mean in my book that it doubles memory bandwidth, when refering to it on paper specs.

Assume you have 500MHz GDDR2 on a 256bit wide bus, which equals to 32GB/sec peak memory bandwidth; while decreased latency increases logicaly efficiency overall for GDDR3 I don't think that claiming that it has technically 64GB/sec peak memory bandwidth under the same conditionals, would be correct.
 
"Micron’s Executive Director of Advanced Technology and Strategic Marketing for Micron’s Computing and Consumer Group said today that GDDR3 operates at 50% higher data rates while consuming approximately half the power of graphics DDR-II. The company believes the first applications to adopt GDDR3 will be high-end graphics, gaming markets and high-speed networking."

http://www.xbitlabs.com/news/memory/display/20030611140159.html
 
Less power to achieve 50% - 100% more speed over GDDR2 according to who you read, such as:

http://www.micron.com/news/product/2003-06-11_27457.html

Boise, Idaho, June 11, 2003 -- Micron Technology, Inc., has delivered the industry’s first graphics double data rate SDRAM (GDDR3) components to ATI Technologies, Inc. and NVIDIA Corporation. GDDR3 targets high-speed point-to-point applications such as high-end PC graphics and gaming platforms requiring ultra fast data rates. This breakthrough technology can provide an aggregate bandwidth of 6.4GB per second per device, achieved with a 1.6Gb per second per pin data rate. GDDR3 is the fastest memory device available today. Fabricated on Micron's leading-edge 0.11µm process technology, GDDR3 provides the highest performance in both per-pin bandwidth and aggregate bandwidth.

“Speed and power are two critical elements that can make or break high-speed graphics designs,â€￾ said Terry Lee, Executive Director of Advanced Technology and Strategic Marketing for Micron’s Computing and Consumer Group. “GDDR3 operates at 50 percent higher data rates while consuming approximately half the power of graphics DDR2. With these speed and power advantages, we believe the first applications to adopt GDDR3 will be high-end graphics, gaming markets and high-speed networking.â€￾

"ATI worked closely with Micron to define and develop GDDR3 as part of an industry-wide initiative, and we commend Micron for being first to sample GDDR3 devices," said Joe Macri, Director of Technology, ATI Technologies Inc, and Chairman, JC42.3 DRAM Committee, JEDEC.

"We are pleased to have the first samples of GDDR3 devices to test with our latest generation graphics processor," said Bryn Young, Director of Memory Marketing, NVIDIA Corporation. "GDDR3 provides tremendous performance gains that will raise the performance bar of next-generation PC graphics."

see also

http://www.ati.com/companyinfo/press/2002/4544.html

http://www.eetimes.com/semi/news/OEG20030321S0037

GDDR3 will consume half the power of GDDR2 and operate up to 50 percent faster, according to Terry Lee, executive director of advanced technology and strategic marketing at Micron, Boise, Idaho

http://www.google.com.au/search?q=c...com/pdf/flyers/GDDR3.pdf+GDDR3&hl=en&ie=UTF-8


http://www.theinquirer.net/?article=5734
 
g__day said:
"GDDR3 operates at 50 percent higher data rates while consuming approximately half the power of graphics DDR2."

Does that mean that GDDR3 at 150% speed consumes half the power of DDR2 at 100% speed, or half the power of DDR2 at 150% speed?
 
I take it as saying GDDR is both alot faster and uses less current than DDR2 - so its a double win with no apparent downsides.
 
g__day said:
I take it as saying GDDR is both alot faster and uses less current than DDR2 - so its a double win with no apparent downsides.

Ahhh grasshopper there is one downside...I dont have a card that has it yet :) hehehehe

Thanks for the info. Looks good!
 
GDDR-3 Simply increases the Clock speed *capable* over GDDR-II by about 40%. not much different other than the little "Wiz-Bang" engineering changes needed to make that happen.

(Above explanation is the no-Techno version minus all the frills)
 
I seem to remember reading that GDDR3 draws so much less current that a board based on it - say if it was R420 - wouldn't need any external power connectors. So if you see a picture of NV40 or R420 and it doesn't have a molex connector - you can guess its using GDDR3.

Half of my original question was if you're designing your next generation GPU and you not sure whether your memory will be DDR2 or say a 50% faster GDDR3 - how do you strike the right balance so your GPU is neither flooded with data or starved for it most of the time?

Having an unknown of that size would be troublesome I would expect. Yet we have heard little speculation on the readiness of GDDR3 up to now. Could maybe R420 be DDR2 and R423 be GDDR3 even?

Wondering, wondering... BTW Hellbinders explanation is what I considered to be basically correct. I am unsure whether GDDR3 has slightly higher latencies - I'll look through that technical link - but I'd guess it would have to have much improved capitance over DDR2 to switch that fast.
 
g__day said:
I seem to remember reading that GDDR3 draws so much less current that a board based on it - say if it was R420 - wouldn't need any external power connectors.

That's unlikely (even with the 75W PCI-Ex allows), considering how much power the ASIC alone will draw.

MuFu.
 
Impressive - and now that ATi have demoed R420 with GDDR3 I think we get to see an inkling of a very rosy and fast future.

These past few weeks leading up to R420 and NV40 appearing for review at least within 5 weeks has been an agonysingly long wait!
 
Back
Top