DDR2 expected to be late for the party

Mulciber said:
martrox said:
I guess the point I'm trying to make here is that there are some problems with using DDrII, it's not a bandwidth cureall, as nVidia has found out. GDDrIII seems to deal with all of the problems with DDrII, as it is designed from the ground up for video memory, and it looks like it's availability will be in time for R400. The question I have is wether nVidia will swallow it's pride & use it.

Just because it's a "2" rather than a "1" doesn't mean it's better. In fact, I believe it's proving to be ill concieved, at least as video memory. Look at the hoops nVidia had to jump through just to use it.... 12 layer PCB & I'm sure the craxy FX Flow is there for the memory, too. And look at all here that though ATI was crazy to go to a 10 layer PCB. (yes, I know it's an 8 layer now, but was rumored to be 10 originally)

once again, since you weren't able to answer any of these questions in a related thread.

do you know the actuall difference in latency between 400mhz ddr and 400mhz ddrII? do you know how it will affect a deeply pipelined parrallel vpu? do you know the thermal specifications of ddr vs ddrII at 400mhz respectively?

and lastly, why do you feel that ddrII at 400mhz would require more pcb layers than regular ddr at 400mhz? (keep in mind, ati has an 8 layer pcb using ddr @ ~300mhz and nvidia a 12 layer pcb @ 500mhz. neither of these are going to support your arguement on a 1:1 basis, because in the event that ati uses ddrII, it could very well be 400mhz which has much lower thermal characteristics than 500mhz ddrii. and the converse would probably be true for 400mhz ddr, which would probably have slightly higher thermal characteristics than ddr @ 300mhz) Not only this, but it took ATI around 4 or 5 months to get a quality 8 layer pcb that would not have signaling problems for 320mhz ddr. so its very possible that in a few months, 500mhz ddrII will be able to fit on a revamped 10 layer pcb for nVidia.

im guessing you wont be able to answer any of these, but I could very well be wrong. you certainly speak as if you know the answers though.

Mulciber, Jeez, you like bustin my cahonies!

First of all, cost. Here is a quote from our own CMKRNL(In a thread I started!!) who has proven in the past to know what he is talking about:

Price for a similar configuration at the same clockspeeds (eg. 350Mhz) is approximately 35% more for DDR-2 over DDR-1. This delta goes up by a fair bit as the rating of the memory goes up. I don't have an exact number, but my guess is that 500Mhz DDR-2 is roughly 50% more than 350Mhz DDR-1.

The memory represents a very significant portion of the BOM costs for any given board, particularly with DDR-2. After that comes the cost of the ASIC and in the case of NV30-500, the cooling mechanism.

On the latency issue, check out this article.
http://www.lostcircuits.com/memory/eddr/
While it is talking about Memory from PC’s, I’m pretty sure that most of what they are saying will also apply to DDr II on video cards. If not, please explain the differences

On the PCB issue, check this article out:
http://news.zdnet.co.uk/cgi-bin/uk/printer_friendly.cgi?id=2118972
Here is a direct quote from it:

Line voltages are also much lower in DDR-II systems than is necessary in current memory standards -- down from 2.5v to 1.8v -- meaning that
signals are much more susceptible to interference and timing issues.

Lower line voltages are necessary to make the memory interface better
with logic chipsets, according to Lee.

On the heating issue, I’m sorry, but I cannot find the source of where I got this, and I could be wrong here. But I’m sure I read it somewhere ;) ………hehe! And, if you look at the incredible heatsinks (front & back) of the GFFX, just why would you cover all of the memory if it wasn’t hot? It would seem to me that all you would be doing is ADDING heat from the GPU, and just why would you want to do this?

Now, I’m not saying things might not change in the future. But banking on leading edge technology right now is what got the GFFX where it is today. And remember, while the GFFX is a highend card, the NV31/34 are not. And ATI has a track record of NOT taking the bleeding edge technology road. DDrII will come down in price, will hopefully be less hot – due to less voltage. But it looks like both the latency & PCB issues are here to stay.
 
samsung does make system memory, and if they are trying to make memory for graphics cards, the memory that doesn't make it, could be used as system memory for one, and for two it still would give them practice making DDRII, and gettting their yeilds up anyway, so I think that mulciber had a valid point. (that practice makes perfect)
 
I won't deny that's a valid point. But it's a point that has no bearing on my post, at least right now. There's a huge difference between what you want and what is......
 
Back
Top