WTF is DDR-II doing as system RAM?

ricercar

Newcomer
So DDR-II is coming for system memory ... and I simply do not understand.

I know that DDR-II has greater bandwidth potential at the cost of higher latency. So, DDR-II has frequency headroom for nextgen gighertz CPUs ... when they eventually arrive. AMD says they will not support DDR-II until 2006, and Intel will remain stagnant in CPU clockspeed until they can implement the IP they traded NVIDIA for the P4 FSB license--anywhere from 6-18 months from the day they inked that deal, depending on how much Intel had already infringed on the NVIDIA patents.

Yet DDR-II on graphics card was a complete bust: expensive, and way too hot for the performance. NVIDIA and ATI moved directly to DDR3 without much of a stop at DDR-II. Anyone remember the FX 5700/5800 or 9800Pro models with DDR-II? You can't replace the factory RAM cooler with anything except a waterblock, because DDR-II RAMs get too hot. Consumer aftermarket aircooled RAMsinks don't have anough radiating surface to keep DDR-II memories within spec--at least none I tried.

So WTF is this with system RAMs going DDR-II? With economy of scale the expense differential of course will disappear, but it still seems like using DDR-II as main memory (instead of DDR3) is beneficial only to the aftermarket cooling scene, because system memory module coolers will actually become necessary.
 
I'm not sure if the currently used DDR2 has anything to do wich the memory nvidia used on the 5800's.
Actually ATI never used GDDRII on the 9800 PRO afaik.
Nvidia used a sort of GDDR2 that was not JEDEC approved at all. It was some unique memory just for that card.

I don't think the DDR2 used on intel platforms gets that hot.
 
ricercar said:
Yet DDR-II on graphics card was a complete bust: expensive, and way too hot for the performance.

Thereare no reason that DDR2 should be more expensive in the long run, DDR2 is not more expensive to produce than DDR(1) (actually DDR2 should be slightly cheaper). And if DDR2 is way to hot DDR(1) is way way to hot - DDR2 runs quite a bit cooler than DDR(1).

NVIDIA and ATI moved directly to DDR3 without much of a stop at DDR-II.

GDDR3 |= DDR3 - GDDR3 is based on DDR2 noone has moved to DDR3 yet.

Anyone remember the FX 5700/5800 or 9800Pro models with DDR-II? You can't replace the factory RAM cooler with anything except a waterblock, because DDR-II RAMs get too hot. Consumer aftermarket aircooled RAMsinks don't have anough radiating surface to keep DDR-II memories within spec--at least none I tried.

Does not change the fact DDR2 runs cooler than DDR.

So WTF is this with system RAMs going DDR-II? With economy of scale the expense differential of course will disappear, but it still seems like using DDR-II as main memory (instead of DDR3) is beneficial only to the aftermarket cooling scene, because system memory module coolers will actually become necessary.

There are no DDR3 RAM the standard does not exist, heat is not a problem with DDR2 at the speed that system RAM runs.
 
Thanks for the input, all.

mustrum said:
Actually ATI never used GDDRII on the 9800 PRO afaik.
FYI they did. I have first hand experience with a 256M DDR-II version of the Radeon 9800 Pro.
 
ricercar said:
So DDR-II is coming for system memory ... and I simply do not understand.

I know that DDR-II has greater bandwidth potential at the cost of higher latency. So, DDR-II has frequency headroom for nextgen gighertz CPUs ... when they eventually arrive. AMD says they will not support DDR-II until 2006, and Intel will remain stagnant in CPU clockspeed until they can implement the IP they traded NVIDIA for the P4 FSB license--anywhere from 6-18 months from the day they inked that deal, depending on how much Intel had already infringed on the NVIDIA patents.

Yet DDR-II on graphics card was a complete bust: expensive, and way too hot for the performance. NVIDIA and ATI moved directly to DDR3 without much of a stop at DDR-II. Anyone remember the FX 5700/5800 or 9800Pro models with DDR-II? You can't replace the factory RAM cooler with anything except a waterblock, because DDR-II RAMs get too hot. Consumer aftermarket aircooled RAMsinks don't have anough radiating surface to keep DDR-II memories within spec--at least none I tried.

So WTF is this with system RAMs going DDR-II? With economy of scale the expense differential of course will disappear, but it still seems like using DDR-II as main memory (instead of DDR3) is beneficial only to the aftermarket cooling scene, because system memory module coolers will actually become necessary.

I can categorically state that you don't have a fing clue. But thanks for playing, maybe next week you can find out what happens when you try to blowdry your hair while taking a shower.
 
kyleb said:
Damn man, did he shit in your Cheerios or something?

Doesn't understand that DDRx and GDDRx have little to no relation to each other.

I know that DDR-II has greater bandwidth potential at the cost of higher latency.

This is incorrect.

ntel will remain stagnant in CPU clockspeed until they can implement the IP they traded NVIDIA for the P4 FSB license
Don't know who's ass he pulled this BS out of.

NVIDIA and ATI moved directly to DDR3 without much of a stop at DDR-II.
Neither Nvidia Nor ATI ever used DDR-II. DDR3 doesn't even exist yet as a complete specification.

In short, he doesn't know his ass from his mouth and is spouting pure BS. And I'm sick and not in a cheery mood, besides I don't in general stand for BS that is correctable with the bare minimum of knowledge. For his own bloody benefit, he should at least have a basic understanding of the crap he's spewing.

Out of 4 paragraphs, the only one that is correct is the first one. He obviously just doesn't understand just about everything.

Aaron Spink
speaking for myself inc.
 
ricercar said:
So WTF is this with system RAMs going DDR-II? With economy of scale the expense differential of course will disappear, but it still seems like using DDR-II as main memory (instead of DDR3) is beneficial only to the aftermarket cooling scene, because system memory module coolers will actually become necessary.

For the average desktop user, DDR2 isn't compelling right now.

If one is working with servers or tasks that require high memory capacity, however, DDR2 makes a lot more sense.

The electrical characteristics of DDR2 signalling are much better than DDR--which is straining to maintain high clock speeds with sane power numbers or decent capacities.

DDR400 at JEDEC spec can only be used at up to 2 DIMMS per memory channel. DDR2 at even higher clocks can double the number of devices per channel, and do it at much lower power.

Most of these advantages don't matter to the desktop crowd that are happy with a gig or less or memory.
 
The Baron said:
The 9800XT did not. Only the 256 meg 9800 Pro did.
And even then only the initial batches shipped with GDDR2 (supposedly left over from nV's FX 5800 run). AFAIK, current 256MB 9800Ps use plain old (G?)DDR.
 
aaronspink said:
Neither Nvidia Nor ATI ever used DDR-II. DDR3 doesn't even exist yet as a complete specification.

Huh, I was under the impression that Nvidia used DDR-II on the 5800u and ATI on the 9800pro with 256mb; at least that is what I gathered from Dave's charts. Am I missing something?
 
aaronspink said:
Out of 4 paragraphs, the only one that is correct is the first one. He obviously just doesn't understand just about everything.
That's still no reason to go off on an obnoxious-asshole rant on the guy. Most people don't know all that much about stuff anyway - and that certainly includes you. If everybody started ripping into each other as soon as they stumble upon someone with lesser knowledge than themselves in some field, we'd all be in a foul mood before long.

*edit: On a sidenote, the 200MHz clockspeed DDR2 in my box is completely cool to the touch, doesn't even have a (fairly useless) heatspreader on it. One must remember that the DDR2 used on NV30 ran at 500MHz - 250% faster. Even at 266MHz, the current fastest official spec, heat should be minimal.
 
Guden Oden said:
aaronspink said:
Out of 4 paragraphs, the only one that is correct is the first one. He obviously just doesn't understand just about everything.
That's still no reason to go off on an obnoxious-asshole rant on the guy. Most people don't know all that much about stuff anyway - and that certainly includes you. If everybody started ripping into each other as soon as they stumble upon someone with lesser knowledge than themselves in some field, we'd all be in a foul mood before long.


I'm sorry but i HAD to laugh when i read that.... Coming from you (see: the moodiest crankiest constantly nagging bitch around here) it's totally hilarious!!! :devilish:




;)
 
Well people who don't know anything shouldn't just come onto a board and state things that just aren't true and start bitching about said untruths especially without doing some research.
 
PC-Engine said:
Well people who don't know anything shouldn't just come onto a board and state things that just aren't true and start bitching about said untruths especially without doing some research.

You mean like most people (esp. in the Politics forum).. :p

And to most people "some research" is 30 minutes with google.
 
Back
Top