Rambus XDR in desktop graphics?

Geo

Mostly Harmless
Legend
http://www.xbitlabs.com/news/mmedia/display/20041228125957.html

Yes, I know it is in the Console section too, but it isn't the NV PS3 stuff I'm interested in here. Some pretty serious heavy-breathing by unnamed Rambus sources flogging XDR for desktop graphics parts. I suppose this isn't surprising. . .but might it actually happen? Does the pitifully slow development/availability of high-speed GDDR3 make this more likely?
 
Well, there are rumors of an "advanced memory interface" for R520 that might be a hint it could support XDR, tho Wavey seems set on this being some type of internal bus scheme rather than external from/to the memory. . .
 
RejZoR said:
RAMBUS was always expensive like hell... certanly won't be for mainstream...

How expensive can it be if Sony is going to put it in a console? And NV will have to do the work anyway to make such an interface to their new core, so it might make sense for them to leverage that R&D investment. . .
 
RDRAM or is it RDR RAM (well whatever goes with the i820 and i850 chipsets) is still very expensive even though it's used in 75mil PS2s.
 
RAMBUS is to expensive for the Vcards market. Micron dumped them because of the problems caused by Rambus. Would you like to pay $1000 and hour to speak with engineers from Rambus and get nothing that is needed from them. Micron showed them changes that would up the yield and seep it up so it would be profetable for micron to fab it but they justs shoved it back at Micron and threaten a lawsuit for even looking at changes to improve the IC. Rambus is just a company with patein (spelling) and has no fabs. Rambus produces nothing.
 
It all comes down to a combination of price and performance, if XDR manages to compete with DDR3/DDR4 in both those areas then I don't see why its use would be out of the question. It's pretty doubtful that will happen though.
 
sir doris said:
RDRAM or is it RDR RAM (well whatever goes with the i820 and i850 chipsets) is still very expensive even though it's used in 75mil PS2s.

You're talking apples and oranges here. DRDRAM in of itself certainly is not particulary expensive, Sony even uses the PS2 chipset in flatscreen TVs now. RIMMs aren't cheap though because nobody's using them anymore, they're obsolete and hard to come by. That drives prices up.

The memory chips themselves are just chips, not really any different than any other semiconductor device.
 
IIRC, the main problems with RAMBUS memory chips that have been driving costs up have been:
  • The need to have extremely high speed I/O logic on the same die as high-density DRAM cells - this combination requires both a larger die and more processing steps than what is the case for, say, a DDR RAM memory module.
  • Limited testability of individual memory chips - for RIMM modules, you could for a long time not do any testing of the RDRAM memory chips before they were placed on the RIMM, meaning that 1 bad chip on the RIMM would require you to discard the entire RIMM. This factor alone caused a 3-5x price penalty on RIMM-packaged RDRAMs over other RAM types. Unless this problem has been solved in XDR memory chips, no sane IHV is going to touch it.
 
arjan de lumens said:
. . .you could for a long time not do any testing of the RDRAM memory chips before they were placed on the RIMM, meaning that 1 bad chip on the RIMM would require you to discard the entire RIMM. This factor alone caused a 3-5x price penalty on RIMM-packaged RDRAMs over other RAM types.

:oops: And here I had bought in to the whole "insanely greedy" theory of Rambus pricing. . .
 
well, sure the rambus ram might be expensive but so is GDDR3...

But is it that much more expensive and how much faster is it? Do we need that extra bandwidth? Do we get extra latency?
 
As far as ATI is concerned I really doubt they will step away from GDDR, especially since they are actively involved in its progress and evolution.
 
arjan de lumens said:
Unless this problem has been solved in XDR memory chips, no sane IHV is going to touch it.
That's because DRDRAM memory devices are surface-mounted straight onto the RIMM PCB; there's no packaging around the die to my knowledge. If they'd instead stuck the dies to a small substrate first it would have made testing much easier.

I suspect though that testing equipment has evolved a lot in the years since though, seeing as flip-chip tech is so prevalent these days.
 
THe narrow memory bus that rambus would offer would be excellent for 3D graphics as it would make all this crossbar memory architecture jazz much less of a requirement. As for latency,a lot of the latency seen in the i820 era rambus rimms was caused by the implimentation rather than any flaws in the technology itself.

On a GPU the memory controller is going to be integratedand I suspect any latency is going to be comparable and possibly lower than standard DDR (dunno about GDDR) plus latency isn't a huge issue for GPU's anyway due to the level of pipelining in there.

I for one would like to see it in a graphics card based on my first point alone. The reason fancy 4 way split memory busses (256bit bus is instead 4x64 bit busses) exist is because of memory granularity issues across a 256bit bus. If you write 64 bits of information across that bus you waste 3/4 of your bandwidth, unless its split 4 ways of course. IIRC rambus is a 32bit wide interface running at a much higher clockspeed. Suddenly you dont need the massively complicated memory controller and cache queing systems on your GPU. Also, far, far fewer pins on your chip.

If thats not going to contribute to a reduced production cost and more real estate space for better GPU design then I dont know what will.
 
DRDRAM is actually not good at all for memory access granularity; it has a minimum burst length granularity of 8 transfers (=256 bits for a 32-bit wide interface), whereas e.g. DDR1 has a minimum of only 2. I suspect that XDR is even worse.

You might require fewer I/O pins for an RDRAM interface than a DDR interface of similar bandwidth, but you need a great deal more shielding and power/ground pins, so the pin count/packaging cost benefit isn't really that big at all.

OTOH, the latency for which RDRAM is so maligned, while important for CPUs, should be pretty much a non-issue with modern GPUs and their latency masking mechanisms. I would expect RDRAM to be in the same ballpark as DDR2 as far as latency is concerned. (For both DDR2 and RDRAM, there is substantial latency overhead in crossing clock domains between core and I/O, and neither supports critical-word-first bursts - DDR1 has neither of these problems)
 
IMO, Rambus is a company which should not be given money at all. I don't care how fabulous their latest tech is.
 
Back
Top