RDRAM inside the N64

msxyz

Newcomer
Following the purchase of an almost immaculate N64, I've renewed my interest in this little console. We know a lot about the CPU and the RCP. The Rambus memories in it, however remains a rather undeveloped subject. Even in the retro-modding forums all the info I could gather is that they are not tolerant to overclock and that, apparently, the N64 can 'recognize' up to 16Mbytes automatically at boot.

Apparently, Nintendo used in the N64 an early version of Rambus RDRAM made by NEC: the uPD488170L. It's a 16+2 Mbit chip with a 9 bit datapath, 250MHz double data rate bus. There were two of these chips in a N64 and they were daisy chained. (On a side note, I found out that there was even an earlier version of this chip with a voltage supply of 5V and realized in 1992!). 16/18Mbit RDRAM was all you could buy in 1996/7 and they were used also in some VGAs of the period (Cirrus Logic Laguna, a version of Chromatic Research MPACT!, SGI workstation cards...).

Some N64 units had these chips marked as "Nintendo RAMBUS18-NUS" but there are also motherboards using chips with the true name of the manufacturer (NEC) and the product code.

Sometime in 1998, Nintendo switched to a single 32/36 Mbit chip (the extra 'Mbits' are available for parity). One of these chips was also included in the 'expansion pak', the little plastic cartridge that was required to use 'hi res' graphics (640x480 vs 320x240) in certain games. Unlike the early 16/18Mbits memories, there is absolutely no info about this chip. The info I could gather is that these appear to be tolerant to overclock possibly being PC600 class. All the major RDRAM manufacturers started to supply PC600 64/72Mbits chips realized with a 0.25u process in 1998 but I couldn't find a single source or datasheet on a 32/36mbit variety. These chips still shows up from time to time from Chinese sources with Nintendo markings. Are they leftovers or removed from board? The fact that some surplus stock still exists after all these years is fascinating... Anyway, some people successfully replaced the two 16/18Mbit chips inside the early units with a pair of these to have an 8MB console. Adding an expansion pak brings the total amount of memory to 12MB and the memory controller is still able to recognize all of them (some say up to 16MB are recognized by virtue of using certain expansion paks with room for 2 chips). Of course no game uses more than 8MB at a time, but it's a nice discovery that the N64 could have been developed into an even more powerful console. Image in it had launched with 16MB of RAM! Cartrdige space would have still been a problem but the extra memory could have been used for procedurally generated content (ie pre computed animations... the geometry processor inside the RCP sometimes was not able to cope with everything being thrown at it).

By the way, in somebody finds a datasheet for this chip, it would be nice of him to share it with me! They are marked RDRAM36-NUS 9937KU653 but this is just a Nintendo internal code, so it's not possible to determine the true manufacturer of said chip.
 
These chips still shows up from time to time from Chinese sources with Nintendo markings. Are they leftovers or removed from board?
It's probably surplus inventory from the manufacture of the iQue Player, a - IIRC - China-only handheld console using the guts of the N64 together with flash carts for games and - I believe - hooking up to a TV for playing games.

Interesting sidenote: I had an MPact II-based VGA card with 8MB of 600MHz DRDRAM (two chips.) It was pretty abysmal... I bought it to watch DVDs with since it featured hardware decoding (well, software really, running on the onboard processor), but the DVD player would instantly bluescreen when the card was used in my PC, a socket-7/AMD K6-3 AGP-equipped board. VIA chipsets were so unbelievably shit... :p

3D acceleration was also very hit and miss. Some games ran okay, most did not, and those that did run generally performed very poorly. I did play through the PC version of Final Fantasy 7 on this board. World map chugged at times and some battle animations were jerky, but as framerate wasn't important to winning it did not matter. ;)
 
Last edited:
MPACT is an almost general purpose VLIW processor that they wanted to make do everything and of course it wasn't great at everything. And writing all of that support software was an impossible burden. It's a 3D, GUI accelerating, DVD processing, VGA sound chip. Heh. I think it is great at MPEG2 though.

Cirrus Logic Laguna 3D was more useless.

N64's RDRAM had gobs of bandwidth for the time but the memory controller was pretty bad and access latency crippling.
 
Last edited:
It's probably surplus inventory from the manufacture of the iQue Player, a - IIRC - China-only handheld console using the guts of the N64 together with flash carts for games and - I believe - hooking up to a TV for playing games.

Interesting sidenote: I had an MPact II-based VGA card with 8MB of 600MHz DRDRAM (two chips.) It was pretty abysmal... I bought it to watch DVDs with since it featured hardware decoding (well, software really, running on the onboard processor), but the DVD player would instantly bluescreen when the card was used in my PC, a socket-7/AMD K6-3 AGP-equipped board. VIA chipsets were so unbelievably shit... :p

3D acceleration was also very hit and miss. Some games ran okay, most did not, and those that did run generally performed very poorly. I did play through the PC version of Final Fantasy 7 on this board. World map chugged at times and some battle animations were jerky, but as framerate wasn't important to winning it did not matter. ;)
Do you still have it? VGA with MPACT! must be pretty rare nowadays...

MPACT is an almost general purpose VLIW processor that they wanted to make do everything and of course it wasn't great at everything. And writing all of that support software was an impossible burden. It's a 3D, GUI accelerating, DVD processing, VGA sound chip. Heh. I think it is great at MPEG2 though.

Cirrus Logic Laguna 3D was more useless.

N64's RDRAM had gobs of bandwidth for the time but the memory controller was pretty bad and access latency crippling.
Rambus also designed a memory controller using standard libraries to facilitate adoption over a broad range of devices and using different fabs. I don't know if SGI, to save time, used this so called 'Rambus cell' in the RCP resulting in a less than optimal arrangement. On top of that, the VR4300 was a low cost design with a shared address and data bus and had to fight with the graphic processor for bandwidth. Rambus design, at least for the time, was elegant: 13 data lines were all that it was required to have a 500MB/s bus. Owing to that, and to the integration of all functions into the RCP, the N64 motherboard had only two layers. RDRAM was a competitive design in the second half of the '90s and it could have played a bigger role in applications where high bandwidth and low complexity is a plus.
 
Do you still have it? VGA with MPACT! must be pretty rare nowadays...
Nawp. Threw it away years ago. Well, sent it to recycling actually... I have no room for old crud like that, and no motherboard with AGP slot on it to house it in anyhow. Its driver only works with windows 98, possibly ME IIRC, and like I said, it was slow as hell in 3D. A 64-bit Rage128 Pro board wiped the floor with it, and I was actually dumb enough to buy a card with a crippled, half-width bus... :)

RDRAM was a competitive design in the second half of the '90s and it could have played a bigger role in applications where high bandwidth and low complexity is a plus.
It had high bandwidth on paper, but its ludicrously high latency destroyed any chance of actually using it in a real-world scenario, especially in a unified memory architecture like N64 where you have a large number of memory accesses in flight at any one time. I don't think I've seen any comprehensive technical explanations of why that is, but I doubt the memory controller was the real reason. IIRC, RDRAM had a substantial burst length, and - I believe - also interleaved data, address and command pins, both of which probably contributed to this problem.

In fact, memory performance was so bad on N64 that some games would not use Z-buffering to keep up framerates... :p
 
It had high bandwidth on paper, but its ludicrously high latency destroyed any chance of actually using it in a real-world scenario, especially in a unified memory architecture like N64 where you have a large number of memory accesses in flight at any one time. I don't think I've seen any comprehensive technical explanations of why that is, but I doubt the memory controller was the real reason. IIRC, RDRAM had a substantial burst length, and - I believe - also interleaved data, address and command pins, both of which probably contributed to this problem.

In fact, memory performance was so bad on N64 that some games would not use Z-buffering to keep up framerates... :p
I've read several Rambus datasheets and there's nothing on paper that points to above average latency. In fact, Direct Rambus (the first type of RDRAM commercialized, used in SGI workstations and in the N64) includes separate RAS and CAS resources. Theoretically, you could do a framebuffer write and z-bank activation for the next fragment (located somewhere outside the current active page) to be performed almost simultaneously reducing the typical DRAM turn-around time. Of course, this kind of overlapped operation involves having your I/O logic designed around Rambus unique features. If you simply reuse a circuit designed to operate with SDRAMs or SGRAMs and slap on top of it a Rambus interface (most likely what Intel did with its 820 chipset :devilish: ), you're not going to use it to its fullest, and you're possibly making things worse by adding an extra layer of logic.

I don't have direct experience with Rambus but, at least on paper, it looked like a sound solution... Find me another bus in 1992 that could move data at up to 500MB/s using 13 signal lines (20 if you count shared voltage references and grounds). Of course it's a byte serial interface running at 250MHz double data rate (I believe the first chips where done at 0.65u nodes while those manufactured by NEC for the N64 are 0.45u) so it might have been more complex at the time. Not that designing a wide SDRAM interface is easier. Unlike classic page mode dram, which is a 'dumb' array of cells that can be used in many 'creative' ways (ie Atari ST, Amiga and all those early shared memory, interleaved designs), SDRAM has a lot of control logic onboard. The real advantage is probably a more relaxed timing and less critical operating frequency (When the first 4ns, 250MHz DDR arrived on the market? I think a whole 8 years later Toshiba and NEC sampled the first PC500 RDRAM), a separate address bus (but the pincount and board layers go up...) and of course a more industry friendly approach :devilish:.
 
Direct rambus isn't what you have in the N64. It is what PS2 and PCs used, tho.

Can't really recall when DDR was first introduced, but it was some time after DRDRAM had already crashed and burned on the PC, so thus after first-generation pentium 4 came out. I suspect Wikipedia could help out here, but I'm too disinterested to look it up TBH. I'm not much of a hardware nostalgia guy - well, not about PCs anyway. I mostly see old PCs as obsolete crap. :p Now, classic 80s home computers or arcade games tho - now THAT is a different matter entirely! :D
 
Direct rambus isn't what you have in the N64. It is what PS2 and PCs used, tho.

Can't really recall when DDR was first introduced, but it was some time after DRDRAM had already crashed and burned on the PC, so thus after first-generation pentium 4 came out. I suspect Wikipedia could help out here, but I'm too disinterested to look it up TBH. I'm not much of a hardware nostalgia guy - well, not about PCs anyway. I mostly see old PCs as obsolete crap. :p Now, classic 80s home computers or arcade games tho - now THAT is a different matter entirely! :D
You're right. Direct Rambus was the third iteration after Rambus (no other suffix) and Concurrent Rambus :oops:

I've a copy of NEC datasheet dated October 1992 where the memory is simply called Rambus. I believe the second and third generation introduced a 16/18 bit bus while the first standard (used in N64) stayed on a 8/9 bit bus.
 
After their bad experience with RAMBUS RAM, Nintendo later went with 1T-Sram which was a great improvement. They should keep it for the NX.
 
1T Sram is obsolete tech today. The company behind it went bankrupt ages ago IIRC. DDR4 or GDDR5 in particular is vastly faster than 1T Sram ever was.
 
1T Sram is obsolete tech today. The company behind it went bankrupt ages ago IIRC. DDR4 or GDDR5 in particular is vastly faster than 1T Sram ever was.
Mosys are still in business and are still licensing their technology out. And I'm sure the modern implementation of 1T-SRAM is still more efficient than GDDR5. The XBox One uses 6T-SRAM variant as super fast embedded SRAM to make up for it's slower general purpose RAM compared to PS4. So the the technology is not quite obsolete. SRAM still has it's advantages.
 
And I'm sure the modern implementation of 1T-SRAM is still more efficient than GDDR5.
Quite possibly. Doesn't matter if it's more efficient though if it can't deliver the data rates needed for modern high-power GPUs tho, at a price that is affordable. 1T-SRAM isn't SRAM of course, but DRAM with some SRAM buffers and some clever prefetching and whatnot attached to it, which makes the die larger and more expensive than regular DRAM. The tech used in the Wii is hopelessly behind the times at this stage (DRAM ran at a few hundred MHz data rate.) Something that can compete with 5.5GHz+ GDDR5 as used in PS4 might not even exist.
 
Back
Top