Will 256-bit Rambus XDR interface be viable this autumn?

EasyRaider

Regular
Longer and math heavier shaders will reduce the need for bandwith, but I expect the move to FP16 back buffers will more or less cancel that in the short term.

The general consensus among the more educated people seems to be that we won't see a 512-bit bus for a long time. But what about 256-bit XDR? That would give >100 GB/s external bandwith... A nice surprise from NVidia perhaps?

On a related note, I am thinking that a tile-based deferred renderer would make more sense now than ever. Soon we will want to play games with FP16 back buffer and 4*AA or higher. On chip depth buffer, on chip FP16 blending, bandwith free AA, the bandwith savings are gigantic, and any improvements in overdraw efficiency or depth/stencil fillrate is just a bonus. Or is bandwith usage for textures considerably larger than for frame buffer? I don't think that's the case (assuming most textures are compressed).
 
The more interesting question to me is why *wouldn't* we see an XDR board from NV? Both IHV's (hell, companies in general) love to leverage their existing work as much as possible, and so far as I can tell if 512-bit really is out of the question then that's all that's left for NV anyway --and if you already have it. . .
 
Since XDR uses differential signalling means just the data signals for the memory interface will need over half a thousand balls on the chip package. Then, add to that the command bus and some other miscellaneous stuff, power, ground etc... Might be over 2000 pins in the end in total for the whole GPU. Not impossible certainly, but probably uncomfortable from a manufacturing point of view.
 
I see.
(edit)
I suspected wide XDR interfaces were impractical, otherwise I'm sure Sony would have fed Cell with 128-bit.
 
Seems to me a 128bit XDR interface is much more feasible, at least it allows to build cards with less than 512MiB. The bandwidth advantage over GDDR3 of course wouldn't be as big, but still significant. Maybe even a nice lead over GDDR4.
 
Xmas said:
Seems to me a 128bit XDR interface is much more feasible, at least it allows to build cards with less than 512MiB. The bandwidth advantage over GDDR3 of course wouldn't be as big, but still significant. Maybe even a nice lead over GDDR4.

Would like to see how they marketed that.
 
What's the width of a single XDR channel, or is this configureable?

Also, as far as RAM technology goes, we should be seeing FBDIMMs in the not so distant future. Does anyone know how these compare, in terms of controller bit width, bandwidth, price, to the XDR solutions?

What I'm really waiting for is this :
http://research.sun.com/sunlabsday/docs/talks/1.02_Drost.pdf
using overalapped GPU/memory chip sandwiches, and fancy new Floating Body RAM.
 
afaik FBDIMMs are more of a server technology, and not terribly suitable for the ultra-high bandwidth needs of a graphics card.
 
elroy said:
Xmas said:
Seems to me a 128bit XDR interface is much more feasible, at least it allows to build cards with less than 512MiB. The bandwidth advantage over GDDR3 of course wouldn't be as big, but still significant. Maybe even a nice lead over GDDR4.
Would like to see how they marketed that.
The same way they marketed 6600GT vs 9800Pro? Put out some benchmarks were thier card with 128bit demolished competition with 256bit? Don't see any problems with this.
 
I thought that FBDIMMs replaced a wide bus to the chipset/CPU with a narrow but high speed bus (fairly similar to RDRAM, except that it uses commodity RAM in conjunction with a buffer chip and is therefore cheaper)?
 
PS3 is rumoured to be using a 256-bit XDR Rambus interface along with the NVIDIA GPU, so.. You never know what NVIDIA has up their sleeve.
 
tahrikmili said:
PS3 is rumoured to be using a 256-bit XDR Rambus interface along with the NVIDIA GPU, so.. You never know what NVIDIA has up their sleeve.
I highly doubt PS3 will have 512MiB of video memory. Neither Elpida, Samsung nor Toshiba seem to be producing 8Mx32 XDR DRAMs. So less than 512MiB on a 256-bit interface is not possible.
 
tahrikmili said:
PS3 is rumoured to be using a 256-bit XDR Rambus interface along with the NVIDIA GPU, so.. You never know what NVIDIA has up their sleeve.

I thought PS3 would have 512MB memory (miserable 256MB still possible) only connected on the Cell's XDR controller, with fuckingly huge bandwith via the FlexIO thing between the Cell and the GPU (27GB/s or whatever)
 
Xmas said:
DegustatoR said:
Xmas said:
I highly doubt PS3 will have 512MiB of video memory.
Why? I think 512MB RAM is a must for a console which will live till 2010...
Main memory probably, but video memory?

unified memory with eDram framebuffer cache, as on gamecube and xbox2 (and xbox 1 without the eDram part, it has what became the Nforce, good enough for sharing all the banwith with a lowly celery 733 :) )

Notice how consoles had and will have unified memory and FSB/flexIO/whatever with banwith on the same order than the high end PC GPUs of their time. PCs can only do integrated graphics that suck with their slow RAM and AGP/PCIe. (that's normal, a console is a closed, embeded system, a PC is the opposite)
 
Hm, a Cell chip is reported to have a 64-bit XDR interface and a FlexIO interface with much higher bandwidth. What kind of topology is Sony going to use?
 
rambus says XDR data rate will be from 3.2ghz to 8ghz, that gives 25.6 to 64 GB/s on a 64bit controller, so that should be enough for the CPU + GPU.

it's a bit like cheap dual opteron boards with 128bits DDR connected to CPU 0 and CPU 1 accessing memory through the HT link to the first CPU.
 
of course.. "3.2 to 8.0" actually means 3.2ghz :D

http://www.realworldtech.com/page.cfm?ArticleID=RWT021005084318

FlexIO is 44.8GB/s outbound and 32GB/s inbound. but I guess we shouldn't care about the inbound part (I doubt any game will use the GPU for general purpose tasks), and some of this bandwith is meant for coherent link to other Cells, more likely for big ass workstations and supercomputing than for the PS3.

for a console 25.2GB/s ought to be enough to everybody? :)


/edit : but that's for the 90nm Cell CPU of today
 
Back
Top