Will XDR/2 ever be used?

ninelven

PM
Veteran
Just something I've been wondering about for a while...
Some of the advantages I see are bandwith/bus width, bandwidth (raw), laetency, easier board layout / cheaper to produce.

I'm guessing the main disadvantage is cost and/or supply. Any other disadvantages or reasons why one wouldn't use XDR with a gpu?
 
NEVAH!!!!!!!!

No, wait.......................maybe.



But since both nVidia & ATi have large shipments already ordered of DDR3, XDR probably won't be used until the late stages of R580 / R600 time period.
 
The question still remains though. Is there a possability that XDR will ever be used in current or next gen GPU's? Although I have yet to see any information pointing towards its use, (other then some XDR in the PS3) I do wonder why it hasnt been used or considered a possability in the PC GPU's. I know that costs and availability are one of the main reasons companys stick with GDDR3 and the upcoming GDDR4.

Maybe its just too cost prohibitive or fear of dealing with Rambus? (I believe they are the patent holders?) Im not sure but wouldnt there also be a need to make changes (maybe minor) to the memory controller to make full use. I thought I read somewhere about XDRlatency issues being a problem relating to use with current GPU's as well.

Guess time will tell :)
 
Times of Rambusfobia are over. But today memory is enough, so why bother with implemeting its special controller? Of all IHV I know only about XGI working closely with Rambus, they "outsourced" PCIe bridge to them.
 
I don't know expecially since XDR2 if it reaches its 8Ghz clock will produce upwards of 200GBs on a 256-bit bus and people are always complaining they don't make larger buses becuase they don't shrink well on newer processes
 
Back
Top