I've been reading all the reviews of the 7800 GTX,and while it's performance is very good,it seems to be unable to handle HDR+AA,just like the 6800 couldn't either,not even in SLI mode.
I've been hearing that it's has mostly to do with the massive memory bandwith requirements,and the need for compression formats for all data types,and so far,it seems that the only graphics chip that can pull it off to some basic degree(HDR using FP 16 + MSAA) is the Xenos in the X-box 2. since the link between the GPU and Edram is so fast at 256 Gb/sec of memory bandwith..
So the question is,since ATi did design the xenos,could it have found a way to implement it in the R520 as well,without having to resort to eDRAM?...I see that as a major advantage if it has that capability,even with the delays it's suffered and it wouldn't even matter if Nvidia released a faster clocked 7800 ultra right after the R520 release...
I've been hearing that it's has mostly to do with the massive memory bandwith requirements,and the need for compression formats for all data types,and so far,it seems that the only graphics chip that can pull it off to some basic degree(HDR using FP 16 + MSAA) is the Xenos in the X-box 2. since the link between the GPU and Edram is so fast at 256 Gb/sec of memory bandwith..
So the question is,since ATi did design the xenos,could it have found a way to implement it in the R520 as well,without having to resort to eDRAM?...I see that as a major advantage if it has that capability,even with the delays it's suffered and it wouldn't even matter if Nvidia released a faster clocked 7800 ultra right after the R520 release...