While we know very little about the RSX and Xenos chips (features and real world performance), the few tantilizing details we have gleaned has me wondering: Why did Sony not partner with ATI? What technologies did Nvidia offer that made an offering from ATI less enticing? Was it an issue of technology, cost, strategy, and/or other?
I ask this question because 2 features of Xenos seem, at least at the surface, as "perfect matches" for CELL.
1. Unified Shaders. The benefit is flexibility. The GPU could be treated as a traditional GPU, or an ambitious developer could offload more of the vertex work to the SPEs and use the GPU for Pixel Shading--without leaving any of the GPU's shader units idle. So games that use the SPEs for heavy physics, AI, and other game related tasks would still have the power of a next gen GPU; and developers looking for a lot of eye candy could dedicate the SPEs to vertex work and use the GPU shader units for pixel shading.
2. eDRAM. The benefit is dedicated bandwidth for the backbuffer, leaving the memory pool bandwidth to be utilized by CELL and for the transfer of textures and large meshes. With the ambitious goal of 1080p anti-aliasing, HDR, and other bandwidth intensive tasks from the framebuffer are going to consume a lot of system bandwidth. The tradeoff of some logic to ensure high levels of AA and HDR would be a great partnering with 1080p. (This may very well be possible with the RSX without eDRAM, but obviously eDRAM would not hurt... I am also assuming that the eDRAM implimentation in Xenos will do what ATI set out to accomplish).
So I ask: What did Nvidia bring to the table that Sony found more enticing than ATI? Any ideas?
I ask this question because 2 features of Xenos seem, at least at the surface, as "perfect matches" for CELL.
1. Unified Shaders. The benefit is flexibility. The GPU could be treated as a traditional GPU, or an ambitious developer could offload more of the vertex work to the SPEs and use the GPU for Pixel Shading--without leaving any of the GPU's shader units idle. So games that use the SPEs for heavy physics, AI, and other game related tasks would still have the power of a next gen GPU; and developers looking for a lot of eye candy could dedicate the SPEs to vertex work and use the GPU shader units for pixel shading.
2. eDRAM. The benefit is dedicated bandwidth for the backbuffer, leaving the memory pool bandwidth to be utilized by CELL and for the transfer of textures and large meshes. With the ambitious goal of 1080p anti-aliasing, HDR, and other bandwidth intensive tasks from the framebuffer are going to consume a lot of system bandwidth. The tradeoff of some logic to ensure high levels of AA and HDR would be a great partnering with 1080p. (This may very well be possible with the RSX without eDRAM, but obviously eDRAM would not hurt... I am also assuming that the eDRAM implimentation in Xenos will do what ATI set out to accomplish).
So I ask: What did Nvidia bring to the table that Sony found more enticing than ATI? Any ideas?