This makes me quite often wonder how could the PS3 compete. GPU was less powerful, it didnt have unified memory, less memory was available also because the OS was taking a larger footprint and it didnt have the advantages of EDRAM. And memory is something that you never get enough of. So the PS3 was clearly hitting memory bottlenecks and the GPU couldnt compete.NV2A was good, but the original Xbox CPU was a low clocked 733 MHz P3 Celeron. Intel already had released 2.0 GHz P4 and AMD had 1.4 GHz Athlon Thunderbird, both over 2x faster than the Xbox CPU.
Xbox 360 GPU (Xenos) was ahead of PC GPUs at launch (both in features and in performance). It was also ahead of the PS3 GPU that launched one year later. Memory bandwidth wasn't that great compared to PC GPUs, but EDRAM meant that render target writes (and alpha blending) didn't consume any memory bandwidth at all. In complex scenes (lots of overdraw or blending) it had practically way more bandwidth than PC GPUs. Xbox 360 CPU (Xenon) wasn't that bad either. It had 3 cores / 6 threads. 3.2 GHz. Full rate 4-wide SIMD multiply+add (FMA) among other goodies. But it was in-order CPU and PCs had out-of-order CPUs that were significantly better for running generic code. Still I would say that Xbox 360 was one of the only recent consoles that had an advantage (albeit pretty small) compared to PC hardware of the same time. But Geforce 8800 GTX was released one year later and it was significantly ahead of Xbox 360 GPU (in performance and features, it was the first GPU with compute shader support).
Could the Cell and the fact that the system memory was fast XDR really compensate for all the limitations? I kind of doubt it. Or was the 360 not fully pushed?
Because honestly when I am thinking of the above the PS3 shouldnt have been able to perform so close to the 360 and initially the discrepancy between multiplatform games was quite large, often missing effects and having worse framerates on the PS3