But the quote says it was in relation to unified memory.
"When we talked to the system team there were a lot of issues around the complexity of signal integrity and what-not. As you know, with the Xbox One X, we went with the 384[-bit interface] but at these incredible speeds - 14gbps with the GDDR6 - we've pushed as hard as we could and we felt that 320 was a good compromise in terms of achieving as high performance as we could while at the same time building the system that would actually work"
14gbps GDDR6 introduced signal issues that warranted a more complex memory solution. PS5's GGDR6 is 14gbps - 256 bit bus is what enables this? So above 320, your GDDR6 @ 14gbps is 'unstable'? What do GPUs do about this for higher BW than XBSX?
It's a good question and I don't have the answer!
But ... whatever it is, it probably comes down to money. Perhaps are more expensive board with more metal layers or higher quality material, perhaps extra components like filtering caps, perhaps more logic on the chip to determine what's interference or signal degradation, and of course PC GPU's have the nuclear option of disabling a memory controller (like the 320-bit 7900 XT does) and regaining integrity and still selling the GPU at a hoofing great markup.
We've seen consoles with disabled CUs, but we've never seen one with part of the memory bus intended to be redundant for yield issues...