Maybe IBM calculates it correctly (like the intel burn test), the TDP is never reached in practice and it does only 75W in games. This could be why the power supply was much bigger than 200W and they wanted to be safe in case some crazy dev actually reach closer to the TDP in a real game. I mean the risk always exists. Devs tend to find more and more clever ways to maximize the performance much later. I can imagine a big game coming out that breaks first gen consoles would be a PR nightmare.No wonder RSX was a bit gimped then
Are we comparing apples to oranges, average in game power draw versus manufacturers TDP?