It's a noble sentiment, but one that's been lost on the howling wastes of the internet, where early adopters have raged over every pixel. It's not a problem Multerer expects to last; now that developers have had time to get used to the architecture, they'll be able to get more out of it.
"I fully expect that to happen," he says. "The GPUs are really complicated beasts this time around. In the Xbox 360 era, getting the most performance out of the GPU was all about ordering the instructions coming into your shader.
It was all about hand-tweaking the order to get the maximum performance. In this era, that's important - but it's not nearly as important as getting all your data structures right so that you're getting maximum bandwidth usage across all the different buffers. So it's relatively easy to get portions of the GPU to stall. You have to have it constantly being fed."