That's actually a notable problem IMO. Basically you have to spend more money, more silicon, to achieve the same sort of thing. Gaming on PCs tended to be having a powerful computer for home computing, and then play games on it. But now processing has eclipsed the needs of the home computing environment, a lot of the push for bigger, faster, better PCs is just for gaming. At which point, trying to build on such archaic legacy ideas is...really wasteful!Yes, there is a reason... it's the most efficient way to do it, considering how cost effective consoles have to be. On PC there are other ways of mitigating those differences.. by actually programming games for the PC architecture's strengths... wide buses, and high capacities. PCs will never have console level efficiencies in design.. we know that.
The consoles represent a next-gen IBM architecture in reality. They are flexible computers that could, with the right software, do video editing, gaming, browsing and Office, but more efficiently. Without being tied to legacy software, the new microarchs could give a breath of fresh air to the computing space. With the work having been done, it'd be nice to be able to roll these designs back into the Windows space so it too can use super efficient file access, etc. What we have instead is an argument that PC can just approach all problems like the Industrial Revolution, with just bigger, faster, more, eating more electricity, and having only clumsy, brute force solutions.