First, you can certainly get some dramatic performance increases in the PC land when changing CPUs, especially at the resolution that matter for consoles. At 1280x720, even with FSAA 4x, even "last generation" top cards (R580, G71) were often CPU-bottlenecked in many games (I'm not even taking SLI or 8800 into account). If anything, the whole "HD era" paradigm of this generation of consoles is highly amusing to the seasoned PC gamer. I don't think I ever went under 1024x768 with some level of FSAA since I got my 9700 Pro.
In addition, games for consoles and PC differ a lot in what they offer, and CPU is often the weakest link for PC developers when targetting a certain configuration. GPU-wise, they can get away by disabling some effects and lowering texture resolution, and letting gamers tweak IQ settings such as resolution, AA, AF... But CPU calculations don't work that way, there is a certain amount of stuff that has to be done anyway (AI, physics, general code...). As such, PC devs tend to have conservative approaches to CPU budget.
Being fixed boxes, consoles don't have these problems. Also, consoles tend to have less CPU overhead than PCs, since there are nearly no background tasks to speak of.