'The reason the consoles kept pace well is simply due to resource allocation. If you open task manager, you'll often see hundreds of processes on your windows (or Linux or Mac for that matter) box. The XBox and PS3 tend to have basically one. When these machines are running your game, it's all they're running. There's no memory manager or paging, there's no exception handling, there's no process switching overhead. It's like if every app on your PC had 95% CPU, 480MB dedicated memory (and no paging) and almost all the GPU.
Combine that with developers coding to well-defined, static hardware, and that's why the consoles still look pretty good.
Why do these arguments always end at someone saying this?
Compared to my 2006 gaming PC (Core 2 Duo E6300 1,86GHz @ 3,2GHz, 2GB RAM, 8800 GTX 768MB) the PS3 that came out a month prior to my building said system will NEVER equal it in raw power, no matter it has to run Windows XP.
I have 2,77GB of combined RAM and VRAM on it to play with while the 512MB of consoles is only a fraction of this. This is a handicap you cannot optimize away, no matter how hard you try. Streaming textures as a method of saving RAM space is a bad joke, one must really love texture and geometry detail popups to consider it a good method of alleviating the problems.
Saying that consoles still look pretty good comes with its own bag of problems ... Uncharted 2 looks great, no doubt, but is also basically a linear tube for you to be funneled through. Why? Because the RAM issues on consoles are so bad they're interfering with proper level design! If what you want the player to see has to be looking good, then you have a tradeoff to make and it's always the extra level size or "that other route" that gets the axe.
And big, open world games like GTA IV look horrible, and have a depressing LOD. Some guys tested on youtube that you cannot even shoot properly with sniper rifles in GTAIV, because of this.