I'm sorry, but debating console HW is like debating wooden vs aluminum crutches. It's all crippleware, no matter how ya slice it.
That's just not something I can really agree with. Sure the hardware stays static for multiple years but that's the whole beauty of a console.
It's a static target to develope for that you know will be around for a long time (assuming the console does well).
So any investment you make into that console generation will last you quite a while.
Compare that to past advances in computer technology and hardware that can change radically in the space of 2 years and you end up with something like...
Crysis. Which I'm going to guess was coded with the assumption that computer technology would continue at the pace it was going when Far Cry was popular. Unfortunately, CPU gains were mostly modest compared to speed gains leading up to Far Cry. GPUs had a bit of a roller coaster but that probably stayed about on track with performnce boosts in the past (with the exception of R600).
And end result is a game that was incredibly CPU bound for a long time (and still is to an extent).
Compare that to planning for a game on console that is slated to be in developement for 2-4 years and it's nice not to have to take a stab in the dark as to how powerful and capable your target platform is going to be.
The flip side of that is that, well, you're stuck with the same hardware for multiple years. And PCs will continue (in general) to get more powerful and more capable. However trying to plan a game 2-4 years out for a moving target is a huge gamble... Do you play it safe or do you try to be cutting edge?
It's no wonder game devs flock to consoles.
And as for player experience. Well, in general (not always) you can always rely on a relatively smooth gaming experience with visuals always at the max.
Well considering you have no choice in the matter.
Regards,
SB