That argument makes zero sense to you" <--FTFYThat argument makes zero sense.
You are obviously have no idea (or maybe had once, but completely forgot by now?) the process which brings us the final visual of a game. It's not polygons at SD vs HD! Developers are pre-baking a s**tton of visual enchantments into the scene and into the textures nowadays, a lot more than they did in 2005. They also use various deferred and other render techniques, and 3D algorithms which simply did not exist that time, let alone the insane boom of post process solutions (le.g.: SSAO, FXAA, etc) available nowadays.People weren't exactly unfamiliar with 3D graphics by the time the 360 came out. We already had 2 previous generations of 3D polygonal graphics.
You claim MS was building from scratch from transitioning from SD to HD. But Nintendo isn't doing the same with THEIR OWN title? What has Nintendo themselves built up from the Wii that'll allow them to seamlessly transition into HD? You think Nintendo would be licensing engines from 3rd party developers?
Compilers, libraries and engines indeed develop amazingly over time, and consoles usually get much more powerful as they get older, but there are still a lot of parameters there defining certain limits, limits acting as guidelines for the professionals working with the hardware. For example: the maximum speed of the rasterizer won't change, it's nothing you can do about it. The conversation here in this thread tries to guess and figure out the limits and the capabilities of the GPU in the Wii-U, and not the limits of the developers.
Rouge Squadron 2 was a launch title on the Gamecube, and you can easily compare it's visuals and engine speed with the very last games on the machine. Backing up your argument with examples instead of logical reasoning and facts never works imho.You act like the jump from N64/PSX to GC/PS2 didn't come with hurdles; or even it's own experience from the previous generation to draw upon. Batman isn't even made on final hardware.