Consoles have never been top-notch tech.
I guess it really depends how you classify top-notch tech. Do you mean absolute best one can buy, Top 5% of PC consumers at the time of console launch, 10%, what? Taking the Top 5-10% approach I would say that current consoles (and maybe even the Xbox1) were top-notch tech in relation to what the top 10% of consumers had on the PC.
Take the 360. I cannot find the 2005 results from the Valve Survey, but I will use the current 2006 ones as a frame of reference (even though they don't represent the 360 launch window). This is a survey of gamers, so it will obviously lean toward that market, which we can then look at what the Top 10% of
gamers have.
The 360 CPU has some tradeoffs compared compared to what Intel/AMD offered. Valve doesn't track cache, but we do know that very top of the line PC CPUs had more cache (although I it seems pretty certain that the bottom of the top 10% did not have
more than 1MB) and are more effecient; but Xenon had 3 cores, beefy VMX units, cache locking with the ability for the GPU to read from L2 cache. Even now less than 3% of gaming PCs have more than 1 CPU. You wouldn't want Xenon for your desktop CPU as of today, but would devs be better off in the long run with a single core 2.2GHz Athlon64 (i.e. the bottom of the Top 10% of AMD CPU users) or a 3.2GHz Tri-Core Xenon?
On the GPU side, looking at today, less than 5% of gaming machines have NV GeForce 7800 series (or better) or ATI Radeon X1800 series (or better) GPUs. Xenos is easily in the top 10% of 2006, let alone 2005. Looking at its featureset and other design features (like eDRAM) I don't think anyone could argue that in the 2005 launch time frame that Xenos was not a "top-notch" design relative to what the market offered. Further, the majority of the market has GPUs very much below this performance envelope.
Memory is an interesting one. On the PC there is a lot of unefficient use of memory. Bloated CPU and background tasks plus a lot of games end up storing the same content twice (e.g. lets say you have 300MB of textures on your GPU, frequently those 300MB of textures are in your system memory as well). So it is difficult to make a straight line comparison. Right now about 11% of users have 1GB or more of system memory. Interestingly back in March more people had 128MB of system memory than users who had 1GB or more (now 128MB users is down to 7%; so there has been some movement on both). I don't have fall 2005 numbers, but it does seem the 360 was on the boarder of the bottom of the top 10% of gaming rigs. And that is not considering system differences like OS size and platform stability/target.
While there are some trade-offs, I don't think we can say that in Fall 2005 the Xbox 360 was not "top-notch" technology relative to the market at that time if we are looking at a what the top 10% of gamers had and what was available in volume.
True, in fall 2005 you could have got something with 2GB of system memory, SLI 7800s or Crossfire X1800s, and an AMD Athlon64 X2 -- if money was absolutely no object.
But if we are going to look at such systems as "top-notch" when they are less than a fraction of 1% of a market, where do we end? Do we start comparing super computing clusters with thousands of CPUs and servers with 16GB of memory? What qualifies as top-notch?
All I know is that the hardware in the PS3 and Xbox 360 in the 2005 timeframe was excellent compared to the consumer market -- not only what was used but also what was available.
I guess the fact that looking at the GPUs alone, in Fall 2005, someone would have to think hard between Xenos/RSX or replacing it with an X1800/7800 (the best on the market at the time) in a PS3/360 in the 2005 timeframe is a good indication they do have top-notch hardware... at least in some areas.
Anyhow, the market has changed. Console ARE getting better hardware, relative to past generations; but likewise a much broader consumer market has opened up in the PC market where consumers can now stick 4 GPUs in their PC. So where we draw the lines on "top-notch" is blurred in this regards because outside of the VoodooII, very rarely have we seen such a situation where the top end GPUs can almost immediately be multiplied in performance by a factor of 2x or 4x by just tossing more stuff in. And now this is applying to CPUs as well...