Entropy said:
The 8800, uncontested champion in performance and features for 9 months (!), has achieved a whopping 1% penetration among online gamers.
Uh yeah considering the entire population of online gamers didnt buy new hardware in the last 9 months exactly how is that statistic relevant? No new product will displace a significant portion of the market. What percentage of new sales among online gamers do you think 8800's garnered in the last few months?
Well, considering that in its 9 months it gathered less than 1% of the Steam survey (frequent online gamers), the adoption rate is likely to be way lower if you take the larger demographic into consideration, dominated by games such as SIMS/SIMS2/WoW/HP/Civ and so on. MUCH lower. Additionally, even though half a million is a large sample, five thousand 8800GTS/GTX users are not. Are they even representative of consumers? How many of these are connected to the industry in one capacity or another, checking drivers and compatibility, producing reviews for on- and off-line publications, working at game companies wanting to stay on top of developments, working in the channel, et cetera?
1% is probably a huge overestimation of the impact of the 8800 series. The actual penetration among regular game players must reasonably be very much lower.
And this is my point for these forums, it is OK to be a technology enthusiast, it is OK to be in the industry and want to be among interested people. But if you get to cosy in your small group and start to believe that the values of that group is shared by other people, you are prone to make mistakes, in thought or in action. Sticky the damn market data. People can draw whatever conclusions they want from it, but at least a rudimentary reality check would be readily available.
IMO better mid-range hardware is a side effect of having a high-end segment. When engineering time and effort is spent squeezing every last drop of performance out of a process those gains scale downward as well. If these companies didn't have to push the envelope the result would not be better $300 cards.
This is really another issue - but I'll address it as well, even though the reply is more IMHO. When I compare the capabilities of for instance my old Radeon 8500 card with todays new DX10 offerings, I see almost the same pixel pushing ability per Watt. The benefits of three generations of lithographic development went largely into pushing the feature set of the gfx-asics. Performance advances have been bought with the coin of increased power draw and the increased parallelism allowed by lithographic advances. Who has pushed for the feature set advances? Well, some new technology have undeniably benefited large parts of the market. But no consumers ever stood on the barricades demanding IEEE compliant rounding behavior, Crossfire, GPGPU control flow or a number of other largely industry internal features. They cost transistors, and therefore power and money, and they cost engineering resources. Nvidia and ATI have had competitive reasons to push these features, both vs each other, to raise the barrier of entry into the market for any other interested party and to try to open up alternative justifications for their products, asked for or not.
The consumers probably just wanted higher performance, as cheaply as possible, at low noise levels and low power draw.
There is a huge gap between these two positions opening up in the last five years or so. Nvidia and ATI had a duopoly, so they could get away with pushing the market in the direction they desired - consumers had no real alternative other than integrated graphics, so they financed this development, even though they may not have been terribly interested in the benefits compared to just getting more pixels pushed cheaper. But it is interesting to speculate in where we would have been if the industry had spent its resources based on enhancing solutions for portables rather than desktops. It would not present less competition or challenges, only different from the ones that led us where we are today - with a widely publicised high end that next to nobody is interested in actually buying, and mid end cards with lackluster performance for their cost that is unable to show much benefit for their technological advances. As a thought experiment, if AMD had produced and sold the R580 chip on 65nm, achieving lower power draw and cost, and had offered that to compete with the 8600GT, would it serve peoples needs in todays market?
Technology advances along the paths we choose. In this particular case I'd like those choices to be more consumer oriented. After all, these are largely game play accessories we are talking about.
PS. Oh, and $300 cards aren't mid level by any stretch of the imagination. You make my Ivory Tower point perfectly. Good products in the $80-$180 bracket or so should be the design target.