Heh. Thought you guys would enjoy it. I was amazed at how straightforward and clear he was - very geek to geek, and with a strong sense of technology integrity shining through.
In the context of the B3D forum, for people involved in gaming (which includes all of 3D gaming) his comment that Intel is really attentive to that group must be gratifying. However, he immediately followed up with "but you can't base a 30 billion dollar company on them" which should be a warning. And the general gist of the presentation really told the story - the age of pushing performance forward at the cost of other parameters is coming to an close for general purpose computing. It's not over yet, and may never fully be, but other factors will get progressively more attention as soon as the marketeers figure out how to sell them, and they
will find ways to sell those features, because apart from gradually loosing attraction value performance has already ceased to improve at the accustomed brisk pace.
The Q&A session had a notable passage from 1.11.30 onward that made it very apparent that in Colwells opinion x86
really carries a lot of baggage, and that it may not be able to compete quite as impressively going forward with clean-sheet designs. That's probably not much of an issue in PC space for compatibility reasons, for consoles however other rules apply. What will stagnating CPU speeds mean for the development of future PC graphics engines? And does this have any short term or long term implications for PC vs. console gaming?
He also remarked on how graphics processors are getting more programmable, and how "this hadn't gone unnoticed at Intel", a remark that's quite intriguing, and a bit disturbing if you happen to be a grahics IHV. (And of course he let slip the amount of i-cache on a gfx-processor unfortunately without saying which one.) What could he have meant by that remark?
Regarding power draw, in Dave Baumanns interview of Dave Orton, Orton said:
"DB: At the end of the day though, is there really the desire to continue with that – is the drive there to keep pushing that type of model?
DO :There’s always the debate of who steps down first. I think what’s going to happen is we’re going to hit a power limit, so through other innovations and technologies we have to manage efficiencies."
which implies (to me) that nVidia and ATI are also feeling that they are closing up on the end of an evolutionary branch. While gfx is very amenable to parallell processing, and thus can make very productive use of additional die area/transistors, power concerns won't let the current trends scale at the pace of the past. So what paths are going to be followed? Selling HDTV capabilities? Low power draw/silence? Video encoding features? Or will it be business as usual only with more power management technology thrown at the problem?
Anyway, I felt that if someone of Bob Colwells caliber speaks about the state of computing, in a way that was just recently backed up Intel scrapping their entire P4 roadmap (!!!), then maybe even the graphics nerds will take notice as his words weigh infinitely heavier than those of an anonymous "Entropy". Times they are achanging although to what degree remains to be seen. Maybe it would be smart to ask ourselves how this is likely to affect the graphics business?