JohnMST said:
I hate to say it, but the majority of your post kept reminding me of the classic Bill Gates line "Nobody needs more than 640 K" (I'm just paraphrasing).
I don't know the exact context of that quote, but I think one way to consider it is that if we hadn't moved away from the PC's basic functions, such as word processing without an user-friendly interface, we might not have needed more than 640K. So that could be categorized as not properly predicting the importance of advanced GUIs and new applications.
So, what I'm saying is actually quite similar, if and only if put in that context. The *kind* of applications 90%+ of consumers out there use don't need more power than is available today. It's even not just the specific applications; it's a fundamental characteristic of the kinds of workloads. There is nothing a mainstream consumer is using today that is going to need a lot of CPU power in the future.
Also, notice that I say 'CPU power'. What I imply by that is that some workloads, such as video encoding and editing, might need more performance in the future. But those problems tend to map better to GPUs and exotic architectures (such as CELL and Larrabee) than to CPUs.
JoshMST said:
Considering that both ArcGIS and AutoCAD are putting out major updates every two years or so
Unless I'm missing something, those are actually massively parallel workloads, so they could eventually move to throughput-oriented processors (GPUs; CELL; Larrabee; etc.)
It's easy to find workloads that benefit from massive parallelism, it's harder to find workloads that benefit only from moderate parallelism (and need the performance), and it can be even harder to extract it. There are very notable exceptions to this rule, of course, and with a bunch of programming effort, miracles can be made.
Also it's probably worth pointing out that this is not the market I'm thinking of most; most consumers don't use that kind of application. Clearly, some things are not mappable to throughput architectures, and as such there are still some markets where more powerful traditional CPUs make sense. The economies of scale and potential profits diminish rapidly when the mainstream is no longer part of that market, however.
pelly said:
However, should something go wrong with the integrated DVD player you are now forced to be without the TV AND DVD player while it is being fixed....Should a new technology come out....that DVD player is basically useless.....
I don't think that really matters, because integration doesn't have sufficient cost savings in the mid-end and upper-end parts of the market. The intrinsic chip costs are the primary factor, and integration is just going to reduce your yields unless you can also sell parts with redundancy (in which case, it would arguably increase them!)
So, if that kind of integration only matters in the <= $400 parts of the market (that is, for the entire PC, not only the chip!), it doesn't really matter if it's fully integrated because that's only targetting mainstream users, who won't want to switch individual parts anyway.
What I'm really predicting and arguing for is that a large part of the customers that have traditionally been buying mid-end stuff will migrade towards the low-end price points, and that new 'extremely-low-end' segments will be created at ridiculously low price points, and with stunning levels of integration.
Geo said:
What I keep saying the potential threat to Nvidia will be is the possibility of the historical graphics IHV investment model reversing.
Indeed, 100% agreed... NVIDIA is not in a position where they can afford to lose volume. On the plus side of things, you would expect them to leverage some of their desktop GPU investments for handhelds and GPGPUs, so those also are new areas where they can amortize their R&D.