Is it time for Nvidia to go to the next level?

Well this is also an interesting idea, but I guess my question is how does Nvidia dominate. It seems to me a simple ARM (for example) mated with an advanced GPU is a very traditional idea. I think it is too timid a development when what is needed is SOC yes, but a dominant solution superior in every way (fast CISC, fast GPU, parallelism, low power).
I just thought I'd still reply to that since I missed it in my earlier reply, given my sudden enthusiast for what I had googled up mid-way through replying, heh... :)

Just including an off-the-shelf CPU on the same chip as the GPU is indeed a quite "traditional" development. Handheld SoCs have been doing it basically forever, and AMD is "about" to do it for the PC market with Fusion. Obviously, if you're aiming at the PC market, you should have a x86, unless you don't plan to make it run normal tasks (i.e. it would be "shielded" through an API)...

Anyway, my point of view on this was very different. My basic premise is this: NVIDIA doesn't have any intention to compete long-term in the high-end segments of the CPU market. They don't have the fabs or the capacity, and they don't have *enough* experienced CPU engineers to be able to beat Intel at their own game right now - and at least for the early years, it wouldn't be sufficiently profitable. At the same time, the odds of a breakthrough that revolutions computing are relatively slim.

There is IMO tons of value to be added for some markets by integrating everything in a single, ridiculously cheap chip if the performance is sufficient for the target market. The point is that if you aim lower than your competitors do, in terms of costs, (think C61V for a recent exmaple) then you'll have a market all for yourself, because they won't be willing to drop their prices and kill their margins.

In order not to dilute their brand, any company doing this would have to create a new name for such a product line, one that clearly implies it's a value offering. The advantages to such a strategy would be diverse:
- Don't get completely cut-out from the "IGP" business.
- Huge mid-ASP business with most likely "OK" margins.
- Capability to deliver complete platforms for imaging/GPGPU.
- Keep CPU neutrality for the higher-end parts of the market, reducing risks.
- Complimentary to NVIDIA's traditional GPU business, rather than destructive.

Furthermore, if the CPU performance becomes less important in the future even for high-end markets (if GPGPU takes off, for example, or just because modern CPUs become more and more overkill for traditional consumer tasks excluding gaming!) then such an investment would be even more justified. The way NVIDIA would get that x86 IP to integrate is fairly irrelevant. Personally, I think VIA makes sense, but it's hardly the only possibility...

I wouldn't exclude the possibility that NVIDIA just doesn't care and won't do anything CPU-oriented in the PC space within the next 5 years. Either way, it'll be interesting to see how things play out for them in the PC space; amazingly well, amazingly bad or just stagnant. Hmm! :)


Uttar
 
geo - I will look at the AMD video re. stream computing.....thanks

Uttar,

In your last post you speak a lot of sense on some of the realities (obstacles) of getting into any x86 CPU space and, lacking any technological breakthrough in architecture, it being a dangerous bet overall.

I guess the original premise has evolved somewhat for me as some of the realities of the technologies have become a little clearer thanks to everyone's contributions.

Let me see if I can clear my head a little:
1. I feel Nvidia must articulate a clear business goal. Whether it has articulated one or not since its inception 15 years ago, it has been in the business of becoming the #1 graphics supplier. I feel Nvidia now must move on. Whether it wants to stand up next to Intel/AMD on their respective terms (eg a CPU supplier) or terms of its own making (eg. a GPCPU supplier) is an interesting debate. It is not clear to me that Intel/AMD would allow a GPCPU competitor to thrive.

Whether Nvidia wants to stand up to another player is also an interesting debate. I don't know who that would be, but to me Nvidia must have a goal in mind in order to meet their goal (whatever that is).....circular, I know.

2. The Engineering/Scientific/Visualisation (E/S/V) market is a clear target for Nvidia. I think one of the questions as I mentioned earlier is the practicalities of a GPGPU without the benefit of engineering/scientific CPU horsepower. In other words, Intel/AMD working hard to close their busses to NVIDIA and other 3rd parties except in a handicapped way.

3. The Consumer Electronics (CE) market is a clear target for Nvidia. The VIA model of a low-power SOC solution is an architectural idea to address the market (per Uttar). Add in some Nvidia magic in terms of low-power, software, GPU, customer centric development and you've got the makings of a possible homerun.

4. Do items 2 and 3, add up to a $10B (revenue) company in 5 years? I haven't taken it apart fully. Here are AMD, TI and BRCM (8, 14, $4B respectively). today.
=================================

I think the answer comes in the form of markets and goals.

It would be interesting to see what someone thinks the market potentials are for E/S/V (item 2 above) or item 3.

It would also be interesting to understand how NVidia can dominate in E/S/V using the GPGPU without access to the necessary CPU horsepower if its going to be suppled by either Intel or AMD?
 
IMO, nvidia, AMD and VIA (exclude VIA if nvidia buys em) should unite for some time and fight against Intel, Intel's capabilities are too large and only such a union can resist them. Afterwards when Intel is no more in the game, nvidia ( or nvidiaVIA), VIA and AMD breakup and start competing. Funny to see this ever happening.
 
It would also be interesting to understand how NVidia can dominate in E/S/V using the GPGPU without access to the necessary CPU horsepower if its going to be suppled by either Intel or AMD?
Surely they have to get a foothold in the market first...
 
I think the answer comes in the form of markets and goals.

It would be interesting to see what someone thinks the market potentials are for E/S/V (item 2 above) or item 3.

It would also be interesting to understand how NVidia can dominate in E/S/V using the GPGPU without access to the necessary CPU horsepower if its going to be suppled by either Intel or AMD?

You read a bit like a stock market analyst. Are you?
 
It's at times like this I'm glad I kept these slides from a Dan Vivoli investor presentation... :)

Slide20.JPG


Now, as far as I can see, that's just the total addressable market (TAM). So, it's the absolute best-case scenario, and would imply near-maximal market share. I'm not sure how they're counting MCP for example; I'd suspect they are counting the entire market there, including IGPs and the entire Intel chipset market. Here's their current revenue repartition, for comparaison's sake:

Slide7.JPG


I'd take the TAM with a bit of salt, but these slides are interesting reference points anyway, and answer some of your questions quite nicely.

As for Consumer Electronics - I think it's a pretty good idea to differentiate handheld (power-oriented) and embedded (less power-oriented, but even more cost-oriented). An ARM is what you need for the former - as for the latter, mostly anything works, but x86 might be slightly preferable. On NVIDIA's chart, "Auto" and "Embedded Entertainment" would both fit in the embedded category, although auto is borderline handheld...

As for GPGPU, it doesn't really matter. PCI-E isn't going to die anytime soon, so this kind of thing isn't going to stop working anytime soon either: http://www.nvidia.com/docs/CP/44227/QPlex_Interface_card.pdf
If they wanted to completely replace the CPU-based systems, rather than complement them (thus, getting rid of Intel/AMD CPUs, rather than 'just' reducing the number of necessary CPUs greatly), then that'd be another thing. That's not strictly necessary yet, though.
 
IMO, nvidia, AMD and VIA (exclude VIA if nvidia buys em) should unite for some time and fight against Intel, Intel's capabilities are too large and only such a union can resist them. Afterwards when Intel is no more in the game, nvidia ( or nvidiaVIA), VIA and AMD breakup and start competing. Funny to see this ever happening.

Two obstacles come to mind concerning this:

1) VIA and AMD both have an x86 license. AMD at least has its license after some serious litigation and cross-licensing agreements. There may be a proviso or two concerning the owner of an x86 license buying another.

2) AMD is also ATI now, so if AMD and Nvidia collude or join against Intel, especially when Intel enters the discrete graphics arena, then that would be an anti-trust risk.

At the very least, either outcome would be a threat to AMD's standing in its anti-trust case against Intel.

Either AMD gets nailed in anti-trust, or it could possibly allow Intel to void the agreement that allows AMD to future extensions to the x86 ISA.

If that happens, it's game over for AMD.
 
Geo said:
You read a bit like a stock market analyst. Are you?

No, strictly a private investor. I'm an old marketing guy out of the graphics industry with a technical background but haven't paid attention to the details of the industry for quite some time.

This forum which I only just discovered is great. The subject of this thread has puzzled me since ATI got bought.
 
The small all in one low powered devices is and will continue to be a huge growth market in the next 5-10 years.

Nvidia will be wise to get in on the leading edge of this. I think the PC market is about as saturated as it will get. The desktop market will continue to shrink, being replaced by laptops. And for people who dont need a computer for more than internet browsing and email. Hald held devices will replace the laptops.
 
This would indeed make very much sense to me. They'd get all the x86 licenses needed, as well as loads of great IP.

I could also see them come out with a great audio solution then to compete against Creative, and by subsuming VIA's Envy24 IP and what they already have with Soundstorm legacy, that would even give them a nice edge.

VIA is a Taiwanese company, right? The Taiwanese government maybe protective of their microprocessor industry, and not allow any such deal to go through.

That doesn't mean I disagree that NVIDIA would want to use the GPU for all the things you listed. And you're most likely right that this is what Jen-Hsun is thinking of. There are TONS of applications where DSPs are used and in which GPUs would do amazingly well, and the ones you listed are very good examples. But the way I tend to think of GPUs in that context is that they are "parallel DSPs with FP32 units". That means if you don't need FP32, they're going to be less area/power-efficient than an optimal solution. And if your workload isn't massively parallel (even more so than standard parallel DSPs, I'd tend to believe), you're also going to have some problems.

Not to mention, Cell targets those same markets, though I believe Cell and GPUs have comparable die sizes. Something could still come out of left field though, anyone know the die size of the Stanford Stream Architecture?
 
VIA is a Taiwanese company, right? The Taiwanese government maybe protective of their microprocessor industry, and not allow any such deal to go through.
Correct, but the CPU design center is in the USA iirc, so I doubt such a scenario is very likely.
Not to mention, Cell targets those same markets, though I believe Cell and GPUs have comparable die sizes. Something could still come out of left field though, anyone know the die size of the Stanford Stream Architecture?
Yup, CELL will be some interesting competition there. It has some things going for it, but also a bunch going against it. G8x currently has roughly the same number of GFlops/mm2, but a majority of the die isn't even dedicated to the shader core. At the same time, you'd expect the ratio of SPEs to increase in the mid-term, and the clock rates to ramp up a lot in the short-term. I'd tend to believe CELL has an advantage, but we'll see.
 
Back
Top