Nvidia Tegra

Hm, I guess since it's a discrete/separate component (RSX I mean) I figured nVidia would be supplying the chip themselves. It does have "Sony Computer Entertainment Inc" on it.

I wonder where Tegra 2 really is deployed right now, in terms of things actually selling. Maybe some car entertainment systems or something..
 
Yeah, something like that should at least dual boot a standard Linux distro. I wonder if Toshiba took the easy route because nVidia has been offering Android distros for Tegra.
 
It's an IP deal, so it's pure profit and $5 is perfectly standard given that (and it probably doesn't go down much over time if at all). The manufacturing costs are also on Sony's income statement, not NVIDIA's.

In other words, about $3.4 million of the CPB revenue (or ~7.4%) comes from the PS3, all of it pure profit. And basically every penny spent in that department, therefore $38.7 million of the loss (100%) is spent on Tegra…
 
In other words, about $3.4 million of the CPB revenue (or ~7.4%) comes from the PS3, all of it pure profit. And basically every penny spent in that department, therefore $38.7 million of the loss (100%) is spent on Tegra…

Well, at least that half of my original claim made sense ;)

I wonder if Zune HD is (still) nVidia's biggest source of Tegra sales - MS has described Zune profit losses between 2007 and 2008 as "over $100 million" so clearly that market meant something; not sure how much of that the HD ever captured. Some indications show it was more popular than I realized, with it at least being listed near the top of for some popularity lists and being sold out a lot. Not sure if being second best to Apple in this market says an awful lot.

To me wins on Tegra 1 are not promising, since nVidia has long since needed to prove Tegra 2 in the market instead. I just don't see what big name product Tegra 2 is looking to land in. Unless they really do get PSP2.
 
Is there any estimate of the market shares of these SOCs?

Or at least profits and revenues of each SOC product line?

Or even the percentage of a GPU line across all the ARM SOCs?
 
Not exactly along those lines, IMG had claimed 80% share of the mobile market for the past few years.

nVidia even tried licensing their graphics IP in the mobile market -- http://www.nvidia.com/object/IO_11430.html -- a long time ago, so they have given a serious effort to the conquest of phones. Their results are just what happens when they can't use as much extra power consumption and silicon to outweigh their disadvantage to TBDR like they did in the desktop space and even in the mobile co-processor chip space for a while.

I don't imagine Intel putting in their own video decode core with that kind of performance/efficiency into CE4100. Should be VXD.
 
It's an IP deal, so it's pure profit and $5 is perfectly standard given that (and it probably doesn't go down much over time if at all). The manufacturing costs are also on Sony's income statement, not NVIDIA's.

I'd also personally second that round $5 is perfectly reasonable for console GPU IP. I can't be sure but I doubt AMD is getting more than that either.
 
Hm, I guess since it's a discrete/separate component (RSX I mean) I figured nVidia would be supplying the chip themselves. It does have "Sony Computer Entertainment Inc" on it.

I wonder where Tegra 2 really is deployed right now, in terms of things actually selling. Maybe some car entertainment systems or something..

Keep a lookout for Tegra SoC's in Audi's upcoming cars, ive seen some development boards for those ;)

I'd also personally second that round $5 is perfectly reasonable for console GPU IP. I can't be sure but I doubt AMD is getting more than that either.

The design of Xenos is more complex than RSX though so maybe they're getting more. Or given the nature of the last minute decision to include RSX in the PS3 maybe Sony had to cough up more. We will never know
 
Nvidia mentioned lower than expected RSX royalties in their financials and ditto for ATI. Maybe its larger for both than would be expected? If larger than normal console sales / lower than expected sales are worth enough to be mentioned then it probably isn't a 20M/Year sideline.
 
Their results are just what happens when they can't use as much extra power consumption and silicon to outweigh their disadvantage to TBDR like they did in the desktop space and even in the mobile co-processor chip space for a while.
That's not quite right: the reason why the GoForce business degenerated has more to do with integrating a 3D core *at all* rather than the 3D core itself not being very good. Furthermore it's not clear to me power efficiency would be bad on these chips; the real problem would be cost, because they're using a truckload of on-chip SRAM for the framebuffer to reduce power consumption.

Here's what happened: between Q1 2003 and Q1 2004, MediaQ/NVIDIA released four chips: the MQ2100, GoForce 2150, GoForce 3000, and GoForce 4000 (all of them on 150nm, except perhaps the first on 180nm?).

Those were all mostly done by the time NVIDIA bought MediaQ I suspect. All of those chips were quite successful, making the MediaQ acquisition a good one overall. Then NVIDIA started dictating the roadmap. They took the GoForce 4000's multimedia and added a very expensive 3D core (1280KB of framebuffer+textures SRAM!) on the same 150nm process. The chip was very expensive and besides the epic fail also known as the Gizmondo handheld console, it never shipped into anything.

Then comes the GoForce 4800 in 1Q05: they removed half the TMUs, shrunk it to 130nm, and added VGA video decode/encode (MPEG-4, not yet H.264). It was still clearly a high-end solution. And then in 1Q06, they released the GoForce 5500 with completely revamped (and frankly extremely good on paper) multimedia and a clock-bumped 3D core finally using stacked DRAM. But it was still a high-end solution with ASP around $20 iirc, so it didn't get much traction in a market that just didn't care about 3D much.

And finally, three long years after their last non-3D chip (the GoForce 4000), they released the GoForce 5300 on 65nm using the same multimedia architecture as the 5500 but on-chip eDRAM instead of stacked DRAM, no Image Signal Processor, and limited to 3MP cameras. A good chip on paper, but it no longer made sense in a market that suddenly started caring about 3D again (iPhone!) and was still engaged in a camera megapixel race.

With 20/20 hindsight, the same architecture, and similar investment, here's a roadmap that could have done substantially better:
- GoForce 4000 shrink on 130nm instead of GoForce 4500 on 150nm
- GoForce 4800 should have used a Imageon-like memory hierarchy (some SRAM+stacked DRAM)
- GoForce 5500 should have been released ~2 quarters later but on 65nm, no need for GoForce 5300.
- Remaining resources used for a lower-end 65nm Tegra (720p decode, VGA encode, 1xTMU, 8MP, etc...)

And regarding Tegra1, I think the main problem besides not jumping on Android early enough is the lack of CPU power. And there were two possible solutions there: either switch to a Cortex-A8 (I always wondered whether the ARM11 MPCore license was Rayfield's first move or his predecessor's last move...) or, and this would have been more interesting, use a Triple Gate Oxide process to achieve higher clock rates on the CPU (and throughout the chip to a lesser extent). Yes, it's not cheap, but neither is a Cortex-A8 or increasing your number of TMUs to get the same performance effect.

Anyhow what's done is done, and now the only thing that really matters is the level of success for their reasonable number of major Tegra2 design wins and, even more importantly, their execution on Tegra3. We'll see - I'm not sure I really care enough to do anything but wait one year from now and judge how things are by then. Way too many factors to consider before then.
 
I'll believe it when I see it. GF isn't genuinely ahead of TSMC for the next one or two process nodes in any useful way, they're not providing SOI except on their high-performance CPU-centric process, and their relationship to ARM means nothing to NVIDIA as they do all that implementation work themselves. They are a genuine alternative to TSMC in that market, but all of those reasons are basically bogus, which doesn't make the rest of the article sound very believable either.

The only reason I could see NVIDIA going down that route is that they should have gotten the first Tegra3 samples back from TSMC by now, and if they're massively underdelivering because TSMC screwed up with the SiON 28LP, then I could see NV changing some plans there. But it's been little time since NV would have known such a thing, and even less time since they'd have signed up with GF - what are the chances Charlie would know about a such a super-confidential deal (since he claims even TSMC doesn't know yet, which in itself seems very unlikely too) so fast and leak it?

I just don't buy it. Sounds to me like an unreliable source that managed to make Charlie believe some random speculation. Yes, everyone with half a clue about what was going on knows NV did negotiate with GF at one point. And everyone also knows those negotiations ended with no deal, so those would have to be new negotiations (which it would be a bit early for, but not impossible).
 
I'm not sure what's funny - their hilarious failure to deliver on their design win claims? That's more of a general thing and not related to that article. Which is accurate btw, because Jen-Hsun is clearly talking about sampling the chip, which I assume they are still expecting to do this year. This is comparable to Tegra2 sampling in Q3 2009 and Tegra1 in Q1 2008.
 
I'm not sure what's funny - their hilarious failure to deliver on their design win claims? That's more of a general thing and not related to that article. Which is accurate btw, because Jen-Hsun is clearly talking about sampling the chip, which I assume they are still expecting to do this year. This is comparable to Tegra2 sampling in Q3 2009 and Tegra1 in Q1 2008.

I don't see anything weird in that article either. Most if not all SoC manufacturers have a rough 1 year cycle for new SoCs.

Design win claims however are IMHO subject to their marketing department. Hopefully they'll realize eventually that claiming design wins before they're absolutely sure about anything can create more damage than good.
 
With this much of an investment in a mobile SoC market crowded by giant Semis, they need to be going forward with only the best IP. The time has come for nVidia to take out a license for Series 6 and the follow-on to VXE/VXD before one of their competitors drops that tech on their heads.
 
With this much of an investment in a mobile SoC market crowded by giant Semis, they need to be going forward with only the best IP. The time has come for nVidia to take out a license for Series 6 and the follow-on to VXE/VXD before one of their competitors drops that tech on their heads.

I keep repeating to myself that I should never say never again, but that's definitely one of the exceptions where it's easy to say NEVER. Imagine how ridiculous it would sound alone if one of the world's largest graphics IHV would license graphics IP from any 3rd party.

Albeit completely OT for the thread and the forum section: what NV could do in the long-run is to try to come closer to Intel, even if it just would mean IP licensing from NV's side.

If anyone at NV fore-casted in the past that the Tegra department could eventually cover any loss from the chipset market and/or difficulties to enter the PC/notebook SoC market of the future then either I don't have a full picture of significant prospects of the future or that someone was simply smoking something extremely hallucinogenic (...where have I read that one before...) :p
 
With this much of an investment in a mobile SoC market crowded by giant Semis, they need to be going forward with only the best IP. The time has come for nVidia to take out a license for Series 6 and the follow-on to VXE/VXD before one of their competitors drops that tech on their heads.
And on top of what Ailuros said, imagine a world with only a single mobile GPU provider. Do you want Imagination to become an Intel-like monster? I don't :)
 
I'm not sure what's funny - their hilarious failure to deliver on their design win claims? That's more of a general thing and not related to that article. Which is accurate btw, because Jen-Hsun is clearly talking about sampling the chip, which I assume they are still expecting to do this year. This is comparable to Tegra2 sampling in Q3 2009 and Tegra1 in Q1 2008.

Hilarious is that after delivering to market only three products running tegra1, and still none running tegra2 they announce that tegra3 is around the corner and tegra4 is right behind it.
Nothing especially funny but made me smile while reading it ;)
 
Back
Top