Nvidia Tegra

1) Nvidia has spent $600M on tegra (I'm frankly astounded at that figure)
In February, it was $500-700M depending on how you count. I assume it depends whether you count the PortalPlayer acquisition for example, and whether you count previous GoForce R&D (presumably if you didn't count it at all it'd be even a bit lower).

The fact of the matter is that the ARM11 MPCore used in the Tegra (and no other NV product before it) was licensed in May 2005. Also Rayfield joined NV in 2005. By that time, the GoForce 5500 was probably taped-out or nearly so; the only other two chips after that were the 5300 and 6100 - the former was just a cut-down 5500 with eDRAM on 65nm, the latter was taped-out by PortalPlayer before they bought them. So that's a ~500 people team for several years now...

2) 50 end-user products "in-flight". Media players 1st, then smartbooks, then phones.
This figure is misleading though: 35 of those are smartbooks, where NV has done a heck of a job reducing R&D expenses for ODMs through their module approach. That module they showed at Computex isn't just a devkit or a board support package; it's literally a production board that's used as-is in real products.

The total number of Tegra smartphones in development was around 12 at last count, IIRC. And that probably includes ODM devices like the ones at MWC09. At this point it seems Tegra1 design cycles took too long to become part of any OEM/ODM's "platform"; i.e. it's probably used for one or two devices, maybe three in the best case, at a given manufacturer and that's it. Then they'll switch to Tegra2, and there their hope must be to penetrate full platforms of devices rather than just flagships. Combining velocity with SW compatibility can be quite powerful.

3) One segment they have design wins for is "media pad" when pressed he describes it as a 7-10" tablet type device. When the interviewer makes a passes remark about an Apple tablet, there is the mearest hint of a smile ?
Don't count on it :) Mike's pretty damn good at making you think he answered or is hinting at something when he's really just dodging part of what you said.

Also, it's difficult for me to understand why Apple would do that when their iPhone 3G S SoC has a stronger CPU, a strong GPU that includes not just OGL ES but also DirectX9, *and* VXD (probably 370) for 40Mbps High Profile H.264 decoding. Meanwhile, Tegra1 is only capable of 20Mbps Baseline H.264 decoding, and they'd have to implement new drivers and so forth. Rather senseless, if you ask me. I'd be very confident both IMG and Infineon are still the big winners in that product. It's going to be pretty fun to see Apple surprise everyone who hasn't paid attention with 1080p playback on September 9th.

4) Somewhat desparaging of Intel's attempts todate "dehydrating notebooks".
Can't say I disagree; he has had a tendency to underestimate Menlow a bit in our discussions, and it looks like Moorestown is on track to overdelivering on idle power compared to what both of us thought - but the fact of the matter is it's still a dehydrated PC architecture, and they simply aren't going to achieve traditional phone standby times before their single-chip 32nm SoC. And by then it might be too late in many ways. One problem is they'll have a significant active power *and* performance disadvantage; by the time that single-core in-order product is available, they'll be competing against quad-core A9s that will *still* beat them on price. Ouch! Does anyone really love x86 so much that they'd want to use that on anything else than Windows 7?

5) In answer a question, he indicates that it is his view that Apple designed the iphone processor, samsung manufactures it.
'His view'?! It's pretty sad that so many years after Apple has started making its own RTL code, nearly everyone still has no clue. Oh well... ;)

6) Next year Tegras will hit x4 performance of current device, at same power (probably A9)
It's A9, but the key to understanding their performance claims is that there are at least two chips in the Tegra2 family: the handheld one and the netbook one. And then I'd guess there's also a lower-end handheld chip to be able to penetrate further down the tiers. I wonder what frequency they'll achieve; I guess that pretty much entirely depends on whether they chose 40LP or 40LPG though.

7) following iteration is a further x2.5 performance.
Yeah, that's on 28LPT and it should be it would be taping-out in 1H10. I assume that's 4xA9 and higher clock speeds. I wonder what we'll have on the GPU side... Seems a bit early for a DX11 arch derivative, but who knows whether it was developed in parallel.
 
In February, it was $500-700M depending on how you count. I assume it depends whether you count the PortalPlayer acquisition for example, and whether you count previous GoForce R&D (presumably if you didn't count it at all it'd be even a bit lower).

The fact of the matter is that the ARM11 MPCore used in the Tegra (and no other NV product before it) was licensed in May 2005. Also Rayfield joined NV in 2005. By that time, the GoForce 5500 was probably taped-out or nearly so; the only other two chips after that were the 5300 and 6100 - the former was just a cut-down 5500 with eDRAM on 65nm, the latter was taped-out by PortalPlayer before they bought them. So that's a ~500 people team for several years now...

It just strikes me as an enormouse monetary input to have no output until now. I'd be surprised if Intel spent that amount on all their Atom variants + chipsets to date, and of course thats been in the field for over a year now. I know they are not comparable products, but they probably are comparable in terms of their overall project size and complexity.

Also, it's difficult for me to understand why Apple would do that....
I agree, I can't see what the Tegra brings to the Apple tablet that an upclocked version of the current 3GS processor doesn't do, except the fancy tegra audio stuff, which I don't think is important. And I can see many reasons why they wouldn't want to use it. Just thought the smile was a little disconcerting (as an IMG fan).

it looks like Moorestown is on track to overdelivering on idle power compared to what both of us thought
If Intels stated x50 reduction in idle is correct, isn't that starting to get close to the same ball park as a comparable ARM part ? I guess its usage power and physical size that are the limiting factors.

Yeah, that's on 28LPT and I'd guess it would be taping-out in 1Q10
.
Obviously you mention 1Q11 here ?
 
This figure is misleading though: 35 of those are smartbooks, where NV has done a heck of a job reducing R&D expenses for ODMs through their module approach. That module they showed at Computex isn't just a devkit or a board support package; it's literally a production board that's used as-is in real products.

a) If the module support helps so much, then why do other vendors dont do ti?

b) Can you provide more details on that module?
 
It just strikes me as an enormouse monetary input to have no output until now. I'd be surprised if Intel spent that amount on all their Atom variants + chipsets to date, and of course thats been in the field for over a year now. I know they are not comparable products, but they probably are comparable in terms of their overall project size and complexity.
I'd be very surprised if Intel didn't reach $500M by now. Also as I implied, that $600M figure is a bit on the high side, especially since how much of the organization was still working on GoForce 5500/5300 on the *software* and support sides. Maybe $300M would be a more plausible figure for *all* the costs related to the APX 2500 & 2600.

Also, it's not true that they had no monetary reward for Tegra. A noticeable part of those 500 people worked on the 3D core, and that 3D core has a pretty much guaranteed revenue stream via Nintendo. They'd had that deal for a *long* time.

If Intels stated x50 reduction in idle is correct, isn't that starting to get close to the same ball park as a comparable ARM part ? I guess its usage power and physical size that are the limiting factors.
Not at all. The lowest-power Menlow MID in the world, to my knowledge, uses between 3 and 4W. 1/50th of 3000mW is still 60mW full-system with the screen off and the user doing nothing whatsoever. By comparison, you could shut-down Tegra and its RAM completely and keep a latest-gen 3G baseband in standby mode, and the full system would probably take less than 3mW. I'd argue that's a pretty big difference. Intel's argument will be that standby no longer matters because devices like iPhones get used so much that most people will want to recharge it every day or two anyway; this is partially true, but whether the OEMs will take them seriously with that kind of claim is very dubious.

Obviously you mention 1Q11 here ?
No no; I said 1Q10 and I mean it. The plan would be to sample in late 2010 and have smartbooks with it for the Back-to-School 2011 season. When NV says they've got velocity on their side, they aren't kidding ;) In the end though, I'd argue the most important factor remains architecture and sales strategy - but they're doing pretty good on those fronts too.
 
a) If the module support helps so much, then why do other vendors dont do ti?

b) Can you provide more details on that module?
http://www.crunchgear.com/wp-content/uploads/2009/06/tegra1.jpg
http://www.cdr.cz/picture/47391/large
http://www.cdr.cz/picture/47392/large
http://www.behardware.com/news/10278/computex-nvidia-relaunches-tegra.html

You've got memory, a PMU (can't make out what it is, maybe it integrates audio), a LVDS transmitter, and two other chips - I'd suspect a multi-port USB PHY (AP16 integrates a 1-port PHY and a multi-port MAC) and a WiFi chip.

I assume Freescale will eventually do this too. But Qualcomm can't do it quite in the same way for a very simple and very ironic reason: they've integrated the baseband. What NV does is they let ODMs combine their module with 3G modules from Sierra Wireless; the latter are useful because they are pre-certified, once again saving quite a bit of work (it does increase unit costs, but in moderate volumes it's probably mostly compensated by lower development costs). Qualcomm can't do that; they could make their own pre-certified modules integrating everything but I'd be surprised if they went that far.
 
It just strikes me as an enormouse monetary input to have no output until now.
I don't think that this matters as much as we non-business types think it does.

Look at it from a Wall Street point of view: They expect a tech company to invest heavily in new product lines and are suspicious about companies that don't, but once products really enter the market and becomes successful, they don't look back at how much it initially cost to get them off the ground. If those $500M hadn't been spent, they could have increased profits in the past, but unless they were financed by debt (which they weren't), they don't impact future earnings. And that's all they care about.

It's not unreasonable (and healthy, for the general progress of technology) that companies plan their investments accordingly: it allows them to make bigger bets (in a positive sense) with, once in a while, a big payoff too. It's also more fun that way for the spectators. ;)

Look at it this way: Arun has stated a number of times that their chipset business has been net cash flow positive only in the last few years. (I assume he means: all profits ever made minus all investments ever made.) Think about the worth of the company if they had decided instead to not go in different fields and stick only with GPU's. It's impossible to know, but it's very likely that, in the eyes of Wall Street, the company would be worth quite a bit less.
 
Making a SO-DIMM module would prolly help in MID, netbook pcb's a lot, but wouldn't that be too big for a mobile phone pcb? I mean a mobile phone pcb will likely be a single pcb, and not a pcb hosting a so dimm and baseband and other chips.

Is designing a working pcb a big deal enough to be called "nv did a lot of work for odm/oems"? I mean for an odm/oem, using this module as-is worth enough savings to warrant not customizing their mobile phone?
 
Not at all. The lowest-power Menlow MID in the world, to my knowledge, uses between 3 and 4W. 1/50th of 3000mW is still 60mW full-system with the screen off and the user doing nothing whatsoever. By comparison, you could shut-down Tegra and its RAM completely and keep a latest-gen 3G baseband in standby mode, and the full system would probably take less than 3mW. I'd argue that's a pretty big difference. Intel's argument will be that standby no longer matters because devices like iPhones get used so much that most people will want to recharge it every day or two anyway; this is partially true, but whether the OEMs will take them seriously with that kind of claim is very dubious.

Intel originally talked about getting moorestown to a x10 improvement compared to Menlow idle power, I assumed when they increased the figure to >x10 and then to x50, it was compared to menlow at idle.

[/QUOTE]No no; I said 1Q10 and I mean it.[/QUOTE]

Thats impressive.
 
silent_guy said:
Look at it this way: Arun has stated a number of times that their chipset business has been net cash flow positive only in the last few years. (I assume he means: all profits ever made minus all investments ever made.)
Yup, good idea to point that out again :) It's also an interesting example for another reason: it all depends on how you separate PC chipsets from the XBox1. If you assume they wouldn't have gotten either the MCP or the GPU design slot without that original MCP R&D, then that first return-on-investment would probably be the best in the company's history. Of course the further R&D that went on for several years after that had to wait until 2006/2007 to be amortized no matter how you account for things...

In the same way, how you count Tegra's costs depends on how you account for the Nintendo contract. This won't be quite as extreme though because it's an IP deal and they were probably aggressive on pricing; you could probably argue that's at least paying for the 3D core R&D though.

tangey said:
Intel originally talked about getting moorestown to a x10 improvement compared to Menlow idle power, I assumed when they increased the figure to >x10 and then to x50, it was compared to menlow at idle.
I'm not sure what you mean; I was also talking about Menlow at idle, but for the *full system* because that's the only thing that really matters and Intel has also made clear 50x is for the full system. I'm sorry, I said 3-4W for the lowest-power Menlow MID; my memory failed me and the correct number is actually 2W. That's still 40mW+ at idle though, assuming Intel delivers.

rpg.314 said:
Making a SO-DIMM module would prolly help in MID, netbook pcb's a lot, but wouldn't that be too big for a mobile phone pcb? I mean a mobile phone pcb will likely be a single pcb, and not a pcb hosting a so dimm and baseband and other chips.

Is designing a working pcb a big deal enough to be called "nv did a lot of work for odm/oems"? I mean for an odm/oem, using this module as-is worth enough savings to warrant not customizing their mobile phone?
Oh, there's a misunderstanding: this *is* only for netbook/MID-type devices. It wouldn't be the right kind of module for mobile phone PCBs, and I'm skeptical OEMs/ODMs would be interested anyway. I think they're trying to get this same module into IPTVs and car entertainment though, FWIW. Maybe one day we'll see something like this for handhelds but I wouldn't hold my breath.
 
I'm not sure what you mean; I was also talking about Menlow at idle, but for the *full system* because that's the only thing that really matters and Intel has also made clear 50x is for the full system. I'm sorry, I said 3-4W for the lowest-power Menlow MID; my memory failed me and the correct number is actually 2W. That's still 40mW+ at idle though, assuming Intel delivers.

Ok my bad interpretation, I thought Intel was talking about chip + chipset only. the 800Mhz Menlow idles around 80watts, dunno what poulsbo does at idle, but I was guessing that 1/50th of the total of both those was putting it in ballpark with ARM solutions, clearly thats not the basis of the comparision.
 
Apple will have already taken over much of the market Nintendo will target with their next handheld, shrinking the royalties nVidia expected.

The license fee for IP is also not too great for a single license, even a design win for a portable console.
 
I
Also, it's difficult for me to understand why Apple would do that when their iPhone 3G S SoC has a stronger CPU, a strong GPU that includes not just OGL ES but also DirectX9, *and* VXD (probably 370) for 40Mbps High Profile H.264 decoding. Meanwhile, Tegra1 is only capable of 20Mbps Baseline H.264 decoding, and they'd have to implement new drivers and so forth. Rather senseless, if you ask me. I'd be very confident both IMG and Infineon are still the big winners in that product. It's going to be pretty fun to see Apple surprise everyone who hasn't paid attention with 1080p playback on September 9th.

Would love to see 1080p playback. Would make more sense on a device with a larger footprint than the iPhone. But really, it needs HDMI out so you can put up that high-def content on a bigger screen.

Maybe that's what the new data center is for, HD video, because iPhone apps. certainly don't require as much storage or bandwidth as HD video.

But there is no money in video for Apple yet, while it's the App. store which provides one of the competitive advantages for the iPhone/iPod Touch.

And 40 Mbps videos? Not over the Internet. I think the current 720p content on iTunes is 5-10 Mbps, if that.

What would be cheaper, put Blu-Ray drives in their products or have ongoing bandwidth costs from hosting 1080p content with bitrates comparable to Blu-Ray?
 
Apple will have already taken over much of the market Nintendo will target with their next handheld, shrinking the royalties nVidia expected.

The license fee for IP is also not too great for a single license, even a design win for a portable console.

That's one damn risky prediction considering nothing is known about any next generation handheld yet. I'd say that it's up to SONY/NINTENDO to market their future handhelds accordingly to differentiate their products from anything else; it's the software that could make that difference and less the underlying hw for either/or. Granted I don't expect those to be just handheld gaming consoles in the strict sense either ;)
 
Nintendo can carve out a big enough market of gamers who will still pay $30-40 for the new Mario, Zelda, etc.

They could also have a more limited selection of dollar games for downloads.

But Sony may not be in a position to say the same, because they don't have the franchises Nintendo does, especially in the portable market.
 
Maybe that's what the new data center is for, HD video, because iPhone apps. certainly don't require as much storage or bandwidth as HD video.

You don't use a large data center to serve a million copies of a HD movie. You need a lot of small ones: the CDN business model. I believe Apple already uses at least 2 CDNs for quite a bit of their current content.
 
Last edited by a moderator:
tegra2 wishlist

- multicore arm (atleast dual, quad will be great, I'll certainly get a tegra2 netbook then)
- hw multithreading.
- dx10 support at the minimum
- some android/(any damn linux) netbooks so that I can hack it :)

When do tegra2 devices ship btw?
 
Future Tegra will be GT300-based, I would expect tegra 2 to be so.

With GPGPU probably usable for more applications, that will be a small Cell-like asymetric chip in a way. Very interesting.

but I still have mixed, schizophrenic feelings about this : either forget about x86 and get a netbook mostly for getting work done, or insist on x86, to run PC games.
 
Read somewhere that Tegra is based on tech acquired when nVidia purchased Porta Player, which had made chips for iPods until Apple switched to another supplier.
 
"Porta Player" doesn't give me a relevant result in google. I never heard of them or that deal.

Tegra is said to be based on geforce 6 (geforce 7600 and 7900 series were very efficent per area and per watt; geforce 6100-based mobos are still sold en masse today), the ARM stuff is licensed (ARM's business model).

Nvidia had experience in chipset since the first Xbox (where they basically made their first nForce), and did a related acquisition in 2005 when buying Uli, which has some system-on-chip experience :
http://www.nvidia.com/page/uli_m6117c.html
Uli was the former Ali, known for the dreadful super socket 7 platform (I wonder if the name change had to do with that :LOL: )

by the way, that's an x86 SoC that has been wandering for years on a nvidia page :). no idea if this could have any meaning.
 
Last edited by a moderator:
Back
Top