Tegra 2 Announcement

Arun

Unknown.
Moderator
Legend
NV-40-T20.png

(Click here for an annotated die shot, speculative ofc)

  • ~49mm² die size (TSMC 40LP & 260M Transistors)
  • 2x1GHz Cortex-A9 (Top SKU) with 1MiB L2 Cache
  • 2-3x faster 3D (4xTMU and/or higher clocks?)
  • 1080p High Profile H.264 Decode (Mbps?)
  • 1080p H.264 Encode (Baseline?)
  • 12MP ISP (same arch as T1?)
  • 32-bit LPDDR2/DDR2
And in practice:
  • On a 2000mAh (3.7v -> 7400mW) battery, 140 hours of music and 12 hours of 1080p HD playback *including* a 5" 400mW screen in the latter case.
  • 6 hours HD web streaming via 3G (i.e. enough to make any truly-unlimited-bandwidth carrier cry).
A few things I'd like to mention:
  • AFAIK, this is the highest-end Tegra2 chip. I'd expect a 1xA9/512KiB L2/1080p Baseline chip to be announced at Mobile World Congress.
  • I have good reason to believe the ready-to-use module for ODMs/OEMs uses the WM8320 for power management.
I'll edit this post later when I've got the time, for now feel free to just read my massive replies :p
 
Finally!
This is something I was waiting for :)
I hope that tablets and netbooks based on T2 will come sooner rather than later.
 
Why is Nvidia only marketing this for tablets (5"-15" according to the press release)? I mean the specs are comparable to TI's 45nm OMAP4, which is expected to start to show up in high-end smartphones by the end of 2010 or early 2011. With a e.g. 600MHz smartphone SKU Nvidia could have a 6 months head start.
 
  • ~49mm² die size (TSMC 40LP & 260M Transistors)
  • 2x1GHz Cortex-A9 (Top SKU) with 1MiB L2 Cache
  • 2-3x faster 3D (4xTMU and/or higher clocks?)
  • 1080p High Profile H.264 Decode (Mbps?)
  • 1080p H.264 Encode (Baseline?)
  • 12MP ISP (same arch as T1?)
  • 32-bit LPDDR2/DDR2
And in practice:
  • On a 2000mAh (3.7v -> 7400mW) battery, 140 hours of music and 12 hours of 1080p HD playback *including* a 5" 400mW screen in the latter case.
  • 6 hours HD web streaming via 3G (i.e. enough to make any truly-unlimited-bandwidth carrier cry).
A few things I'd like to mention:
  • AFAIK, this is the highest-end Tegra2 chip. I'd expect a 1xA9/512KiB L2/1080p Baseline chip to be announced at Mobile World Congress.
  • I have good reason to believe the ready-to-use module for ODMs/OEMs uses the WM8320 for power management.
I'll edit this post later when I've got the time.
The new SoC purportedly allows for ten times the performance of a modern smart phone, and it draws only 500mW.
http://techreport.com/discussions.x/18254

500mW for the highest-end 1GHz dual-core Tegra2@40nm sounds ok.
 
I was expecting higher clock speeds (1.2~1.5Ghz) and die size (Tegra1 was 144 mm²!!! what they did now?) from this high-end, tablet and smartbook oriented, tegra2 chip. In the end, it is much like the smartphone oriented OMAP4, and the CPU probably slower than the 1.5GHz dual-core snapdragon...
 
I was expecting higher clock speeds (1.2~1.5Ghz) and die size (Tegra1 was 144 mm²!!! what they did now?) from this high-end, tablet and smartbook oriented, tegra2 chip. In the end, it is much like the smartphone oriented OMAP4, and the CPU probably slower than the 1.5GHz dual-core snapdragon...
Tegra1 was closer to 40mm2, 144 was the package area. As for speed, yeah, although Snapdragon2 devices won't be available before 2011 evidently...

As for phones, remember the lead times are longer. They do expect phones in H2.
 
  • AFAIK, this is the highest-end Tegra2 chip. I'd expect a 1xA9/512KiB L2/1080p Baseline chip to be announced at Mobile World Congress.
.

I take it this will be the chip used for the rumoured NDS2 then? Should be one major upgrade.

Also of note, Nvidia demonstrated UE3 running on Tegra2.
 
Tegra1 was closer to 40mm2, 144 was the package area. As for speed, yeah, although Snapdragon2 devices won't be available before 2011 evidently....
Oh, now that makes sense, and explains why Tegra 1 could be used in an small device like Zune...

And I assumed that snapdragon would arive before Cortex A9 SoC. The single core version is already in selling phones, and we have the specs for the dual-core one for quite a while. I tought that it would arive this year.

Also, Anandtech made an article about it: http://www.anandtech.com/gadgets/showdoc.aspx?i=3714

The 1080p decode works at "at bitrates in the 10s of megabits per second". Hopefully this means Level 4 compilance (25Mbps), but hardly Level 4.1 (65Mbps). So, you probably will not be able to see your blu-ray w/o transcoding it first. They are not marketing it, so there is little chance that it may be suported...

Anyone have nvidia slides and/or an die-shot w/o those ugly opaque boxes above it?
 
The UE3 demonstration for Tegra 2 is up on youtube, by the way. Certainly nice.

Yep, just watched it, this thing looks like a really nice little chip. Tim Sweeney compares it to high end PC graphics of 3-4 years ago, surely here's being rather generous there? In terms of featureset it may be there but not in terms of raw performance, surely?

Any idea what resolution that demo was running at? If it really was "HD" (i.e. 720p), then that'd be damn incredible.

Link here:

http://www.youtube.com/watch?v=PpGtu_ZkwqA
 
  • ~49mm² die size (TSMC 40LP & 260M Transistors)
  • 2x1GHz Cortex-A9 (Top SKU) with 1MiB L2 Cache
  • 2-3x faster 3D (4xTMU and/or higher clocks?)
  • 1080p High Profile H.264 Decode (Mbps?)
  • 1080p H.264 Encode (Baseline?)
  • 12MP ISP (same arch as T1?)
  • 32-bit LPDDR2/DDR2
And in practice:
  • On a 2000mAh (3.7v -> 7400mW) battery, 140 hours of music and 12 hours of 1080p HD playback *including* a 5" 400mW screen in the latter case.
  • 6 hours HD web streaming via 3G (i.e. enough to make any truly-unlimited-bandwidth carrier cry).
A few things I'd like to mention:
  • AFAIK, this is the highest-end Tegra2 chip. I'd expect a 1xA9/512KiB L2/1080p Baseline chip to be announced at Mobile World Congress.
  • I have good reason to believe the ready-to-use module for ODMs/OEMs uses the WM8320 for power management.
I'll edit this post later when I've got the time.

My questions

  1. Does it have NEON?
  2. ANY details about the T2 gpu?
  3. Any tidbits about nv's arrogance that anand mentioned in his article? :)
 
Tegra1 was closer to 40mm2, 144 was the package area. As for speed, yeah, although Snapdragon2 devices won't be available before 2011 evidently...

As for phones, remember the lead times are longer. They do expect phones in H2.
With the announced dual-core Tegra2 or the unannounced single-core SKU?
 
Anyone got any idea when the recorded version of the webcast will become available?
 
500mW for the highest-end 1GHz dual-core Tegra2@40nm sounds ok.
500mW is a marketing number, it doesn't mean anything whatsoever. It's not a TDP per-se; there is such a number (all subsystems activated at once), but nobody really cares about it since you can just down-throttle in that case, or just prevent it from happening completely. 1080p decode logic is around 100mW IIRC, which is a very nice improvement (although expected, TSMC 40LP is better than most people seem to realize).
I take it this will be the chip used for the rumoured NDS2 then? Should be one major upgrade.
I don't know, and anyone who claim to also probably doesn't. A lot less is known about the DS2 than the leaks would suggest, spec-wise. It might be a custom SoC, or it might not. It's very probably NV-based, but even that isn't certain.
And I assumed that snapdragon would arive before Cortex A9 SoC. The single core version is already in selling phones, and we have the specs for the dual-core one for quite a while. I tought that it would arive this year.
Snapdragon1, yes, of course (it's been in phones since last year, after all!) - but look at when it was announced! (hint: 2006). The fact Snapdragon2 specs have been announced means absolutely nothing, and though I definitely expect the gap to be noticeably shorter, don't hope for miracles either.
The 1080p decode works at "at bitrates in the 10s of megabits per second". Hopefully this means Level 4 compilance (25Mbps), but hardly Level 4.1 (65Mbps). So, you probably will not be able to see your blu-ray w/o transcoding it first. They are not marketing it, so there is little chance that it may be suported...
Tegra 650 supported 20Mbps Baseline H.264 - so I'd certainly expect this to mean at least 25Mbps High Profile, yes. As a matter of fact, nobody really cares about Level 4.1 compliance - NV PureVideo on the desktop was originally engineered for 40Mbps, iirc. Most solutions are 40 or 50Mbps, although I think that might change with Bluray 2.0 requirements... Either way, you'd be storage-limited.

I'm much more curious about the encode side. It would be very impressive if it also supported High Profile, and it'd help explain the slightly-higher-than-I-expected die size too. I'll see if I can get more precise info in the coming weeks.
With 12.1 megapixel cameras already in phones, this ISP seems lacking.
Yeah, although if I had to play Devil's Advocate I'd point out that a more expensive camera means a more expensive device for a given application processor ASP, which means less money for NV - so certainly it wouldn't make sense for them to actively encourage massive sensors. Most of the multimedia flagship devices today are still 5MP or 8MP (3MP in Apple's case!) - also it's possible that the lower-end device might ironically support larger sensors; after all, who cares about even 12MP on tablets? Finally, remember that camera sensor does not equal ISP; the performance would have had to be improved anyway to support 1080p Encode (all the frames need to pass through the ISP).
Does it have NEON?
AFAIK NV hasn't even licensed it, but I could be wrong. If it does have NEON, it would probably only be on one core (i.e. heterogeneous), and I very much doubt that. As I said in the past, I genuinely believe it's a pretty dumb piece of silicon in the current market environment and NV believes the same AFAIK.
ANY details about the T2 gpu?
T1 GPU was 2xVS/2xTMU @ 120MHz, this is claimed to be 2x faster at least, sometimes up to 3x. So presumably it's 4xTMU/?xVS @ >120MHz?
With the announced dual-core Tegra2 or the unannounced single-core SKU?
I actually don't know which one taped-out first at this point, I thought for a long time the single-core did, in which case it would be that one. It's not impossible that one or a few OEMs will try to have models with both available as they would certainly be very similar package/SW-wise, but I'm a bit skeptical after all the Tegra1 device delays. We'll see.
 
T1 GPU was 2xVS/2xTMU @ 120MHz, this is claimed to be 2x faster at least, sometimes up to 3x. So presumably it's 4xTMU/?xVS @ >120MHz?

Unified shaders? Is it a TBDR like SGX?

EDIT: Anand said that nv is being pretty quite about Tegra 2 gpu. I wonder why.....
 
Anyone got any idea when the recorded version of the webcast will become available?
None, but if what you care about is a clean die shot, I just edited my first post ;)
rpg.314 said:
But a dual core 500 MHz A9 is neck and neck with a 1.6GHz Atom in web browsing. With probably lower area and power as well.
Assuming the 49mm² die size is correct (Anand isn't the best source, and it's suspiciously near 7x7, although it makes sense given the specs) then each individual Cortex-A9 core (including L1/FPU, excluding L2 I/F, PTM, NEON, etc.) takes only 1.3mm²! Let me repeat that again: 1.3mm². Including all that stuff (except NEON presumably ofc), the dual-core+L2 takes ~7.25mm² (also keep in mind the L2 is reused as a buffer for video etc.)
Unified shaders? Is it a TBDR like SGX?
Uhm, I don't know what time it is where you are, but you probably need more sleep :) I have no idea how you got that impression from what I said... It's neither unified nor a TBDR.
EDIT: Anand said that nv is being pretty quite about Tegra 2 gpu. I wonder why.....
Because competitive analysis in the industry is awfully deficient so they figure they'll have a competitive advantage from not saying anything. Although now that Rys is at IMG/PowerVR, I might want to try and convince them that resistance is futile ;)
 
Yep, just watched it, this thing looks like a really nice little chip. Tim Sweeney compares it to high end PC graphics of 3-4 years ago, surely here's being rather generous there? In terms of featureset it may be there but not in terms of raw performance, surely?

My laptop has a 1.6GHz AMD dual core with NV6 igp. I bought it for ~$1000 in 2006. So it is comparable to a low-end-mid range laptop from 3 years ago. :smile:

Probably he was compensating for the low res displays these things will be driving. IOW, they can deliver the same IQ at mobile/handheld res as a 3-4 year old high end pc on then-prevalent resolutions.
 
Back
Top