500mW for the highest-end 1GHz dual-core Tegra2@40nm sounds ok.
500mW is a marketing number, it doesn't mean anything whatsoever. It's not a TDP per-se; there is such a number (all subsystems activated at once), but nobody really cares about it since you can just down-throttle in that case, or just prevent it from happening completely. 1080p decode logic is around 100mW IIRC, which is a very nice improvement (although expected, TSMC 40LP is better than most people seem to realize).
I take it this will be the chip used for the rumoured NDS2 then? Should be one major upgrade.
I don't know, and anyone who claim to also probably doesn't. A lot less is known about the DS2 than the leaks would suggest, spec-wise. It might be a custom SoC, or it might not. It's very probably NV-based, but even that isn't certain.
And I assumed that snapdragon would arive before Cortex A9 SoC. The single core version is already in selling phones, and we have the specs for the dual-core one for quite a while. I tought that it would arive this year.
Snapdragon1, yes, of course (it's been in phones since last year, after all!) - but look at when it was announced! (hint: 2006). The fact Snapdragon2 specs have been announced means absolutely nothing, and though I definitely expect the gap to be noticeably shorter, don't hope for miracles either.
The 1080p decode works at "at bitrates in the 10s of megabits per second". Hopefully this means Level 4 compilance (25Mbps), but hardly Level 4.1 (65Mbps). So, you probably will not be able to see your blu-ray w/o transcoding it first. They are not marketing it, so there is little chance that it may be suported...
Tegra 650 supported 20Mbps Baseline H.264 - so I'd certainly expect this to mean at least 25Mbps High Profile, yes. As a matter of fact, nobody really cares about Level 4.1 compliance - NV PureVideo on the desktop was originally engineered for 40Mbps, iirc. Most solutions are 40 or 50Mbps, although I think that might change with Bluray 2.0 requirements... Either way, you'd be storage-limited.
I'm much more curious about the encode side. It would be very impressive if it also supported High Profile, and it'd help explain the slightly-higher-than-I-expected die size too. I'll see if I can get more precise info in the coming weeks.
With 12.1 megapixel cameras already in phones, this ISP seems lacking.
Yeah, although if I had to play Devil's Advocate I'd point out that a more expensive camera means a more expensive device for a given application processor ASP, which means less money for NV - so certainly it wouldn't make sense for them to actively encourage massive sensors. Most of the multimedia flagship devices today are still 5MP or 8MP (3MP in Apple's case!) - also it's possible that the lower-end device might ironically support larger sensors; after all, who cares about even 12MP on tablets? Finally, remember that camera sensor does not equal ISP; the performance would have had to be improved anyway to support 1080p Encode (all the frames need to pass through the ISP).
AFAIK NV hasn't even licensed it, but I could be wrong. If it does have NEON, it would probably only be on one core (i.e. heterogeneous), and I very much doubt that. As I said in the past, I genuinely believe it's a pretty dumb piece of silicon in the current market environment and NV believes the same AFAIK.
ANY details about the T2 gpu?
T1 GPU was 2xVS/2xTMU @ 120MHz, this is claimed to be 2x faster at least, sometimes up to 3x. So presumably it's 4xTMU/?xVS @ >120MHz?
With the announced dual-core Tegra2 or the unannounced single-core SKU?
I actually don't know which one taped-out first at this point, I thought for a long time the single-core did, in which case it would be that one. It's not impossible that one or a few OEMs will try to have models with both available as they would certainly be very similar package/SW-wise, but I'm a bit skeptical after all the Tegra1 device delays. We'll see.