NVIDIA Tegra Architecture

Arguably, the reason nVidia has to ask/help developers make custom enhanced versions of mobile games for their SoC is because Apple doesn't need to ask for developers to do it for their PowerVR GPUs.

That's a good point. I actually wasn't aware that devs were already targeting higher end iOS devices.
 
What can we do on tomorrow's SoC's that we can't do today?

Play games with better graphics? Its a chicken or the egg question no?

Im not sure what games you play but there are already games that are pushing the envelope (look at Gameloft). Right now everyone is targetting the lowest common denominator wich is the SGX 535 but fairly soon they will move on to SGX543MP2 and the level of graphics and detail will go up

1080p screens werent introduced to sell better GPUs, most people care less about that but now that they are out there, we will need more powerful hardware to run those screens

S4Pro on 720p is probably more than enough for the next few years but move up to 1080p and a lot of lazy developers (again, looking at you Gameloft) might make it struggle next year.
 
Because watching videos and whatever else they do on their mobile device stresses current generation GPUs?

Depends on the content; if you'd always just stay for instance with 240p or 360p you wouldn't need an upgrade for many years to come.

I'm fine with GPU perf increasing, of course. But I totally get why Nvidia decided to favor CPU over GPU for Tegra 3. Being known for a PC GPU company doesn't mean that you shouldn't first get the performance fundamentals right. And for the vast majority of users, things are still CPU bound. This is no different than Intel with PCs: focus on CPU first, once that levels off, start increasing the GPU.

What's so great about so far Atoms exactly? Else so far I don't see any "intel effect" in any of those CPUs so I clearly must be missing that focus.

As for NV specifically the above is the explanation that makes sense watching from the outside. How many engineers did the ULP GF team have exactly for up to Tegra3 development for example?
 
That's a good point. I actually wasn't aware that devs were already targeting higher end iOS devices.
Well the A5 is considered mid-range on iOS now. I think this is just where fragmentation rears it's head. Fragmentation isn't really an impediment for developers to "support" Android, but to go beyond merely supporting to releasing optimized versions for specific SoC or devices it's certainly easier to justify and do on iOS where there are only a limited number of combinations and you know each represents 10s of millions of devices, whereas on Android you're going to have to generalize.

Many/most games aren't pushing graphics enough to need to tweak for every device in order to achieve the effect they want, but for those that do, The Bard's Tale explicit description of how they are handling it is probably pretty accurate for how graphics are balanced on each iOS device. The only exception is that most games, if they bother to support it at all, probably try to keep the iPad 1 at native resolution and just cut back effects further rather than displaying such a low resolution on such a big screen.

https://itunes.apple.com/ca/app/the-bards-tale/id480375355?mt=8

• iPad4: 2048x1536 + enhanced rendering**
• iPad3: 1536x1152 + enhanced rendering**
• iPad2/Mini: 1024x768 + enhanced rendering**
• iPhone5/iPod Touch 5G: 1136x640 + enhanced rendering**
• iPhone4S: 960x640 + enhanced rendering**
• iPad: 512x384
• iPhone*: 480x320
• iPhone*/ iPod Touch*: 480x320

* Compatible with iOS 4.0 and higher on iPhone 3GS, iPhone 4, iPhone 4S, iPod Touch 3rd generation, iPod Touch 4th generation, iPad, and iPad2&3.
** HD on iPhone 4S/5, iPad2,Mini,3&4, iPod Touch 5G. Enhanced rendering includes: environment mapping, dynamic water surfaces, full shadow cast, and 60Hz display-refresh (with in-game “Battery Savings” option turned off).
What's so great about so far Atoms exactly? Else so far I don't see any "intel effect" in any of those CPUs so I clearly must be missing that focus.
I think silent_guy was referring to Intel on desktop where the need for more CPU performance for most users is leveling off and now Intel is dedicating more effort and die area to the IGP. Mobile being a few years behind the desktop market, we are presumably in the CPU bound era if mobile follows the same user requirement progression as desktops did, which may or may not be true.
 
Last edited by a moderator:
Depends on the content; if you'd always just stay for instance with 240p or 360p you wouldn't need an upgrade for many years to come.

I could be mistaken but I thought current gen hardware already encodes and decides 1080p faster than real time.

@Jubei, yes it's a bit of a chicken and egg situation and I guess I'm just not seeing the demand from the consumer and developer/software side.
 
http://www.theverge.com/2013/2/27/4035286/qualcomm-claims-snapdragon-800-supremacy

Qualcomm's Senior VP of Product Management, Raj Talluri, told us today that his team is "more focused on shipping products" than refuting competitors' claims, but he still believes that the latest iteration of the Snapdragon chip "easily" beats Nvidia's best. Tegra 4 may be impressive, but the Snapdragon 800 is "so much more integrated" — by virtue of having an LTE modem built right into its silicon die — and simply more powerful in general terms.
Qualcomm's response to nVidia's performance claims. I guess it can be characterized as dismissive.
 
Qualcomm's response to nVidia's performance claims. I guess it can be characterized as dismissive.

Definitely dismissive, but perhaps a bit ignorant too. After all, the relatively new S4 Pro and the upcoming S600 do not have an integrated LTE modem (not to mention many other quality SoC's that do not have an integrated LTE modem such as A6, A6X, Exynos 4, Exynos 5, etc.).
 
I think silent_guy was referring to Intel on desktop where the need for more CPU performance for most users is leveling off and now Intel is dedicating more effort and die area to the IGP. Mobile being a few years behind the desktop market, we are presumably in the CPU bound
Yup.
 
And how about something like H.264 HP @ L5.2 for the future? Too power hungry and too expensive today but how about the future?
First, I don't consider video encode/decode to be part of the GPU. It was only historically so in desktop PCs, but they're really independent.

Second, I don't even know what HP@L5.2 is. Probably something with better quality, but I couldn't care less. Video on even early smartphones was already good enough. And I honestly couldn't tell you if my Netflix streams on iPhone/iPad are 720p or 1080p, nor do I care.

(I do know that I always order $3 SD instead of $4 HD on Amazon Instant Video on my TV because the minor difference in quality is not something I'm willing to pay for. It's weird how bad text on low resolution seems to bother me much more than video...)
 
First, I don't consider video encode/decode to be part of the GPU. It was only historically so in desktop PCs, but they're really independent.

You could expect that I know at least as much. Are you confident yourself that tablet resolutions will stay at the current maximum of 2560*1600 or will we see another jump in the future?

Second, I don't even know what HP@L5.2 is. Probably something with better quality, but I couldn't care less. Video on even early smartphones was already good enough. And I honestly couldn't tell you if my Netflix streams on iPhone/iPad are 720p or 1080p, nor do I care.
It's a 4k stream (3840*2160) for the further future of course. Albeit decoders are already available for those.

(I do know that I always order $3 SD instead of $4 HD on Amazon Instant Video on my TV because the minor difference in quality is not something I'm willing to pay for. It's weird how bad text on low resolution seems to bother me much more than video...)
On a up to 10" display SD is more than enough; if tablets shouldn't grow in the future up to say 20" like that weird Panasonic sample I linked to in a former post differences are going to be hard to spot. However if you stream say a video to TV display for instance the bigger the latter's screen the more the difference will be apparent.
 
I could be mistaken but I thought current gen hardware already encodes and decides 1080p faster than real time.

@Jubei, yes it's a bit of a chicken and egg situation and I guess I'm just not seeing the demand from the consumer and developer/software side.

At 1080p it would be a nice luxury to have more gpu performance. ...faster web page loads...nicer games...ok we are already at a stage where we dont exactly need more performance to play current content..well.

But as smartphones move closer and closer to mobile personal computers (I already use my galaxy s3 like this..and works fairly well suprisingly. )..we will need/ appreciate more power...perhaps a cortex a15 @2ghz / krait 400 @ 2.3ghz is more than adequate for almost all computing tasks...if you think back 6 years ago that would be a high end pc like performance( with laptop like gpu performance).

But mobile gpus will need to keep growing to close the gap on pc hardware and functionality.

If you read my posts I was basically saying that nvidia doesnt seem to b ahad of the curve on mobile socs graphics performance and api...thats not to say they they a million miles away they are not...but the fact w8 rt has to be chopped down to d3d 9.1 probably to accomodate tegra 3..doesnt look good for nvidia in my opinion.

The reason for this has also been discussed....obviously windows driver experience and maturity was the reason...but my point about ms was over the recent 3 mobile OS introductions...ms has gone with out dated hardware with the puny advantage of perhaps saving a few months on launch time..at the likely cost of losing hardware excitement and desirability. ..not to mention putting a lower performance common denominator on their new software which has been a missed opportunity for development. .in my opinion.

Edit..As answer to the inevitable retort about no other soc manufacturer having windows drivers..so what could ms have gone with if not tegra 3...I have already answered this...although api is still the same...tegra3 plus would have made a better choice than vanilla tegra 3...higher performace due to clocks and more bandwidth thanks to the introduction of ddr3L.

There are rumours of certain windows rt games/hd movies struggling for performace in certain situations....things like 1080p videos off youtube...and hydro thunder..im not sure whether this would be un optimised software or more HW.. (perhaps both) but I feel tegra 3 plus would have been a better choice considering it turned up inside the asus transformer infinity at around the same time.
 
Last edited by a moderator:
If you read my posts I was basically saying that nvidia doesnt seem to b ahad of the curve on mobile socs graphics performance and api...thats not to say they they a million miles away they are not...but the fact w8 rt has to be chopped down to d3d 9.1 probably to accomodate tegra 3..doesnt look good for nvidia in my opinion.

The reason for this has also been discussed....obviously windows driver experience and maturity was the reason...but my point about ms was over the recent 3 mobile OS introductions...ms has gone with out dated hardware with the puny advantage of perhaps saving a few months on launch time..at the likely cost of losing hardware excitement and desirability. ..not to mention putting a lower performance common denominator on their new software which has been a missed opportunity for development. .in my opinion.
Speculation on my part is that Microsoft shooting for lower-end hardware requirements is an accommodation for their OS fee. With the Android OS being "free", although perhaps there is some cost for Google services, Microsoft may have had to cut back on the hardware requirements to allow devices makers to use cheaper parts to give Microsoft their OS cut while yielding the same end device price for Windows/Windows Phone devices as Android devices. nVidia was presumably offering Tegra 3 cheaply in order to gain market share and was a good option for Windows RT.
 
Speculation on my part is that Microsoft shooting for lower-end hardware requirements is an accommodation for their OS fee. With the Android OS being "free", although perhaps there is some cost for Google services, Microsoft may have had to cut back on the hardware requirements to allow devices makers to use cheaper parts to give Microsoft their OS cut while yielding the same end device price for Windows/Windows Phone devices as Android devices. nVidia was presumably offering Tegra 3 cheaply in order to gain market share and was a good option for Windows RT.

Yes an astute observation...perhaps nvidia had slower orders and excess inventory for the tegra 3..perhaps tegra 3 plus was just higher binned tegra 3 and therfore wouldnt meet demand for surface rt orders?...

With other manufacturer unable to offer up reliable drivers and nvidia likely offering up the scenario you suggest, perhaps it just "made sense".
 
You could expect that I know at least as much. Are you confident yourself that tablet resolutions will stay at the current maximum of 2560*1600 or will we see another jump in the future?
Absolutely not.

But I'm saying that we're a long time (many years) away before a streaming video source will get anywhere close to that resolution. So I'm wondering why you brought up video in the first place?

However if you stream say a video to TV display for instance the bigger the latter's screen the more the difference will be apparent.
I believe Tegra 3 demoed 25x14 video decode at its introduction? Don't know about 4, but it's probably already 4K capable.
Do you expect streaming 25x14 or 4K from your phone to a TV anytime soon?

I don't, but even if you do, I think those cases are already covered by current hardware even if not in the latest annex if the spec. Which shouldn't be a problem, since Netflix already encoded in 120 different streaming formats anyway, to ensure device compatibility.
 
With other manufacturer unable to offer up reliable drivers and nvidia likely offering up the scenario you suggest, perhaps it just "made sense".
You keep saying that, but there have been Windows RT tablets running on Snapdragon chips almost from the beginning. Just look up the Dell XPS 10 and the Samsung ATIV Tab (they get phenomenal battery life BTW, performance is only marginally better than Tegra 3).
 
You keep saying that, but there have been Windows RT tablets running on Snapdragon chips almost from the beginning. Just look up the Dell XPS 10 and the Samsung ATIV Tab (they get phenomenal battery life BTW, performance is only marginally better than Tegra 3).

Oh really? Ive not heard that till now..how bizarre. ..sweet ill have a look :)...still it doesn't change the fact tegra 3 is the lowest common denominator. ..when s4 pro and tegra 3 plus were avaliable.

Edit.
https://www.youtube.com/watch?v=1e8RBEJBeaA&feature=youtube_gdata_player

Here we go...only dual core krait though...interesting how much power tegra sucks...a bit unfair though as tegra3 was designed to use the shadow core..heterogeneous multi processing isnt utilised in Windows 8 rt yet.
 
Last edited by a moderator:
Absolutely not.

But I'm saying that we're a long time (many years) away before a streaming video source will get anywhere close to that resolution. So I'm wondering why you brought up video in the first place?

Because hw is already moving into that direction and because I mentioned video in my initial reply:

Just because you fail to see the point it doesn't mean that some folks aren't killing some time on the move watching videos, playing games or whatever else with their mobile device.

I believe Tegra 3 demoed 25x14 video decode at its introduction? Don't know about 4, but it's probably already 4K capable.
Do you expect streaming 25x14 or 4K from your phone to a TV anytime soon?
I don't have yet a 4k capable capturing device nor a 4k TV either. However I am capturing quite a few short videos of my little one's nonsense, store the funniest ones in my mobile and am connecting it from time to time to TVs when I'm at friends so we can have a laugh or two.

Are we debating here the typical chicken/egg dilemma? It's not that you or the majority doesn't know how the story goes with hw capabilities and the question was what we really need higher end hw for. Requirements won't remain idle nor will technology stop to advance and video is part of that answer.

I don't, but even if you do, I think those cases are already covered by current hardware even if not in the latest annex if the spec. Which shouldn't be a problem, since Netflix already encoded in 120 different streaming formats anyway, to ensure device compatibility.
If I'd be judging from my own use cases alone, answers would be diametrically different. If I'd have to project material f.e. during a conference to a =/>100sqm video wall and would have the choice between 1080p and 4k resolutions what would be the more obvious choice?

Wasn't it just a couple of posts ago that you stated that high ppi is a eulogy for text displays? How non annoying would text be on a big ass video wall for those folks sitting in the front rows?
 
If the Tegra 3 is already 4k capable, why than is Qualcomm making such a big fuss over their Snapdragon 800 being capable of doing 4k decoding and encoding? At both CES and MWC they had this stand with a big 4k display that was being driven by a prototype Snapdragon 800 device.
 
Back
Top