NVIDIA Tegra Architecture

Yes, Tegra 4 is certainly one to look forward to. That said, the next Geforce ULP will have plenty of competition from Mali 6xx series, Rogue 6 series, Adreno 3xx, etc. Will be very interesting to see things unfold next year.

For now, Tegra 3 seems to be doing well, and tablets built with it have gotten more and more polished and refined over the last few months. The Nexus 7 tablet has more than 8 hours of battery life, and the user interface looks buttery smooth. The upcoming Surface RT tablet will have more than 10 hours of battery life, and again the user interface looks buttery smooth.
 
whats with the anti vent movement ? I can understand fans but if its a nice grill that is hidden by the design i'm all for it. The new ipad can get very hot to use. My phone also heats up. I wouldn't mind a vent.

I agree that if a phone/tablet gets hot to the touch, then a vent would be welcome. The only problem is that, depending on how one holds the phone/tablet, there may be some warm air blowing temporarily on one's hand :)

If I recall correctly, during the Surface RT unveil, Microsoft indicated that their tablet will be cool to the touch. So a tablet that is thin, has no internal fan (to be as quiet as possible), has no vents (to prevent warm air from blowing against anything), has good battery life (to last the whole day without needing to be recharged), has good performance (for smooth user interface), and is cool to the touch is really the ideal tablet. I feel that Microsoft did a really nice job with the Surface RT, but they really need to price it as close to the ipad as possible so that price doesn't sway people to overlook it.
 
In fact, if I was pricing the Surface RT tablet, I would price it no higher than $499, and then offer the Type or Touch keyboard as a $99 accessory.
 
I'd hope that the successor to the Tegra 3 will offer a bit more graphical grunt than its predecessor.

I know that NVidia's plans for the Tegra series are to get them out onto the market quickly, so T2 was the first dual-core A9, and T3 was the first quad-core A9, but they are rather lacking in GPU power.

As an example of this, I've bought a cheap, yet well-made, Chinese tablet which features the budget Rockchip RK3066 chip and this kicks the arse of the T3 in gaming benchmarks. Perhaps not too surprising considering the RK3066 contains a quad-core Mali and 1.6Ghz dual-core A9.

I doubt that Tegra 3+ will offer too much of an improvement to the GPU so I wonder how much more we will see from Tegra 4?
 
You have to say it has worked for them..forcing there way into a crowded market..but yes gpu..an area where they would b expected excel in...has been weak since tegra 1.

Tegra 4 on 28nm I expect is going to be a massive jump in all areas...its the chip I'm going to want in my next phone....as they will likely follow the Sam idea of using the same soc for tablets and phones...the gpu is likely to be open gl es 3.0 halti /dx 11.1 and open cl compliant....that means unified shaders, tesserlator and the lot.

I think they just got caught with there pants down on early models...I think if it wasn't for their world class drivers and them paying devs for special features it would be much worse imo.
 
I know that NVidia's plans for the Tegra series are to get them out onto the market quickly, so T2 was the first dual-core A9, and T3 was the first quad-core A9, but they are rather lacking in GPU power.

True, but I think their "tight" relationships with game developers for exclusive features and optimized performance is proving to be a lot more rewarding than having a larger chip with a beefier GPU.
 
True, but I think their "tight" relationships with game developers for exclusive features and optimized performance is proving to be a lot more rewarding than having a larger chip with a beefier GPU.

That's a good way to look at it...they spend some off their gpu silicon budget on software optimization....clever.
 
True, but I think their "tight" relationships with game developers for exclusive features and optimized performance is proving to be a lot more rewarding than having a larger chip with a beefier GPU.

Developers will go for volume primarily above all, which is pretty much self-explanatory as to why.
 
I think they just got caught with there pants down on early models...I think if it wasn't for their world class drivers and them paying devs for special features it would be much worse imo.

Grrrr. Don't talk to me about NVidia's "world class drivers". NVidia still haven't released ICS libs for the version of Tegra 2 used in my Motorola Atrix so I can't get a really good and full-featured ICS just yet, even with several enthusiast talented devs working on the device.

I'm kind of with Torvalds on this issue. :devilish:
 
True, but I think their "tight" relationships with game developers for exclusive features and optimized performance is proving to be a lot more rewarding than having a larger chip with a beefier GPU.

I think it is apparently possible to use various hacks and additional drivers to enable the additional Tegra 3 features on non-NVidia devices though, apparently, games are now being patched to stop this working...

It's pretty much an exact analogue of the situation in the PC games market!
 
As an example of this, I've bought a cheap, yet well-made, Chinese tablet which features the budget Rockchip RK3066 chip and this kicks the arse of the T3 in gaming benchmarks. Perhaps not too surprising considering the RK3066 contains a quad-core Mali and 1.6Ghz dual-core A9.

The RK3066 uses what is essentially the same GPU as the Samsung Galaxy S III international version (although I'm not entirely sure about GPU clock speed). So the GPU in Tegra 3+ should perform similarly in comparison to RK3066, and of course the quad-core CPU in Tegra 3+ should perform much better in comparison, all with better battery life too. So the RK3066 tablet is a good value, but so are upcoming Tegra 3 (and Tegra 3+) low cost tablets. Basically the SoC itself only makes up a relatively small fraction of the overall cost of the tablet. The display is usually the most expensive part.

I'd hope that the successor to the Tegra 3 will offer a bit more graphical grunt than its predecessor.

I know that NVidia's plans for the Tegra series are to get them out onto the market quickly, so T2 was the first dual-core A9, and T3 was the first quad-core A9, but they are rather lacking in GPU power.

I know what you mean about wanting or desiring more GPU performance from Tegra. The thing is, NVIDIA has had to make some tradeoffs to build up some momentum in the mobile space. In the smartphone/tablet market, they do not have the luxury at the moment to release high end, middle end, and low end SKU's. They have had to focus primarily on one SKU, and create great designs based on that. With high end video cards, the situation is different. NVIDIA can release GTX 690, GTX 680, GTX 670, GTX 660, etc. to fill a wide variety of price points.

So in light of that, Tegra 3 is a remarkable success given the limitations that NVIDIA had. If NVIDIA had the luxury of designing a mobile SKU with a much larger GPU die size (such as that used on A5 or A5X), then they would be competing for the performance crown in the mobile space. Alas, NVIDIA had to make a more modest choice of GPU to give themselves a better chance of achieving market share.

For NVIDIA to keep up with the likes of PowerVR and others in the future, they will need to expand the Tegra platform to target different price points. Wayne and Grey is certainly a step in the right direction towards achieving that goal.

Another thing I should point out is that people who use smartphones and tablet computers typically only use them for casual gaming. Most of the time, people use them for internet, texting, video, movies, music, GPS, social networking, etc. So unless one is running at super high screen resolution, the Tegra 3 GPU is often more than adequate. And particularly with games optimized for the Tegra platform, the mobile gaming experience on Tegra 3 is argually second to none. Last but not least, the quad-core CPU is advantageous with internet, gaming, and some other tasks, while the fifth CPU core provides some nice benefits in terms of longer battery life.
 
Developers will go for volume primarily above all, which is pretty much self-explanatory as to why.

I don't know if developers go for volume above all.
For the small teams from small companies that we see in many Android games, I think the free developer work hours that nVidia provides them may be better.

For example, I'm pretty sure there are more PowerVR GPUs in the mobile maket than nVidia GPUs, and maybe even more Adreno and Mali than nVidia too (because of the Galaxy S line), yet you still see many games with "nVidia-only features" and "exclusive optimizations".

Head on to Google Play's best-selling titles and you'll see lots of games with that: SiegeCraft, Dead Trigger, Shadowgun, Riptide GP and others.
 
I get the feeling, especially from looking at the GPU clock speeds nVidia is already pushing, that existing Tegra designs might hit some power and heat obstacles if their silicon were scaled up to compete with A5/A5X performance.
 
The RK3066 uses what is essentially the same GPU as the Samsung Galaxy S III international version (although I'm not entirely sure about GPU clock speed). So the GPU in Tegra 3+ should perform similarly in comparison to RK3066, and of course the quad-core CPU in Tegra 3+ should perform much better in comparison, all with better battery life too. So the RK3066 tablet is a good value, but so are upcoming Tegra 3 (and Tegra 3+) low cost tablets. Basically the SoC itself only makes up a relatively small fraction of the overall cost of the tablet. The display is usually the most expensive part.

I'm not sure what benchmarks you are running when you say the Tegra 3 has similar GPU performance to the RK3066. In GLBenchmark, the 720p offscreen figures show the Rockchip providing around 50% more performance than the T3. I believe this is the best benchmark for measuring GPU power alone?

Incidentally, I've seen reports say that the quad-core Mali in the RK3066 clocks at around 266MHz.

Battery life of my M11 is pretty decent - around 8 hours in normal usage (though less if gaming, as with all devices) and I'm very impressed with my the tablet considering it only cost me about 150 quid and has a screen almost as good as that of the iPad 2!

I can certainly see why NVidia have gone the way they have with their Tegra chips, pushing for greater CPU power, but it still seems odd that a company which has made its name in the GPU market has provided relatively low-performing graphics capabilities in their mobile stuff. Fingers-crossed, we'll see a lot more from T4 - something with the grunt to compete with PowerVR in the Apple devices would be nice!
 
I get the feeling, especially from looking at the GPU clock speeds nVidia is already pushing, that existing Tegra designs might hit some power and heat obstacles if their silicon were scaled up to compete with A5/A5X performance.

When working with a much larger GPU die size, one can increase execution units in order to achieve higher performance. So in such a scenario, higher GPU clock speed would not be required.

Just to give you an example, when going from ipad 2 to ipad 3, Apple was able to double GPU execution units (and double GPU performance, without needing to increase GPU clock frequencies) when increasing SoC die size from ~122 mm^2 to ~163 mm^2. The Tegra 3 SoC has a die size that is ~80 mm^2. So what do you think NVIDIA could do if they doubled their SoC die size to ~ 160 mm^2 and dedicated all that extra increase in die size towards the GPU? Of course, NVIDIA was not in a position to do so as explained above, but the difference in die size between Tegra 3 and A5/A5X is huge.
 
I'm not sure what benchmarks you are running when you say the Tegra 3 has similar GPU performance to the RK3066.

Not Tegra 3, but Tegra 3+ (T37, which is a die shrunk version of Tegra 3 fabricated using 28nm process) that is supposed to have at least 20-30% higher CPU/GPU clocks compared to T30L.

The Samsung Galaxy S III international phone uses the same GPU that your tablet is using (although GPU clock speed on the S III is not specified). Take a look at a performance preview here: http://www.anandtech.com/show/5810/samsung-galaxy-s-iii-performance-preview

The HTC One X international phone uses Tegra 3. So if one adds 20-30% to approximate the performance of Tegra 3+, then one can see that Tegra 3+ would compare reasonably well with Mali-400/MP4. In most tests, the differences in GPU performance would be less than 10%.


I can certainly see why NVidia have gone the way they have with their Tegra chips, pushing for greater CPU power, but it still seems odd that a company which has made its name in the GPU market has provided relatively low-performing graphics capabilities in their mobile stuff

Don't forget that the entire Tegra 3 SoC die size was only ~80 mm^2, while the A5X SoC die size was ~163 mm^2. NVIDIA is a resourceful company, but even they cannot perform miracles :D
 
No chance tegra 3 is as powerfull as Mali 400.

I never said that. I said that Tegra 3+ (aka T37, which is supposed to have at least 20-30% faster CPU/GPU operating frequencies vs. T30L, and probably updated drivers too) should be reasonably competitive with the Mali-400/MP4 that is used in the Galaxy S III international phone.
 
Back
Top