NVIDIA Tegra Architecture

You can be likely to suffer from wireless congestion, though that depends on where you live and time of day. 2.4GHz is crowded, 5.5GHz Wifi is a bad omission if you want to push real time interactive streaming.

Well the tablet doesn't have any controls so it doesn't have to be interactive, it just needs to send a 720p H264 + stereo (AAC, MP3, whatever) video stream fast enough to the tablet.
Interaction would have to be made with i.e. a wireless gamepad connected to the "render server".
 
http://www.digitimes.com/news/a20131212PD211.html

TSMC's 16nm FinFET has entered risk production, with volume production scheduled to kick off within one year, according to the foundry chipmaker. Newly-appointed co-CEO Mark Liu said at TSMC's annual supply chain management forum that the foundry is looking to ramp up 16nm production ahead of schedule.

TSMC revealed previously that it would initiate volume production of 16nm FinFET chips in the first quarter of 2015.

Liu also indicated that TSMC is set to enter volume production of SoC chips built using 20nm process technology in January 2014.
 
Tegra 5 Speculations?

CES 2014 is coming in less that 2 weeks (Jan 7-10).

Nvidia usually announces their latest Tegra at that show.

So what do we know and what is speculated that Tegra 5 will be?
 
Will the Tegra5 require active cooling? If so, how large of a fan? What will it's power draw be? How hot will it run?
 
From what I gather it's pretty much guaranteed to be a 4+1 Cortex-A15 design with Kepler-M graphics on 28nm. The only real unknown is the precise configuration of the GPU.
 
Will the Tegra5 require active cooling? If so, how large of a fan? What will it's power draw be? How hot will it run?

nVidia has demonstrated the GPU reaching similar performance levels as iPad 4 while using < 1W so that part will have no problem scaling to well below levels needing active cooling. I see no indication that they'll use a CPU with inferior perf/W than the 4+1 Cortex-A15 arrangement in Tegra 4; if they use A15 again chances are it'll be the revision with some more power consumption optimizations, so it should be at least slightly better. And it should have no problem scaling to the same power levels Tegra 4 does.
 
nVidia has demonstrated the GPU reaching similar performance levels as iPad 4 while using < 1W so that part will have no problem scaling to well below levels needing active cooling. I see no indication that they'll use a CPU with inferior perf/W than the 4+1 Cortex-A15 arrangement in Tegra 4; if they use A15 again chances are it'll be the revision with some more power consumption optimizations, so it should be at least slightly better. And it should have no problem scaling to the same power levels Tegra 4 does.

Presumably, the next iteration of the Shield console (if there is one) might feature active cooling, but I agree that in phones and tablets there should be no need for it.
 
Latest guesses for Tegra 5 (at least top-end version?) are for quad-A15 + ninja core together with a GK208. That makes 384 cuda SPs, 32 TMUs and 8 ROPs.


EDIT: Make that 16 TMUs for GK208. I've been wronged by wikipedia.
 
Last edited by a moderator:
My money is on a more concrete Tegra 5 reveal, meaning CPU information, more exact GPU specs, and benchmark/power numbers.
 
Full disclosure on Tegra 5 before Tegra 4i devices see the light of day?
BTW, where the hell is Tegra 4i?
 
I hope they did can it, it really is too few too late.
Overall and whereas I know I might be part of the 0.1% here I'm hoping for Nvidia to change its approach of this market.
I think they are trying to hard to make a dent in the high end, I would say it is in their DNA. Though I think they are not playing their strength properly, and that comes from somebody they seriously contemplated the option of buying a tegra note 7 this Christmas.
I was to post something in those lines of thinking a couple of weeks ago and decided to pass, I watched yesterday the podcast Anand made with an ARM Fellow and changed my mind, especially the part about the thriving market of mid end devices for which MotoG is a poster child.
My belief is that Nvidia focus too hard on hardware excellence and proprietary solutions. In turn their time to market suffers.
IMHO what they need is, for now at least, focus on mid range products and use off the shelves ARM solutions (on the CPU side obviously).
Hardware is ubberly important for Nvidia so is software, actually software might be the main reason for Nvidia position in the PC realm. My take is that on a market where they have no name, where costumers are clueless, software support, long term software support, is going to do a lot more for the brand than attemtps at performance leadership.
I don't think Nvidia needs crazy quad cores, insane gpu to succeed and a sustainable place for it self on the market, what they need is well rounded hardware and support, the type of support you have on iOS or Nexus devices.
One thing they do right is Tetra zone and matching optimized games. They do it with their GPU cards, their strength should be to provide the same level of support in the mobile world.
Nvidia should rhyme with "safety" on costumers perception IMHO that is what they should aim for.
Back to hardware Nvidia should refrain to spread it self too much, they are excellent there is not disputing it, but they are not Qualcomm or Samsumg or Apple, both in man power and brand recognizion.
In the podcast Anand made he asked the ARM fellow about his dream set-up for a mobile device on the upcoming node. His answer was 2 A57 and 4 A53, both clusters clocked conservatively wrt to PR standards. PR standards are not useless but they are more relevant in the high end than in the mid range.
Again MotoG is proof for that you don't need crazy specs on that segment just a consistant product and support. I belief is that Nvidia is in a great position to provide that type of product now they bought Icera.
Ultimately NV tries to sell pretty high end SoC, with matching power consumption and those chips end in pretty mid range devices of not lower. Luckily they are keeping.g die size under control. Yet my take is that they would be better providing proper mid range SoC (what Tegra4i could have been if not so late).
Keeping in mind that ARM fellow dream set-up, I would think that for example a great product that could have launched in place of Tegra 4 as we know it could have consisted of:
1 A15 and 2 A7 cores, with GPU more akin to the one in the Tegra 4i. Not crazy on the paper, like the moto G, but with proper support (OS and Games) it might very well have turned into a class defining SoC.

On the other hand I think their move with the Tegra Note is great, they should stick to that approach and actually extend it to phones.
For this CES I've few hopes, they may have great things to showcase on the gpu front but overall I expect them to pay to high a price in power consumption. I'm also expect them to be late to the ARM v8 party Apple got started.
Still I see a lot of potential in Nvidia, just I think are almost too ambitious.
 
I don't think T4i is canned, nor should it be. There should be a market for a relatively fast quad-core Cortex-A9-R4 with integrated LTE. It might not beat octo-core A7s from Mediatek et al. in Antutu and other synthetic benchmarks, but it should do better in most real applications, which seldom scale past 2 threads, let alone 4. Sure, when A53 and A12 show up, A9 will become obsolete, but until then (~6 months?) T4i could prove attractive to OEMs. It would have been better a year ago, but I guess figuring out LTE integration takes time.

The main problem I see is the GPU: no OpenGL ES 3.0 support doesn't look very good in a 2014 SoC. I'm also curious as to what will replace T4i, and I hope NVIDIA will talk about that. My guess would be a quad-A53 with Kepler (Maxwell?) graphics and LTE. We'll see.
 
I'm not convinced that a quad A53 will be an improvement on a quad A9-R4. At least not if those clocks that NVIDIA is going for are an indication. AFAIK, those Cortex-A53 cores aren't really made for high clock speeds and that's why I don't expect to see them at anything higher than say 1.8 GHz (Just a guess). That should make NVIDIA's 2.3 GHz A9s still a bit faster. That said, those A53s will probably be quite a bit more energy efficient.
 
I'm not convinced that a quad A53 will be an improvement on a quad A9-R4. At least not if those clocks that NVIDIA is going for are an indication. AFAIK, those Cortex-A53 cores aren't really made for high clock speeds and that's why I don't expect to see them at anything higher than say 1.8 GHz (Just a guess). That should make NVIDIA's 2.3 GHz A9s still a bit faster. That said, those A53s will probably be quite a bit more energy efficient.

ARM apparently expects A53 to be used around 2GHz:
Cortex-A53-relative-performance-chart.png

http://www.arm.com/products/processors/cortex-a50/cortex-a53-processor.php

I'm not sure it's any less capable of reaching high clock than A9. Besides, I imagine T4i will often have to throttle down.
 
I don't think T4i is canned, nor should it be. There should be a market for a relatively fast quad-core Cortex-A9-R4 with integrated LTE. It might not beat octo-core A7s from Mediatek et al. in Antutu and other synthetic benchmarks, but it should do better in most real applications, which seldom scale past 2 threads, let alone 4. Sure, when A53 and A12 show up, A9 will become obsolete, but until then (~6 months?) T4i could prove attractive to OEMs. It would have been better a year ago, but I guess figuring out LTE integration takes time.
My issue with A9 at this point in time is that I think A7 offers more bang for bucks: A9 is indeed faster per cycle but significantly bigger and power hungry. Peter Greenhalgh views would serve Nvidia better, games scales reasonably with the number of cores, performance per core are still not that relevant in the mobile realm, so slower, less power hungry cores free power for the gpu to use.
Again there is the investment in silicon 5 A9 cores takes a lot more room than 4 A7, I don't think it is worse it.
The main problem I see is the GPU: no OpenGL ES 3.0 support doesn't look very good in a 2014 SoC. I'm also curious as to what will replace T4i, and I hope NVIDIA will talk about that. My guess would be a quad-A53 with Kepler (Maxwell?) graphics and LTE. We'll see.
I see nothing wrong with Tegra 4i Gpu, it looks really die space efficient, performance should be fine, I don't think open gl support will prove to be an issue in real world usage.

Looking forward I expect Nvidia to work on a single SoC, Tegra 4i exists because of some tegra 4 lackings, Lte support power consumption, etc.

To put it shortly Nvidia focus should not be on winning benchmark but developing platforms (phones and tablets) and providing long software support to its platform (both OS and games). How long for example tegra note OS supported will be more relevant to me than benches, then comes power consumption/battery life, last performance/price ratio.
I think lot of people think the same, lot of people in the mobile realm are done with half assed support (when there is one...) for the products they bought.
 
Last edited by a moderator:
Back
Top