NVIDIA Tegra Architecture

Chip companies in general are pretty good at resisting the gross margin race to the bottom, especially the ones with high added value.

True. The automotive space is quite a bit different than the consumer space in that it requires much more effort than simply being a high volume chipset supplier with the fastest time to market. The hardware and software stack need to be extremely robust. And considering that NVIDIA's clients include many luxury car and sport car companies such as Audi, BMW, Tesla, Bentley, Porsche, Lamborghini, these clients are not necessarily interested in a race to the bottom dollar for their in-car electronics. In fact, an SoC such as Tegra K1 has quite a lot of compute horsepower (relatively speaking) and will save these companies quite a lot of money for advanced computing technology compared to what they used in the past. To make a long story short, the electronics chipsets are not a commodity in the automotive space.
 
It started at ~ 9fps and is now ~ 11fps for Manhattan GFXBench 3.0 test, which coincidentally happens to be the two performance values where NVIDIA released TK1 power consumption measurements.

9 or whatever fps is just the medium or if you prefer average result between the two test runs so far. First was at 7.7 fps and second at 11.5 fps http://gfxbench.com/subtest_results_of_device.jsp?D=Lenovo+K1+HD+%282014%29&id=545&benchmark=gfx30

There's about a 60% difference in offscreen fillrate between the two.

Judging by the older results here: http://www.tomshardware.com/news/lenovo-thinkvision-28-nvidia-tegra-k1-android,25733.html

There's at least 2x times more performance lurking in there if they clock it higher.
 
HP Slatebook 14

Is powered by a Nvidia SOC. Either the K1 or the T4.

http://www.tomshardware.com/news/hp-slatebook-android-laptop,26648.html

The unannounced laptop seems to be your typical clamshell model: no transforming form factors will be found here. The screen measures 14-inches, has touch-based input and a Full HD resolution, aka 1920 x 1080. Backing this screen is an Nvidia Tegra SoC – likely the Tegra K1 or the Tegra 4 – and 2 GB or 4 GB of RAM. The version of Android running on this device is unknown although it’s probably v4.2 or v4.3 offering Google’s portfolio of services.
 
Yeah, clearly the iPhone 5S failed because of its lack of RAM.
This is getting off-topic, but anyway; The iphone 5S was released last year and has the OS tailor fitted to its hardware, additionally many people have complained about the lack of RAM.
The iphone 6 will have at least 2GB.
 
Generalizing to make it slightly back on topic: it's a given that you need more RAM on an OS that uses GC instead of reference counted memory, but is there any indication even on Android that 1GB is severely limiting? How would it a user even know? On an almost 2 year old, iPhone 5, it has never even entered my mind that something was amiss on the memory front. Is this really much different for Android?
 
Generalizing to make it slightly back on topic: it's a given that you need more RAM on an OS that uses GC instead of reference counted memory, but is there any indication even on Android that 1GB is severely limiting? How would it a user even know? On an almost 2 year old, iPhone 5, it has never even entered my mind that something was amiss on the memory front. Is this really much different for Android?

Highly relative to how you handle your applications; if you browse the web with a fair amount of tabs in Android (it doesn't make much difference which browser you use) you will notice the lack of ram more than once, but nothing IMO that makes it completely unbearable.

iOS is obviously quite a different chapter; first and foremost because there's no 2GB current iPhone to compare it with the 1GB ones.

For a >$600 device 1GB looks bad, but bite me folks who here really thinks that it would had been a sizeable sales success if it would have had even 3GB of ram?
 
Android 4.4 KitKat was slimmed down and runs just fine on 512MB, 1GB is more than enough.

For a mainstream/low end device definitely. "Just fine" and "more than enough" is highly relative to someone's demands. I don't do a lot of fancy stuff myself on a smartphone, but I wouldn't suggest either that 2 and 3GB ram amounts on current and upcoming high end smartphones are complete overkill either.
 
I'm not sure if this was posted earlier, but here are theoretical differences between T4 and TK1:

slides23.jpg
 
Interesting part - http://developer.download.nvidia.com/embedded/jetson/TK1/docs/Jetson_platform_brief_May2014.pdf
Some numbers from paper - 13*2.5 = 32.5 fps in GLB3.0 1080p offscreen at 7W(AP+DRAM) on unoptimized OS and platform, should be ~5W(AP+DRAM) on Andoid (1.5x perf of A7 at constant power 2605 mW on mobile OS + LPDDR3, while unoptimized Linux version + DDR3L consume 3660 mW, 3660/2605 = 1,405 or 6980 / 1.405 = 4968 mW for full clocked version)
 
Last edited by a moderator:
Tegra K1 Performance per watt represented in this chart were collected on a mobile optimized Tegra K1 platform (that uses LPDDR3, smart panel, and other mobile optimized platform components). Jetson TK1 platform is not optimized for mobile power levels.
Jetson TK1 will enable a new generation of applicationsfor computer vision, robotics,medical imaging, automotive,and many other areas.
Powered by the revolutionary 192-core NVIDIA Tegra K1 mobile processor, the Jetson platform delivers over 300 GFLOPS of performance that is almost three times more than any similar embedded platform.
Performance for those 32.5 fps in Manhattan offscreen was measured on the Jetson K1 board (which is not optimized for mobile power levels), yet the 7,29 fps/W were obtained from a mobile optimized platform. One result is not necessarily directly connected to the other.

Apple A7 consumed 2560 mW to deliver 13 fps on iPhone 5S. An equivalent Tegra K1 mobile platform consumes 2605 mW to deliver 1.45x higher performance. Since Jetson uses DDR3L and OS is not tuned for mobile, it consumes 3658 mW to deliver an equivalent 1.45x performance advantage over Apple A7.

As a close second from the original 365 GFLOPs I saw recently 326 GFLOPs figures being mentioned and now it's suddenly just >300 GFLOPs. Either someone was just bored to calculate the 326 GFLOPs or not all Jetson K1 boards have the same GPU frequency.
 
NVIDIA Tegra K1 SoC Finally Detailed, Has 5-8W TDP, 11W at Top Load

So much for that 60 watt hit piece.

http://news.softpedia.com/news/Tegr...led-Has-5-8W-TDP-11W-at-Top-load-439981.shtml

In fact, the chip has already been pitted against the Tesla K40 GPU, or rather a Tesla K40 GPU + CPU system configuration.

Obviously, it failed to even scratch the performance of the K40, but that wasn't the point. The point was to illustrate the energy efficiency.

And indeed, the Tegra K1 has a very good power-performance ratio. It used a bit over 11W when at full power, a situation that will rarely occur in real life.

Sure, if some server maker or other decides to make a system based on it, with a special motherboard and everything, the Tegra K1 might be used at full throttle.

Tablets, however, and even superphones are unlikely to ever push the Tegra K1 that far, so the average TDP should be of 5 to 8W. Definitely leagues away from the 45-60W that previous rumors and leaks suggested.
Another Article Link: http://wccftech.com/stats-nvidia-tegra-k1-superchip-wattage-finally-revealed
 
Last edited by a moderator:
Back
Top