NVIDIA Tegra Architecture

Of course NVIDIA will not give up on tablets and high end smartphones. They will not chase mainstream smartphones.
 
http://www.anandtech.com/show/8035/qualcomm-snapdragon-805-performance-preview/2

S805 Krait is losing even to T4 A15 R2, this is an easy win for TK1 with A15 R3.

http://www.anandtech.com/show/8035/qualcomm-snapdragon-805-performance-preview/3

GFXbench 3.0, only 17FPS on a 128bit memory interface when TK1 is 30FPS with only a 64bit memory interface, another easy win for TK1 Kepler GPU and even more impressive considering the margin of victory on a smaller memory bus.

The gap will only grow even bigger when Erista with Maxwell ultra efficient GPU ships launches next year at CES 2015.
 
Last edited by a moderator:
Denver can only become a reality when 64-bit Android and 64-bit Windows on ARM become a reality. So the hardware and software need to be ready, which will probably not happen until the end of this year at the earliest.
 
TK1 in the Mi Pad takes 5 hours to drain a 6700mA.h battery while playing Real Racing 3.
That means it's sucking about 6,7/5 = 1.34 A.

Assuming a typical IC voltage of 3.3V, this means the TK1 takes quite a bit less than 1,34*3,3 = 4,4W.

Take away the energy needed for voltage regulation (I'm assuming the display uses more than 3.3V?), display, sound, mass storage, touchpanel, etc. it seems the TK1 is probably consuming close to 3W while playing a rather advanced 3D game. This seems really nice and a big difference to those 6-12W we saw in early rumours.




Of course, this would be assuming that the battery meter in the Mi Pad is very precise. It probably isn't.
 
Denver can only become a reality when 64-bit Android and 64-bit Windows on ARM become a reality. So the hardware and software need to be ready, which will probably not happen until the end of this year at the earliest.

Or quite simply I feel nvidia doesn't need to emphasive the Denver K1 while they're currently trying to sell the 32bit one.

Regarding the OS, they are fairly processor agnostic, I'm sure google and Microsoft can work out or have worked out already the 64bit-specific issues on x86-64 or even MIPS systems but of course that doesn't spare building and validating for ARMv8. (including dealing with whatever hardware bugs there are in Denver, compiler bugs)
 
Regarding the OS, they are fairly processor agnostic, I'm sure google and Microsoft can work out or have worked out already the 64bit-specific issues on x86-64 or even MIPS systems
The kernel is indeed not a big issue (Linux has been running on Aarch64 for a long time, be it on simulators or test chips). But JIT has become increasingly important (for Java apps on Android or Javascript) and this is still being worked on (even for x86-64).
 
Of course, this would be assuming that the battery meter in the Mi Pad is very precise. It probably isn't.

That's a "price" you'll pay with all those applications GFXbench3.0 included; you can take them as an indication but that's about it.

On a sidenote that onscreen T-Rex result for the MiPad should be wrong; it should be somewhere in the 40+ region since the Lenovo Thinkvision at 4k is reaching already 18+fps in that one.
 
TK1 in the Mi Pad takes 5 hours to drain a 6700mA.h battery while playing Real Racing 3.
That means it's sucking about 6,7/5 = 1.34 A.

Assuming a typical IC voltage of 3.3V, this means the TK1 takes quite a bit less than 1,34*3,3 = 4,4W.

The battery's not 3.3V (and most of the power consumption is probably not on a regulated 3.3V rail but on rails closer to 1V). Battery's going to be roughly some multiple of 3.7-3.8V (maybe a little higher), depending on how many cells it has in series. I can't find any information on this, but based on the charge times for the rated chargers it looks like one cell. That'd put power consumption under that load test around 5W.

Another possible consideration is that the voltage output of a battery is higher when it's nearly fully charged, so the tablet should draw less current in this case. Whether or not this affects the capacity measurements they took would depend on how those work.

A big variable is the brightness of the screen. Nexus 7 2013 allegedly runs at 44 hours idle with the screen at minimum brightness, using only a 16Wh battery, so the whole thing would be using only around 360mW. So you might be counting too much for the non-SoC part of the system.
 
The battery's not 3.3V (and most of the power consumption is probably not on a regulated 3.3V rail but on rails closer to 1V). Battery's going to be roughly some multiple of 3.7-3.8V (maybe a little higher), depending on how many cells it has in series. I can't find any information on this, but based on the charge times for the rated chargers it looks like one cell. That'd put power consumption under that load test around 5W.

I just checked and I wrote the IC's typical voltage of 3.3V. Not the battery.
Of course the battery would need a higher-than-3.3V voltage output. Boost converters require switching which is not very energy efficient.
 
I just checked and I wrote the IC's typical voltage of 3.3V. Not the battery.
Of course the battery would need a higher-than-3.3V voltage output. Boost converters require switching which is not very energy efficient.

Yes you did write that, and I'm saying that if you want to determine the power consumption of the system relative to the capacity of the battery you compute the power draw at the battery, which means using the voltage of the battery. Yes, there's some lost power in those regulators, but you already mentioned that on top of the power consumption at the battery.

3.3V is just one of the rails that the PMIC will be regulating and most of the power consumption will NOT be coming off of this rail (although it depends on just what's in the system). The big consumers will be rails at around 1V for the CPU core and GPU and 1.2V for the memory and its interface. The SoC will actually have several power domains of varying voltages, probably a few regulated with high efficiency switching convertors and some low power ones regulated with LDOs.
 
NVIDIA Tegra K1 platform for next generation automotive infotainment systems

Open Source automotive software news for Nvidia's K1

http://www.heraldonline.com/2014/05/27/5997857/pelagicore-announces-integration.html?sp=/100/773/385

NVIDIA UI Composer Studio is a groundbreaking HMI design tool used for instrument clusters and infotainment systems. Developed by NVIDIA, it's used by automakers and Tier 1 automotive suppliers to rapidly develop proof of concepts for evaluation, market research, usability testing, and ultimately final production.

Pelagicore’s integration of UI Composer and Qt combines photorealistic 3D graphics with pixel-perfect 2D graphical user interfaces on NVIDIA’s Tegra K1 processors. This allows highly integrated blending and controlling of UI Composer scenes by Qt applications.
 

Wow, this site is really jumping to conclusions.

- There are codewords for "Volantis" and "Flounder" which must be Nexus 8 because flounder is a fish - no way could this be some other Nexus product or something else altogether
- There's Tegra K1 64-bit references in Android kernel source which must mean that Google is using it apparently, and not simply that nVidia themselves pushed it upstream
- There's reference that this Volantis/Flounder is using ARM64, that must be Tegra K1 64-bit because as everyone knows no one else is ever working on an ARM64 SoC for Android ever, definitely not Samsung or Qualcomm

I really doubt Denver K1 will be in a product available less than a month from now, but I guess we'll know soon.
 
Back
Top