NVIDIA Tegra Architecture

I would say that the risk would be too high to claim just YET, that existing technology can guarantee 100% autopilot safety. There could be countless of cases where moral problems could arise and the whole issue is far too complicated IMHO on a legal basis.

Other than that yes of course could/would NVIDIA benefit from such a turn of events in that they might fill the spot of what Mobileye decided to give up. Apart from that however if Tesla as an automotive manufacturer might change policy for autonomous driving and it becomes a trend for others too, because of too many accidents NV might face in the longrun a far bigger damage for the automotive market then one can foresee today.
 
I think Nvidia is being smart and doing this in stages, 1st being a safer autocruise control, followed up at a later date with more advanced 'autopilot' but again these would be done in optional stages IMO such as automated 'parking valet' system and growing from that.
Still seems to me we are around 18 months away from a system that could be seen as a more mass production complete driverless system, however Nvidia is making great strides in terms of the ecosystem required to develop such a relatively safe system.

Cheers
 
Maybe, although part of the rumor mill is that Tesla is trying to create more in-house hardware and software. Part of the reason for that split was rumored to be that Tesla's desire for a custom platform diverged from Mobileye's more general platform.
Perhaps Nvidia's platform would plug into it, or a custom version it.

Tesla did hire AMD's chief architect for its self-driving hardware group. He'd probably want to do more than repackage Nvidia's reference boards.
 
Shield K1 tablet just got another update. V1.4. The fourth this year and third update of their Android 6.0.1 branch.


---------
This update for SHIELD Tablet K1 contains important system enhancements & bug fixes, including:

  • Conformity with OpenGL ES 3.2
  • Additional optimizations to Android Doze
  • Fixed issue with intermittent Auto-Rotation shut downs
  • Fixed audio playback stutter when the display is off
  • Update to Android 6.0 Security Patch Level July 1, 2016
  • Overall stability and security improvements
History
https://shield.nvidia.com/support/shield-tablet-k1/release-notes/1
 
Nvidia doesn't make the sensors do they, just the chips?

So how would they replace Mobileye, which incidentally is partnering with other companies to offer what sounds like a turnkey product to manufacturers.
 
Nvidia doesn't make the sensors do they, just the chips?

So how would they replace Mobileye, which incidentally is partnering with other companies to offer what sounds like a turnkey product to manufacturers.
Not sure I follow as they both do similar products, Mobileye do not manufacture the imaging capture/cameras/measurement device (which also include radar/LiDAR).

This is what Mobileye says for products that look to compete with Nvidia:
Mobileye is the world leader in computer vision systems, mapping, localization and machine learning focused on the automotive domain. Delphi is a world leader in automated driving software, sensors and systems integration. Working together, the two companies will co-develop the market’s first turnkey Level 4/5 automated driving solution.
The automated driving solution will be based on key technologies from each company. These include Mobileye’s EyeQ® 4/5 System on a Chip (SoC) with sensor signal processing, fusion, world view generation and Road Experience Management (REM™) system, which will be used for real time mapping and vehicle localization. Delphi will incorporate automated driving software algorithms from its Ottomatika acquisition, which include the Path and Motion Planning features, and Delphi’s Multi-Domain Controller (MDC) with the full camera, radar and LiDAR suite.

The suite Nvidia is creating also has everything apart from the measurement sensor devices-cameras (which comes from large scientific labs-manufacturers), Nvidia it seems does include though the algorithms/controller for cameras/radar/LiDAR maybe similar to Delphi, and critically also creating partnerships/ecosystem with many companies.
Nvidia's approach seems to be broader than that of Mobileye/Delphi.

Cheers
 
Last edited:
At the moment I have Tegra Note 7, ASUS Transformer TF701 and Shield Tablet K1. I've been trying the Android release of KOTOR on each of them.

KOTOR has some graphics options. Grass, shadows, frame buffer effects, and high quality graphics. The FBE option appears to kick the z buffer to higher precision because it eliminates distant z fighting. Shadows requires FBE. It also noticeably slows Tegra 4. The "high quality graphics" option makes the game run at the device's native resolution instead of an upscaled resolution. The upscaled resolution is unknown but less than 1280x800.

Tegra K1 runs KOTOR at what looks like 60fps at native 1920x1200. I did not sense any throttling after 30 mins of play. Tegra 4 is playable only with upscaling, and is quite acceptable that way with all other quality options enabled. It's impressive how huge of a jump K1 is over T4.

The Transformer TF701 runs the game faster than Tegra Note 7 at the same settings. Maybe the TN7 throttles more. Or perhaps 2GB RAM is beneficial. Both are running Android 4.4.2, though TN7 can be updated to a buggy 5.1 release. The Shield K1 is Android 6.0.1.

Just some observations. Maybe someone will care. I get into this stuff. ;) I also have access to Tegra 2 and 3 but don't think those are worthwhile for KOTOR.
 
Last edited:
I still have a Tegra 3 Asus Transformer TF300T which I did not trash yet because I use it for testing and tweaking my apps for the large screen size. However, it is a pain to use, takes ages to boot a Lolipop Cyanogen release and it is buggy as hell as well. But I blame that more on the low quality NAND employed than Tegra 3. It has progressively gotten worse with time, even if when I bought it it was quite decent.

Its normal that TK1 was such a jump since it finally employed a modern architecture (Kepler) versus Tegra 4 legacy G70 style pipeline with dedicated pixel and vertex shader "cores".
 
I still have a Tegra 3 Asus Transformer TF300T which I did not trash yet because I use it for testing and tweaking my apps for the large screen size. However, it is a pain to use, takes ages to boot a Lolipop Cyanogen release and it is buggy as hell as well. But I blame that more on the low quality NAND employed than Tegra 3. It has progressively gotten worse with time, even if when I bought it it was quite decent.

Its normal that TK1 was such a jump since it finally employed a modern architecture (Kepler) versus Tegra 4 legacy G70 style pipeline with dedicated pixel and vertex shader "cores".
I had a ASUS MeMo Pad ME301T that is essentially the same as the TF300 and I hear you on the flash performance. I've read there are some homebrew ROMs that can alleviate the problem, maybe through TRIM support. Unfortunately the ROM developers barely support the ME301T, maybe because it was much less common.

TF701 is really fast. So is the Tegra Note 7. They really aren't far off of the user experience of the K1 in general outside of games.

It's just a bit mind blowing how much faster K1 is than T4. It must be around 4x faster with KOTOR. I suppose it's similar with T3 vs. T4 though.
 
Last edited:
Its normal that TK1 was such a jump since it finally employed a modern architecture (Kepler) versus Tegra 4 legacy G70 style pipeline with dedicated pixel and vertex shader "cores".
I feel like adding something more here. Tegra 4's GPU is a curiosity to me. It has "72 cores!!!!", which I've seen compared to 7600GT. If it can't run an upscaled (1024x600?) KOTOR at 60fps, it isn't even remotely close to 7600GT performance. It's seems pretty amazing that NV could go from what must be a tiny, custom mobile-orientated T4 GPU to a Kepler/Maxwell hybrid SMX on the same manufacturing process and still be in the same ballpark on power consumption.
 
Last edited:
I feel like adding something more here. Tegra 4's GPU is a curiosity to me. It has "72 cores!!!!", which I've seen compared to 7600GT. If it can't run an upscaled (1024x600?) KOTOR at 60fps, it isn't even remotely close to 7600GT performance.

The Tegra 4's raw performance should be a lot closer to a full G70/G71. The G70 had 8 vertex shaders and 24 pixel/fragment shaders, both of which could do 2 MADDs each. 16 MADDs on vertex shaders + 48 MADDs on pixel shaders. The Tegra 4 does 24 MADDs on the vertex shaders + 48 MADDs on pixel shaders.

Frequencies should also be similar between the Tegra 7 and the Geforce 7800 GTX, at ~550MHz.

KOTOR should top the panel's 60Hz at 1024*600 in such a GPU, unless it's severely bandwidth limited (at that resolution? hmm...).
The original KOTOR rendered at 640*480 in a NV2A, a Geforce 3 Ti-ish, plus a Celeron at 733MHz. Only one of the Tegra 4's Cortex A15 at 1.8GHz should be plenty to run the game (which is probably single-threaded anyways).

My guess is the port to Android was just sleazy, mostly because when it released (last year?) any mid-range phone could run the game comfortably without much effort.
I bet a BayTrail tablet in Windows could run the game at over 720p with max quality settings at 60 FPS.
 
The Tegra 4's raw performance should be a lot closer to a full G70/G71. The G70 had 8 vertex shaders and 24 pixel/fragment shaders, both of which could do 2 MADDs each. 16 MADDs on vertex shaders + 48 MADDs on pixel shaders. The Tegra 4 does 24 MADDs on the vertex shaders + 48 MADDs on pixel shaders.

Frequencies should also be similar between the Tegra 7 and the Geforce 7800 GTX, at ~550MHz.

KOTOR should top the panel's 60Hz at 1024*600 in such a GPU, unless it's severely bandwidth limited (at that resolution? hmm...).
The original KOTOR rendered at 640*480 in a NV2A, a Geforce 3 Ti-ish, plus a Celeron at 733MHz. Only one of the Tegra 4's Cortex A15 at 1.8GHz should be plenty to run the game (which is probably single-threaded anyways).

My guess is the port to Android was just sleazy, mostly because when it released (last year?) any mid-range phone could run the game comfortably without much effort.
I bet a BayTrail tablet in Windows could run the game at over 720p with max quality settings at 60 FPS.
Yeah memory bandwidth probably isn't even half of what 7800 had available. There's also the question of fillrate/bandwidth efficiency. GeForce ULP is simplified in some ways. Tegra 4 is the first with >16-bit Z! Tegra K1 has all the goodies, even delta compression.

KOTOR on Android's optimization is a question mark but K1 has no trouble with it. The game actually has some changes compared to the older releases, including some changed shader effects. For example the personal stealth generator no long gives you that Predator cloak look but just makes your character transparent with some energy surrounding you. Just texturing. This may have been done to deal with the very limited shader performance of various older mobile chips like Tegra 3.

I tried the game on a Cherrytrail tablet today and noticed it was running 16-bit color by default. The dither was very obvious. Turning on the quality options kicked it to 24-bit. I don't think it ever runs 16-bit color on the Tegra 4 or K1 because I can not see any low color depth artifacts. I have seen other games run 16-bit color and usually the Intel GPUs do it better because they do the dither whereas NVIDIA just gets very banded.
 
Last edited:
To be honest, the game is 13 years old. It sold in 2003 for the Xbox and Windows. Then it was released for MacOS in 2004. Then re-released in 2009 as a Steam title. Then again in 2011 as a Mac Store title. Then again in 2013 in the iOS store.
For its 7th release on Android, what exactly makes Aspyr (had to go see the name of the guys who ported it) think that a straight up port deserves a 11€ price?

Same thing for all those ports of 16/32bit era titles that Square-Enix sells for 15-20€. Some of those are even Nintendo DS ports, which makes it even easier.
 
Perfect example of why non-casual gaming never picked up on Android ;)

See the post above; give me one good reason why I should pay as much for as antiquated material. Give me something that I wouldn't consider paying for, even if the sum is diametrically higher and then I'll let you know why gaming never has picked up until now on any mobile platform.
 
Back
Top