NVIDIA Tegra Architecture

And which of those things requires 4x1.8+GHz Cortex-A15s and Tegra 4's GPU? Still not following this at all. All of it should be possible with a relatively modest CPU with competent media acceleration.
It's funny that that's exactly what I'm wondering about mobile phone chips and much less so about automotive. ;) After all, whatever chip is inside an iPad 3, is still very usable to me and other than games, I don't see anything really that's going to require a lot more processing power.

I have a limited amount of experience in the field of automotive: the qualification cycles are absolutely brutal. Think chip process stability and aging, high and low temperature robustness, operation in a very noisy environment, and, let's not forget, rock solid software stability over hundreds of thousands of cycles. And an army of anal-retentive German or Japanese engineers who will ding you for every little deviation from whatever spec they want to adhere to. And paper work: mountains of them.

It takes a couple of years before an producer is willing to make the jump to include it in their platform and only then do they start to actually design it in. With Tegra being only ~3 years old, the big ramp up in volume, if there is one(!), should be ahead of us. It's just too early to declare it a success or failure there.

As for price: I would not be the least bit surprised if this outdated $20 Android cell phone chip sells for $100 in the car market. In fact, that's probably low-balling it.

So with that said, bringing up Tegra 4 in this context is way too early. You can't see an automotive application now that requires a quad A15 and amped up GPU. But do you believe that automotive engineers won't be able to come up with something for deployment 5 years from now? Because if Tegra 4 ever makes it into automotive, that's when it's going to be in its prime.
 
It's funny that that's exactly what I'm wondering about mobile phone chips and much less so about automotive. ;) After all, whatever chip is inside an iPad 3, is still very usable to me and other than games, I don't see anything really that's going to require a lot more processing power.

I believe CPU power is always nice, if only to run javascript but what's needed is a dual cortex A15 not a quad. Hell if it's about cell phones I guess a single core A15 (or some 1+1 scheme with A15 + A15, or A15 + A7) with a very toned down GPU running on Firefox OS is all what I'd want.
 
I have a limited amount of experience in the field of automotive: the qualification cycles are absolutely brutal. Think chip process stability and aging, high and low temperature robustness, operation in a very noisy environment, and, let's not forget, rock solid software stability over hundreds of thousands of cycles. And an army of anal-retentive German or Japanese engineers who will ding you for every little deviation from whatever spec they want to adhere to. And paper work: mountains of them.

This. I expect any automobile I get will last me 10 years at a minimum. I'd be very disappointed if the included components for the dashboard and center console didn't also last that long. I'd rather have a lower performance proven architecture than something on the bleeding or near bleeding edge.

Regards,
SB
 
I think Renesas still has a decent share of the automotive navigation/infotainment SoC market, where they deploy some pretty high end PowerVR cores. They have to sample the SoCs years in advance of their actual installation into vehicle line-ups, though, as mentioned above.

The need for ever-increasing levels of performance will continue as devices start being able to interpret what their camera sees, what their mic hears, etc. and use that knowledge to perform useful new tasks. The pattern recognition, database indexing, etc. will require a lot of extra performance.

The iPad's A6X handles that same real-world, CPU workload and GPU workload of that Need for Speed game (as well as high-end titles like Infinity Blade 2) without getting throttled over an extended session of play (let alone within 10 minutes).
 
Last edited by a moderator:
It's funny that that's exactly what I'm wondering about mobile phone chips and much less so about automotive. ;) After all, whatever chip is inside an iPad 3, is still very usable to me and other than games, I don't see anything really that's going to require a lot more processing power.

I have a limited amount of experience in the field of automotive: the qualification cycles are absolutely brutal. Think chip process stability and aging, high and low temperature robustness, operation in a very noisy environment, and, let's not forget, rock solid software stability over hundreds of thousands of cycles. And an army of anal-retentive German or Japanese engineers who will ding you for every little deviation from whatever spec they want to adhere to. And paper work: mountains of them.

That makes a lot of sense for anything that might have to control any critical system in the car, or affect driving in any way, but are the standards equally high for information and entertainment systems?
 
The iPad's A6X handles that same real-world, CPU workload and GPU workload of that Need for Speed game (as well as high-end titles like Infinity Blade 2) without getting throttled over an extended session of play (let alone within 10 minutes).

You are arguing with a wall. When IMG trounced Nvidia last year in performance, the argument here was that we shouldnt use top performance as a metric but instead compare perf per mm2 of die size and costs. If Nvidia had a 20% performance advantage, it would be seen as HUGE but if they have a 20% disadvantage in performance its negligable

When iPad 4 and the 554MP4 came out, it wasnt fair to compare it to Tegra 3 but now suddenly A6X vs a future chip coming sometime next year is a legit comparison

Im excited about Logan but the rest of the companies are not sitting still while Nvidia is innovating. If and when they release competitive products, expect more cherrypicking of data here that favours Nvidia
 
here we go again... definitive words like "epic fail" and "misleading customers" despite the product just been launched.... you'll never get it :devilish:
it's so easy to use this semantic.... I can do it too:
S800 is very late to market, announced in January 8th, but 8 months later, no tablet yet available = misleading customers and PR bullshit, that's why here we hate Qualcomm so much.
S800 has no design win in tablets, it's a product no OEM wants. It has to be something deeply wrong. it's a complete disaster.
S800 has poor CPU performance, they have to bump the frequency to 2.3GHz to be competitive and still they can't beat T4 at 1.8GHz. It's the bulldozer of SoC world.
S800 in Xperia Ultra Z has disappointing battery life from Engadget review, despite the Gargantua 3000maH battery. EPIC FAIL !
this product is FAIL after FAIL after FAIL, Qualcomm is doomed. Will they still be there in 2015 ?
:rolleyes:

edit: I don't know why I'm still replying. I feel like a troll now... BTW "never argue with a troll, they will drag you down to their level and beat you with experience"
The crazy thing is that I'm not a NV fan (I don't own any green graphic card) but I can't keep my mouth shut up when I read so much hate and biased opinions :???:

Ay?? There is a massive difference between nvidia and qualcomm when it comes to misleading consumers, qualcomm is number one because it under promises and over delivers, most qualcomm socs/cpus/gpus/baseband etc hit the targets qualcomm advertises and sometimes even beats it :).

Nvidia promises a technological revolution, performance leadership through elaborate marketing.
when we get the final product it bares no comparison to what they advertised, often trying to beef up their product by renaming such things as small as alus as gpu cores!..or insinuating tegra 3 is as fast/faster than an intel core duo :)

As is the case of tegra 4...promising unmatched performance and lower power consumption compared to tegra 3. #laughable.

S800 is not running late its scheduled to be arriving in devices now ( 2H 2013)and it is arriving in devices now, bang on time.

Performance of krait 400 is actually HIGHER than tegra 4 in the third party products I have linked...so I dont know what you are using as your reference point..probably nvidia slides or the fan assisted oven SHIELD.
-even then snapdragon s800 holds its own.

Edit. .Well well, after a 30 second google search I find a user complaining about nvidia SHIELD throttling its self after only ONE run of antutu! Lol.

He says scores dropped from 40, 000 to 30, 000 in just 2 runs.
What a fantastic peice of engineering nvidia :)

"after doing a few benchmarks i noticed that the device throttles itself down. so while at first i scored 40k on antutu after doing a stability test i ended up doing another bench test and that time it dropped to 30k"
-xda user.
http://forum.xda-developers.com/showthread.php?t=2391036

Edit 2. Well another 15 second youtube search and I find another user complaining that his fan assisted SHIELD throttles to about 50% of the performance in just TWO runs of 3d mark...and loading is noticeably affected even compared to a galaxy tab which finished both tests posting virtually the same score.
https://www.youtube.com/watch?v=CK4cp8OLjws&feature=youtube_gdata_player

Point proven.
 
Last edited by a moderator:
french toast... xpea was obviously being ironic.
The message is that you shouldn't use terms like "EPIC FAIL" or "TROUNCED" in this.
It brings very little to the discussion and you kinda lose argumentative power.

Yes, T4 seems to be unable to fit in smartphones due to thermal reasons, but it could still become a commercial success if the Shield, Transformer Infinity Slatebook X2 or the reference tablet take off.

The S800 seems to be a better chip, but it's probably a lot larger than T4 so nvidia needs to sell less chips to make the same amount of money, for example. (not that I think T4 will ever give nVidia the same scale of profits that the S800 will give to Qualcomm)
 
french toast... xpea was obviously being ironic.
The message is that you shouldn't use terms like "EPIC FAIL" or "TROUNCED" in this.
It brings very little to the discussion and you kinda lose argumentative power.

Yes, T4 seems to be unable to fit in smartphones due to thermal reasons, but it could still become a commercial success if the Shield, Transformer Infinity Slatebook X2 or the reference tablet take off.

The S800 seems to be a better chip, but it's probably a lot larger than T4 so nvidia needs to sell less chips to make the same amount of money, for example. (not that I think T4 will ever give nVidia the same scale of profits that the S800 will give to Qualcomm)

Thats a bit like pot calling the kettle isnt it? You constantly use such terms in your posts, as do other members.
The terminology is justified if the message is correct, which it is, you dissagree? :/
 
french toast... xpea was obviously being ironic.
The message is that you shouldn't use terms like "EPIC FAIL" or "TROUNCED" in this.
It brings very little to the discussion and you kinda lose argumentative power.

Yes, T4 seems to be unable to fit in smartphones due to thermal reasons, but it could still become a commercial success if the Shield, Transformer Infinity Slatebook X2 or the reference tablet take off.

The S800 seems to be a better chip, but it's probably a lot larger than T4 so nvidia needs to sell less chips to make the same amount of money, for example. (not that I think T4 will ever give nVidia the same scale of profits that the S800 will give to Qualcomm)

You're assuming that they're priced at the same point. I doubt that very much.
 
Continuing this slight OT
Remember that S800 has full modem integrated so when you compare it's size to T4 you have to remember about that. Unless they sampled APQ versions just like it was with S4 pro.
 
french toast... xpea was obviously being ironic.
The message is that you shouldn't use terms like "EPIC FAIL" or "TROUNCED" in this.
It brings very little to the discussion and you kinda lose argumentative power.

Yes, T4 seems to be unable to fit in smartphones due to thermal reasons, but it could still become a commercial success if the Shield, Transformer Infinity Slatebook X2 or the reference tablet take off.

The S800 seems to be a better chip, but it's probably a lot larger than T4 so nvidia needs to sell less chips to make the same amount of money, for example. (not that I think T4 will ever give nVidia the same scale of profits that the S800 will give to Qualcomm)

Not that I want to interfear in the debacle, but you realize that the higher the manufacturing volume the smaller the price at any foundry? S800 might be (I honestly don't have exact figures) say by 40-50% larger than T4, but with the volumes they're dealing with its far less of an issue to even mention it in a comparison.

And the above is also assuming that both SoCs compared have the exact same yields while being manufactured; well they don't. As a reminder Qualcomm has released 28nm SoCs quite some time ahead of NV.
 
These ultra mobile handheld devices tend to have strict system level power and thermal constraints (ie. CPU + GPU + screen + anything else that consumes power). By constantly ramping up CPU, then GPU, then CPU, then GPU in a benchmark that is already quite tough even under cool operating conditions, the system level power and thermal constraints will clearly be reached, and as a result the system will be forced to clamp down on CPU and GPU clock operating frequencies. There is no way to get around this with a relatively high performance quad-core CPU and a relatively high performance GPU. With real world gaming, looking at games that are clearly playable on these devices, the power and thermal headroom should be noticeably better than this previously described loop with the vast majority of gaming applications because the CPU should not be maxing out in a similar fashion and because the GPU should have higher framerates to begin with.
The purpose of that graphics test isn't to measure the performance in games that run at 60 fps at all times, which would obviously be pointless. It's to estimate the content complexity up to which the given device will be able to provide playable frame rates. That means testing peak sustainable GPU load (with a simultaneous moderate CPU load).
The fact that the frame rate is low in the test doesn't really mean much since even at higher frame rates the GPU might have no idle cycles, and the workload could be almost identical (in terms of composition of work performed by different functional units).

I'm not suggesting to use a loop of alternating graphics and physics tests as some kind of standard. Looping just the graphics test would be a better idea.
 
It's funny that that's exactly what I'm wondering about mobile phone chips and much less so about automotive. ;) After all, whatever chip is inside an iPad 3, is still very usable to me and other than games, I don't see anything really that's going to require a lot more processing power.

With phones you can make and install a wide variety of apps. One thing I can list which definitely needs the CPU power is higher end console emulators. Games can benefit from it (even ones that don't constantly use much CPU can definitely benefit from bursting, for instance during loading), web stuff can sometimes too.

These car media stuff sounds like a real embedded system that runs a few predefined programs. Or do people install apps on their cars?

I have a limited amount of experience in the field of automotive: the qualification cycles are absolutely brutal. Think chip process stability and aging, high and low temperature robustness, operation in a very noisy environment, and, let's not forget, rock solid software stability over hundreds of thousands of cycles. And an army of anal-retentive German or Japanese engineers who will ding you for every little deviation from whatever spec they want to adhere to. And paper work: mountains of them.

That should be true for the stuff that controls your car but I doubt the same requirements are in place for media and even navigation. I've seen systems in cars where the performance was pretty erratic, didn't strike me as rock solid at all.

If we were really talking "automotive grade" hardware I doubt Tegra would even qualify, there's a reason for stuff like ARMv7-R for instance.

I don't buy that it takes years to get these SoCs into cars either, nVidia was talking about specific vehicles with Tegra 2 not long after the chip came out.

As for price: I would not be the least bit surprised if this outdated $20 Android cell phone chip sells for $100 in the car market. In fact, that's probably low-balling it.

Surely somebody would be out-competing that if it's really just selling the same chip for much more.
 
The iPad's A6X handles that same real-world, CPU workload and GPU workload of that Need for Speed game

Cherry picking one particular game is not very helpful, especially when comparing it on two very different platforms. In fact, I don't even think that the game is running at the same resolution on these two tablets either. Anyway, throttling is a system level phenomenon. The Toshiba tablet in question has a higher resolution and more power hungry screen, and a more powerful and more power hungry quad-core CPU too.
 
I don't know what all the shield hate is about in here. Sure T4 failed expectations, and the throttling issue in the shield has to do with the cores not waking properly after sleep. That is software problem that NV is aware of and says it will fix (don't just skim, read the information).

When I first got the shield the fan was on all the time. After last mayor update I haven't been able to catch the fan running, and it hasn't slowed down at all, in general usage or gaming. I'm not gonna defend Nvidia for much, but in terms of the shield, that thing is a beast in steroids. Don't knock it if you haven't tried it or user it.
 
Edit 2. Well another 15 second youtube search and I find another user complaining that his fan assisted SHIELD throttles to about 50% of the performance in just TWO runs of 3d mark...and loading is noticeably affected even compared to a galaxy tab which finished both tests posting virtually the same score.
https://www.youtube.com/watch?v=CK4cp8OLjws&feature=youtube_gdata_player

french toast, if you bothered to do any research on this, you would know that his Shield device has a sleep mode issue where, after the device has been put into sleep mode, the CPU/GPU clock operating frequencies do not immediately ramp way back up to normal operating speeds. This has nothing to do with throttling. In fact, if you actually read the NVIDIA Shield forum, you will see that, as long as his Shield is not in and out of sleep mode, he is able to run 3dmark or or other gaming applications many times without materially affecting his framerate. Nice try though.
 
blabla troll blabla
Point proven.
Oh my God, I've found a winner :LOL::LOL::LOL:

french toast... xpea was obviously being ironic.
The message is that you shouldn't use terms like "EPIC FAIL" or "TROUNCED" in this.
It brings very little to the discussion and you kinda lose argumentative power.
Thanks TTTTranz, English not being my native tongue, for a second, I thought that I was not clear enough.
 
Maybe some of you boys and especially the newcomers should breathe in a bit and look around; B3D is one of those great places where the "big boys" debate about technical possibilities and where folks like me lean back try to follow the conversation as much as I can and just try to learn a couple of things.

If I want a sockpuppet show in a sandcastle I'll get it for free in my daughter's kindergarten and you know what it's got more wit then some of you present here.

If I want to read NV's obnoxious advertisements and/or PR/marketing drivels (or from any other IHV) I know where I can find it too.

It's really one of those cases where I'm starting to wonder what the hell I'm still doing here and I'm just about at the spot where my glass has gotten too full.
 
Once the Toshiba tablet throttles after several minutes of Need for Speed, the iPad's CPU workload and GPU workload running it are actually a lot higher since it's still running it at the higher frequency. And the issue is not confined to Need for Speed; any sufficiently high-end game played for an extended session can result in the same comparison.

The fact that Tegra 4 has two extra higher-power CPU cores is not relevant since its CPU (and other parts of the app processor) throttle under a given CPU workload for which other competing devices don't.

I do concede that the Toshiba tablet has various challenges working against it, specifically the power-hungry display (the iPad's display is no easy challenge against which to balance, though), though that indicates Tegra 4's lack of thermal headroom doesn't leave much flexibility for device makers' designs.
 
Back
Top