NVIDIA Tegra Architecture

Some news on Parker. My information is that it is still on TSMC 16nm(FF/FF+?) and they have not shifted to Samsung as rumoured. It is also expected to tape out sometime this month..or has possibly already taped out since my information is slightly dated. If this is correct, we're looking at availability in late Q1'16, at best. My source indicated that it has Denver cores and not Cortex A72.
Any information on the GPU? Parker was originally going to use Maxwell and I assume that's still true after its delay, but it would be interesting if it will use Pascal now.
 
@AlNets: while it's tempting to think that automotive modules might allow for larger design powers because they have the room for active cooling, and in some cases that's true, the chip designer still needs to make a design suitable for passive installations.

Also it will need to have full functionality at ambient temperatures the majority of tablets aren't rated for and at which they throttle down to near uselessness.
 
Any information on the GPU? Parker was originally going to use Maxwell and I assume that's still true after its delay, but it would be interesting if it will use Pascal now.

I don't think that the first Pascal part which has a Tapeout will be a SoC. Pretty sure it's Maxwell again. I would just bet on 3SMM instead of 2 in X1. Anyway it's unclear whether Pascal isn't just Maxwell plus HBM and we won't see HBM in mobile SoCs but Wide IO 2.
 
I don't think that the first Pascal part which has a Tapeout will be a SoC. Pretty sure it's Maxwell again. I would just bet on 3SMM instead of 2 in X1.
As Apple SoCs have proven over time, I would prefer to see Parker with 4SMM. You sacrifice a bit of silicon area but you can get better power consumption from running it at lower speed...
 
As Apple SoCs have proven over time, I would prefer to see Parker with 4SMM. You sacrifice a bit of silicon area but you can get better power consumption from running it at lower speed...

That'd be pretty big though wouldn't it? 16/14FF isn't that much of a die saver over 20nm so that'd be quite a lot bigger. But I would definitely like to see that, something like 750-800MHz perhaps.
 
True however all so far Tegra GPU blocks have had record frequencies across the market even when they were still feeding a handful of smartphones with them. No idea what their real plans could be, but 32 TMUs for a ULP mobile SoCs sounds overkill even for 2016.
 
Nvidia Shield stats & reviews ....


NVIDIA SHIELD Android TV Reviewed: Gaming And The Ultimate 4K Streamer
charts.png

http://hothardware.com/reviews/nvid...reviewed-the-ultimate-streaming-device?page=1


NVIDIA SHIELD Android TV – The 4K Ultra HD Revolution
nvidia-shield-pro-specs-645x325.jpg

http://www.legitreviews.com/nvidia-shield-android-tv-review_164223
 
and how much will SD810 consume if it had same sustained performance ?

What would be the consumption of a VW Beetle if it had the same horsepower as a Ferrari?

This is kind of a pointless question.
How much would SD810 if it had the same sustained performance? It doesn't have the same performance as a TX1, not even before throttling, so how would you ever measure that and why would that matter?
 
Last edited by a moderator:
What would be the consumption of a VW Beetle if it had the same horsepower as a Ferrari?

This is kind of a pointless question.
How much would SD810 if it had the same sustained performance? It doesn't have the same performance as a TX1, not even before throttling, so how would you ever measure that and why would that matter?
only 20W for 2GHz quad core A57 + 1Ghz 512 FP32 GFlops GPU on 20nm is very very impressive in my book. Saying that it consumes 20W without taking into account the very high sustained performance, is hiding half of the story. Because X1 efficiency is at the top of the market and it can be adapted to a tablet easily.

Too much for a tablet, which was the point.
In a tablet, when it will be clocked lower and it will throttle to fit 10W envelop like everybody does, it will be suitable and still be miles ahead of anything QC has to offer. Now is Nvidia pursuing this market, its the real question...
 
In a tablet, when it will be clocked lower and it will throttle to fit 10W envelop like everybody does, it will be suitable and still be miles ahead of anything QC has to offer. Now is Nvidia pursuing this market, its the real question...

Perhaps, but this wouldn't change the fact that you won't see this level of performance in a tablet, as Ryan said.
 
only 20W for 2GHz quad core A57 + 1Ghz 512 FP32 GFlops GPU on 20nm is very very impressive in my book. Saying that it consumes 20W without taking into account the very high sustained performance, is hiding half of the story. Because X1 efficiency is at the top of the market and it can be adapted to a tablet easily.
You asked why Tegra X1 isn't likely to get a tablet design win and you got your answer.
No one accused TX1 of being inefficient. It's just not a proper chip for a tablet.
They could underclock and undervolt everything towards a 10W consumption, but then it might not even get far enough from the cheaper TK1 in gaming performance.

In a tablet, when it will be clocked lower and it will throttle to fit 10W envelop like everybody does, it will be suitable and still be miles ahead of anything QC has to offer.
You either did side-to-side, apples-to-apples comparisons on limiting power consumption to 10W on both chips using dedicated development boards yourself, or you own a magic crystal ball to be so sure of that.

Now is Nvidia pursuing this market, its the real question...
Like all Tegra chips before this one, nVidia is probably pursuing everything they can get their hands on.
The result so far is that nVidia could only find one device where to place TX1: their own set-top-box. And maybe a Shield 2, also from themselves.

The truth is that TX1 is a solution looking for a problem. The android games that are making money don't need high-performing GPUs. In fact, the number of android game releases with sophisticated 3D graphics is slowing down. Gameloft is going back to cheap movie tie-ins, EA's last demanding 3D game is the >2 year-old Real Racing 3 and they're now betting on bejeweled and plants vs. zombies, etc.
The F2P bubble didn't help either, because at the moment no one is willing to spend too much money on Android games that are always hit-and-miss.

Yes, nVidia is spending money on some exclusive PC ports. But the exclusive part is the problem. nVidia's greediness is clouding their ability to understand that, to enlarge the market and demand for such games, they'd need to let those games run in other SoCs.
nVidia will rather starve than let the competition take a bite from their investments. So they'll probably starve.
 
Back
Top