NVIDIA Tegra Architecture

More demanding games could help but at the most moment the market makes sure that any increase in GPU performance is eaten away through resolution increase.

Just because a smartphone or tablet has N native resolution it doesn't necessarily mean that all mobile games actually run in that resolution. Other than that desktop gaming hasn't stopped at 1560p either; watch how things evolve once more 4k displays appear and 8k waltzes into town after that.

Back to NV, an issue they face is that one of their main advantage, software, doesn't do much for them on those markets. Simply put most of task are not that demanding and the entry bar for good enough is easier and easier to reach.

You can have the best sw in the world, if your hw stinks, the first is not going to save the day and that works vice versa. NVIDIA has both excellent hw and drivers/support. The ULP SoC market due to having a high amount of large semiconductor manufacturers operating in, is practically ideal for IP sales. There it shouldn't surprise that ARM and IMG are as successful while selling intellectual property. On top of that NV, Intel and the likes jumped on the ULP train too late; it was probably a time when folks were giggling about smartphones and the mythical "potential" many saw in it, which in the meantime backslapped in their face.
 
Just because a smartphone or tablet has N native resolution it doesn't necessarily mean that all mobile games actually run in that resolution. Other than that desktop gaming hasn't stopped at 1560p either; watch how things evolve once more 4k displays appear and 8k waltzes into town after that.
???
You can have the best sw in the world, if your hw stinks, the first is not going to save the day and that works vice versa. NVIDIA has both excellent hw and drivers/support. The ULP SoC market due to having a high amount of large semiconductor manufacturers operating in, is practically ideal for IP sales. There it shouldn't surprise that ARM and IMG are as successful while selling intellectual property. On top of that NV, Intel and the likes jumped on the ULP train too late; it was probably a time when folks were giggling about smartphones and the mythical "potential" many saw in it, which in the meantime backslapped in their face.
Most apps and games are not demanding, games are not reviews, drivers overhead is not (can't) be measured and so on.
As for as Intel or Nvidia jumping in too late and the market being ideal for IP sales. Nv started the tegra and embedded lines of product a good while ago, whether execution was always there is another matter.
Intel well I would not sell the bear skin before the bear is dead, I've not an inch of doubt that Intel will successfully find it place, a good one, in the mobile market, their strengths are simply overwhelming (pretty much on every aspects from the financial part to the engineering to production capacity).

I quit the discussion, I think we are going way past the subject of the topic at hand, I hope Anandtech review of the Nexus 9 comes out soon.
 
Last edited:
Yeah Ive been running android 5.0 for a couple of days, battery life seems quite a bit better, in fact everything is better, except the gfx when u turn it off (the old fade tv effect) it just boringly fades out now
 
Shield tablet achieves 300+ minutes of battery lifetime in that test when capped at 30fps.

Let's cap then all devices at 30fps for a bullshit metric and magically double battery life everywhere.
 
Let's cap then all devices at 30fps for a bullshit metric and magically double battery life everywhere.

Since most tablets can not even reach 30 fps they will have exactly the same power consumed as before and no "magically double battery life".

For the ones that do have the performance it is up to the user to either enable or disable the 30 fps cap.

If the game play to the user at 30 fps is not noticeably different compared to uncapped then I see no reason to not enable the cap and enjoy longer game play.
 
Since most tablets can not even reach 30 fps they will have exactly the same power consumed as before and no "magically double battery life".

The Air2 would and considering its sales volumes it's not that far from "most tablets" for this generation of high end tablets, unlike others.

http://gfxbench.com/result.jsp?benc...vice&os-check-OS_X_gl=0&os-check-Windows_gl=0

For the ones that do have the performance it is up to the user to either enable or disable the 30 fps cap. If the game play to the user at 30 fps is not noticeably different compared to uncapped then I see no reason to not enable the cap and enjoy longer game play.

Is there anywhere any voting going on if vendors should suddently promote a 30fps cap for everything? If NV wants to implement it it's their can of beer, but it still would be wiser to cater for lower power consumption ratios from the get go.
 
Last edited:
Has Nvidia sorted out its manufacturing issues (i.e. cracked case problems) for the Shield Tablet? I want to pick one up, but the problem is concerning. I wouldn't mind a refresh with a slightly better screen as well.

I thought the Nexus 9 would be my next tablet, but the lack of external microSD expansion killed that for me.
 
I doubt there'll ever be a Google (or Apple) device with µSD support.

If they ever make a budget device they probably would. But otherwise, they'd prefer just enough memory to do some things on base models and then either charge extravagent fees for models with more memory or for Google at least, get people to store their data in the Cloud, which would provide various benefits to Google. Assuming a storage hosted by Google. As well as potential kickbacks from mobile internet providers for driving more traffic and getting people to pay more for internet on their device.

Regards,
SB
 
ROFL since it's nothing more than a reworded CPU whitepaper (source NVIDIA) I brushed though it quickly; I'm sure the Apple A7 is on 28"HPm" and it contains a "SGX5MP4" and Erista will be also on the very same process. Word has it that the first Erista variant has taped out quite a while ago under 20SoC TSMC and that it contains a 4*A57+4*A53 big.LITTLE config.
 
ROFL since it's nothing more than a reworded CPU whitepaper (source NVIDIA) I brushed though it quickly; I'm sure the Apple A7 is on 28"HPm" and it contains a "SGX5MP4" and Erista will be also on the very same process. Word has it that the first Erista variant has taped out quite a while ago under 20SoC TSMC and that it contains a 4*A57+4*A53 big.LITTLE config.

Don't see the need for ROFLing. Denver is an interesting CPU, I'd love to see more information about it, and this had some things that were new to me. This article has nothing to do with Erista except a tiny bit of speculation, I don't care if their guesses about Erista are wrong.
 
It doesn't deliver anything on top of NV's own Denver material. Did the author test and analyze the CPU by himself or is all supplied data from the same source? Anyone can copy/paste a whitepaper and at best reword the whole thing to make it sound like something "interesting". It's not only that the author's guesses for Erista are wrong, he doesn't seem to be able to get other facts straight either. If you want me to cry over this peace of wannabe article I'll also gladly do it. It ain't worth more.
 
Isn't the Linley Group that company that charges manufacturers to release marketing material disguised as independent analysis?
 
Back
Top