Nvidia Tegra

Ok, let's go back to your initial argument: 2W display. 500mW CPU.

Those 500mW reduce the use time of your gadget from 10 to 8 hours. Or from 8 to 6.4 hours.

When the media reports on tablets, one of the things that are always prominently reported are the 10h on iPad vs 6h on Galaxy Tab vs the rumored (and denied) horrible use time on the Playbook.

You don't reduce power with gigantic gobs at a time. Once low hanging fruit has gone, it's hard, tedious work and you have to fight for each mW. You can bet that integrators take notice when you're able to shave 50mW from a common use case.

In a vacuum, sure. But I noticed you chose specific examples. What if the design decision involved 12 hours vs 10? What if for that 2 hours of battery life you give up, you get something for it? Like a more powerful GPU or faster memory? Hell, how about it just cost more?

At some point, lowering the power of the SoC becomes an afterthought compared to other areas to focus on. Nobody out there is set (or will realistically) build the absolute perfect device. At some point, you'll have to prioritize.

Also, if you add in the 2W for the RF radio, 1W for the memory, 1W for the flash, etc. That number stops being 10 hours vs 8 hours and becomes 10 hours vs 9.2 hours or some change. Your assumption also totally eliminated SoC power. In reality, we can at best expect something like 350mW vs 1W and only as peak consumption.
 
Also, if you add in the 2W for the RF radio, 1W for the memory, 1W for the flash, etc. That number stops being 10 hours vs 8 hours and becomes 10 hours vs 9.2 hours or some change.
Your RF radio, memory and flash chips are using 0.5um technology?
 
Your RF radio, memory and flash chips are using 0.5um technology?
I don't know if this is what he meant, but mostly everyone's 3G Power Amplifier isn't made on a leading-edge bulk process to say the least! ;) And (for reasons unrelated to the process) those obviously take a lot of power during a phone call. In fact, the GSM (2G) spec requires the device to be able to amplify a signal to a strength higher than is possible with USB2.0's 2.5W maximum so all 2G/3G data dongles need to have a bunch of capacitors for that reason alone!

The 130nm 3G CMOS PAs are pretty cool though, hopefully they'll take over the world one day. Right now they're only targeting the low-cost 2100MHz market but in theory they could certainly do high performance multi-band down the road, not sure what everyone's roadmap looks like.

And certainly the DRAM is going to take a lot of power, but I agree it's not realistic to expect 1W DRAM power for 250mW of CPU power even with a very efficient CPU system... unless you're testing a memcpy, but I thought we were talking real-world here! :p
 
A mobile computer such as a smartphone accompanies a person through their daily interactions with others and the surrounding world, so the demand for it to better its ability to analyze and interpret tasks like real-time facial and speech recognition, real time language translation, image recognition, and biometrics will only continue to grow. Powerful GPGPU and CPU processing will continue to be very important.
You might have read about Word Lens. It seems like such an app.

Word Lens: How Future Hardware Will Enable Mobile Apps
 
I don't know if this is what he meant, but mostly everyone's 3G Power Amplifier isn't made on a leading-edge bulk process to say the least! ;) And (for reasons unrelated to the process) those obviously take a lot of power during a phone call. In fact, the GSM (2G) spec requires the device to be able to amplify a signal to a strength higher than is possible with USB2.0's 2.5W maximum so all 2G/3G data dongles need to have a bunch of capacitors for that reason alone!

The 130nm 3G CMOS PAs are pretty cool though, hopefully they'll take over the world one day. Right now they're only targeting the low-cost 2100MHz market but in theory they could certainly do high performance multi-band down the road, not sure what everyone's roadmap looks like.

And certainly the DRAM is going to take a lot of power, but I agree it's not realistic to expect 1W DRAM power for 250mW of CPU power even with a very efficient CPU system... unless you're testing a memcpy, but I thought we were talking real-world here! :p

In the real world, the CPU isn't going to run anywhere close to its maximum intrinsic power draw either :)

These are simplified numbers, but my point stands. At some point, SoC power becomes an afterthought.
 
In the real world, the CPU isn't going to run anywhere close to its maximum intrinsic power draw either :)

These are simplified numbers, but my point stands. At some point, SoC power becomes an afterthought.
SOC: 800mW?
LPDDR2: 250mW
RF wifi: 200mW? Rarely used continuously.
RF GSM may be 2W, but that's peak and it's never used all the time both at the low level protocol and in high level use cases.
Flash: Don't know, but 1W is ridiculous. Especially since the use/idle ratio is extremely low.

So, yeah, when doing things that consume the most power, like playing games, the SOC is a major part of the power consumption.
 
SOC: 800mW?
LPDDR2: 250mW
RF wifi: 200mW? Rarely used continuously.
RF GSM may be 2W, but that's peak and it's never used all the time both at the low level protocol and in high level use cases.
Flash: Don't know, but 1W is ridiculous. Especially since the use/idle ratio is extremely low.

So, yeah, when doing things that consume the most power, like playing games, the SOC is a major part of the power consumption.

No, it isn't. The SoC isn't "used continuously" either. Also, you've not taken into account the display. It seems you're just trying to play a skewed numbers game to try to prove your point. Even in your equation, the CDMA/GSM/UTMS radio is the main power equation.

In reality, the display will be the biggest portion of power, followed by the SoC if it is performing a computationally intensive task (which in a typical smartphone, is never the case). Most of the time, I would say the WiFi (Athero's chipset is about 600mW) and 3G radio chip is the most power hungry silicon.
 
metafor said:
No, it isn't. The SoC isn't "used continuously" either. Also, you've not taken into account the display. It seems you're just trying to play a skewed numbers game to try to prove your point. Even in your equation, the CDMA/GSM/UTMS radio is the main power equation.

In reality, the display will be the biggest portion of power, followed by the SoC if it is performing a computationally intensive task (which in a typical smartphone, is never the case). Most of the time, I would say the WiFi (Athero's chipset is about 600mW) and 3G radio chip is the most power hungry silicon.

All I did was come back go your initial use case: display is used.

Do you agree that there are LOT of use case where display is and AND the SOC is at full power and nothing else? E.g graphics intensive games that are not internet connected?

Yes?

Well, in that case your display uses 2W and the SOC uses, say, 800mW.

Sounds like a non-trivial amount to me, and worth optimizing, but then, what do I know?
 
All I did was come back go your initial use case: display is used.

Do you agree that there are LOT of use case where display is and AND the SOC is at full power and nothing else? E.g graphics intensive games that are not internet connected?

Yes?

That's not "a lot" of use cases in modern smartphones. At least, not yet. And even if so, the CPU in general is not taxed. The GPU and memory controller isn't going to push 800mW. Nowhere close.

Well, in that case your display uses 2W and the SOC uses, say, 800mW.

Sounds like a non-trivial amount to me, and worth optimizing, but then, what do I know?

It's not. Look at a modern smartphone design and tell me which one discriminates based on active power (so long as it's below say, 1W) of the SoC in lieu of performance, features or price. If power were the primary discriminant, Tegra 2 would be shunned.
 
http://www.brightsideofnews.com/new...-tegra-2-3d-in-january2c-tegra-3-by-fall.aspx

Nvidia%20Tegra%202011%20roadmap.jpg


Nothing surprising considering the CPU. 3x times faster graphics? :rolleyes:
 
While the GPU isn't a huge a power draw on the system, performance per consumption is the main criterion licensees use to make their selection and the primary design objective for the IP companies.
 
That's a pretty big leak, and Ailuros we both know 3x faster graphics is way too vague to guess what the architecture is really like, it might be utterly boring or extremely exciting. We'll know sooner or later. Personally the part I'm most curious about is 'ULP CPU Mode'. As far as I can tell it's either very boring or very exciting, and for now I'm betting on the former with a small hope for the latter.

Also BSN is comedy gold as always. The writer doesn't know the difference between 2010 and 2011 (thinks 4Q10 hasn't happened yet) and claims Project Denver will run the entire OS on a GPU core, but that's nothing compared to this gem:
http://www.brightsideofnews.com/news/2011/1/23/nvidia-thinks-world-domination-tegra-2-3d-in-january2c-tegra-3-by-fall.aspx?pageid=1 said:
The problem with such approach is that the competition uses Tile-Based Rendering principle, while nVidia utilizes its ultra-successful GeForce architecture and more efficient Shader principle.
Bwahahaha! :D This might be a bit mean, but my theory is that the only reason anyone ever leaks anything to BSN is to get a good laugh out of their analysis.
 
Does Tegra 3 really get to claim they have the world's first mobile quad core when i.MX6 was announced already? I mean, so long as paper launches count on both ends.

Nice quote Arun, I wonder if tilers will ever beat shaders :D
 
Does Tegra 3 really get to claim they have the world's first mobile quad core when i.MX6 was announced already? I mean, so long as paper launches count on both ends.
NV claims Tegra3 started sampling in 4Q10 whereas Freescale implied i.MX6x wasn't even sampling yet, so yes, they do get to claim that. Although if it's true that the PSP2 is also quad-core, then that would have taped-out first. Since that's a proprietary solution it wouldn't really be comparable though.
 
That's a pretty big leak, and Ailuros we both know 3x faster graphics is way too vague to guess what the architecture is really like, it might be utterly boring or extremely exciting.
I'd be disappointed if they don't unify cpu/gpu mem space after having total control over over design.

Bwahahaha! :D This might be a bit mean, but my theory is that the only reason anyone ever leaks anything to BSN is to get a good laugh out of their analysis.
I have half a mind to sig that one. :D
 
NV claims Tegra3 started sampling in 4Q10 whereas Freescale implied i.MX6x wasn't even sampling yet, so yes, they do get to claim that. Although if it's true that the PSP2 is also quad-core, then that would have taped-out first. Since that's a proprietary solution it wouldn't really be comparable though.


Hum? As far as I know, all we'll get this week from Sony is an initial paper launch.
Nintendo also pre-announced the 3DS back in March 2010, while the console came out a year later. That said, the PSP2's CPU/GPU/SoC may have not taped out yet.

Besides, if we take a look at the fast-as-hell X360's development timeframe as an example, much of the PSP2's hardware could still be pending some major decisions.
 
That's a pretty big leak, and Ailuros we both know 3x faster graphics is way too vague to guess what the architecture is really like, it might be utterly boring or extremely exciting. We'll know sooner or later. Personally the part I'm most curious about is 'ULP CPU Mode'. As far as I can tell it's either very boring or very exciting, and for now I'm betting on the former with a small hope for the latter.

The slide looks real to me. It'll have "scalar" ALUs but we won't tell BSN agreed?

Also BSN is comedy gold as always. The writer doesn't know the difference between 2010 and 2011 (thinks 4Q10 hasn't happened yet) and claims Project Denver will run the entire OS on a GPU core, but that's nothing compared to this gem:
Bwahahaha! :D This might be a bit mean, but my theory is that the only reason anyone ever leaks anything to BSN is to get a good laugh out of their analysis.
How about Apple 4 containing a Mali55?

Hum? As far as I know, all we'll get this week from Sony is an initial paper launch.
Nintendo also pre-announced the 3DS back in March 2010, while the console came out a year later. That said, the PSP2's CPU/GPU/SoC may have not taped out yet.

ROFL :D
 
Back
Top