Nvidia Tegra

Discussion in 'Mobile Devices and SoCs' started by Frontino, Apr 15, 2008.

  1. metafor

    metafor Regular

    In a vacuum, sure. But I noticed you chose specific examples. What if the design decision involved 12 hours vs 10? What if for that 2 hours of battery life you give up, you get something for it? Like a more powerful GPU or faster memory? Hell, how about it just cost more?

    At some point, lowering the power of the SoC becomes an afterthought compared to other areas to focus on. Nobody out there is set (or will realistically) build the absolute perfect device. At some point, you'll have to prioritize.

    Also, if you add in the 2W for the RF radio, 1W for the memory, 1W for the flash, etc. That number stops being 10 hours vs 8 hours and becomes 10 hours vs 9.2 hours or some change. Your assumption also totally eliminated SoC power. In reality, we can at best expect something like 350mW vs 1W and only as peak consumption.
     
  2. silent_guy

    silent_guy Veteran Subscriber

    Your RF radio, memory and flash chips are using 0.5um technology?
     
  3. Arun

    Arun Unknown. Legend

    I don't know if this is what he meant, but mostly everyone's 3G Power Amplifier isn't made on a leading-edge bulk process to say the least! ;) And (for reasons unrelated to the process) those obviously take a lot of power during a phone call. In fact, the GSM (2G) spec requires the device to be able to amplify a signal to a strength higher than is possible with USB2.0's 2.5W maximum so all 2G/3G data dongles need to have a bunch of capacitors for that reason alone!

    The 130nm 3G CMOS PAs are pretty cool though, hopefully they'll take over the world one day. Right now they're only targeting the low-cost 2100MHz market but in theory they could certainly do high performance multi-band down the road, not sure what everyone's roadmap looks like.

    And certainly the DRAM is going to take a lot of power, but I agree it's not realistic to expect 1W DRAM power for 250mW of CPU power even with a very efficient CPU system... unless you're testing a memcpy, but I thought we were talking real-world here! :p
     
  4. 3dcgi

    3dcgi Veteran Subscriber

    You might have read about Word Lens. It seems like such an app.

    Word Lens: How Future Hardware Will Enable Mobile Apps
     
  5. metafor

    metafor Regular

    In the real world, the CPU isn't going to run anywhere close to its maximum intrinsic power draw either :)

    These are simplified numbers, but my point stands. At some point, SoC power becomes an afterthought.
     
  6. Lazy8s

    Lazy8s Veteran

    I hadn't heard about Word Lens yet. Very awesome. Thanks!
     
  7. silent_guy

    silent_guy Veteran Subscriber

    SOC: 800mW?
    LPDDR2: 250mW
    RF wifi: 200mW? Rarely used continuously.
    RF GSM may be 2W, but that's peak and it's never used all the time both at the low level protocol and in high level use cases.
    Flash: Don't know, but 1W is ridiculous. Especially since the use/idle ratio is extremely low.

    So, yeah, when doing things that consume the most power, like playing games, the SOC is a major part of the power consumption.
     
  8. Periander

    Periander Newcomer

  9. metafor

    metafor Regular

    No, it isn't. The SoC isn't "used continuously" either. Also, you've not taken into account the display. It seems you're just trying to play a skewed numbers game to try to prove your point. Even in your equation, the CDMA/GSM/UTMS radio is the main power equation.

    In reality, the display will be the biggest portion of power, followed by the SoC if it is performing a computationally intensive task (which in a typical smartphone, is never the case). Most of the time, I would say the WiFi (Athero's chipset is about 600mW) and 3G radio chip is the most power hungry silicon.
     
  10. silent_guy

    silent_guy Veteran Subscriber

    All I did was come back go your initial use case: display is used.

    Do you agree that there are LOT of use case where display is and AND the SOC is at full power and nothing else? E.g graphics intensive games that are not internet connected?

    Yes?

    Well, in that case your display uses 2W and the SOC uses, say, 800mW.

    Sounds like a non-trivial amount to me, and worth optimizing, but then, what do I know?
     
  11. metafor

    metafor Regular

    That's not "a lot" of use cases in modern smartphones. At least, not yet. And even if so, the CPU in general is not taxed. The GPU and memory controller isn't going to push 800mW. Nowhere close.

    It's not. Look at a modern smartphone design and tell me which one discriminates based on active power (so long as it's below say, 1W) of the SoC in lieu of performance, features or price. If power were the primary discriminant, Tegra 2 would be shunned.
     
  12. Ailuros

    Ailuros Epsilon plus three Legend Subscriber

  13. Lazy8s

    Lazy8s Veteran

    While the GPU isn't a huge a power draw on the system, performance per consumption is the main criterion licensees use to make their selection and the primary design objective for the IP companies.
     
  14. Arun

    Arun Unknown. Legend

    That's a pretty big leak, and Ailuros we both know 3x faster graphics is way too vague to guess what the architecture is really like, it might be utterly boring or extremely exciting. We'll know sooner or later. Personally the part I'm most curious about is 'ULP CPU Mode'. As far as I can tell it's either very boring or very exciting, and for now I'm betting on the former with a small hope for the latter.

    Also BSN is comedy gold as always. The writer doesn't know the difference between 2010 and 2011 (thinks 4Q10 hasn't happened yet) and claims Project Denver will run the entire OS on a GPU core, but that's nothing compared to this gem:
    Bwahahaha! :D This might be a bit mean, but my theory is that the only reason anyone ever leaks anything to BSN is to get a good laugh out of their analysis.
     
  15. Exophase

    Exophase Veteran

    Does Tegra 3 really get to claim they have the world's first mobile quad core when i.MX6 was announced already? I mean, so long as paper launches count on both ends.

    Nice quote Arun, I wonder if tilers will ever beat shaders :D
     
  16. Arun

    Arun Unknown. Legend

    NV claims Tegra3 started sampling in 4Q10 whereas Freescale implied i.MX6x wasn't even sampling yet, so yes, they do get to claim that. Although if it's true that the PSP2 is also quad-core, then that would have taped-out first. Since that's a proprietary solution it wouldn't really be comparable though.
     
  17. rpg.314

    rpg.314 Veteran

    I'd be disappointed if they don't unify cpu/gpu mem space after having total control over over design.

    I have half a mind to sig that one. :grin:
     
  18. rpg.314

    rpg.314 Veteran

    Also from that piece,

    I suppose short memory is also a bliss. :)
     

  19. Hum? As far as I know, all we'll get this week from Sony is an initial paper launch.
    Nintendo also pre-announced the 3DS back in March 2010, while the console came out a year later. That said, the PSP2's CPU/GPU/SoC may have not taped out yet.

    Besides, if we take a look at the fast-as-hell X360's development timeframe as an example, much of the PSP2's hardware could still be pending some major decisions.
     
  20. Ailuros

    Ailuros Epsilon plus three Legend Subscriber

    The slide looks real to me. It'll have "scalar" ALUs but we won't tell BSN agreed?

    How about Apple 4 containing a Mali55?

    ROFL :D
     
Loading...

Share This Page

Loading...