Tegra 3 officially announced; in tablets by August, smartphones by Christmas

Discussion in 'Mobile Devices and SoCs' started by Mike11, Feb 16, 2011.

  1. Pottsey

    Pottsey Newcomer

    GL Benchmark does offscreen benchmarking so all devices render the same screen at the same resolution. All those games would run just fine on year old PowerVR GPU's. In fact some of them like Shadowrun already do. The problem for some people is for a 2012 SoC Tegra has little GPU power being possbilly the slowest of 2012 chips.
     
  2. Please define what you mean by "any content" because I don't thing you really mean "any content".

    Otherwise, the PSVita wouldn't need a SGX543MP4 with dedicated video memory for a 960*540 screen.
     
  3. Pressure

    Pressure Veteran

    I believe the correct nomenclature is "SGX543MP4+" for the graphic part in the PS Vita.

    One would think Sony is aiming for PS3 kind of visuals on that 960 x 540 resolution screen at 60 FPS or something along those lines.

    But I digress, I find the biggest let-down of the Tegra 3 its GPU performance and that will be the Achilles heel of the platform.

    It will still take ICS to make the animations seem smooth. Overall I still think a package, like the Apple A5, is more desirable.
     
  4. sebbbi

    sebbbi Veteran

    Just one year more and we have enough performance to run 1:1 console ports at 720p. As a game developer I would really appreciate that, since it would make porting (high quality) console games to the mobile phones a lot more straigthforward. You can not spend millions of dollars to develop a single game just for mobile phone... But if it would be possible to port those games at 1:1 quality (graphics, physics, gameplay, AI, online features, etc) we could have massive improvement in the quality of games on mobile devices.
     
  5. Entropy

    Entropy Veteran

    So when is that ¨good enough¨ point reached for a significant number of consumers? Now? 28nm? 20nm?
    My iPad2 already has slightly better performance than the 17¨-lamp iMac my elderly mother uses for all her computing and web-surfing needs. By 20nm we are down to the same game players and video transcoders that have justified all advancements on the desktop for the better part of the last decade.

    People may want to connect to a bigger screen for a lot of reasons, and to other input devices for efficiency of input obviously. Connecting to a bigger screen is the issue basically, since input can be taken care of via bluetooth already if need be. Most people just don't/won't need the laptop/desktop computer system for the sake of its computing power, just for the sake of backing up their mobile units. And that task can be taken over by the cloud, (and possibly dedicated copy stations/disks).

    My contention is this - the upcoming generation of mobile devices will have sufficient computing resources to be the sole computing device of their owners. Inertia (on many fronts) is the only thing keeping x86 PCs from a much sharper decline in volume than what we are seeing.
     
  6. tangey

    tangey Veteran

    I wonder whether it'll be a case of OS'es starting to bloat as Socs start to get more powerful, mimicking what happened on the desktop.

    Also, there's probably many application areas still waiting for the right performance point before they can be mature technologies in the handsets. In a few years time we might be laughing at the limitations of Siri, and I know that TI is big into gesture recognition using the front facing camera, the latter might be something that can benefit from GPU power via OpenCL.

    As smartphones is one of (possibly THE) major leading edge segments that is driving innovation, it might not be good for the industry overall if there is a maturity point perhaps only 2 or so years away from onw.
     
  7. Entropy

    Entropy Veteran

    Oh, you should never underestimate the power of inertia as a driving force!
    But it's a good idea to recognize it for what it is.
     
  8. Ailuros

    Ailuros Epsilon plus three Legend Subscriber

    He probably means any current or upcoming tablet/smart-phone target content.

    The PSVita is not only a handheld console, meaning that it has to handle console ports as Pressure already mentioned, but also has to last probably at least half a decade. Are there any =/>5 years design cycles in the smart-phone/tablet markets? More like on a yearly basis I'd say.

    In terms of pure paper specs the upcoming TI OMAP5 and Apple's next SoC override the theoretical performance of the 543MP4+ in PS Vita. The first will drive =/>720p screens, for the latter the rumors about a very high resolution screen have been circulating forever.

    But to come back to metafor's actual point: the PSVita SoC is being manufactured under Samsung's 45nm. Take a guess how large it is and whether you could today cram something that consumes a handful of Watts into a smart-phone.
     
  9. wco81

    wco81 Legend

    Are there such ports being worked on or contemplated?

    Yeah given the prevailing ASPs of mobile games, nobody will spend millions on a mobile game.

    That would be the one trend that would support continued development of more powerful SOCs, that more people rely on their mobile devices as their main computing platform, which they could hook up to bigger displays and keyboards or game controller for general computing or entertainment purposes.
     
  10. The announcement of GTA 3 for Android and iOS could be the beginning of an era of cheap console and PC ports.
    I wouldn't mind it at all, to be honest. Proper controls will always be an issue for many games in tablets with only a multi-touch screen, though.
     
  11. wco81

    wco81 Legend

    What would it take to pair a bluetooth controller to a device?

    Could a third party write the libraries and profiles for a common BT controller like the DS3?

    Or would there have to be support at the OS-level only? Towards the end of the PS2, some companies came out with some peripherals and announced free SDKs that games developers could use to integrate support for these peripherals -- I think it was a headset and maybe some kind of keyboard on a controller.

    Of course Apple could put a kibosh on any game app. that tried to use such tools or maybe they don't allow full access to the Bluetooth stack on iOS?

    You would think Google would welcome all comers, though developers and publishers have been more wary of console-quality games on Android.
     
  12. metafor

    metafor Regular

    You'd think so. But as with laptops, somehow, the battery size just doesn't scale with screen size. And looking at comparable Android phones with differing screen sizes, that seemed to have remained the case. I agree that they should be able to pack bigger batteries with bigger screened phones but alas....

    I really don't know of too many websites which suffers from non-near-instantaneous load times due to something other than the network. Engadget may be the most complex example I can think of and most of the time, that has to do with the response time from the ad servers.

    WebGL games may be another example but again, that's more of a GPU limitation, which I agree, can always be better.

    My point isn't so much cost as in per-die cost. It's more of cost as in opportunity cost. That is, instead of making such a high-powered CPU on a high-powered process with a beefy power distribution grid to support it, one could have designed a light, low-power, CPU and used the rest of the die area for more GPU pipelines.

    I think those kind of things will be more automated in the future, with something similar to how Nehalem regulates its core freq/voltage settings.
     
  13. metafor

    metafor Regular

    Well no, not "any content" as in go-wild-with-ray-tracing. But for titles that can be released on a mobile platform -- that is, you're targeting a ~4.3" screen -- I don't think developers are that hard-pressed for more compute power. I could be wrong, of course.
     
  14. Lazy8s

    Lazy8s Veteran

    The Dark Meadow is a pretty UE3 game for iOS, and I adored the smoothess of the rock-solid 60 fps on iPad 2, especially as the player is constantly whipping the camera view around to explore the environments.

    Moderate CPU advances are fine, but focusing on the evolution of the GPU will be very much appreciated for future SoC development, even if it were only for games.
     
  15. Wishmaster

    Wishmaster Newcomer

    Last edited by a moderator: Nov 9, 2011
  16. Ailuros

    Ailuros Epsilon plus three Legend Subscriber

  17. Ailuros

    Ailuros Epsilon plus three Legend Subscriber

    I'd expect as much from a company like NVIDIA where its core business is actually graphics. My guess is that it's easier and cheaper that way. Granted NV is not the only that is carrying the same architecture over several SoCs (from Tegra1 to 3), but that base architecture doesn't at least to me scream like a whole of a lot of resources spent in the GPU direction.
     
  18. frogblast

    frogblast Newcomer

    I highly doubt Nvidia deliberately chose to de-prioritize the GPU. I would be shocked if they knowingly designed an uncompetitive product.

    I speculate that their forecasts expected much weaker GPU competition and thus chose instead to keep the chip small to save on costs. About a year and a half ago they realized they needed a full GPU architecture refresh to compete, which requires a multi-year lead time to get to market. They need something else to sell, and ARM provides turn-key solutions to put down more CPU cores in much less time than the GPU refresh.

    The result is that the "CPU is dead" company sings the virtues of CPU speed. Sell what you have, downplay what you don't.
     
  19. swaaye

    swaaye Entirely Suboptimal Legend

    It'll be interesting to see if they come up with a bombshell GPU in the future. They certainly have significant graphics technology to leverage against everyone else. They seem to be getting pretty serious about games on these devices as shown by Tegra Zone for example. But I'm sure there is a limit as with desktop IGPs where it won't make sense to add more GPU power because few users would appreciate it and so the value is nil.
     
  20. Ailuros

    Ailuros Epsilon plus three Legend Subscriber

    Well yes and no. Architectures like SGX had been announced and analyzed years ago and IMG wasn't the only one that went multi-core. I even think that ARM was the first to announce multi-core GPU IP before Series5XT.

    http://imgtec.com/News/Release/index.asp?NewsID=449

    First announced in March 2009; a tad less than 3 years ago.

    I won't say that CPU power is unnecessary, rather the exact opposite. The only thing I'd personally wish for, is a finer balance between CPU and GPU resources.

    I'm less worried about any competitor's hw in that regard; what might become a major headache for them is rather sw, support and devrel down the line. Resources and experience are already there at NV and I'd dare to claim at this point that they're the only embedded contender at the moment that doesn't face any win8 driver considerations.
     
Loading...

Share This Page

Loading...