Google tells me there was this HTC Qilin for chinamobile back in 2009, but it was a WM6.x device, not android (many of the old windows mobile HTC smartphones used OMAP SoCs, BTW).
I doubt HTC intends to abandon Qualcomm, rather the contrary. I'm not aware of HTCs real intentions but for any manufacturer to be too dependent on only one source doesn't sound healthy to me.Nonetheless, it's been at least two years since HTC used anything other than a Qualcomm SoC for any of their smartphones.
In addition to OMAP3 phones (mostly for China iirc) as Ailuros said, HTC is now using the ST-Ericsson A9500 with ST-E's TD-SCDMA baseband for China Mobile.If confirmed, isn't this HTC's first Android smartphone without a Qualcomm SoC?
As much as I love everything iPhone, I haven't seen a mainstream application that really requires all the GPU power of an iPhone 4S. Even if Android is notoriously more GPU demanding to work around its inefficient real-time SW stack to get stutter free scrolling (the next version of Android, no doubt), I don't see how the vast majority of users are going to benefit from all this graphics power.Lazy8s said:I'm far more optimistic on the CPU side of Tegra 3 than I was before from the standpoint of efficiency (but not performance), after listening better to their explanation of how processing will be balanced.
Tegra 3 will still be a let-down, though, due to the underwhelming graphics performance. Tegra 3+ is kind of funny when you think about it in context.
As much as I love everything iPhone, I haven't seen a mainstream application that really requires all the GPU power of an iPhone 4S. Even if Android is notoriously more GPU demanding to work around its inefficient real-time SW stack to get stutter free scrolling (the next version of Android, no doubt), I don't see how the vast majority of users are going to benefit from all this graphics power.
Other than games (which are already pretty stunning on an iPhone 4), is there really any application that makes use of it? I can't think of anything.
I'd prefer higher CPU efficiency personally; if I am to trust metafor's word (which I don't see why I shouldn't) Tegra3 is not going to have it easy against Krait, despite the latter being dual core. Would T3 had truly shipped in August as projected in the past quite a few things might have been different.From that POV, more CPU performance seems to be much better.
Instagram supposedly uses the GPU to accelerate image filters as I heard someone from their team state they're more interested in the GPU increase in the iPhone 4S than the CPU.As much as I love everything iPhone, I haven't seen a mainstream application that really requires all the GPU power of an iPhone 4S. Even if Android is notoriously more GPU demanding to work around its inefficient real-time SW stack to get stutter free scrolling (the next version of Android, no doubt), I don't see how the vast majority of users are going to benefit from all this graphics power.
Other than games (which are already pretty stunning on an iPhone 4), is there really any application that makes use of it? I can't think of anything.
From that POV, more CPU performance seems to be much better.
As much as I love everything iPhone, I haven't seen a mainstream application that really requires all the GPU power of an iPhone 4S. Even if Android is notoriously more GPU demanding to work around its inefficient real-time SW stack to get stutter free scrolling (the next version of Android, no doubt), I don't see how the vast majority of users are going to benefit from all this graphics power.
Other than games (which are already pretty stunning on an iPhone 4), is there really any application that makes use of it? I can't think of anything.
From that POV, more CPU performance seems to be much better.
Quite honestly, I think -- without some sort of new use-case for phones -- that we've hit the saturation point for performance needed. Yes, things like Flash will always drain as much processing power as you have but that's a load of software inefficiency, not a task that is inherently computationally intensive.
My own personal preference from now on would be lower active power.
"Saturation point for performance needed"? That's an utopia for portable devices, at least until the wireless communications become so fast and cheap that everything is done through the cloud.
I'd rather have a smartphone that doubles as a personal computer and working station when connected to a large screen and mouse+keyboard.
Agreed, although I think my 'good enough' point is probably higher than yours. With a 2GHz 2xA15 (or 2.5GHz 2xKrait) you've got about 3x the average performance of a 1GHz 2xA9 for all workloads. That jump is still going to be a very perceptible one in web browsing even excluding Flash.Quite honestly, I think -- without some sort of new use-case for phones -- that we've hit the saturation point for performance needed. Yes, things like Flash will always drain as much processing power as you have but that's a load of software inefficiency, not a task that is inherently computationally intensive.
My own personal preference from now on would be lower active power.
Agreed, and there are also severe TDP limitations for any handheld device even when plugged in. Where's the (expensive & thick) heatsink? Where's the active (even if practically silent) fan? And where's the air moving through? If not there's not enough air, is the chassis able to sustain sufficiently high temperatures? And do you really want your device to be (literally) burning hot when plugged in? The ability to to plug in a screen & keyboard/mouse is very nice, but there's no way you'll be competitive with a 17" notebook.metafor said:That would be a "new use-case". I really don't think that's in very high demand currently; especially with things such as wireless syncing and mirroring.
After all, what would you prefer? A phone that's high-powered enough to act as a PC but has to be in low-power mode when not plugged in (from a silicon standpoint, you can't always have *best* of both worlds).
Agreed, although I think my 'good enough' point is probably higher than yours. With a 2GHz 2xA15 (or 2.5GHz 2xKrait) you've got about 3x the average performance of a 1GHz 2xA9 for all workloads. That jump is still going to be a very perceptible one in web browsing even excluding Flash.
Agreed, and there are also severe TDP limitations for any handheld device even when plugged in. Where's the (expensive & thick) heatsink? Where's the active (even if practically silent) fan? And where's the air moving through? If not there's not enough air, is the chassis able to sustain sufficiently high temperatures? And do you really want your device to be (literally) burning hot when plugged in? The ability to to plug in a screen & keyboard/mouse is very nice, but there's no way you'll be competitive with a 17" notebook.
There's a push for bigger screens and higher resolution displays on phones. Not sure how large a market that is, since those kinds of components usually require more GPU power and the overall price of the device is at the high end.
But there must be enough of a market or all these Android manufacturers wouldn't keep pushing the bar higher on screen size and resolution. So that means GPUs and SOCs continue to become more powerful.
Hey, I would think so too but at the Arstechnica forum, one guy imported the Galaxy Note from Germany for a cool $818.
There must be enough people demanding these kinds of oversized devices. I'm not completely unsympathetic. Smart phones are more interesting to me as handheld computers, not phones per se.
Samsung and HTC seem to be doing well producing these high end phones. If they can no longer cite higher clock speeds or bigger screens, how else would they differentiate from products they shipped 3 months ago, and therefore justify new purchases?
But then they'd be sued by Apple for copying their ideas (sorry, couldn't resist - I do feel it is noteworthy that the iPhone 4S has the slowest Cortex-A9 implementation to date despite their Intrinsity acquisition though [which is not clear if they used yet] - and that comes with the battery life benefit you'd expect). I don't think bigger screens hurt battery life though since they allow you to fit a correspondingly bigger battery."You can use it for a full day now"
I'm mostly thinking of WiFi web browsing here. I think the 'good enough' level is higher than you claim because everyone has a desktop computer and so experiences on a daily basis how much faster it can be. And web pages are still increasing slightly in complexity (even before considering Flash) so a bit of future proofing can't hurt.I could easily see 2x1.5GHz A9 (or 4x 1.5GHz A9) for tasks that are excessively parallel to be good enough for just about everything I do on the phone. Yes, there may be a small benefit to moving to 2GHz+ A15/Krait, but truth be told, if it weren't for the nerd-factor of owning a faster processor, I'd be hard-pressed to justify a purchase.
But then they'd be sued by Apple for copying their ideas (sorry, couldn't resist - I do feel it is noteworthy that the iPhone 4S has the slowest Cortex-A9 implementation to date despite their Intrinsity acquisition though - and that comes with the battery life benefit you'd expect). I don't think bigger screens hurt battery life though since they allow you to fit a correspondingly bigger battery.
Wait a second; from the pre-release Transformer 2 results that leaked for a while in the GL Benchmark database Tegra3 was scoring in Egypt 720p somewhere in the 53 fps league, which is roughly 60% of the SGX543MP2 in the iPad2 (and there's a reason why I'm comparing a tablet with another tablet SoC). I wouldn't call that kind of performance meger or small at all.