Windows tablets

Ah i see so wouldn't the Krait also have higher IPC all other things being equal? and the A15 have higher frequency potential for the same reason?

All other things aren't equal :)

However i heard the A15 has twice the execution units or something (8v4)
So that should have higher IPC? and inversely mean the Krait can scale to higher frequencies?? confused...

A15 is a lot bigger and uses a lot more hardware throughout the pipe, similar to a classic desktop OoOE architecture. Krait is by comparison much simpler though its execution resources are similar (9-way execution unit vs 8 on A15 I believe).

In reality, it's really difficult to tell a uarch's average IPC based solely on execution resources.
 
Right. so it looks like Qualcomm has been very clever about this, they have provided competitive IPC with a A15, even if it looses out on some scenarios whilst likely being more compact and power effecient.

Sounds like the perfect processor to me:D

How much bigger do you think these next gen are going to be compared to a9s?..same process.
 
Remember, we haven't seen anything but marketing numbers from either Qualcomm or ARM. And pretty vague ones at that. They're pretty hard to trust verbatim.

Nothing really matters until you see decent benchmarks running on real hardware.. sadly in the mobile space there aren't a lot of decent benchmarks being ran, so you kind of have to take what you can get.
 
More importantly, relevant power numbers for similar platforms (with the same software stack and power management profile) really aren't reviewed at all. At best, the reviewer will turn the phone on and run it through loops.

With display and WiFi radios taking so much power, power savings due to the CPU can often go unnoticed -- unless you do something absurd like run 3W/core.
 
Unless one architecture has a big advantage, manufacturers are going to go by costs.

Of course AMD chips are less costly than Intel chips but the latter have a big performance advantage as well as better marketing support.
 
Unless one architecture has a big advantage, manufacturers are going to go by costs.

Of course AMD chips are less costly than Intel chips but the latter have a big performance advantage as well as better marketing support.

If that were true, nobody would be making handsets with separate modem chips.
 
Unless one architecture has a big advantage, manufacturers are going to go by costs.

Of course AMD chips are less costly than Intel chips but the latter have a big performance advantage as well as better marketing support.

is that really true , amd seems to have a large gpu advantage while intel has a large cpu advantage. Howevery a 17w tinity should be a compelling choice vs a 17w ivy bridge .
 
If that were true, nobody would be making handsets with separate modem chips.

To be fair, the only company that does 100% integrated modem chips really well is Qualcomm. And modem chips usually aren't dictated by the manufacturer, but the carrier you're building the phone for. For example, I'm sure Samsung would love to stick Exynos inside every Galaxy S II product they make, but it doesn't support 42Mbps HSPA+ or LTE. So even in their home market of South Korea, their most popular phone has to use a 3rd party SoC to get LTE support. Not exactly ideal, and it's something I expect to be resolved in 2012.
 
To be fair, the only company that does 100% integrated modem chips really well is Qualcomm. And modem chips usually aren't dictated by the manufacturer, but the carrier you're building the phone for. For example, I'm sure Samsung would love to stick Exynos inside every Galaxy S II product they make, but it doesn't support 42Mbps HSPA+ or LTE. So even in their home market of South Korea, their most popular phone has to use a 3rd party SoC to get LTE support. Not exactly ideal, and it's something I expect to be resolved in 2012.

Qualcomm's modems pretty much cover every major market's carrier protocol -- with a few exceptions here and there. My point is, that is a major cost saver, yet it doesn't guarantee them exclusivity when it comes to handhelds. Obviously there's a market where slightly bigger cost is acceptable for some sort of technical advantage elsewhere, be it a faster discrete modem -- in the case of 42mbps HSPA+ -- or a faster apps processor like Exynos.
 
DX10 GPU on ARM? I thought the plan was to go to market with SoCs like OMAP 5 which only have DX 9.3

You're getting confused by the sloppy naming conventions which Microsoft has allowed. Here's a quote that explains it better than I ever could as I asked the same question:

It's actually more complex than that. When it comes to programming for Direct3D11, there are a number of different GPU feature level targets. The idea is that developers will write their application in DX11, and then have customized render backends to target each feature level they want to hit.

As it stands there are 6 feature levels: 11, 10_1, 10, 9_3, 9_2, and 9_1. Unfortunately everyone has been lax in their naming standards; DirectX and Direct3D often get thrown around interchangeably, as do periods and underscores in the feature levels (since prior to D3D 11, we'd simply refer to the version of D3D). This is how you end up with DirectX 9.3 and all permutations thereof. The article has been corrected to be more technically accurate to clear this up.

In any case, 9_1 is effectively identical to Direct3D 9.0. 9_3 is somewhere between D3D 9.0b and 9.0c; it implements a bunch of extra features like multiple render targets, but the shader language is 2.x (Vertex Shader 2.0a, Pixel Shader 2.0b) rather than 3.0

Source: http://www.anandtech.com/show/4940/qualcomm-new-snapdragon-s4-msm8960-krait-architecture/3 (3rd comment)
 
Some minimum specs (same for ARM and x86): 1366x768 display, DX10 GPU, USB 2.0 port (at least one), Wi-Fi, Bluetooth, webcam (720p), accelerometer/gyroscope. So it seems that 3G/4G/GSM modem and GPS tracking are both optional features.

Are those for laptops or tablets?
 
Hmm, that resolution is standard for all Windows laptops. Do they even make 7-inch or 10-inch displays with those resolutions?

Probably the latter.

How well will W8 scale between different resolutions? Will W8 apps. be resolution-independent
 
DX10 GPU on ARM? I thought the plan was to go to market with SoCs like OMAP 5 which only have DX 9.3
PowerVR SGX545 supports DX10.1 (http://www.dvhardware.net/article21698.html), PowerVR Rogue supports DX11.1 (http://www.anandtech.com/show/5364/powervr-series-6-rogue-gpus-released-to-licensing). ARM Mali-T604 supports DX11 (http://www.tomshardware.com/news/ARM-GPU-GPGPU-Mali-T604-Mali-400,11616.html). Tegra 4 supports DX11 (http://www.hi-technonews.com/nvidia-tegra-4-to-8-core-processing-directx-11.html). All the new ARM SOCs that will be released later this year (and onwards) will support either DX10 or DX11.

Do they even make 7-inch or 10-inch displays with those resolutions?
Galaxy Note already has 5.3" (16:10) 1280x800 display. 7" display at 1366x768 would have lower DPI, so it should pose no problems (only 3% more pixels compared to Galaxy Note). The rumors also say next Galaxy Tab will have a 2560x1600 display and next iPad will have a 2048x1536 display, so the Win8 resolution requirements are not especially high.
 
Last edited by a moderator:
What's the minimum for a non 16:9 display then or is that option out?

Galaxy Note already has 5.3" (16:10) 1280x800 display. 7" display at 1366x768 would have lower DPI, so it should pose no problems. The rumors also say next Galaxy Tab will have a 2560x1600 display and next iPad will have a 2048x1536 display, so the Win8 resolution requirements are not especially high.

It won't be a problem necessarily, but it might keep the price up for a while. You won't see a lot of bargain windows tablets.
 
How well will W8 scale between different resolutions? Will W8 apps. be resolution-independent

With Metro there is no need or (I believe) ability to support old legacy windows application. Hence, in theory everything developed for Metro should be resolution independant and thus be able to scale gracefully with available resolution of the device.

Those legacy applications are the biggest hurdle to a truly resolution independant Windows environment. Something I've been wanting for the longest time. Especially as I grow older and my vision inevitably grows worse. Being able to scale the Windows environment down or up without applications doing weird things because they were designed to only work at a certain desktop DPI would be a huge boon.

Regards,
SB
 
Back
Top