Windows tablets

you think we are going to see 8W llano chips this year ? the leaked list only shows a low of 65w for llano

Not a chance, IMO. Llano is being done on 32nm SOI in GF's Fab 1, which I think it's for high-performance, high-power (and probably high-leakage too) chips.
Notice that no CPU clockspeeds have been announced yet for Llano, but the 65W TDP remains even for the lowest-specced dual-core + 160shader GPU.
 
so it doesn't seem like amd will have more compelling tablet chips this year then. Oh well guess i can wait till next year
 
As for OS, did you happen to notice that this thing actually features a Windows button and a start menu? This interface paradigm is as misplaced here as it was in Windows mobile, and the sooner some touch friendly replacement is standardised on the better. But honestly, Acer just isn't the kind of company that has either the ambition or the pull to make that happen.

Check out the ExoPC, they have a very attractive and fuctional touch UI shell that is quite responsive and, IMO, far better than any of the Android or IOS UI shells. They've also started licensing out to other slate makers, but I'm not sure it's going to gain much traction.

Even with that, I still prefer a Windows desktop.

A desktop OS running on a tablet doesn't work. It becomes too quirky and fiddly to be useable, and nobody (except a few weird people) end up liking it. All windows tablets released so far have been dismal or spectacular failures, every single one of them, even those that seemed to have everything going for them in terms of style and execution (like Oqo for example.)

Purely an opinion. The ExoPC was sold out for almost 4 months from launch. The HP slate was sold out for roughly 2-3 months after launch. The Asus slate continues to sell out.

None of those can be considered a failure although there are some growing pains with the early slates. Despite that they continue to sell well.

Obviously not in iPad numbers, but then again just like MP3 players easily outsold laptops a dedicated media consumption slate is going to sell more than a slate that is capable of everything that a laptop is capable of.

The market for what is basically a media consumption device is obviously going to be far larger than that for devices that are capable of far more. But that doesn't mean there is no market for those devices. And certainly doesn't mean they are failures.

What's basically happening now with Windows based tablets is that they're trying to copy what Apple is doing but cannot do it as well because they're using a desktop OS and GUI that was never designed for touch usage.

My god I hope MS utterly fails in that arena. I DO NOT WANT yet another iPad. If I wanted one I'd buy one. But the iPad has absolutely zero attraction to me. IMO, MS would be best served by continuing to evolve the Windows tablet ecosystem. Perhaps adopt a touch friendly UI shell (like the ExoPC shell for example) but leave the underlying system intact as well as the ability to dimiss the "touch friendly" (ugh) shell in favor of the much more functional Windows desktop.

And as far as I'm concerned the Windows default UI is already quite touch friendly as it is giving far easier access to multiple applications and far easier access to multi-tasked applications than either IOS or Android.

The only time you get into issues with touch only is in applications which were never designed with touch in mind. For example Adobe Reader's bookmarks feature extremely small tree expansion icons. But that's a problem with the applications NOT with the Windows UI.

One thing I found interesting was that while Adobe Reader takes the longest to load initially out of all PDF readers, it happens to also be the fastest Windows PDF reader by far on low power devices. Weird.

Regards,
SB
 
http://www.engadget.com/2011/04/17/asus-eee-pad-slider-making-the-jump-from-tegra-2-to-atom-z670/


eeepad-slider-04-17-2011.jpg


Sounds interesting i wonder how well the new atom will do with 1080p playback and other things



anotherthing i was reading is that the bobcat cores are supposed to be capable of 1watt power drain. Obviously on the 8w c-50 the two 1ghz cores are using much less power than the gpu. Why isn't amd selling quad cores in both the 1ghz and 1.6ghz flavors if it adds so little size and power to such a chip.
 
Last edited by a moderator:
Sounds interesting i wonder how well the new atom will do with 1080p playback and other things

The VXD should do 1080p decoding just fine, but the SGX535 (shudders) in Oak Trail should make it nearly obsolete for almost any kind of 3D gaming.
Plus, AFAIK there haven't been any architectural changes in the Atom core, so the performance should still be roughly the same as the 3 year-old Atom N270. And that's just not good enough for a "windows experience" IMO.
Using any of the new hardware accelerated browsers, even the C-30 should do loops around that Atom Z670 for web browsing, and 3D games are probably an order of magnitude faster.



anotherthing i was reading is that the bobcat cores are supposed to be capable of 1watt power drain. Obviously on the 8w c-50 the two 1ghz cores are using much less power than the gpu. Why isn't amd selling quad cores in both the 1ghz and 1.6ghz flavors if it adds so little size and power to such a chip.
The cores may spend that little power but then there's the L2 cache and northbridge I/O functions, memory controller, etc that should also take a big part of the power consumption.
For some reason, AMD managed to make a 5W C-50 just by laser-cutting some USB and SATA ports and limiting the memory options, while leaving the Bobcats and GPU and full performance.

4-core Bobcats are already in the pipeline (Krishna), but for the current Brazos platform (the first APU iteration in the market), I think AMD just wanted to get a fully-functional chip for a good-enough windows experience. We can agree that they managed that just fine :)


My god I hope MS utterly fails in that arena. I DO NOT WANT yet another iPad. If I wanted one I'd buy one. But the iPad has absolutely zero attraction to me.

I hope the same as you, but there's currently a mass histeria among software and hardware houses to blatantly copy everything that Apple does.
The all-mighty Microsoft completely dumped a 10-year-old ecossystem with PC-like functionality in mobile devices just to make a blatant copy of iOS with the functionality of a dumbphone. The all-mighty Nokia decided to do the same, even though WP7 sales are piss-poor.
Microsoft has stated several times that their plan for tablets is to go with the full Windows 7/8, but dumbing down to the phone version for smaller-sized tablets wouldn't be all that surprising, and would be just following the same trend as everyone.



It scares me a little that Apple seems to be the only company that's been confident enough to launch "disruptive" products, and all the others are just following like sheep.
Is the IT world lacking in imagination that much?!
 
BTW, look what the Easter Bunny brought us for Q3:
c60.png


400MHz Cedar in a 9W APU, now we're talking :D

I guess we should expect the Zacates in Q3 to also enable Turbo Core, for up to 2GHz Bobcats and ~650MHz Cedar.
 
If C-50 and C-60 use the same process, I wonder how they can achieve the same TDP while turboing both GPU and CPU (the page you link wonders the same).
 
I doubt this alone would allow to increase both CPU by 33% and GPU by 45%, unless power is already quite bad on C-50. Let's wait for release to see :)

BTW I wonder what's the difference between HD 6250 and 6290. Only frequency?
 
I doubt this alone would allow to increase both CPU by 33% and GPU by 45%, unless power is already quite bad on C-50. Let's wait for release to see :)

BTW I wonder what's the difference between HD 6250 and 6290. Only frequency?

you do know that turbo doesn't work that way. Its going to a single core or the gpu that will increase its clocks while the others are downclocked .
 
you do know that turbo doesn't work that way. Its going to a single core or the gpu that will increase its clocks while the others are downclocked .
I know it has to be a trade-off. But for instance on Intel latest chips the trade-off is more fine-grained than just the CPU as a whole, some parts of the CPU can be clock gated.
I just hope AMD won't have to downclock too much the CPU while upclocking the GPU because usually if GPU demand is high, CPU demand is somewhat too.
 
I doubt this alone would allow to increase both CPU by 33% and GPU by 45%, unless power is already quite bad on C-50. Let's wait for release to see :)

TurboCore, like Intel's TurboBoost, analyses the overall system's power consumption, temperature readings and CPU\GPU load and clocks the several parts to max. performance while being limited by the TDP.

It's probably possible to have both Bobcats at 1.3GHz and the Cedar at 400MHz at the same time, since power consumption isn't dictated by frequency alone.

For example, Intel offers Turbo overclocks with no sleeping cores in all Nehalem i7 CPUs. The i7 720M is stated as a 1.6GHZ quad-core CPU, but the Turbo offers a 1x multiplier increment with the four cores enabled, so it's practically a 1.86GHz quad-core (only limited to temperature constraints, I'd guess).


BTW I wonder what's the difference between HD 6250 and 6290. Only frequency?
One is stuck at 280MHz, the other can Turbo up to 400MHz.



I know it has to be a trade-off. But for instance on Intel latest chips the trade-off is more fine-grained than just the CPU as a whole, some parts of the CPU can be clock gated.
I just hope AMD won't have to downclock too much the CPU while upclocking the GPU because usually if GPU demand is high, CPU demand is somewhat too.
I'm almost 100% sure the C-60 will always be faster than C-50, no matter what software is running.
Unless you're using the C-60 in a very hot place and the temperature threshold quicks-in.
 
The VXD should do 1080p decoding just fine, but the SGX535 (shudders) in Oak Trail should make it nearly obsolete for almost any kind of 3D gaming.

There's a crap-load of non shader bound yet fill-rate bound applications out there in the mobile space; again it might measure up badly to higher end SGX variants but at that frequency it's anything but a slouch compared to many embedded GPUs out there. Start counting how many embedded GPUs really have 800MTexels/s raw fill-rate, let alone effective fill-rates.

No one ever claimed that it'll be amongst the "hottest" embedded GPUs of its time, but any 3D gaming is a wild exaggeration.
 
There's a crap-load of non shader bound yet fill-rate bound applications out there in the mobile space; again it might measure up badly to higher end SGX variants but at that frequency it's anything but a slouch compared to many embedded GPUs out there. Start counting how many embedded GPUs really have 800MTexels/s raw fill-rate, let alone effective fill-rates.

No one ever claimed that it'll be amongst the "hottest" embedded GPUs of its time, but any 3D gaming is a wild exaggeration.

It's a x86 windows device, so I wasn't comparing it to embedded GPUs in SoCs for Android\iOS devices. 800MTexels/s goes back to the 11-year-old Radeon or Geforce 2MX.

Compared to a C-30\C-50, the 400MHz SGX535 is an order of magnitude slower and hardly of any use even for last-gen DX8.1 and 1st-gen DX9 titles.

I still wonder if the GMA600 is any faster than the original 945G IGP in Calistoga.

And I never claimed Intel said Oak Trail would bring a decent GPU. It's just too little, too late for Q2 2011, IMHO.
 
It's a x86 windows device, so I wasn't comparing it to embedded GPUs in SoCs for Android\iOS devices. 800MTexels/s goes back to the 11-year-old Radeon or Geforce 2MX.

The GF2MX didn't even have as much fill-rate; you're probably confusing it with a higher end GF2 PRO which had 4 TMUs. There were more than a few spots where a 350MTexels/s K2 embarrassed a GF2 and we're still talking over 2x times more fill-rate than that.

GMA600 for the record hasn't exclusively windows support; what vendors will go for in the end is a completely different chapter.

Compared to a C-30\C-50, the 400MHz SGX535 is an order of magnitude slower and hardly of any use even for last-gen DX8.1 and 1st-gen DX9 titles.

I guess it's too hard to understand that there are worlds of differences between something that is aimed to consume milli-Watts and Watts; try to cram any Cxx SoC into something like a smart-phone.

Against higher end SoCs like the C30 or C50 Intel will most likely place something out of its own GenX stuff.

I still wonder if the GMA600 is any faster than the original 945G IGP in Calistoga.

It would be challenging to find out wouldn't it? As I said in fill-rate bound scenarios it could very well be as I doubt the 945G was able to yield 60fps in Q3a/trilinear at 1080p.

And I never claimed Intel said Oak Trail would bring a decent GPU. It's just too little, too late for Q2 2011, IMHO.

That's subject to Intel's rather awkward graphics philosophy. We've already had an indirectly related debate about what they might integrate in their embedded generation beyond the GMA600. Imagine how things might look like if everyone else comes along with high clocked Series5XT MP GPU blocks and Intel will still be chewing on a 545.
 
The GF2MX didn't even have as much fill-rate; you're probably confusing it with a higher end GF2 PRO which had 4 TMUs. There were more than a few spots where a 350MTexels/s K2 embarrassed a GF2 and we're still talking over 2x times more fill-rate than that.
http://en.wikipedia.org/wiki/Comparison_of_Nvidia_graphics_processing_units
GF2MX has 4 TMUs, the 400 model clocked @ 200MHz, so 800MTexels/s.
GF2 Ti had 8 TMUs, doubling to 1.6GTexels/s.




GMA600 for the record hasn't exclusively windows support; what vendors will go for in the end is a completely different chapter.

I guess it's too hard to understand that there are worlds of differences between something that is aimed to consume milli-Watts and Watts; try to cram any Cxx SoC into something like a smart-phone.
Thread title: Windows tablets.
:p

I wasn't trying to discuss the performance/power of SoCs with the SGX family, or comparing it to Fusion APUs in any way. There are plenty of discussions about that in these forums.
I'm simply commenting the usability of the 400MHz SGX535 in a 2011 Windows Tablet. What can and what cannot be done.




Against higher end SoCs like the C30 or C50 Intel will most likely place something out of its own GenX stuff.

Not really. Oak Trail is going by $75 (CPU only). That's the same as the 1.87GHz single-core N475, whose netbooks cost about the same as the ones with C-50.
Intel won't be countering the C-50 with GenX GPUs, it's either GMA600 in Z650\70 or GMA3150 (rebadged GPU from 945G).

They might try to counter the E-350 with some CULV i3, but the pricing will be well above the E-350
 
http://en.wikipedia.org/wiki/Comparison_of_Nvidia_graphics_processing_units
GF2MX has 4 TMUs, the 400 model clocked @ 200MHz, so 800MTexels/s.
GF2 Ti had 8 TMUs, doubling to 1.6GTexels/s.

Bad memory then; however that puts the entire GF2 fill-rate even in a worse light:

http://www.firingsquad.com/hardware/kyro2/page11.asp
http://www.firingsquad.com/hardware/kyro2/page12.asp
http://www.firingsquad.com/hardware/kyro2/page14.asp
http://www.firingsquad.com/hardware/kyro2/page15.asp
http://www.firingsquad.com/hardware/kyro2/page16.asp

Again that's only 350MPixels/s for the K2 and where "FSAA" it's for both GFs and the KYRO2 ordered grid super-sampling.

Thread title: Windows tablets.
:p
For the last time the Z670 is meant both for tablets and smart-phones. Using windows as a OS on something like the Z670 is a silly idea from the get go, but as I said that's a different chapter.

I wasn't trying to discuss the performance/power of SoCs with the SGX family, or comparing it to Fusion APUs in any way. There are plenty of discussions about that in these forums.
I'm simply commenting the usability of the 400MHz SGX535 in a 2011 Windows Tablet. What can and what cannot be done.
It would be a stupid idea even if the 670 would carry a MP4@400MHz.

Not really. Oak Trail is going by $75 (CPU only). That's the same as the 1.87GHz single-core N475, whose netbooks cost about the same as the ones with C-50.
In 1000 pieces quantities. NVIDIA doesn't even sell Tegra2 SoC in lower than 100.000 quantities if that even should ring a bell.

Intel won't be countering the C-50 with GenX GPUs, it's either GMA600 in Z650\70 or GMA3150 (rebadged GPU from 945G).

They might try to counter the E-350 with some CULV i3, but the pricing will be well above the E-350
What does GenX stand for in your book exactly? Because the 3150 wasn't developed by the Bitboys OY for sure.

Let me put it in a different light: the iPhone4 (SGX535@200MHz?) gets in GLBenchmark2.0 in 960*540 1132 frames. At 400MHz the result would be significantly higher if not twice as much. http://www.glbenchmark.com/phonedetails.jsp?benchmark=glpro20&D=HTC+Sensation&testgroup=overall Adreno220 at 960*540
 
Bad memory then; however that puts the entire GF2 fill-rate even in a worse light:

http://www.firingsquad.com/hardware/kyro2/page11.asp
http://www.firingsquad.com/hardware/kyro2/page12.asp
http://www.firingsquad.com/hardware/kyro2/page14.asp
http://www.firingsquad.com/hardware/kyro2/page15.asp
http://www.firingsquad.com/hardware/kyro2/page16.asp

Again that's only 350MPixels/s for the K2 and where "FSAA" it's for both GFs and the KYRO2 ordered grid super-sampling.
Still piss-poor for a 2011 Windows 7 device. Unless all you want to play is 1999 3D games.
Wait, what was the discussion again?


For the last time the Z670 is meant both for tablets and smart-phones. Using windows as a OS on something like the Z670 is a silly idea from the get go, but as I said that's a different chapter.

Err.. no. You won't see a Z670 in any smartphone, I can promise you that.

Z670 is Oak Trail. It's still a "Gen2" Atom built in 45nm.
"The new Intel Atom 'Oak Trail' platform, with 'Cedar Trail' to follow, are examples of our continued commitment to bring amazing personal and mobile experiences to netbook and tablet devices, delivering architectural enhancements for longer battery life and greater performance," said Doug Davis, vice president and general manager of the Netbook and Tablet Group at Intel.
From the Z670 press release. See any mention to smartphones in there?

Oak Trail is basically a Moorestown with embedded whitney port (PCI bus, IDE, etc) in order to support Windows 7, for tablets and ultra-thin netbooks. You can say that Oak Trail was purposedly made for Windows. The GPU is higher clocked but it comes mainly from the perk of downscaling it to 45nm.

TDP for Oak Trail Z670 is 3W, which is ~5x higher than Z500 Moorestown (Intel's only actual - and failed - attempt at smartphones so far).



Perhaps you're mistaking it with Medfield (which still hasn't been given a "market name")?
Medfield is the actual successor to Moorestown, and is the chip that's been shown on working handhelds this year, made in 32nm and an actual "Gen3" Atom CPU.
Since the chip has been in production since early 2011, news of smartphones with Medfield coming somewhere in 2H 2011 shouldn't take too long to appear.
There's been rumours of both ZTE and LG (not to mention the pseudo-announced Nokia N950) launching MeeGo smartphones with a Medfield.



Sure, you can install Android and MeeGo into an Oak Trail device (as you can in any 3 year-old Atom netbook), but then the added die area and power consumption needed for Windows 7 compatibility would become a bit of a waste, given that Medfield is almost in the market and will do the same for (supposedly) a fraction of the TDP, don't you think?


What does GenX stand for in your book exactly? Because the 3150 wasn't developed by the Bitboys OY for sure.

AFAIK, GenX is every IGP from Intel that has unified shaders (or Execution Units), which started with GMA X3000 (965G Northbridge).
The GMA3150 in current Atom netbook solutions has 2 pixel shaders (SM2.0) with vertex shader (SM3.0) done through software. It's a cut-down version of GMA 3000, which is a rebadge of the IGP in 945G.

So as I said before, GMA3150 in current netbook solutions sucks. Hard.

With Atom out of that equation, the only way for Intel to counter C-60 or E-350 is with a CULV Sandybridge i3, but those are always very expensive (as were the CULV Core2 CPUs). Maybe AMD's APUs will force Intel to price them a lot lower, which would be really interesting.
But that's Windows UMPCs, and the topic is Windows tablets.
 
Last edited by a moderator:
Still piss-poor for a 2011 Windows 7 device. Unless all you want to play is 1999 3D games.
Wait, what was the discussion again?

The discussion was about you not willing to understand that milliwatts and high 3D performance don't fit in the same sentence.

Err.. no. You won't see a Z670 in any smartphone, I can promise you that.

The very same SoC at lower frequencies was/is aimed for smartphones also.

Z670 is Oak Trail. It's still a "Gen2" Atom built in 45nm.

From the Z670 press release. See any mention to smartphones in there?

Oak Trail is basically a Moorestown with embedded whitney port (PCI bus, IDE, etc) in order to support Windows 7, for tablets and ultra-thin netbooks. You can say that Oak Trail was purposedly made for Windows. The GPU is higher clocked but it comes mainly from the perk of downscaling it to 45nm.

TDP for Oak Trail Z670 is 3W, which is ~5x higher than Z500 Moorestown (Intel's only actual - and failed - attempt at smartphones so far).

Perhaps you're mistaking it with Medfield (which still hasn't been given a "market name")?
Medfield is the actual successor to Moorestown, and is the chip that's been shown on working handhelds this year, made in 32nm and an actual "Gen3" Atom CPU.
Since the chip has been in production since early 2011, news of smartphones with Medfield coming somewhere in 2H 2011 shouldn't take too long to appear.
There's been rumours of both ZTE and LG (not to mention the pseudo-announced Nokia N950) launching MeeGo smartphones with a Medfield.

Huge difference between those two SoCs.


AFAIK, GenX is every IGP from Intel that has unified shaders (or Execution Units), which started with GMA X3000 (965G Northbridge).
The GMA3150 in current Atom netbook solutions has 2 pixel shaders (SM2.0) with vertex shader (SM3.0) done through software. It's a cut-down version of GMA 3000, which is a rebadge of the IGP in 945G.

So as I said before, GMA3150 in current netbook solutions sucks. Hard.

In my book GenX describes their graphics chipset team; it hasn't changed much from the past to now.

With Atom out of that equation, the only way for Intel to counter C-60 or E-350 is with a CULV Sandybridge i3, but those are always very expensive (as were the CULV Core2 CPUs). Maybe AMD's APUs will force Intel to price them a lot lower, which would be really interesting.
But that's Windows UMPCs, and the topic is Windows tablets.

Either way Intel will find itself in a tough spot; AMD is at least IMHO completely on the right track with its concentration on GPU acceleration in all of its Fusion APUs (top to bottom). Even worse Intel should be forced by now to sell its hardware by its real value...
 
Back
Top