Intel Silvermont(Next-gen OOE Atom)

This ASUS T100 review certainly seems to show that the SoC is way higher than Intel BS 2W SDP. It would even seem that 5W is below reality:
Don't forget that TDP isn't the upper limit the cpu can draw neither - thanks to turbo it can exceed that as well (but this is configurable and hence you could configure it to not exceed TDP, in contrast to the completely meaningless SDP which just relies on "nice" workload to not exceed the number). I have no idea what the real TDP is - intel was careful to not mention it anywhere for Bay Trail (not so much for those SDP Haswells). Certainly though the discrepancy between SDP and TDP gets larger the smaller the numbers are (I guess those low power chips just use a lighter scenario...). All things considered around 5W looked reasonable to me but of course it's really a guesstimate. And obviously it has to be said this is still quite decent and those ARM based tablet chips are not any better there neither (and those certainly also don't come with TDP numbers).
 
This ASUS T100 review certainly seems to show that the SoC is way higher than Intel BS 2W SDP. It would even seem that 5W is below reality:
- idle 1.7-3.9W
- load 10.7-11.8W

For reference:
iPad 4
- idle 2.6-8.4W
- load 9.8-12.5W
iPad Air
- idle 1.8-7.1W
- load 7.5-10.4W


Those are total system consumption numbers, not SoC consumption.
Furthermore, they probably measured in the wall so it's less than that, depending on the PSU's efficiency.

Screen should be the thing that consumes the most, then there's also RAM, mass storage, power converters, etc. etc.
Intel's numbers are probably right.

All those ipads would be blowing up or burning people's hands if their SoCs were consuming 12W.
 
Don't forget that TDP isn't the upper limit the cpu can draw neither - thanks to turbo it can exceed that as well
You're right, my non OC 4770K has no trouble eating 140W for a few seconds, that's frightening the first time you see it :)

I have no idea what the real TDP is - intel was careful to not mention it anywhere for Bay Trail (not so much for those SDP Haswells). Certainly though the discrepancy between SDP and TDP gets larger the smaller the numbers are (I guess those low power chips just use a lighter scenario...). All things considered around 5W looked reasonable to me but of course it's really a guesstimate. And obviously it has to be said this is still quite decent and those ARM based tablet chips are not any better there neither (and those certainly also don't come with TDP numbers).
TDP is for the end-user a useless number (SDP is even worse). What matters is how long the device you want lasts, and if it's fast enough for your needs.

Comparing T100 vs iPad Air according to the links I posted, I have no hesitation: similar battery lifes, I don't care about Windows on a tablet, I played with a T100 for a few minutes, it really feels poor and has poor screen resolution. So I don't care if it has a better SoC or not, the T100 looks a worse tablet to me than the iPad.
 
Took a look on the Venue 11 Pro this morning, no IGP settings that I can see other than Screen Brightness.

There is an option to "Enable Turboboost\x99" which is set to Enabled by default - "Allows the Intel Turbo driver to increase the performance of the CPU OR graphics processor."

This is the IGP page of my Venue 8 Pro's BIOS when accessed from Win8's advanced startup UEFI settings option. Sorry about my phone's terrible photo quality. "GFX Boost" is disabled by default though I'm not sure what it does.

There is an incredible number of options in this BIOS. Pages of mysterious things. Though I don't think you can overclock it. Heh.
 
The Venue 11 Pro has a completely different BIOS, nothing like that.

Almost looks like the 8 has the reference design BIOS on it.
 
The Venue 11 Pro has a completely different BIOS, nothing like that.

Almost looks like the 8 has the reference design BIOS on it.
Yup it definitely has a reference feel to it. Extremely configurable. Not a good idea from Dell's perspective considering it looks very easy to make the device inoperable . And I'm not sure how to wipe settings in that situation...

I their defense though the majority of the settings are hidden if you enter the BIOS during a typical boot. You need to go in through the convoluted Win8 advanced startup.
 
The BIOS definitely has a "not for consumers" feel to it because of the sheer number of mystery options I have never seen before. Could really mess the device up and I'm not sure how to clear settings on it. ;)

Hard to say without a better view of the Venue 8 Pro BIOS, but looks very similar to the BIOS on the Baytrail FFRD I've used in the past.
 
Screen should be the thing that consumes the most, then there's also RAM, mass storage, power converters, etc. etc.
Intel's numbers are probably right.
Saw a presentation where the breakdown had the SoC at 4 or 5 on the things that consume the most power. Display was #1.
 
Saw a presentation where the breakdown had the SoC at 4 or 5 on the things that consume the most power. Display was #1.
Sorry but that's pretty meaningless without knowing the exact components used :)

A simple example is that the consumption of two screens built using the exact same technology will depend on the resolution: the denser the pixel array is, the higher the backlight has to be be, hence increasing consumption.

I also bet that on the Surface Pro 2 the CPU consumes more than the screen :)
 
I also bet that on the Surface Pro 2 the CPU consumes more than the screen :)

Over any meaningful span of time, I'd take your bet. Unless you're compressing video or playing a video game, the supermajority of your time using any tablet is going to have the CPU near idle; the same cannot be said about your screen.
 
Over any meaningful span of time, I'd take your bet. Unless you're compressing video or playing a video game, the supermajority of your time using any tablet is going to have the CPU near idle; the same cannot be said about your screen.
So according to your logic, you don't need that powerful CPU, an old generation Atom will be enough, because you're basically saying CPU performance doesn't matter.
 
So according to your logic, you don't need that powerful CPU, an old generation Atom will be enough, because you're basically saying CPU performance doesn't matter.

For the majority of the time, as he stated, you probably don't. Web browsing, playing music, reading e-mail, working on a word document, etc. isn't going to be taxing even for an Atom or Arm chip, much less a Haswell I-series in an ultrabook.

Regards,
SB
 
So according to your logic, you don't need that powerful CPU, an old generation Atom will be enough, because you're basically saying CPU performance doesn't matter.

Yeah Albuquerque why are you saying the CPU doesn't matter?

:D

Day to day, in a tablet the CPU will idle most of the time. Baytrail has really, really great idle numbers. Peak performance of the Z3740 in my T100 is perfectly acceptable as well for office use. Really great chip. When I first heard Intel would be using Baytrail in Celeron and Pentium I was disheartened, but now with some experience with it I'd say it is a good move.
 
I dunno though I actually think it's the opposite, the graphics has high bandwidth for that class but low ALU capacity (the bandwidth exceeds that of half of AMDs discrete mobile gpu lineup (which says a lot about that lineup...) whereas those 4 EUs aren't a whole lot. A Haswell GT3 (without edram) part will have 1.5 times the bandwidth but 10 times the EUs...

Yeah I suppose I should stop thinking in terms of 2004-2005 GPUs. It might help if I wasn't trying old games too!

From a weird timelessness perspective, it is neat seeing a 2-5W tablet fluidly run a game that the best PCs could barely run once upon a time. Deus Ex Invisible War for example runs well, with its normal mapping and demanding lighting/shadowing, perhaps faster than a power hungry FX 5900 or Radeon 9800 can manage. And the CPU here is a quad core with each core surely outperforming processors of that time too.
 
Yeah I suppose I should stop thinking in terms of 2004-2005 GPUs. It might help if I wasn't trying old games too!

From a weird timelessness perspective, it is neat seeing a 2-5W tablet fluidly run a game that the best PCs could barely run once upon a time. Deus Ex Invisible War for example runs well, with its normal mapping and demanding lighting/shadowing, perhaps faster than a power hungry FX 5900 or Radeon 9800 can manage. And the CPU here is a quad core with each core surely outperforming processors of that time too.
Well let's see just for fun: 2005 (let's say beginning of 2005 otherwise I have to mention dual-cores...) if you ran some near high-end cpu, that would be something like a 2.2Ghz A64 (on socket 939...). Z3770, much less Z3740, cannot outperform that actually (per core only, of course...) in general. The Z3770 would be somewhat close, though.
A Radeon 9800 would have been quite old in 2005, so let's use a X800 instead. That would be 12x4 MAD (fp24) + 6*4 MAD (fp32) per clock (and 400Mhz clock). That's something like 30 gflops, which is just a bit below that of Bay Trail chips (roughly 40 gflops assuming max gpu clock), though obviously the Bay Trail flops should be far more flexible. TMU-wise the X800 has a factor of 2 advantage (taking clocks into account) whereas the X800 also has like 3 times higher ROP capability. Memory bandwidth would also be higher in X800 though not massively so, 22GB/s vs. 17GB/s.
So you are quite right that even with Bay Trail which has comparatively high bandwidth / flop, the X800 still had quite a bit more bandwidth / flop.
(FWIW it's not _quite_ true that a Haswell GT3 has 10 times the EUs but 1.5 times the bandwidth - while it actually has potentially more than 10 times the ALU capacity (as it has higher max gpu clock) it also has access to LLC which of course helps some with bandwidth issues though I can't stick a number to that.)
 
Over any meaningful span of time, I'd take your bet. Unless you're compressing video or playing a video game, the supermajority of your time using any tablet is going to have the CPU near idle; the same cannot be said about your screen.

Actually i'll take you on that bet. Seeing the very vast majority of the time the screen is completely off( like ~21 hours a day for me). All you need is a few HD videos with stupid encode options that force you to software decode to get CPU's really going ( happens semi regularly to me on my sony XTZ) and chewing power. Im an old school gaming buff and emulators seem to use a disproportional high amount of power from what you would expect.
 
Well let's see just for fun: 2005 (let's say beginning of 2005 otherwise I have to mention dual-cores...) if you ran some near high-end cpu, that would be something like a 2.2Ghz A64 (on socket 939...). Z3770, much less Z3740, cannot outperform that actually (per core only, of course...) in general. The Z3770 would be somewhat close, though.
A Radeon 9800 would have been quite old in 2005, so let's use a X800 instead. That would be 12x4 MAD (fp24) + 6*4 MAD (fp32) per clock (and 400Mhz clock). That's something like 30 gflops, which is just a bit below that of Bay Trail chips (roughly 40 gflops assuming max gpu clock), though obviously the Bay Trail flops should be far more flexible. TMU-wise the X800 has a factor of 2 advantage (taking clocks into account) whereas the X800 also has like 3 times higher ROP capability. Memory bandwidth would also be higher in X800 though not massively so, 22GB/s vs. 17GB/s.
So you are quite right that even with Bay Trail which has comparatively high bandwidth / flop, the X800 still had quite a bit more bandwidth / flop.
(FWIW it's not _quite_ true that a Haswell GT3 has 10 times the EUs but 1.5 times the bandwidth - while it actually has potentially more than 10 times the ALU capacity (as it has higher max gpu clock) it also has access to LLC which of course helps some with bandwidth issues though I can't stick a number to that.)

Don't the pixel shaders of that era have two sets of ALU per pipeline?
 
Don't the pixel shaders of that era have two sets of ALU per pipeline?
I don't think so (though I used a simplified representation, as pixel shader is actually vec3+1, and vertex shader is in reality 4+1 though that last "1" in case of VS can't do MAD and may have other limitations wrt co-issue to the vec4 so I didn't count it). Only later r5xx chips increased ALUs inside the pixel pipes (by a factor of 3 though, not 2).
Nevertheless though the calculation I did is buggy as I didn't count MAD as 2 flops for the X800, so you can double the number... Meaning the X800 actually has very similar ALU/bandwidth ratio than Bay Trail, a somewhat surprising result after all...
 
Back
Top