Intel Silvermont(Next-gen OOE Atom)

I ran 3DMark03 on it. This was fairly GPU limited on GPUs of 2003-2004.

Venue 8 Pro on AC power (don't know if being plugged in matters)
3DMark Score
4862.0 3DMarks
GT1 - Wings of Fury
107.1 FPS
GT2 - Battle of Proxycon
35.1 FPS
GT3 - Troll's Lair
30.5 FPS
GT4 - Mother Nature
34.8 FPS
CPU Score
919.0 CPUMarks
CPU Test 1
75.4 FPS
CPU Test 2
20.9 FPS
Fill Rate (Single-Texturing)
829.0 texels/s
Fill Rate (Multi-Texturing)
2156.9 texels/s
Vertex Shader
33.9 FPS
Pixel Shader 2.0
44.2 FPS
Ragtroll
25.1 FPS


9800 Pro with Core 2 E4400 (I had a 775 + AGP board)
3DMark Score
7646.0 3DMarks
GT1 - Wings of Fury
249.6 FPS
GT2 - Battle of Proxycon
50.1 FPS
GT3 - Troll's Lair
44.9 FPS
GT4 - Mother Nature
48.0 FPS
CPU Score
1091.0 CPUMarks
CPU Test 1
134.8 FPS
CPU Test 2
17.2 FPS
Fill Rate (Single-Texturing)
1741.8 texels/s
Fill Rate (Multi-Texturing)
2945.6 texels/s
Vertex Shader
26.5 FPS
Pixel Shader 2.0
59.4 FPS
Ragtroll
25.2 FPS

Generally outclassed but the CPU results are interesting and unified shaders win it the vertex test.
 
Generally outclassed but the CPU results are interesting and unified shaders win it the vertex test.
Hmm interesting. Looks like the "more flexible" flops don't help much in that test. Though it isn't outclassed that bad, the results are probably skewed a bit by the very simple GT1 test which is a much larger win on the 9800Pro compared to the others. The cpu definitely isn't 2004 (this particular model is 2007 though faster core 2 duos were available 2006, but nothing like that was available 2004). Not sure why the core 2 wins one cpu test easily but loses the other - is one of the tests ST the other MT?
The power consumption numbers would probably put this comparison into perspective a bit :).
 
Hmm interesting. Looks like the "more flexible" flops don't help much in that test. Though it isn't outclassed that bad, the results are probably skewed a bit by the very simple GT1 test which is a much larger win on the 9800Pro compared to the others. The cpu definitely isn't 2004 (this particular model is 2007 though faster core 2 duos were available 2006, but nothing like that was available 2004). Not sure why the core 2 wins one cpu test easily but loses the other - is one of the tests ST the other MT?
The power consumption numbers would probably put this comparison into perspective a bit :).

I ran CPU test 2 again and graphed CPU load. There is indeed some multithreading, reaching around 90% load at times.

Here is what Futuremark says about those tests in their whitepaper.
The 3DMark03CPU test allows users to evaluate CPU performance for 3D graphics workloads. It is important to note that this test is intended to only measure CPU performance for 3D graphics usage and not for general PC usage. For the latter, a benchmark such as PCMark™, available also from Futuremark, is more appropriate.

This test runs game test 1 at a resolution of 640x480 using software vertex shaders. Game test 1 is very amenable to CPU measurement as the materials used are simple and most of the screen consists of single- textured, low polygon background objects. We also run game test 3 at a resolution of 640x480 with the use of dynamic shadows disabled. The optimized code path for 1.4 pixel shaders is disabled, since that feature should not be credited in this test. Note that this test still depends somewhat on the graphics hardware, but these settings reduce the dependency.

The CPU test result can also be used as good measure of the software vertex shader speed. Vertex shading is an important part of 3D game performance. Similar calculations as vertex shading are needed in other parts of 3D game software, such as real-time physics.

Note that the CPU test by itself is not multithreaded, but multithreading in DirectX should give some benefit to the CPU score, when run on a dual CPU system or on a CPU with hyper threading support. 3DMark03 has been programmed and compiled as a 32-bit application. Therefore, 64bit support in the CPU should not improve the scores.

The CPU test reports a numerical CPU Score.
 
Actually i'll take you on that bet.
Then be prepared to lose, because your reading comprehension has let you down. Let me help you get out your checkbook in reference to the bolded, supersized section of my quote below:
...the supermajority of your time using any tablet is going to have the CPU near idle; the same cannot be said about your screen.

Now, here you are talking about:
Seeing the very vast majority of the time the screen is completely off( like ~21 hours a day for me).
What are you doing USING a tablet with the screen off for 21 hours a day? Are you interested in an 8" replacement for your iPod shuffle, is that what you're trying to say? Because that's what you're insinuating.

When the screen is off, the machine is either in connected standby (a feature of Win8 and Haswell ULV / Atom Z27xx / Atom Z37xx which is so ridiculously low powered that, honestly, your wireless radio is probably still using more power than the coalesced interrupt handling of the IO block on the CPU) or else you're in full standby / hibernate which is rounding error at best.

So what you're saying is this: you leave your tablet in non-connected standby for 21 hours a day, you wake it up only long enough to encode a video (simply watching an HD video using BayTrail will still use less power than the display output itself), and then play games. And there's absolutely nothing else you do.

Great, so you found exactly the two corner cases that I've already described: playing games, and encoding video. So yeah, go ahead and take my bet, and then send the checks to my PM box for as long as you like handing out free money.
 
Last edited by a moderator:
My tablet has an SSH,FTP servers and wireshark installed on it. I can and do use those without the screen being on. I use it all the time at work in datacentres as a wireless serial/network terminal that i ssh too from my laptop and then serial/ip out into said device. That way im not in the loud as hell 21c low humidity room, rather im in the much nicer room behind the glass wall.

also when i use it to listen to music every morning the screen is off.

I didn't fail to comprehend what you said you have failed to frame it within bounds. so depending on whoever is reading your post the context of your words can vary. i use my tablet a lot with the screen off thus im willing to take your bet because you made it on the readers terms you didn't make it on the average drone consumer sheep's terms.

So i also love that playing games, playing video (decoding not encoding) and listening to music all fall into the corner cases in terms of use for a portable device :oops:.

maybe you shouldn't be so gun-ho in the aggressiveness of your reply when you list some of the most common activities on mobile devices as corner cases
 
You host.. a server... on a tablet? A tablet that you take with you everywhere? A tablet that you've configured to never go to sleep so that it stays awake that whole time so it can be available as a server -- otherwise, what good what it be as a server? And then when you do wake it up, you immediately encode videos?

Here's a hint: your SFTP / SSH server isn't using more power (via CPU) than your wireless radio and the disk itself needs to move the data. At the end of your concocted story, the CPU still uses less power than the rest of the system.

Your use case is not even a corner case, it's probably not even a niche case. It's so far off the map that it doesn't even count. It has become obvious that your sole reason for posting that utter tripe is to somehow prove that you're right. It doesn't work, because you're using the most asinine, far fetched and blatantly constructed false paradigm to do so. I have no interest in someone who has invented such a wild story to disprove what is already known by all of the industry as a standard usage model (the CPU being idle for vast majority of USAGE time); I have no further interest in your questionable contributions to this topic..
 
Last edited by a moderator:
I can't wrap my head around using something as highly mobile and disconnected as a tablet to run an SFTP server. SFTP is not a reliable protocol by itself, and serving it up over wireless is just begging for a session interrupt that forces you to re-download the file again and again. And if you're doing this from the inside of a datacenter, then you're probably pulling firmware images for things like routers and switches, which makes such a thought even more questionable.

Anyone who uses a tablet for it's intended purpose (eg: my wife looking at Pinterest, Facebook, emails, reading books, watching YouTube videos of cats doing dumb things, doodling in OneNote, unloading RAW photos from our DSLR, touching them up, and then saving them on our WHS box) will have a CPU who hits max performance perhaps 3% of total power on time. The rest of the time? It's showing something on the screen, but it's otherwise basically asleep.

All of Intel's design paradigms for mobile follow this thought process, because it's the defacto use case. Turbo up to get the job done and over with, and then go back to sleep. That entire paradigm is why Intel developed both active idle (S0ix) and connected standby )S3ix) power states for Haswell and latest-gen SoC Atom processors. The entire chip pretty much goes to sleep, except for the display interface and enough I/O to monitor for HID and specific types of network chatter.
 
3DMark 05 fun. I have a collection of results in my Futuremark account and my old, madly overclocked Athlon XP seemed like a nice comparison.

Code:
3DMark05                 Atom Z3740D    Athlon XP @ 2.5 GHz
                                        (6600GT 500/600)
3DMarks                  3295           3997
GT1 Return to Proxycon   14.6 fps       16.7 fps
GT2 Firefly Forest       10.5 fps       11.7 fps
GT3 Canyon Flight        14.9 fps       20.9 fps
CPU Score                8473           3803
CPU1                     15.13 fps      11.72 fps
CPU2                     26.22 fps      21.72 fps
 
Anyone have any insights as to the real usage practical pros and cons of the Pentium Bay Trails vs. the Atom ones?

As far as I can tell, the Pentium gains a bit on max clocks, 64bit support and typically has 4GB SKUs. However, it looses the lowest power idle states and connected standby.

Anything else? And what about the Celerons?
 
Anyone have any insights as to the real usage practical pros and cons of the Pentium Bay Trails vs. the Atom ones?

As far as I can tell, the Pentium gains a bit on max clocks, 64bit support and typically has 4GB SKUs. However, it looses the lowest power idle states and connected standby.
Bay Trail Atoms have full 64bit support. Though apparently windows only supports connected standby with the 32bit version for now (don't ask me why seems totally silly?), and so far it looks like the UEFI implementations in shipping systems are a pain to get a 64bit OS to boot.
The Pentium doesn't though actually have higher max clock (well not noticeable anyway), but it should be faster with workloads requiring a lot of power due to its increased TDP (though note that chips like z3740 can maintain max turbo clock all the time even with multithreaded load as long as the gpu isn't stressed, I believe though it isn't quite true with z3770).
It also has higher max gpu clock, and higher max memory bandwidth (though that one probably doesn't really matter).
Anything else? And what about the Celerons?
As per usual tradition, pentiums and celerons drop things like quicksync, but at least no cpu instruction set extensions this round afaik (but don't quote me on that, ark.intel.com left out some things there).
Celerons drop the max memory clock back to atom level, and clock speeds are a bit lower but I believe there are no other differences from Celerons to Pentiums.
Some Celerons also only have 2 cores obviously, and some low-clocked (or 2 core) models have lower tdp - so the difference between different Celerons is way larger than that from the fastest Celeron N2920 and slowest Pentium N3510 (which is just about nothing). For the mobile chips at least but I think it's the same story for desktop as well.
 
Thanks. I see now that (unlike last gen) the Z-series tablet atoms actually list 64bit support (though, I haven't seen anything but 2GB 32bit models for sale).

With the available models showing decent speed on their storage subsystems (seems the latest EMMC spec has some improvements over last gen), loosing connected standby might not be that big a deal. At least if going for a model with 128GB storage so that a 4GB hibernate file doesn't cut too much into the available space.

Slightly higher max clock and a higher TDP will be welcome, especially when on AC power. On the flip side, I guess I'll have to wait for some reviews to see how the Pentium models fare when it comes to battery life during lower intensity workloads such as video, productivity and web-browsing tasks.
 
Last edited by a moderator:
Slightly higher max clock and a higher TDP will be welcome, especially when on AC power. On the flip side, I guess I'll have to wait for some reviews to see how the Pentium models fare when it comes to battery life during lower intensity workloads such as video, productivity and web-browsing tasks.

The Pentium loses Quicksync support vs. the Atom, which is a shame.
 
I ran 3DMark03 on it. This was fairly GPU limited on GPUs of 2003-2004.

Venue 8 Pro on AC power (don't know if being plugged in matters)
3DMark Score
4862.0 3DMarks
GT1 - Wings of Fury
107.1 FPS
GT2 - Battle of Proxycon
35.1 FPS
GT3 - Troll's Lair
30.5 FPS
GT4 - Mother Nature
34.8 FPS
CPU Score
919.0 CPUMarks
CPU Test 1
75.4 FPS
CPU Test 2
20.9 FPS
Fill Rate (Single-Texturing)
829.0 texels/s
Fill Rate (Multi-Texturing)
2156.9 texels/s
Vertex Shader
33.9 FPS
Pixel Shader 2.0
44.2 FPS
Ragtroll
25.1 FPS


.

9600XT/2500+ (Barton):


3DMark Score
4184.0 3DMarks
GT1 - Wings of Fury
131.4 FPS
GT2 - Battle of Proxycon
27.2 FPS
GT3 - Troll's Lair
24.6 FPS
GT4 - Mother Nature
27.5 FPS
CPU Score
424.0 CPUMarks
CPU Test 1
46.8 FPS
CPU Test 2
7.6 FPS
Fill Rate (Single-Texturing)
1109.7 MTexels/s
Fill Rate (Multi-Texturing)
2038.2 MTexels/s
Vertex Shader
14.2 FPS
Pixel Shader 2.0
35.5 FPS
Ragtroll
17.5 FPS

I want more mobile games for my Acer W4 like Modern Combat 4.
 
The Pentium loses Quicksync support vs. the Atom, which is a shame.

But the Quicksync encoding is garbage. moreso that the one on Haswell

Outside of some niche applications, why not simply do an xvid encoding at 480p or 540p if you want it to be quicker (with 256K AAC for the soundtrack, please)
 
But the Quicksync encoding is garbage. moreso that the one on Haswell

Outside of some niche applications, why not simply do an xvid encoding at 480p or 540p if you want it to be quicker (with 256K AAC for the soundtrack, please)



You're comparing quicksync's H264 1080p quality to software xvid 540p?
A wee bit too extreme, wouldn't you say?
Who wants sub-720p nowadays, anyway? Are there people in the world without (at least) a 720p screen in their tablets/PCs/TVs anyways? Give or take a year and the same will be seen in smartphones too.

I did a bit of testing with software, CUDA and Quicksync encodings in Sony Vegas and Quicksync is fine enough for home movies and short presentations. It isn't good for professional grade encoding, but it isn't garbage either.
 
There are people with a computer not powerful enough to play H264 (or it works, just CPU intensive), or a DVD player and similar stuff that can play MPEG4 but not H264.
Maybe there are less such old PCs now (though you can have not powerful PCs with no hardware decoding that are not old, e.g. netbook with 32nm Atom running under linux)
 
Intel's 32nm Atoms for netbooks had a PowerVR video decoder with full support for high-profile H264 @ 1080p.

You'd have to go all the way to 2010/2011 to find a netbook with a 45nm Atom that doesn't run hardware-accelerated H264 videos. There shouldn't be that many people in this situation, not in first-world countries and even less within people who actually care about watching movies with decent quality.
 
Intel's 32nm Atoms for netbooks had a PowerVR video decoder with full support for high-profile H264 @ 1080p.

You'd have to go all the way to 2010/2011 to find a netbook with a 45nm Atom that doesn't run hardware-accelerated H264 videos. There shouldn't be that many people in this situation, not in first-world countries and even less within people who actually care about watching movies with decent quality.


Well, I still have working Athlon XP-M 1400+ laptop and it struggles even with MPEG-4 decoding :LOL:.
On the other hand at least 2 previous generations of my smartphones were good for 1080p H264 ...
 
Well, I still have working Athlon XP-M 1400+ laptop and it struggles even with MPEG-4 decoding :LOL:.
On the other hand at least 2 previous generations of my smartphones were good for 1080p H264 ...

But do you use it to watch videos?
 
Back
Top