Intel Broadwell for desktops

Consumers are getting i5/i7 all the time, because they don't quite know what they want. With the iGPU and low requirements, it's very easy for an OEM to throw together an i7, 8GB ram and 1TB or bigger HDD in a tower. (Low requirements being that a 300W PSU and cheapest motherboard will do). And that's not even a bad PC : that kind of money used to buy a MediaGX PC, a Celeron P4 with sdram and SiS chipset graphics and so on, if that.

An OEM may often include a lower end graphics card with 2GB or so ddr3 memory, but they would presumably be not if using an i5 or i7 C.
 
At this point, I don't think I'd ever spend money on a dual-core CPU. It's quad-core or bust. Not sure if I'd go i3 either, but I'm not fully aware of the differences between i3 and i5.
 
At this point, I don't think I'd ever spend money on a dual-core CPU. It's quad-core or bust. Not sure if I'd go i3 either, but I'm not fully aware of the differences between i3 and i5.

i3 is dual core and i5 is quad core - generally speaking. There are a few minor exceptions. i3 is also hyperthreaded while i5 is not, again, generally speaking.
 
I mean no offense but I have to ask: in all seriousness were you not aware of the fact that Haswell Iris Pro already did this? I thought it was fairly common knowledge that Iris Pro competes well with AMD A-series stuff, but I've seen a lot of comments similar to yours after the reviews here, so maybe not?

I'd seen benchmarks where Iris Pro was beating AMD but always (to my knowledge) in laptops or other power constrained systems and I think I'd just assumed that when both were released of those restrictions in a desktop, AMD would perform a lot better. Obviously I was wrong!

I'm also nicely impressed by the CPU performance given the low clock speed. I can't wait to see what you do with Skylake and 72 EU's. I'm guessing you'll be playing in or near XBO territory by then.
 
Personally I'd like to see a GT3e on an i3 rather than an i5 or i7 for desktop. It makes a lot more sense there. At least for me. It'd be perfect for my HTPC/server machine for gaming. When I move up to a desktop i5/i7 part, I'd be looking for better graphics performance than GT3e can provide. Granted, this may not mesh with the average consumer. But then is the average consumer going to be getting an i7? And if one does, chances are, they'd know enough to want better graphics performance. And if they don't need graphics performance then they likely wouldn't need the boost that the edram provides.

These desktop Broadwell parts are too close to Skylake for me to in the market for one, but it's interesting to see the evolution. Can't wait to see what Skylake brings, and hopefully there will be an i3 Skylake version with edram.

Regards,
SB

That's the one thing I don't like about these new CPU's. Too much GPU. Don't get me wrong, the GPU performance is amazing if that's what you want but more serious gamers are going to be getting a DGPU. But look at that die shot, it seems about 2/3rd of what I'm paying for in the CPU will actually never be used by me. I know Intel cater for this with the "E" range but it's late and overpriced by comparison and I'm not sure I understand why. Seems to me like an 8 core CPU without GPU would still be smaller (and thus cheaper) than a 4 core CPU with GPU. I'd rather Intel gave us that choice from day 1 and at the same cost as the CPU+GPU options.
 
i3 is dual core and i5 is quad core - generally speaking. There are a few minor exceptions. i3 is also hyperthreaded while i5 is not, again, generally speaking.
In notebook land you don't get quad cores until you get into the high-TDP side of the i7 range. You get hyperthreading much more often though.

Also a nod to those sneaky budget Baytrail/Cherrytrail chips... They can be quads however!
 
Last edited:
I know Intel cater for this with the "E" range but it's late and overpriced by comparison and I'm not sure I understand why. Seems to me like an 8 core CPU without GPU would still be smaller (and thus cheaper) than a 4 core CPU with GPU. I'd rather Intel gave us that choice from day 1 and at the same cost as the CPU+GPU options.
It's a lot more complicated than that simple accounting, although that sort of logic is common from enthusiasts. Suffice it to say - compare this die to Haswell-E to Xeon D and you'll get an impression of why it's not just a "hey, it feels like I can fit N more cores in that space, I'll get out my glue stick!" :)

Regarding cost, the 6-core 5820k is about the same price as some of these CPUs I believe... it's certainly a good option if you don't need the iGPU and prefer more cores/cache/PCI-E lanes instead. Different options for different people.

I really do feel like between the regular quad cores, these and the -E series parts consumers have lots of choices between design points. It's not as if you could magically fit a 140W HSW-E into a 65W chip with the same performance or energy characteristics.

Ultimately for games and consumer workloads frequency is still far more important than number of cores. Thus it's a bad trade-off to add more cores and lower the frequency.
 
Very low power quad with the latest integrated graphics can be great for some uses, i.e. Atom, AMD, even Tegra X1.
I would take it for a bad reason : some programs waste a lot of CPU. There's a linux window manager that uses OpenGL and is very good, but wastes ungodly CPU power when moving or sizing windows. One web browser that I use for only one flash game. HTML5 streaming : that's even heavier than flash, which was heavier than java (remember wmv streaming? that didn't waste CPU lol). If/when HTML5 wins over Flash for good or Flash becomes impractical (maybe mid-2017) it will warrant its own separate Firefox instance, to watch/listen to stuff.
Oh, I forgot that a javascript implementation of flash is in the works (by Mozilla) so there's never an end to the layering of CPU wasting technologies :D

tl;dr with a 15W quad core CPU+GPU you can have a stupid process using 80% of one core and not worry about noise or power bill ; three whole cores are left to run mildly more useful code.

At this point, I don't think I'd ever spend money on a dual-core CPU. It's quad-core or bust. Not sure if I'd go i3 either, but I'm not fully aware of the differences between i3 and i5.

With a desktop i3 you can't go much wrong, or let's call it a 2C/4T (dual core, four threads).
It's ungodly fast : 3.6GHz Haswell, so it's much like a 4770 unless you're going to use all the threads.
On mobile if Intel makes 28W 2C/4T Broadwell and Skylake with a high-ish clock speed that'd be some of the best ones.
 
I suppose it's fixed in Broadwell-E ; it was, in Haswell-EX.
For desktop and laptop Broadwell that may be a rather low priority feature. Consumer Broadwell is actually kind of old, perhaps older than Haswell-EX?

Ah, there's even this on wikipedia :

source here
http://www.intel.com/content/dam/ww...dates/core-m-processor-family-spec-update.pdf
I'm not sure if I understand if TSX is usable or not : bug number "BDM36" seems to tell it's messed up.
Perhaps we don't need it at all anyway. If it's borked it will only hurt you if you're a developer of high end server apps.
Thanks, I guess I'll wait a couple of months and check the intel site for a doc with the errata in desktop Broadwell.
 
And yes, by the way. Since the only cost of the 28W options is better cooling (power consumption at idle should stay the same), we do need more laptops with those.

At least on the Haswell parts, the 28W chips have higher C state(close to idle) power use than 15W ones. So, that's not true.
 
Since broadwell alledgedly fits all current socket 1150 motherboards (provided the appropriate firmware has been flashed), I've been thinking I should still upgrade my current rig once given the chance, rather than waiting even longer and spending a huge chunk of money on an entirely new PC, when this one could serve me well for a while longer. Are there any signs broadwell-C chips are heading out into the retail channel yet? I've not seen anything myself so far...

Cheers! :D
 
Broadwell-C is really a very nice upgrade option for the existing 1150 boards and that doesn't happen often lately, to be honest with the barrage of sockets by Intel. The situation is very similar to my decision, to prolong the life of my old LGA1366 workstation, by throwing in a dirt cheap 6-core Xeon and meanwhile spend the rest of the budget on a new high-end IPS monitor that I actually badly needed.
 
Ultimately for games and consumer workloads frequency is still far more important than number of cores. Thus it's a bad trade-off to add more cores and lower the frequency.

While that's true, it's hard to shake the feeling of paying for a lot of extra transistors we don't need. That's just going to get worse as the igpu consumes more of the die.

There probably isn't a lot of incentive for Intel to sell "pure" quad cores for the consumer market though.
 
While that's true, it's hard to shake the feeling of paying for a lot of extra transistors we don't need. That's just going to get worse as the igpu consumes more of the die.

There probably isn't a lot of incentive for Intel to sell "pure" quad cores for the consumer market though.

It's not just the transistors the customer pays for. The unique masks, design, engineering, marketing, and production cost are also paid for. The integrated client chips have that cost spread out over a broader set of markets. Some, like the business and mobile markets, appreciate the GPU very much.

The possibility exists that a desktop consumer could wind up paying more just for the transistors they need, rather than accepting that the price of having the whole business market and mobile subsidize their purchase is some mm2 of silicon they don't interact with.
 
my decision, to prolong the life of my old LGA1366 workstation, by throwing in a dirt cheap 6-core Xeon
From where did you source that cheap 6-core xeon? My older PC is a socket 1366, and it could be fun to tinker with it again, provided it doesn't cost too much money. :) It could help me buff my folding@home standing if nothing else! ;)

Don't you need a newer chipset for that? I have a Z87 chipset so IIRC I can't upgrade to broadwell.
The wary would ordinarily do well to assume that, considering past Intel history! However, according to Anandtech:
The two ‘C’ models will be socketed LGA parts, meaning that with a BIOS upgrade should be compatible in all Z87 and Z97 motherboards.
...So it looks like we're in luck! :D
 
It's not just the transistors the customer pays for. The unique masks, design, engineering, marketing, and production cost are also paid for. The integrated client chips have that cost spread out over a broader set of markets.
Yeah, while I get the sentiment and all, my response to the whole "it feels bad to waste all these transistors" is kind of leaning towards "you let us worry about that" :) Get whatever processor is most appropriate for you (between devil's canyon, Haswell-E and these new Broadwell chips I'd argue you have a good range of choices at the major design points) if you feel the price is justified for what you get out of the parts of it that you use. Let economics work out the rest :)
 
Take it with a grain of salt. It may be that there are certain authors on Anandtech that are prone to making assumptions and not labeling them as such.
That's quite an accusation. ;) The review has been up for a couple days now and the quote in question still stands. If it was inaccurate you'd think someone would have pointed it out to them by now so they could fix it. ;) Also, AT should be big enough of an operation to have a fact-checking editor... *shrug*
 
From where did you source that cheap 6-core xeon? My older PC is a socket 1366, and it could be fun to tinker with it again, provided it doesn't cost too much money. :) It could help me buff my folding@home standing if nothing else! ;)

Just search for it in eBay. I've got mine from the local classifieds.

The price will vary in a wide range. The lowest I've stumbled once was $60, no warranty, just a 3-day money back. Mine was taken down from a small-scale corporate mail server.
 
Back
Top