Intel Broadwell for desktops

They talk of a potential 10 core desktop version too which would would be pretty cool. Although having to wait until mid 2015 for it isn't.
 
In one way of thinking, there's no significant need for faster processors. I suppose this shouldn't be a surprise, really...
 
In one way of thinking, there's no significant need for faster processors. I suppose this shouldn't be a surprise, really...

Yep - if anything, the only processor I'm really looking forward to is Cherry Trail, and that's only because it should do away with the Bay Trail graphics bottleneck.
 
In one way of thinking, there's no significant need for faster processors. I suppose this shouldn't be a surprise, really...
But sure many people would want the same performance now, in a much tighter energy and thermal envelope, not only for mobile but also in the rising class of compact desktop "bricks". ;)

Apple actually did this already with its Mini line a while ago, now the rest are following. It's curious that VIA had such an idea few years back also, but it didn't caught much traction due to the wimpy CPU and IGP options.
 
Last edited by a moderator:
I see Haswell Celeron is out in stores with hard availability!
Here was this year's CPU news. (but there will be cut down Kaveri and Kabini/Temash refresh)

News sites seem to ignore it. Well, Pentium Haswell is better but you can spend the small price difference on memory/storage and other stuff instead.
Now what's missing is Displayport on low end motherboard (50 euros and less), that's braindead because the connector is royalty free and the hardware is worth using for ten years so a means to plug a 3K, 4K etc. screen on the IGP is useful.
 
http://www.anandtech.com/show/7875/new-unlocked-iris-pro-cpu-broadwell

e8xCDd6.png
 
So the i5-5670k and i7-5770k will have an Iris Pro equipped with 128MB eDRAM? very nice.
Will the eDRAM help CPU performance?

I hope these chips will have the TIM & IHS improvements that Devil's Canyon chips will have for improved overclocking.

A few years ago AMD had a huge GPU advantage and Intel had a huge CPU advantage, By next year Intel will have just about closed the GPU gap but the CPU gap has only widened.
 
http://www.anandtech.com/show/6993/intel-iris-pro-5200-graphics-review-core-i74950hq-tested/3

Unlike previous eDRAM implementations in game consoles, Crystalwell is true 4th level cache in the memory hierarchy. It acts as a victim buffer to the L3 cache, meaning anything evicted from L3 cache immediately goes into the L4 cache. Both CPU and GPU requests are cached. The cache can dynamically allocate its partitioning between CPU and GPU use. If you don’t use the GPU at all (e.g. discrete GPU installed), Crystalwell will still work on caching CPU requests. That’s right, Haswell CPUs equipped with Crystalwell effectively have a 128MB L4 cache.

Even if you don't use the Broadwell Gen8 Iris Pro GPU, you still get a gigantic 128MB L4 cache.
 
No, the plan is to make the GPU much bigger and improved, and to not use eDRAM on those versions. That makes it similar to AMD APUs.

The news is it will get Iris branding.
 
Ah, my source for that was this thread's very first post which links to an article that says :

Intel is bringing Iris Pro (GT3) graphics to the desktop with the Broadwell-K CPU line. Expected to arrive towards the end of 2014 or 2015, these new unlocked LGA1150 processors promise over 80 percent more graphics performance than the Core i7-4770K.

80% faster than 4770K is merely catching up with Haswell GT3e and Richland, roughly, and thus I immedialely conclude that 5670K/5770K would have the full GPU but no eDRAM L4.
 
Not sure what you are talking about, the slide clearly states -K SKUs will get Iris Pro and Iris Pro comes with both larger GPU with more EUs & the 128MB eDRAM L4 cache.
 
Iris Pro is a brand. Where's the fact that eDRAM is used?
So far, Iris means use of the biggest GPU, and "Pro" is the actual significantly fast one.

The "K" CPU will have a higher TDP than mobile ones and this alone will make the GPU performance faster.
 
Yes, with my little theory it would be bandwith starved. Official DDR3 support would have to be bumped to 2133 max instead of only 1600.
I'm waiting for being proven hilariously wrong, it would be pretty amazing to see that eDRAM on those top gamer parts. But that would have a high cost.
 
Back
Top