Intel Broadwell for desktops

Turbo runs cores at different clocks insofar as some cores are clocked at ~0Hz, and others are clocked >0Hz... :) All cores actually being clocked receive the same clock AFAIK.

I've seen no diagnostic/benchmarking utility that suggests otherwise...
 
Pretty sure Nehalem was the first intel chip to feature independent core clocks. At least Core 2 Duos did not.
 
I think Core 2 Quad has independent die clocks. The two separate dies seem to do their own thing.
 
I think Core 2 Quad has independent die clocks. The two separate dies seem to do their own thing.
Makes sense, I guess for C2D it wouldn't work because of the shared L2 being in the same clock domain. Though by that logic it could have worked for the Pentium D too as it also was two completely separate cores not sharing anything at all (be it on one or two dies).
 
Wow, AMD just got taken to school on the integrated graphics front, I wasn't expecting that! Looks like AMD's going to have to up CU numbers and bring HBM to their APU's sooner rather than later. I expect 10 CU's with HBM would be enough but I'd sooner see 12 for console level performance in an APU.

Incidentally, why aren't AMD using the Tonga IP in its latest APU's? Surely they would benefit.hugely from the colour compression tech?
 
Wow, AMD just got taken to school on the integrated graphics front, I wasn't expecting that! Looks like AMD's going to have to up CU numbers and bring HBM to their APU's sooner rather than later. I expect 10 CU's with HBM would be enough but I'd sooner see 12 for console level performance in an APU.

Incidentally, why aren't AMD using the Tonga IP in its latest APU's? Surely they would benefit.hugely from the colour compression tech?
That IP level would not have been available in time for validating Kaveri, which is what lies at the core of their current APUs.
 
Wow, AMD just got taken to school on the integrated graphics front, I wasn't expecting that!
I mean no offense but I have to ask: in all seriousness were you not aware of the fact that Haswell Iris Pro already did this? I thought it was fairly common knowledge that Iris Pro competes well with AMD A-series stuff, but I've seen a lot of comments similar to yours after the reviews here, so maybe not?
 
I mean no offense but I have to ask: in all seriousness were you not aware of the fact that Haswell Iris Pro already did this?
There weren't desktop Haswell's with Iris Pro were there?

In regards to Broadwell, color me impressed 65W @ the benchmarked performance. Both CPU and GPU wise it has me wishing I held out a year for a new machine.

BTW are the problems with transactional memory (in Haswell) fixed in it?
 
There weren't desktop Haswell's with Iris Pro were there?
There were no socketed (LGA) ones, no. But there were 47 and 65W ones (such as the 4770R). I realize it's nice to have socketed ones and indeed Intel doing this is largely a response to consumer request by my understanding, but any general notion of whether AMD's iGPUs were better than Intel's (as per the sentiment above) should have considered Iris Pro, socketed or not, no?
 
BTW are the problems with transactional memory (in Haswell) fixed in it?

I suppose it's fixed in Broadwell-E ; it was, in Haswell-EX.
For desktop and laptop Broadwell that may be a rather low priority feature. Consumer Broadwell is actually kind of old, perhaps older than Haswell-EX?

Ah, there's even this on wikipedia :
https://en.wikipedia.org/wiki/Trans...ion_Extensions#cite_note-intel-spec-update-10

source here
http://www.intel.com/content/dam/ww...dates/core-m-processor-family-spec-update.pdf
I'm not sure if I understand if TSX is usable or not : bug number "BDM36" seems to tell it's messed up.
Perhaps we don't need it at all anyway. If it's borked it will only hurt you if you're a developer of high end server apps.
 
but any general notion of whether AMD's iGPUs were better than Intel's (as per the sentiment above) should have considered Iris Pro, socketed or not, no?
Actually, market segment does matter. An important thing Intel is doing now is bringing GT3 parts to way more laptops.
They have decreased the price margin of GT3 parts by a lot compared to Haswell, and there are more GT3 parts now than there were with Haswell, and they begin lower on the stack.
So for practical purposes, Broadwell is where Intel becomes better than AMD for a large part of the market (mainstream laptops).

Broadwell-C is still too expensive for the graphics performance it provides. It is a nice bonus for those who just want a fast and efficient CPU with eDRAM, though. If graphics is what you want, won't Skylake GT4e provide an even bigger boost?
 
Personally I'd like to see a GT3e on an i3 rather than an i5 or i7 for desktop. It makes a lot more sense there. At least for me. It'd be perfect for my HTPC/server machine for gaming. When I move up to a desktop i5/i7 part, I'd be looking for better graphics performance than GT3e can provide. Granted, this may not mesh with the average consumer. But then is the average consumer going to be getting an i7? And if one does, chances are, they'd know enough to want better graphics performance. And if they don't need graphics performance then they likely wouldn't need the boost that the edram provides.

These desktop Broadwell parts are too close to Skylake for me to in the market for one, but it's interesting to see the evolution. Can't wait to see what Skylake brings, and hopefully there will be an i3 Skylake version with edram.

Regards,
SB
 
Can't wait to see what Skylake brings, and hopefully there will be an i3 Skylake version with edram.
Yup, I hope so too. GT3 is now actually far more prevalent among i3 ULT parts. Previously, the only GT3 in i3 ULT parts was the sole 28W option.
So I am hoping that Skylake brings a GT4e to dual core desktop i3 processors.
And yes, by the way. Since the only cost of the 28W options is better cooling (power consumption at idle should stay the same), we do need more laptops with those.
 
Back
Top