Intel Broadwell (Gen8)

I did notice the context of the initial statement.
I've simply expressed my amazement ; indeed the problem would rather be priority (so management/strategy/vision) rather than resources.
 
Just had a Broadwell Pentium (3805U) laptop in my hands.
Was surprised when it turned out it only supports OpenCL 1.2. The i3 variants support 2.0.
I wonder whether that's due to segmentation or because GT1 is on a different die or something.
 
They really pushed forward with GPU performance and the eDRAM does some nice things for CPU performance as well. It's going to be interesting to see what this does to NV in the notebook market.
 
I'm really looking forward to Intel Skylake. GT4e should have 72 EUs, on top of some significant architectural changes. This Iris Pro 6200 looks pretty good though. That's a pretty power friendly integrated GPU, with decent performance. Should look even better with DX12 on Windows 10.
 
Yeah, I don't understand why Apple won't put eDRAM chips in their 13" line; the screen rez is only marginally lower than the 15" macbook, so it could really use the extra graphics performance. Besides, CPU and battery life would all receive a boost from it.
 
Nice. Now we just need to wait for the mobile version.

Unfortunately the brand new Retina Macbook Pro 13" doens't have eDRAM. Would have been an instabuy for me :). And the new 15" version is still based on Haswell and sports an old (renamed) GCN 1.0 GPU :(

I'm interested in a mini PC with a quad core i5 and an Iris Pro GPU during the Skylake timeframe. Ideally, a surface pro 4 would be great, but I don't think Iris Pro will ever make its way into that device because of power.
 
Not sure if it's a problem with our setup and/or individual CPU, but with the Gaming6-Board and UEFI from June 1st, the GT-cores get decidedly more power than would fit in the whole 65 watt thermal envelope for the whole processor package. Might be that initial performance for Iris Pro Graphics 6200 testing is higher than normal.
 
Btw, is there a way to know how much Iris Pro advantage is more because of the eDRAM or the improved GPU?
 
The Z97A Gaming6 has a UEFI option to disable the eDRAM multiplier, but it continues to function anyway. So no, apparently not until Intel releases a SKU with Iris Graphics 6100 - and 65 watt TDP.
 
Btw, is there a way to know how much Iris Pro advantage is more because of the eDRAM or the improved GPU?
Faster GPU + more bandwidth (edram) is a good combo. Iris 6100 (without esram) shows much smaller gains in benchmarks (this was expected, since Haswell GPU was already bandwidth starved and the memory speed remains the same).
 
Problem is, even with 14 nm and the apparent power savings, Iris Pro Graphics seems to be power limited in 65 watt parts.
 
Yeah, you can do that in the UEFI. I haven't tested though if there's a limit.
 
Ah, I see. No, not comprehensively yet. I've seen an unconfirmed (read: not thoroughly reproduced) gain in Luxmark 2.0 Sala by about 14 percent from having a (way) higher TDP limit.
But of course if strongly profits from higher memory clocks as well as higher power budget.
 
Problem is, even with 14 nm and the apparent power savings, Iris Pro Graphics seems to be power limited in 65 watt parts.
Are you seeing it drop GPU clocks when running a workload with the CPU ~idle? I've actually seen that far less on BDW than HSW (HSW's turbo clocks were a bit higher to start with). Obviously the CPU can easily eat the entire TDP and more if asked to, but in GPU-only workloads my BDW GT3e 65W tends to stay pegged to max turbo clock much more than my HSW machines did.

Haven't played with configurable TDP at all though so curious what you're seeing.

In terms of turning off the eLLC, no I don't believe there is any consumer way to do that. Intel has published some numbers on how much it helps though and it is obviously quite large. If you read the Ars review of the i7 BDW NUC you can see that even the Iris Pro 6100 is memory bandwidth limited - it scales almost perfectly across the board from DDR1600 to DDR1866. That's obviously why we have the eLLC there in the first place :)
 
Back
Top