Intel Gen9 Skylake

GT4e in particular is interesting in that it gets into roughly the performance territory of the Xbox One and thus games targeted at console minimums will generally work well on it too.

And don't get me started on external graphics - it seems cool in your head, but it's really pointless in practice vs. just having a desktop PC in the bread box. dGPUs are a significant portion of the cost and power use of a modern gaming PC so it makes little sense to slave them to some laptop that has been optimized to be portable and light vs. just having two separate PCs.

Also the Surface Pro 4 i7 w/ Iris graphics (GT3e) is awesome. Quite fast and great cooling solution. Expensive, but you really can't do much better in the form factor currently.


I disagree about the external gpu. Say I want a laptop that's light, quiet and power efficient, but I want to play games sometimes. My option is either to buy a gaming laptop, which is heavy and huge, or buy an appropriate laptop and a second desktop pc. Much easier to put a GPU in a case, and use it as a dock. You're only ever going to use a gaming gpu somewhere you can plug your laptop in anyway. Right now, I have a macbook with a quad i7 mobile and GT3e. It's great. Being able to dock it with an external GPU would be fantastic.

Now, an NUC-type mini computer with GT4e is a pretty interesting option. Basically a full windows PC with Xbox One performance (in D3D12 games). For me, it comes down to price. Which option gives me the best convenience, performance at an affordable price, considering I already have a nice laptop.
 
Much easier to put a GPU in a case, and use it as a dock. You're only ever going to use a gaming gpu somewhere you can plug your laptop in anyway.
No, it's more complicated and more annoying to do it that way vs. just having the "dock" be a computer too. You're sacrificing performance and paying more for "gaming" props in your laptop that will barely compete with fairly low end desktops (in most cases the "thin and light" machines to pair these with 15W CPUs, but even with a 47W one).

*Far* better to just get a nice ultraportable for a laptop and a proper desktop for a desktop. Faster, more robust *and* cheaper than all the options out there for external GPUs. As a bonus you have two computers that don't need to be tethered to one another and can be used simultaneously...

Like I said, naively it seems like an interesting option but when you actually work out the practical reality, pricing, performance, etc. it's just a stupid plan vs. having a full mini-ITX PC in a similar amount of space. Laptop hardware has absolutely nothing to add to a proper desktop PC.
 
*Far* better to just get a nice ultraportable for a laptop and a proper desktop for a desktop.

And far better than to get a proper $1000 desktop is to get a $4000 enthusiast desktop. But it doesn't mean there isn't plenty of room for something in-between.

You'll never take a desktop out of your home.

A laptop + discrete GPU in a separate box can be put in a backpack and you can take it anywhere within a car trip's reach. Plus you don't have the compromise of having to carry a big laptop when you know you're not going to play games (work, school, etc.).


It's not like the discrete GPUs for laptops are going to kill the destop. When Thunderbolt v3+ becomes standard through USB-C, it might make a dent on thick and heavy gaming laptops though. I know I'd probably never get such a laptop again for myself.
 
Last edited by a moderator:
Laptops with a secondary GPU, even that weird case of a tablet PC with a GPU in the keyboard all have integrated software/firmware to deal with that i.e. nvidia Optimus and AMD Switchable Graphics.

What happens when you plug-in an external GPU, not covered by that support? I don't know, although Windows 7 and up does have the ability to load a graphics driver on the fly. Maybe you'll make do with some weird freeware helper software that runs in the sys tray.
With linux, you would have to reboot or at the very least restart the whole display stack, losing the current graphical user session. Till things move up on that front, if they ever do.

What if you plug any arbitrary GPU in there, like a Fermi Quadro?
Will Thunderbolt act as a Displayport input on the laptop? I doubt so. (that could be a cool feature in the long run)

In all, external GPU could be a feature for geeky users but I can see how it might not be pushed to the general public.
I guess nothing will stop us from using them, heck if we want to attach big expansion box with serial ports cards and what not to a PC with a netbook or NUC form factor, I guess we can.
 
Last edited:
Will Thunderbolt act as a Displayport input on the laptop? I doubt so. (that could be a cool feature in the long run)

Yes, at least on the Razer solution, the external GPU can also drive the internal display on the laptop. It even supports it as part of a multi-monitor layout. And it should be able to switch with Optimus.

So yeah get home and plug one thunderbolt 3 cable in for power, external GPU, some USB 3.1 connected devices and a Gbit (or even 10Gbit) Ethernet home LAN?
I love it.

Now I just need a Skylake Mac with a fast 4/8 core CPU to materialize. A laptop or even a Mac Mini, I'm game.
 
Yeah, I'm quite interested to see what the Surface Pro 4 i7 with Iris can do. Waiting to see a website do some benches on it.

Regards,
SB



From the available benchmarks we have (gfxbench, 3dmark11, 3dmark13, Bioshock Infinite) Iris 540 is roughly twice as fast as HD Graphics 5500 (15W GT2 from Broadwell)
 
The SP4 i7 setup is particularly great as they did a really good job on the cooling and thermal setup. It's basically allowed to use up to 25W depending on the chasis temperature. In practice it tends to sit around 18W but if you want to throw a fan across the back of it you can actually sustain 25W all the time, which is better than the timed PL2 setup on previous designs.
 
Also the Surface Pro 4 i7 w/ Iris graphics (GT3e) is awesome. Quite fast and great cooling solution. Expensive, but you really can't do much better in the form factor currently.
But therein lies the problem right? Almost nobody will have GT3e graphics because it is amazingly expensive to get a device like that. Budget wise it is far better to get something with GM107 and a decent i3 or i5 mobile CPU. As for competing in the console market, Intel could dominate with a GT3e NUC at ~$450, but they will never do it. I think in the long run it would be okay to sacrifice margins for a device like that to get some real midrange market penetration on graphics (Intel graphics = suck to most any gamer, no matter how power efficient they are the performance just isn't there outside of a multi thousand dollar Surface Pro). If you're gonna spend >$500 folks will just get a real PC or console.

Intel is playing an uphill battle against NVIDIA wrt graphics, so they need to offer their products at decent prices if they ever want word of mouth to spread that not all Intel graphics suck ass.
 
Last edited:
From the available benchmarks we have (gfxbench, 3dmark11, 3dmark13, Bioshock Infinite) Iris 540 is roughly twice as fast as HD Graphics 5500 (15W GT2 from Broadwell)
Being strongly bandwidth limited on previous generations, I suspect that framebuffer compression as well as faster main memory in SKL will help the bigger GPUs even more than in BDW/HSW.
 
To further my point, do you think NVIDIA would have the great rep on graphics that it does if only their absolute highest end parts were capable of solid gaming? I think not. This is the position Intel has forced itself into. They don't even offer good graphics on the desktop and I promise there is a market for it. An i5 with GT3e would sell like bananas to a rich monkey.
 
To further my point, do you think NVIDIA would have the great rep on graphics that it does if only their absolute highest end parts were capable of solid gaming? I think not. This is the position Intel has forced itself into. They don't even offer good graphics on the desktop and I promise there is a market for it. An i5 with GT3e would sell like bananas to a rich monkey.

Yah, I've kind of been hoping that Intel would offer a package that offers something that's competitive with PS4. They'd need something more powerful than GT4e to do that. Seems like they only pair their GT3e and GT4e with i7, but an i5 option would be nice. Not sure what the difference in size and power consumption of i7 vs i5 is.
 
There have been desktop and mobile Core i5 models with GT3e since Broadwell (see Core i5-5350H). And if Intel's own ARK prices are anything to be believed, for Skylake the price difference between a GT2 and a GT3e CPU is getting really small ($23 between a 6200U and a 6260U).

I don't really understand why there the laptop OEMs keep pumping out only GT2 models, as it doesn't make much sense. Even the Surface Pro 4 should have the i5 + GT3e option IMO.
Perhaps it's a matter of availability.


As for PS4-level GPU performance in a current APU, I don't think it's coming before Cannon-Lake and Zen APUs somewhere in 2017. By then, the consoles will be almost 4 years-old, though.
The GT4e actually gets really close to a XBone in theoretical performance (1.31 TFLOP/s XBOne vs. 1.152 TFLOP/s GT4e), though Intel's Gen9 is probably pretty distant from GCN as far as compute and async capabilities go.
 
Last edited by a moderator:
To further my point, do you think NVIDIA would have the great rep on graphics that it does if only their absolute highest end parts were capable of solid gaming? I think not. This is the position Intel has forced itself into. They don't even offer good graphics on the desktop and I promise there is a market for it. An i5 with GT3e would sell like bananas to a rich monkey.
I disagree, and mainly because the broader market has no idea of what they need performance-wise. SKL GT2 is sufficient for playing almost any game on some set of settings - hell it's faster than some discrete cards that still get sold (mainly in China). And that's really the point: there's strong evidence that the market simply associates "integrated = bad" and "discrete = good" regardless of the relative performance of various parts. On top of that there's an obvious brand halo effect - i.e. "Titan X is the fastest, so I should go by a 720M!".

The market is pretty irrational here and I don't think a different SKUing strategy would address it at this point to be honest. Would you and I love to see higher end graphics in a wider variety of parts? Definitely! Could they sell it for any more money than GT2 today? Almost certainly not. Even if it was the same price would you get tons of enthusiasts bitching that they should get more CPU in the same area instead? Guaranteed.

Anyways I'm not trying to be negative and I obviously agree and would love to see higher end graphics across a wider range of chips, but right now the reality is that user perception has almost nothing to do with actual performance and everything to do with "integrated vs discrete", as it evidenced by both AMD and NVIDIA's marketing and various successes/failures with their SKUs.

It's worth noting as well that you could easily put an Iris 15W chip in much cheaper machines than the SP4 (and indeed there are several announced/available), but it's really a question of the whole package. For a "premium" device like the SP4 Microsoft did a great job on SKU-specific power management, cooling, etc. For more budget devices you'll likely hit throttling and other stuff just due to OEMs having restricted time/budget for those lower margin designs.
 
I am proposing that Intel take steps to dispel the notion that Integrated Sucks, and that would mean sacrificing a bit on the precious margin. I know margin is important, but so is market penetration and consumer perception.

As for enthusiasts bitching about wanting more coars instead of better graphics,
A) X99 offers just that.
B) Intel should really start thinking about offering >4 cores on their mainstream socket, and even more cores on the big socket. There is more than enough room for SKUs with better graphics + EDRAM (or maybe it's SRAM now I dunno) vs more cores and vice versa. It would be far more useful than the pedantic segmentation we have now with i5-6400 vs 6500 etc.

And why in the blazes isn't the large L4 cache offered on desktop? I promise it would sell even to those not using IGP.
 
Last edited:
I am proposing that Intel take steps to dispel the notion that Integrated Sucks
It's far from clear to me that what you're suggesting would help at all in that case. It certainly hasn't for AMD despite them putting fairly large graphics on all of their APUs and selling them for very competitive prices.

It would be far more useful than the pedantic segmentation we have now with i5-6400 vs 6500 etc.
Not really - you take a frequency hit for it and that'll impact the "regular" workloads. Ex. HSW-E runs games slower than devil's canyon. I'm also not convinced there are enough commonly-used and compelling workloads out there that are sufficiently multithreaded for people to make that choice (because it is ultimately a trade-off), and the people who do have those workloads can get the -E parts today.

And why in the blazes isn't the large L4 cache offered on desktop? I promise it would sell even to those not using IGP.
It is - there's the 5775c and ilk. It helps a little bit in some workloads (Winrar IIRC), but it's hardly a panacea for the cost (remember it's a whole 'nother chip on the same package). The CPU already has a pretty large LLC and the number of workloads with working sets >8MB but <128MB is fairly limited in the client space.

I'm playing devil's advocate as I noted in my previous post, but my point is that the options really are there at the moment, yet people are still typically best served by the standard i5's and i7's. Beyond "I want more for free!" and pretending there are no performance trade-offs, I'm not sure the perception issues are easily overcome via SKU strategy. People ultimately *don't want* to understand the subtleties of reality and instead just prefer broad stroke brand loyalty.

Anyways getting a bit off-topic and I've said my bit :)
 
Last edited:
It is - there's the 5775c and ilk. It helps a little bit in some workloads (Winrar IIRC), but it's hardly a panacea for the cost (remember it's a whole 'nother chip on the same package). The CPU already has a pretty large LLC and the number of workloads with working sets >8MB but <128MB is fairly limited in the client space.

The Tech Report found that stock Broadwell-C was surprisingly competitive compared to stock Skylake-K in gaming situations when judged through the lens of frame time consistency. Granted, it is a slim lead, but CPU performance never really provides stunning leaps in gaming performance. And the slim lead is likely within the review's margin of error, but nothing looked egregiously out of wack, so I give it a pass in that respect.

value-gaming.gif


So while gaming is only a sliver of the potential use cases for a modern computer, I think it's a decent chunk of the market for these kinds of CPUs - decent enough for Intel to keep putting out these kinds of CPUs at least.
 
Thing is, a Quad GT4e will probably suit my gaming needs perfectly (I'm the guy that mostly buys stuff like HD2600, HD4670, GT630 (fast version)).
Even being socketed is probably not a requirement, upgrading an i5 or an i7 does seem silly. Though this only works if there are very few GT4e SKUs, otherwise socketing is what gives the ability to mix and match motherboard features and CPU power.
However, 45W for quad cores + GT4 does seem a bit too little. Not to mention that I would like a custom cooler, enough expansion slots and SATA ports, etc.
 
It might be a commercial reality with the next "half gen", Kaby Lake. Such a desktop version would likely be calibrated for 65 watts, as Broadwell-K and Skylake-S are.
 
Back
Top