Intel Skylake Platform

*AHEM* This is an Intel thread, please keep the AMD talk out of it. You know, for being the self-proclaimed Master Race, the PC warriors are far worse than the console warriors. Now I see why there's talk about handing out vacations on the PC side of things too.
 
No. It's one of the ways of supporting their huge R&D expenses.
In other words, I'm happy that octacores cost $999 instead of $500, because it probably means that the quadcore I want costs $200 instead of $250.
Intel released mainstream quad-cores almost a decade ago in 65nm process. Today Intel has 18-core CPUs and will release 28-core CPUs in the next generation. Quad-cores should be dirty cheap.
 
Intel released mainstream quad-cores almost a decade ago in 65nm process. Today Intel has 18-core CPUs and will release 28-core CPUs in the next generation. Quad-cores should be dirty cheap.

But they are not. I do not want a quad core processor for 200 instead of 250, I wanna an 18-core for 500-600 instead of 2000.

*AHEM* This is an Intel thread, please keep the AMD talk out of it. You know, for being the self-proclaimed Master Race, the PC warriors are far worse than the console warriors. Now I see why there's talk about handing out vacations on the PC side of things too.

BRiT, what is your opinion about Skylake? Do you like it? Do you have something to share about it, perhaps we haven't heard yet?
 
What's your definition of dirty cheap? You can get an unlocked i5-4690k for $238 on Amazon today, which is the very top of the i5 line. It will crush nearly all of the competition from AMD outside of a very, scant few super multi threaded applications at stock clock, and would overclock to 4.5GHz on the cruddy OEM cooler.

At that price and capability, I cannot fathom why you need anything else. If "four real cores" your only metric then you're stuck buying an A8 from AMD, which is going to skew your complaint about dirty cheap...

And then you're going to lose on IPC, power consumption and memory access latency. To some degree you are getting what you pay for.

To whoever wanted to blame "greed" for pricing of multi-zillion core sockets, please tell us all the name of a competitor who is doing anything similar, and at a lesser price.

I will not be holding my breath for that list...
 
What's your definition of dirty cheap? You can get an unlocked i5-4690k for $238 on Amazon today...
If Intel had mainstream quadcores at 65nm almost a decade ago, AMD had six-cores at 45nm for $199 half a decade ago and today even a $35 RaspberryPi 2 is quadcore, then at 22nm and 14nm my definition of "dirty cheap" is "Celeron cheap".
Your own words indicate that Intel's mainstream prices have much more to do with lack of competition than anything else.
 
If Intel had mainstream quadcores at 65nm almost a decade ago, AMD had six-cores at 45nm for $199 half a decade ago and today even a $35 RaspberryPi 2 is quadcore, then at 22nm and 14nm my definition of "dirty cheap" is "Celeron cheap".
Your own words indicate that Intel's mainstream prices have much more to do with lack of competition than anything else.
If you are going to lump a mobile ARM into your discussion as an example of "dirty cheap" quad core and somehow hold that against Intel, then to be fair you should be comparing to the Atom line which has delivered quad cores for two generations now.

http://ark.intel.com/products/80274/Intel-Atom-Processor-Z3735F-2M-Cache-up-to-1_83-GHz

If you're going to play that game, then there's a $17, MSRP tray price for quad core x86, released back in Q1 of 2014. This link on the ARK shows all the Bay Trail series, none of the Atom line exceed $50 by my cursory glance:
http://ark.intel.com/products/codename/55844/Bay-Trail#@All

Edit: If you're willing to add $3 MSRP to the Z3735F Bay Trail I pointed out above, you can get a current gen Z8300 Cherry Trail with a 10% drop in TDP, a 20% increase in main memory bandwidth, and probably a 50% grain in graphics capability. Both are 64-bit capable, and given the PCI-E built in, would (in pure performance) annihilate the ARM quad you'd put it up against. I"m sure you can source a cheaper ARM, but the race to the bottom isn't interesting to me.
 
Last edited:
Your arguments don't challenge the fact that quadcores are not special or expensive anymore, except on Intel's desktop lineup, where quadcores are sort of high-endish despite the fact that Intel can put 20x more stuff per chip relative to a decade ago when their first mainstream quadcore was released.
Quad-core is the sweetspot nowadays on the desktop, but since Bulldozer sucks and AMD is faltering, Intel still wants to force you into paying almost $200 for the cheapest quadcore when even Skylake Celerons should be quadcore in this day and age.
 
quad-core is the sweetspot nowadays on the desktop
You will need to defend this with data. Just as a thought experiment, you must realize that the super-majority of the consumer computing space is working in the Microsoft Office suite (Outlook, Word, Excel, OneNote, PowerPoint, Project, VIsio) or some other open source version of the same. When those consumers aren't using local apps, they're using browser interfaces that are probably server-side execution with little Java applets. Even when they are gaming, the largest portion of that gaming is browser-based gaming, not locally installed, multi-gigabyte powerhouse games like we would talk about here on B3D.

All of the above items are lightly threaded at best. Consider than a desktop Core i3 can run (up to) four threads, which means it is likely more CPU than most of the population really needs.

and but since Bulldozer sucks and AMD is faltering, Intel still wants to force you into paying almost $200 for the cheapest quadcore when even Skylake Celerons should be quadcore in this day and age.
If you're intent on making this a "Cry about how Intel makes us buy expensive quad cores" conversation, then go make your own thread to do it.
 
Your arguments don't challenge the fact that quadcores are not special or expensive anymore, except on Intel's desktop lineup, where quadcores are sort of high-endish despite the fact that Intel can put 20x more stuff per chip relative to a decade ago when their first mainstream quadcore was released.
Quad-core is the sweetspot nowadays on the desktop, but since Bulldozer sucks and AMD is faltering, Intel still wants to force you into paying almost $200 for the cheapest quadcore when even Skylake Celerons should be quadcore in this day and age.

Hopefully DX12 will start showing better scaling with multiple cores which may force a "core arms race" between AMD (with Zen) and Intel.

I absolutely agree that we're being short changed with quad core CPU's in 2015 when it's possible to fit 2x that many cores on a die without pushing boundaries even for mainstream chips. I find the 8 core "enthusiast range" equally rubbish. Fair enough there's an argument to be made that 4 cores are all the "mainstream" (non gamer) market requires but the E range are supposed to be aimed at enthusiasts, mainly gamers. And yet we're still only getting 8 cores when twice that number should be easily feasible without an iGPU.
 
But why? What is this fascination with tons and tons of CPU cores?

What in this green earth are you doing with that kind of compute resource? There are incredibly few, and incredibly niche applications that can use "lots of cores." At some point, sanity has to prevail. Yes, there are chips out there that provide 18 cores / 36 threads in a single socket, but you've now started oversubscribing the I/O interfaces to that socket. There are only very unique workloads that would make sense to place in a system with such a low I/O capacity in relation to compute.

Yes, you can make the academic argument about "lots of cores have been around forever, why aren't they on the desktop?!?" Until you can find something that would actually use those resources, it's pointless.
 
But why? What is this fascination with tons and tons of CPU cores?

What in this green earth are you doing with that kind of compute resource? There are incredibly few, and incredibly niche applications that can use "lots of cores." At some point, sanity has to prevail. Yes, there are chips out there that provide 18 cores / 36 threads in a single socket, but you've now started oversubscribing the I/O interfaces to that socket. There are only very unique workloads that would make sense to place in a system with such a low I/O capacity in relation to compute.

Yes, you can make the academic argument about "lots of cores have been around forever, why aren't they on the desktop?!?" Until you can find something that would actually use those resources, it's pointless.

I'd argue the reason why most applications don't use more than 4 cores is that we don't have more than 4 core CPU's in the mainstream. Pretty much all games these days make good use of 4 cores and most scale to some extent to 8. I've little doubt that if 8 cores were mainstream now (and had been for the last few years) then we'd be seeing pretty good scaling on them in lots of applications.
 
Is anyone else frustrated by the near complete lack of uarch information released on Skylake? It's still supposed to be coming out a mere few weeks from now, isn't it? When was the last time Intel has been so close to a CPU release with so little disclosed information about what they've improved? Especially for a "tock." That old rumor that Skylake is a huge uarch revamp (on par with Prescott to Conroe no less) is looking more and more dubious.

I suppose they could be more secretive now with more competitors with relevant CPU designs having entered the fray.
 
You will need to defend this with data. Just as a thought experiment, you must realize that the super-majority of the consumer computing space is working in the Microsoft Office suite (Outlook, Word, Excel, OneNote, PowerPoint, Project, VIsio) or some other open source version of the same. When those consumers aren't using local apps, they're using browser interfaces that are probably server-side execution with little Java applets. Even when they are gaming, the largest portion of that gaming is browser-based gaming, not locally installed, multi-gigabyte powerhouse games like we would talk about here on B3D.

All of the above items are lightly threaded at best. Consider than a desktop Core i3 can run (up to) four threads, which means it is likely more CPU than most of the population really needs.
The majority of people also don't need 3GHz CPUs, 8GB of RAM or 1TB HDD, but they have it because it comes by default and it's cheap. Their video requirements are very basic, and yet Intel will spend a huge chunk of the silicon die with an IGP that is overkill.

There are Excel benchmarks at the usual sites that show that some Office functions do benefit from quadcores. Web browsers are increasingly multithreaded, some people often keep dozens or hundreds of open tabs in the browser and some sites and webapps can be demanding. People often multitask with many apps running simultaneously. Media consumption is increasing and some people need to transcode their videos to put them into their phones and tablets.

A few years ago I used to dismiss quadcores while saying that dual-cores were the sweetspot since most apps were single-threaded. Despite the fact that the situation hasn't greatly changed, I still think it has improved enough to justify quad-cores as the new sweetspot on the desktop. And if dual-cores are better than single-cores for running single-threaded programs, then quad-cores are better than dual-cores for running dual-threaded programs. We don't really need to max out all four cores to justify a quad-core.

If you're intent on making this a "Cry about how Intel makes us buy expensive quad cores" conversation, then go make your own thread to do it.
If you're intent on making this a "let's praise Intel and their monopoly margins because they know what we users need" conversation, then go make your own thread to do it.
 
If you're intent on making this a "let's praise Intel and their monopoly margins because they know what we users need" conversation, then go make your own thread to do it.
This thread is about Skylake, not your misperception of the usefulness of a thousand cores.
 
I'd argue the reason why most applications don't use more than 4 cores is that we don't have more than 4 core CPU's in the mainstream. Pretty much all games these days make good use of 4 cores and most scale to some extent to 8. I've little doubt that if 8 cores were mainstream now (and had been for the last few years) then we'd be seeing pretty good scaling on them in lots of applications.
Good is relative and non-trivial parallel programming is not easy. Also, go beyond games and you'll find an even more barren landscape, there aren't throngs of apps that show linear scaling or are crying for more cores in the consumer space. The cost / benefit analysis there just isn't there for software development houses, at the moment (AFAICS). Whether some up and coming thing will change this (which would keep the wheels of commerce going) is to be seen.
 
This thread is about Skylake, not your misperception of the usefulness of a thousand cores.
We are talking about four (4) cores as the new baseline for desktop Skylake. It's you who are talking about "thousands" or "zillions" of cores on the desktop.
 
Any hints of any L4/crystalwell Skylake chips in the pipe - especially for the desktop?

I'm not upgrading until I can get my hands on such a beastie...
 
Any hints of any L4/crystalwell Skylake chips in the pipe - especially for the desktop?

I'm not upgrading until I can get my hands on such a beastie...

The roadmaps that have leaked so far show Skylake below top Broadwell SKUs. I assume that this is because they lack eDRAM. But surely there will be Skylake models with eDRAM at some point.
 
But surely there will be Skylake models with eDRAM at some point.
Yeah, it's the "at some point" which galls me... :D Then again, I bought a new gadget recently already, maybe it's not time to splurge for the guts of an entirely new PC just yet, especially as the current one is perfectly serviceable, and there aren't any new GPUs worth upgrading to for me anyway... When high-end 16nm FF GPUs hit, well, then we're talking.

Bit of a first-world problem though isn't it, to want to shop, but not have anything worth shopping?
 
Back
Top