Is everything on one die a good idea?

I have been thinking about this a lot lately because Linus Torvalds said something like dGPUs will no longer be made after 10 years from now. I have always thought intel (but not just intel) went the wrong path with iGPUs (due to the versatility of programmable rendering and how decent its performance could be), so I want to hear what the experts here have to say about them (like advantages, disadvantages, and alternatives).
 
Integration is generally accepted as a positive progression because it avoids the duplication of certain components/parts between different processors/(co-)processors. A dGPU needs its own RAM, power/voltage circuitry, lots of I/O, a multi-layer PCB to house all that, etc. A motherboard that only takes an APU makes it all simpler, cheaper and easier to build.


10 years for dGPUs to stop existing is probably a good prediction.

As a downside, we'll get a lot less options to mix and match CPUs and GPUs from different vendors. We'll probably have more differentiated APUs (i.e. gaming APU line, business APU line, professional CAD/design APU line) and each line should have different segments and price points.

For example, nVidia has announced that they'll start selling their iGPUs as IP for SoCs. I wouldn't be surprised if Intel starts selling APUs/SoCs with nVidia iGPUs for their gaming range and then keep their existing iGPU line for business parts only. OTOH, I doubt we'll see an Intel SoC using an AMD iGPU or an AMD SoC using a nVidia iGPU.
 
10 years for dGPUs to stop existing is probably a good prediction.

Hasn't this prediction happened before when the first GPU was integrated into a chip set?

And again when the GPU was integrated into the CPU?

This "10 years for dGPUs to stop existing" has been stated for what seems like well over 10+ years now.

So I guess we will keep seeing this "10 years for dGPUs to stop existing" year after year never ending.
 
There's actually a cycle of reincarnation for silicon design, where peripheral hardware can be subsumed into the primary chip, and then new peripherals are created for new niches that may be subsumed by the main chip, and so on.

There are factors that can be weighed against each other: how accomodating is the main chip to integrating the peripheral and how demanding is the peripheral?
The demands can range from just raw density, to other factors like external bandwidth, power draw, design cycles, or some sometimes physical quirks that keep things like antennae (edit: or perhaps more correctly the analog components linked to it) from going on-die.

Over time, SoC density has increased, while for a wide range of devices the utility or physical usability of more CPU cores has dropped.
Meanwhile other silicon like GPUs has reached enough general acceptance that they will likely enhance the value of an SoC they are put into.

Memory bandwidth demands and new interfaces are likely to align GPU and CPU memory directions, so one of the big differences is likely to go away.
Separate power budgets is one thing, but this is counterbalanced by things like the need for tighter integration for eliminating overheads and the increase in programmability. Besides, there are higher-end CPU solutions with power draw that's not significantly different from all but the high-end GPUs.

To keep GPUs separate in the long term, there needs to be a strong enough reason for them to be separate for enough of a market to justify the increasing expense in designing and producing any silicon product. However, right now it looks more like we're closing in on the point that they do better more closely integrated, and we haven't found an evolutionary niche that justifies them being separate.
 
The main obstacle I see to widespread integration in the short term is the fact that no one company holds all the right cards. Intel has a fine CPU architecture but AMD/nVidia are still miles ahead in GPU land especially in terms of driver and software support. On the other hand nVidia has absolutely no presence in the desktop CPU market.

If I look at recent trends and my own upgrade history the demand for more graphics processing power far outstrips that of conventional CPU workloads. You can keep up with the latest games by simply upgrading only your discrete graphics card. A good quad-core CPU can easily last 5+ years.

I would be happy to upgrade an APU every couple years instead of a GPU assuming there's no additional cost versus a discrete GPU of comparable performance. However I don't think any company is currently positioned to meet that standard.
 
A good quad-core CPU can easily last 5+ years.

PS3 and XB360 was not using IA64, nor were designed to spread main core job over 6-8 equally powered cores.
PS4 and XB1 are. This means that new games will naturally be optimized for such support.
Intel advantage is not really on multithreading, but rather on its huge IPC compared to the weak AMD cores.
That advantage will slowly become less important over the time. An 8 core CPU will likely last "better".
 
A good quad-core CPU can easily last 5+ years.

Although to be fair, part of the problem there is that the CPU hasn't had much in the way of improvement in the past half decade. I'd love to upgrade my CPU to something 4x faster, but in the last 4-1/2 years, there hasn't quite been that level of improvement. Instead, we've seen investment in APUs -- great for laptops, pointless for forcing desktop upgrade cycles imho, ymmv.
 
Intel advantage is not really on multithreading, but rather on its huge IPC compared to the weak AMD cores.
That advantage will slowly become less important over the time. An 8 core CPU will likely last "better".

The comparison is not really valid since if Intel sold chips at AMD's margins we would get 6-10 core Xeons for the price of an FX8350 in a similar power envelope. Also single thread performance will continue to matter for years to come.

Although to be fair, part of the problem there is that the CPU hasn't had much in the way of improvement in the past half decade. I'd love to upgrade my CPU to something 4x faster, but in the last 4-1/2 years, there hasn't quite been that level of improvement. Instead, we've seen investment in APUs -- great for laptops, pointless for forcing desktop upgrade cycles imho, ymmv.

CPUs are a lot more powerful than they were 5 years ago, but there isn't much end user software (games) that can take advantage of 8-20 threads. Also no competition. Intel can drive margins instead, while still leading considerably in performance..
 
Hasn't this prediction happened before when the first GPU was integrated into a chip set?

And again when the GPU was integrated into the CPU?

No, not really. Not for either case AFAIK.
Unless you're willing to show sources for such claims, for which I'd say they would be wrong.

APU and heterogeneous computing stands at a very different position from when it was 10 years ago.
I have no idea how someone would assume dGPUs would be dead by now when the Radeon 9700 Pro was released. I can't really make any sense out of it.
 
PS3 and XB360 was not using IA64, nor were designed to spread main core job over 6-8 equally powered cores.
PS4 and XB1 are. This means that new games will naturally be optimized for such support.

Yep hopefully. Though such optimizations will benefit quad cores greatly as well. I don't think I've ever seen all 4 of my cores fully or equally loaded in any title.

Intel advantage is not really on multithreading, but rather on its huge IPC compared to the weak AMD cores.
That advantage will slowly become less important over the time. An 8 core CPU will likely last "better".


Maybe. It won't really change the general trajectory of GPU vs CPU demand though. That's primarily a software issue. Progress on that front would make it easier to configure high-end APU SKUs that make sense.
 
No, not really. Not for either case AFAIK.
Unless you're willing to show sources for such claims, for which I'd say they would be wrong.

You can go tell Jon Peddie that he is wrong then.

The death of discrete GPUs by integrated graphics has been predicted every year for the past ten years and every year for the past ten years, the reports of death have been highly exagerated — even this year. If you look over time at the ratio of discrete GPUs to integrated GPUs (including the embedded GPUs in CPUs like Fusion and Sandy Bridge) you can see an obvious trend.

http://jonpeddie.com/back-pages/comments/discretes-are-deadlong-live-discretes

We may not see that this year but our forecast indicates a good future, and a long future for Discretes—long live the Discrete.
 
Yep hopefully. Though such optimizations will benefit quad cores greatly as well. I don't think I've ever seen all 4 of my cores fully or equally loaded in any title.

I have. MMORPGs. Especially when you get a lot of characters on screen at once. Systems can get CPU starved quite easily. I've got an i5 2500k at 4.4 Ghz. And it can easily get CPU starved (100% CPU useage which then leads to slowed gameplay and rendering) in some MMOs.

You won't likely see that situation in single player games where developers can easily control how many players/NPCs are on screen at any given time. But in an MMO once you start to get a few hundred players in the same area, all with physics animations, pathing, movement prediction, etc. the CPU gets greatly stressed.

Regards,
SB
 
You can go tell Jon Peddie that he is wrong then.

It's true, that statement from Jon Peddie makes no sense. Where on earth did people during 2001 start to claim the dGPU was going to die ten years later? That time was the rise of the discrete GPU. The first Radeon and the first Geforce chips introducing T&L.

Who the hell was saying dGPUs back then were to get extinguished when they were just starting to show real promise?

This is a completely different era. We're looking at the years of high-performance SoCs, heterogeneous computing, tight CPU/GPU latencies and high-bandwidth memory with 2.5D stacking on MCMs or over SoCs. Slowly ending the dGPU within 10 years (we're talking 2024) makes sense from where we're standing.
It would make zero sense to claim that 13 years ago. Jon Peddie just decided to start that article with a punchline that just happened to be a bullshit claim.


I have. MMORPGs. Especially when you get a lot of characters on screen at once. Systems can get CPU starved quite easily. I've got an i5 2500k at 4.4 Ghz. And it can easily get CPU starved (100% CPU useage which then leads to slowed gameplay and rendering) in some MMOs.

Draw-call limitation? Sounds like a case to be solved by lower overhead drivers :)
 
It's true, that statement from Jon Peddie makes no sense. Where on earth did people during 2001 start to claim the dGPU was going to die ten years later? That time was the rise of the discrete GPU. The first Radeon and the first Geforce chips introducing T&L.

Who the hell was saying dGPUs back then were to get extinguished when they were just starting to show real promise?

This is a completely different era. We're looking at the years of high-performance SoCs, heterogeneous computing, tight CPU/GPU latencies and high-bandwidth memory with 2.5D stacking on MCMs or over SoCs. Slowly ending the dGPU within 10 years (we're talking 2024) makes sense from where we're standing.
It would make zero sense to claim that 13 years ago. Jon Peddie just decided to start that article with a punchline that just happened to be a bullshit claim.




Draw-call limitation? Sounds like a case to be solved by lower overhead drivers :)

I though Tim Sweeney (amoungs others?) said dGPU's would be gone in ten years, around 2001ish IIRC? As GPU's were getting more programable the thought was everything would return to software bassed rendering and be run on 10GHz Pentium 4s.
 
It would make zero sense to claim that 13 years ago. Jon Peddie just decided to start that article with a punchline that just happened to be a bullshit claim.

Why don't you contact him and ask him about his your quote "bullshit claim" then.

http://www.jonpeddie.com/about

Dr. Jon Peddie is one of the pioneers of the graphics industry, and formed Jon Peddie Research (JPR) to provide customer intimate consulting and market forecasting services. Peddie lectures at numerous conferences on topics pertaining to graphics technology and the emerging trends in digital media technology. Recently named one of the most influential analysts, regularly advises investors in the GLG network, he is frequently quoted in trade and business publications, was the former president of Siggraph Pioneers, and he is also the author of several books his most recent, The History of Visual Magic in Computers.
 
The main reason discrete GPUs are still alive is bandwidth. When you're limited to a 128-bit bus of DDR3, you're just not going to get much out of any integrated graphics, even if you throw a lot of computing resources at it.

Stacked memory will change that, and only the biggest dedicated GPUs will still make sense.
 
Discrete Graphics: The Sky is Not Falling

No, not really. Not for either case AFAIK.
Unless you're willing to show sources for such claims, for which I'd say they would be wrong.

Another article/link

Ever since the Intel Ivy Bridge launched, people are coming out of the woodwork to declare that the discrete graphics card market is dead, which of course would be negative for companies like NVIDIA and AMD. This is a case of deja vu as I have heard this nearly every year for the last 5 years. Intel Ivy Bridge graphics have made impressive gains in performance, but even Intel isn’t making claims about what this will do to the discrete graphics market. So is this some big conspiracy from people who don’t like discrete graphics? Absolutely not. It is easy to fall into the trap of misunderstanding. And you really need to dig into the details to get a real understanding of what is going on to predict what will happen in the future of discrete graphics.

http://www.forbes.com/sites/patrickmoorhead/2012/05/10/discrete-graphics-the-sky-is-not-falling/
 
Back
Top