Intel Planning CPU-integrated Graphics?

Jawed

Legend
http://www.dailytech.com/article.aspx?newsid=2850

Still, Bhandarkar feels that its server products will offer enough of performance advantage that it won’t need to delve into integrated memory controllers for now. In the mean time, integrated memory controllers are definitely in the pipeline as is an integrated graphics controller, but no timeline was given for either.
Does that seem at all credible? First I've heard of it.

Jawed
 
'Graphics controller' would be PCIexpress controller rather than integrated Graphic chip surely?
I don't see why they would waste time with an integrated low performance 3D chip.
 
arrrse said:
'Graphics controller' would be PCIexpress controller rather than integrated Graphic chip surely?
I don't see why they would waste time with an integrated low performance 3D chip.


If it was standard in the processor? Would be one more plus to every Intel CPU and one less seperate chip they would need for their motherboards with integrated graphics. There are a very very large number of users who only use their PCs for general day to day tasks and have absolutly zero need for a high or even a low end graphics accelerator like those from ATI and Nvidia. Intel marketshare > ATI or Nvidia by a large percent. Intel has been pursuing this for quite awhile, and its also a reason to the rumor for why AMD would want to purchase ATI since AMD is also looking into this idea.

They arent looking to break records in performance, they just want to get what is now 3D integrated graphics on motherboards into a processor.


get off my typo!
 
Last edited by a moderator:
Perhaps it would be prudent to have two different lines of CPUs. But given the penetration of Intel into IGP mobos, they might not save much in the long run anyway for chip production. How many transistors are in Intel's IGP chips?

On AMD's side, I sure wouldn't mind an X1300Pro-level chip integrated with the CPU (whatever the new "fast" low-end chip happens to be). I'm not too sure how that would play out for heat dissipation though.

Question: I assume they mean "integrated" to be two dice on a package.... Can there be different clock speeds (non-integer multipliers) if the CPU and GPU are actually one die?

SugarCoat said:
... graphics excellerator...
:LOL: sorry, that just made me think of the commercial for the gum. :LOL:
 
Last edited by a moderator:
Not anytime soon (I would be shocked if it happens within five years). Not on desktop processors, either. Ever. I imagine this will be only for ULV chips.
 
The Baron said:
Not anytime soon (I would be shocked if it happens within five years). Not on desktop processors, either. Ever. I imagine this will be only for ULV chips.
wanna bet? (hint: Cyrix MediaGX)
 
This idea isn't as prepostrous as it sounds actually, after having thought about it a bit. I can certainly see it happening within 5 years - which is an enormous amount of time in this business. Remember, Geforce 3 was intro'd just a little over 5 years ago, and see what has happened since then both feature- and performance-wise!

A GPU on the CPU die along with the memory controller could even lead to better graphics and CPU performance, since the hardware would be better able to schedule memory requests to best suit both types of tasks. At least slightly, in theory.

Not saying I'd want a solution like that myself, hell no, but it does make sense from a certain perspective. It would also make Tim Sweeney right in his crazy predictions of CPUs again taking over graphics rendering...sort of! ;)
 
why not an integer multiplier/divider. you could have GPU frequency == RAM frequency for instance.
while it would certainly not be the greatest GPU around it would be able to fully benefit from the bandwith dual channel DDR or DDR3 (by then..) whereas integrated chipsets for AMD are limited by HTT bandwith. in the end maybe that'd be a draw in the contest for cheap half-half decent GPU.

I can imagine some wicked thing too, let's say a 64bit memory controller for CPU, for three RAM slots, and another one for the GPU, attached to one slot only. put 1GB or 2GB main RAM, and at will 256 or 512 for GPU. (or, same memory controller you could say but one channel for each). how cool and dumb of a brainfart that is.

no timeline is given at all though so it may not worth speculating insane things.
maybe it's for the time Intel goes Cell-like (AMD will too). Vector units could be used as SM 4.0 shader ALU if needed, Intel would provide a few TMU and ROPs (and whatever needed little circuitry) on chip in case you don't use an external GPU instead.
and that would be well enough for Vista, Vista's successor and 3D solitaire.
 
Intel stated its plans to integrate graphics onto their CPUs, in the "Platform 2015" outline back in, what, early 2005?

http://www.intel.com/technology/magazine/computing/platform-2015-0305.htm

Intel's CMP architectures will also provide the essential special-purpose performance and adaptability that future platforms will require. In addition to general-purpose cores, Intel's chips will include specialized cores for various classes of computation, such as graphics, speech recognition algorithms and communication-protocol processing. Moreover, Intel will design processors that allow dynamic reconfiguration of the cores, interconnects and caches to meet diverse and changing requirements.

2. Special Purpose Hardware
Over time, important functions once relegated to software and specialized chips are typically absorbed into the microprocessor itself. Intel has been at the forefront of this effort, which has been the driving force behind our business model for over 35 years. By moving functions on chip, such capabilities benefit from more-efficient execution and superior economies of scale and reduce the power consumption drastically. Low latency communication between special purpose hardware and general purpose cores will be especially critical to meet future processor architecture performance and functionality expectations.

Special-purpose hardware is an important ingredient of Intel's future processor and platform architectures. Past examples include floating point math, graphics processing and network packet processing. Over the next several years, Intel processors will incorporate dedicated hardware for a wide variety of tasks. Possible candidates include: critical function blocks of radios for wireless networking; 3D graphics rendering; digital signal processing; advanced image processing; speech and handwriting recognition; advanced security, reliability and management; XML and other Internet protocol processing; data mining; and natural language processing.
 
That's interesting from a platform pov. If every CPU has integrated graphics built in, that pretty much kills everbody elses IGP doesn't it?
 
first quote : graphics include your photoshop and all other image processing.

second quote : general statement that says the CPU will do everything, we heard that from Intel in the MMX days. Intel always made propaganda about how the CPU is all you need, and somewhat succeeds when you see the Mac mini and laptops with an expensive, fast CPU and terrible "extreme graphics" :)

Then there's a list on what Cell-like SIMD units are for. 3D graphics rendering, that can well be software, offline rendering ;)
 
Didn't Philips do something like this 10 years ago or so ?
I'm talking about a CPU + GPU + APU multimedia processor for desktop PC's, all in one.
 
On a similar note one of ars' reasons for AMD being interested in the merger with ATI was beefing up their co-processor's capabilities along the same lines.
 
INKster said:
Didn't Philips do something like this 10 years ago or so ?
I'm talking about a CPU + GPU + APU multimedia processor for desktop PC's, all in one.
Cyrix MediaGX, was exactly all-in-one CPU+GPU+Audio+memory controller.
I still remeber that a MB with 200MHz MediaGX was ~ 100$
 
Can they realistically jam Vista-ready graphics capability into a CPU? Dropping an extra hundred million or so transistors into a CPU just to save a few bucks on an IGP part doesn't seem justifiable.
 
Ratchet said:
Can they realistically jam Vista-ready graphics capability into a CPU? Dropping an extra hundred million or so transistors into a CPU just to save a few bucks on an IGP part doesn't seem justifiable.

For DX9L? 1/2 that might not be out of the question. RV370, the basis of X300, is 75M. RS482 has two pipes to X300's four. . . There's also been some suggestion that USC (if Intel goes that way), saves you transistors at the bottom because you have fewer units.
 
While this might not make sense right now to the mainstream market it really makes perfect sense to be added to the server market in the Xeon chips where dedicated graphics are not needed.
 
Back
Top