Really off-the-wall question about video chips.

SithSolo1

Newcomer
My friend wants to know why they don't just make video processors like CPUs a.k.a. you buy a "video chip" and plug it into a socket on your motherboard just like you would a P4 or A64. I told him I didn't know if it was possible, even if it was it would cost too much money for both the IHVs and the consumer. I just don't know enough to explain it to him. Can anyone here help? :?: :oops: :?
 
Well, a modern graphics card is a bit like a mini-computer attached to the rest of the computer. It is basically a motherboard (the card itself) with a cpu (the gpu chip) and memory modules (the, well, memory modules). So it's already as compact as can be, and you do plug in it in to the main motherboard :)
 
Basically, the problem is memory. With a socket on the mobo you'd pretty much be confined with system memory, and even with the fastest dual-channel memory setups available in PCs today, they do not compare with dedicated video memory on a graphics card. Also, you probably don't want to permanently dedicate 128MB or so of your 256-512MB of RAM just for graphics. ;)
 
Plus bandwidth, you don't want to be sharing bandwidth with the CPU by sharing a common bus.

I'm guessing any other solutions would probably drop the price of a graphics card by $100 and raise the price of a motherboard by $100 (random numbers picked out of my head, just to illustrate a point ;)).
 
Yeah, it's probably mainly a question of memory bandwidth, but I'm sure MB size/complexity and now power consumption are also factors.

I'm curious how feasible a 512-bit (quad channel) memory controller on a MB would be, though. I'd imagine providing for four RAM slots that could operate as quad-channel when all four are filled would be an intriguing solution for MB's with both a CPU and a GPU socket.
 
The other big problem is you have to pay for 'infrastructure' that can support an X800/6800, even if you only want to plug an MX440 into it.
Pete said:
I'm curious how feasible a 512-bit (quad channel) memory controller on a MB would be, though. I'd imagine providing for four RAM slots that could operate as quad-channel when all four are filled would be an intriguing solution for MB's with both a CPU and a GPU socket.
I'm pretty sure quad-channel DDR (on the sticks we use today) would only be 256-bit.
 
Pin counts for high-end GPUs are extremely high; AFAIK, about 1500-1700 for 256-bit GPUs; this would increase the number of needed board layers to something like 10-12, which is generally considered unacceptable for a motherboard. Also, using a socket for the GPU and slots for their GDDR1/2/3 memory may adversely affect the clock speed the memory can run at (longer trace lengths + multiple contact capacitance loads => reduced signal integrity => lower clock speeds). Finally, making such a motherboard forwards compatible with ANY future GPU sounds like a hopeless task, considering that each new GPU will still come with its own demands for power supply, signal integrity, memory types and voltages, and people are not too keen on replacing their motherboards for every new GPU upgrade.
 
Integrated graphics chips do pretty much what you're describing. They live on the motherboard although generally don't sit in a pluggable socket.
 
well, the other big one--heat. going to slap a 200 million transistor chip on your motherboard? good luck fitting a cooler.
 
chrisATI said:
Integrated graphics chips do pretty much what you're describing. They live on the motherboard although generally don't sit in a pluggable socket.
Not all of them. Most motherboards sold with integrated graphics have the graphics integrated into the chipset itself, so there is no extra chip for graphics. This includes nVida's nForce and Intel's integrated graphics (I think ATI's chipset also falls into this category, but haven't seen much info on it).
 
The Baron said:
well, the other big one--heat. going to slap a 200 million transistor chip on your motherboard? good luck fitting a cooler.

uhm, the gpu is in the same case, producing the same heat with the same air..
 
arjan de lumens said:
Cooling a hypothetical socketed GPU shouldn't be harder than cooling a modern CPU; AFAIK, Prescott draws more power than even the NV40.
Except now you'd be cooling two of 'em. This could get challenging if they're located close to one another on the motherboard.
 
arjan de lumens said:
Cooling a hypothetical socketed GPU shouldn't be harder than cooling a modern CPU; AFAIK, Prescott draws more power than even the NV40.
I don't believe this is correct. Prescott tops out at what, 110W, and isn't NV40 quite a bit higher than that?

The problem is that you'd have to have two giant heatsinks in very close proximity, and you'd have to have some insane case cooling in order to keep the hot exhaust from one from blownig into the other.
 
The Baron said:
arjan de lumens said:
Cooling a hypothetical socketed GPU shouldn't be harder than cooling a modern CPU; AFAIK, Prescott draws more power than even the NV40.
I don't believe this is correct. Prescott tops out at what, 110W, and isn't NV40 quite a bit higher than that?

The problem is that you'd have to have two giant heatsinks in very close proximity, and you'd have to have some insane case cooling in order to keep the hot exhaust from one from blownig into the other.

yeah, and they aren't very close now with the gpu directly physically under the cpu-cooling system in a tower-system, means all the heat gets directly up onto the cpu cooler itself..

ouch..

definitely, this is NOT an issue.
 
The Baron said:
arjan de lumens said:
Cooling a hypothetical socketed GPU shouldn't be harder than cooling a modern CPU; AFAIK, Prescott draws more power than even the NV40.
I don't believe this is correct. Prescott tops out at what, 110W, and isn't NV40 quite a bit higher than that?
Keep in mind that the power draw from an NV40 includes more than just the chip.
 
Back
Top