Is it possible to realize videogames completely hardware accelerated?

Frontino

Newcomer
I'm dreaming about a Gaming PC that doesn't need CPUs, nor system RAM, able to run any application completely in hardware on the GPU/s (becoming General Purpose Unit de facto).
Since the introduction of the first graphics cards, many rendering processes have been moved from the CPU to the video card.
And now the physics also makes its way out of the CPU.
What's next?
Will A.I. also be hardware accelerated?
Will CUDA, Stream and OpenCL make CPUs optional?
I hope so.
If I can choose to buy just the GPU, instead of CPU+RAM+GPU, I'm happy.
 
Once that happens, your GPU will simply become a CPU. Except that its less customisable since you can't add or remove RAM or shift more power to graphics/CPU (by upgrading components) as your needs change.

You can already get such things. Simply buy a pre-built computer with set components and you'll get the lack of customisability that you desire.
 
You can't buy only GPU's even now, you buy a video card. And RAM will never be optional, even if CPUs and GPUs merge down the line.
 
Gosh!
It's pretty clear that none of you guys understood my topic.
I'm talking 'bout running an OS on the GPU solely.
And so for multimedia applications, games, visual effects renderer for cinema, everything that CUDA, Stream and OpenCL make possible to do.
 
No, I don't think you understood their responses.

Memory will never go away. Ever. A "GPU" is simply the processor chip on a video card, you can't run software on an processor alone -- you need memory. And you also need an interface for storage medium, like harddrives -- which means you need some underlying system connection from that processor and memory to this storage.

Which means you've got a processor (GPU in your example), memory, and a "board" of some sort to connect it all together.

So no, your original question is bunk - you will never run an entire system out of a GPU by itself.
 
something along similar lines the killer network card runs its own o/s a version of linux not sure how fully featured it is, but you can run applications on it
 
Come to think of it, these days (ie the era of programmable graphics pipeline) even GPUs render in software(a lot of stages), so "hardware acceleration" for the most part is misnomer. I remember reading Intel's LRB paper and they had profiled their software renderer and found >50% load in shading, which is done in software (albeit on a very fast hardware)
 
Yes, Intel's Larrabee is basically a manycore x86 CPU which can 'feed itself' instructions. It doesn't need a host to function. Although whether or not they'll make socketable versions or actual motherboards with it on without a regular CPU.. Who knows.

Not sure why you'd want to substitute a powerful out of order processor with a bunch of shaders on a chip though.. The CPU is good at what it does and the GPU is good at what it does, and they do have some overlap, but there's a reason the Cell still comes with a little PPC core on top of the pie.
 
The CPU is good at what it does and the GPU is good at what it does, and they do have some overlap, but there's a reason the Cell still comes with a little PPC core on top of the pie.

Absolutely, shaders are great, fast and everything. But they can only perform a subset of the entire spectrum of computing, ie virtual memory, context switching, protected memory, single threaded IPC, system calls and stuff. And they are not going to go away.
 
This is a retarded thread, and the notion of hardware accelerated is archaic legacy of fixed funciton elements. Anything that runs on a processor is "hardware accelerated" and adding more hardware (e.g. GPUs, audio processors, physics processors, etc.) don't really accelerate anything (assuming you're not fully decoupled from the time domain) you're just redistributing workload. The closest notion you could come to the idea would write an entire game in a hardware description language and load it onto an FPGA...
 
Come to think of it, these days (ie the era of programmable graphics pipeline) even GPUs render in software(a lot of stages), so "hardware acceleration" for the most part is misnomer.

Yeah I've always found that puzzling. The notion of "hardware accelerated" usually refers to fixed function hardware doing something a CPU will take longer to emulate. But now that GPUs are becoming more programmable and are executing user defined processes things are actually less hardware accelerated. Some of the remaining fixed function bits like rasterization, texturing and blending should eventually go the way of the dodo as well.
 
he closest notion you could come to the idea would write an entire game in a hardware description language and load it onto an FPGA...
...and that path leads to insanity.

The notion of "hardware accelerated" usually refers to fixed function hardware doing something a CPU will take longer to emulate. But now that GPUs are becoming more programmable and are executing user defined processes things are actually less hardware accelerated. Some of the remaining fixed function bits like rasterization, texturing and blending should eventually go the way of the dodo as well.
Only if you are willing to pay a lot more for your hardware. If you have common fixed tasks, then it is (typically) more economical to use dedicated hardware.
 
Only if you are willing to pay a lot more for your hardware. If you have common fixed tasks, then it is (typically) more economical to use dedicated hardware.

Sure, that's the case when those tasks are fixed and the performance cost of flexibility is too high. But eventually those factors will go away. Isn't Larrabee only going to have fixed function texturing? That's already a big step in that direction.
 
Um, aren't they "hardware" accelerated already? Last time i checked CPU was still under "hardware" category... Also even if we say physics aren't hardware accelerated. Well they are. Through CPU...
 
Hardware accelerated usually refers to dedicated hardware. Of course you need something that exists in the physical world to do the work so everything is hardware accelerated if you use that definition :)
 
Well, some definitions aren't exactly defined and used.
Take audio hardware acceleration for example. We all say it's hardware accelerated when it's not mixed by the CPU. However audio decoding itself is still done by CPU, sent to soudcard and mixed there.
 
Gosh!
It's pretty clear that none of you guys understood my topic.
I'm talking 'bout running an OS on the GPU solely.
And so for multimedia applications, games, visual effects renderer for cinema, everything that CUDA, Stream and OpenCL make possible to do.
As was eluded to before, it technically wouldn't be much of a GPU as it wouldn't be used as a specialized processor for just graphics anymore. It would virtually be a CPU for the type of applications its running, almost a return to the days when everything is software accelerated from a general purpose processor. Even though its theoretically the reverse, technically its the same thing.
 
Back
Top