A dream I had (partial fixed function 'GPUs')

:?::?::?: This thread is about the merits of adding fixed functions in the GPU, which is almost the opposite of claiming that a GPU is getting more like a CPU.


The standard pattern always goes like this: new kind of technology -> dedicated HW -> CPU/GPU becomes faster -> dedicated HW goes away and becomes programmable.

Remember the days where CPU had a hard time keeping up with ordinary DVD's? Now they eat it for lunch.

UVD and PureVideo were not added to play low bit-rate MPEG2 clips but to crunch through HD-DVD/BlueRay. Give it another 2 or 3 years and the need for dedicated HW will once again be greatly diminished.

CPUs are adding FF hardware, so are GPUs (while also becoming more programmable). Where's the confusion? Trending towards overall greater programmability doesn't preclude adding FF hardware.
 
Ail, I'm talking like 20 or 30 years from now. Out of my arse of course, since I don't own a crystall ball :) But I mean stuff like 3d gfx, audio, video, blah chips will all disappear since it'll all be integrated or emulated somehow. Basically a "gamer" in that future will just buy a small device using very little energy which needs no exspansions for doing its job and has computing power in abundance.

Any attempt to project anything considering graphics over 2 or 3 decades brings a knee-slapping reaction at best. I doubt even a professional can safely predict anything beyond half a decade and even up to that it's pretty risky.

Look at Simon's comment above; I wouldn't be one bit surprised if in a couple of years down the drain hardware makes another cycle like that. For the time being we see transistor budgets to get at least twice as large for each generation. Emulation is IMHLO out of the question; integration yes but most likely for less demanding stuff.

With the pace graphics demands are rising I don't see any sort of universal chip being capable of replacing high end dedicated graphics hardware.
 
There will always be a need for fixed function parts. For instance I don't see dedicated depth testing hardware going out of fashion anytime soon. Texture units will probably remain fixed function for generations to come. I don't see programmable rasterization on the horizon either.

Fixed function depth and stencil testing (with hierarchal Z and also stencil in the future), triangle setup and texture sampling. However I could see programmable triangle interpolators, as the perspective corrected linear interpolation is not well suited for all values.

The feature I am currently most waiting is ability to get original pixel color as an input to pixel shader. It would allow many new algorithms without expensive pinpoint rendering between 2 buffers that are not possible to implement efficiently without it (like robust volumetric lights with shadows). And of course it would also allow pixel shader to do any kind of programmable blending. If this was the only new feature in DX11, I wouldn't complain :)
 
UVD and PureVideo were not added to play low bit-rate MPEG2 clips but to crunch through HD-DVD/BlueRay. Give it another 2 or 3 years and the need for dedicated HW will once again be greatly diminished.
Sure, but there are many reasons why this time around, the dedicated HW for video is likely to stick:
- Perf/watt is more important than ever before.
- The RTL is already done and reusable in future generations.
- Extending the same logic to higher bitrates via higher clocks isn't a problem.

Furthermore, CPUs risk becoming a commodity in the low-end, with their performance no longer being a selling point. If this does happen in the next few years, then CPU-only 20Mbps+ H.264 might remain a difficult workload for a substantial part of the market.

sebbi: Don't forget one of the key reasons to making things 'programmable' is actually to get rid of potential bottlenecks. That's why it makes a lot of sense to do triangle setup in software, and even stencil. As for texture filtering, my guess is you'll want a 'fast-path' for INT8 and maybe some slightly more expensive formats, but beyond that it might make sense to use the ALUs directly.

Depth testing is a slightly more difficult question; my guess is it'll remain a performance advantage to keep it fixed-function for at least a few more years. Beyond that, who the hell knows!
 
A programmable depht-test doesn't make much sense cause it would break multisampling unless we devote a new programmable stage to it (perhaps in conjunction with other per fragment tests..). But then we might loose the ability to properly early reject stuff..
 
That depends how it's exposed. Imagine if you could have two programs: the basic depth test (D1, D2 -> Pass / Fail) and the min-max depth test (D, DMin, DMax -> Pass / Fail / Might Pass).

If done properly, this shouldn't break any of the current GPU optimizations... Heck, it wouldn't even break CSAA! Whether it is desirable or not is another question entirely, of course. It's also easier said than done to use the shader core to run small 'jobs' like that efficiently!
 
It's not so easy, current Hi-Z implementations do more complex stuff than just a min-max test. There are different tests performed at different rates that can trigger not-so-trivial actions behind the scenes (included non-deterministic behaviours!)
I'm not saying is not doable, but it's not as straightforward as it may seem.
 
Holy shit non-deterministic :| Out of curiosity, is that to save die space (i.e. it's not more efficient than the alternatives, just cheaper to implement) or does it actually result in better performance than a 'simple' min-max test?
 
Holy shit non-deterministic :| Out of curiosity, is that to save die space (i.e. it's not more efficient than the alternatives, just cheaper to implement) or does it actually result in better performance than a 'simple' min-max test?
it's to not stall some stage of the GPU in certain cases..(and it doesn't improve performance.. )
 
Glad my post wasn't completely without merit and some decent discussion came of it.

What are SSRI's and why dont you dream of large boobies like the rest of us ?
I dreamt of a gpu/dsp hybrid maybe im nuts too :D

They're anti-depressants and I won't go into the booby dreams...
 
Back
Top