The End of The GPU Roadmap

Berek

Regular
http://www.ubergizmo.com/15/archives/2009/08/tim_sweeney_the_end_of_the_gpu_roadmap.html

Sweeney sees the 2020 date as a time when graphics will become realistic enough that we simply won't need more feature-rich GPUs, but more performance based GPUs. When was the last time we had accurate predictions 10+ years out?

"In the next generation we’ll write 100-percent of our rendering code in a real programming language--not DirectX, not OpenGL, but a language like C++ or CUDA," he said last year.

Though, we're already beginning to see the era of desktop systems as a whole begin to cave in the light of cheap and fully functional laptops, netbooks, etc. Sure, they may not have a GPU as fast as a desktop system, but when will that not matter as much, whether it is a limit of need, worthy new features, or the ability to use them?:

"Hardware will become 20X faster, but: Game budgets will increase less than 2X."

You can grab the full PDF here: http://graphics.cs.williams.edu/archive/SweeneyHPG2009/TimHPG2009.pdf
 
Last edited by a moderator:
...and so history repeats itself....I guess some weird predictions never die.

Dumb layman's question: how do you code a game on CUDA and not let it run on a GPU? :oops:
 
that was very readable! not much Carmack-style, "refractions are achieved through rerouting the plasma beams inside the depolarisation vortex conundrum".

I liked the (putative) use of inherent concurrency in pure functional languages, and the atomic/transaction stuff. Concepts I read about (for my personal culture only). Good to know about that "multi-tier" approach with multithreading and vectorising. All methods suck for some tasks.. so if we've got a good generic manycore chip, you can use several unrelated methods.

Now I see Intel's Larrabee as a kind of totally mad R&D budget offshoot (after that 80-core chip for test purpose only).

Lastly the conclusion seems to be, everything is about BANDWITH, DATA and BANDWITH. that was a known problem already but perhaps it's a bigger one than Moore's law limitation or even power budget.
 
Oh great, for a while Larrabee held off the "we need fat cores" crowd ... but I see they are already regrouping :/
 
I remember the 6800 launch in Geneva, where Sweeney - to the obvious shock of Nvidia reps present - proclaimed anti-aliasing to be dead. Granted, it's not dead (yet), but the increasing number of titles trading AA-support for some üb0r-effect speaks for itself. :(
 
The GPU as we know it, will need a new name. Maybe 'compute processing unit', or in short: 'computer' :p
Now that's something Scott McNealy would've never imagined: The GPU IS Teh Computer! ;)
 
that was very readable! not much Carmack-style, "refractions are achieved through rerouting the plasma beams inside the depolarisation vortex conundrum".

I liked the (putative) use of inherent concurrency in pure functional languages, and the atomic/transaction stuff. Concepts I read about (for my personal culture only). Good to know about that "multi-tier" approach with multithreading and vectorising. All methods suck for some tasks.. so if we've got a good generic manycore chip, you can use several unrelated methods.

Now I see Intel's Larrabee as a kind of totally mad R&D budget offshoot (after that 80-core chip for test purpose only).

Lastly the conclusion seems to be, everything is about BANDWITH, DATA and BANDWITH. that was a known problem already but perhaps it's a bigger one than Moore's law limitation or even power budget.
Well More 's Law has predicted that it will end sometimes in 2010 when we we hit by either power wall or bandwidth wall.

Anyway, i still remember when Intel said, Pentium 4 will last near a decade, and we will have Dual Core CPU running at 10+ GHz. And Jansen from NV himself said by 2010 we will have Real Time Ray Tracing graphics from our Graphics Card ( Back in 1998 - 2000 ).

And i forgot... PC gaming is pretty much died ( in a sense of Profit making )..... Most Game deveopler will spend their 0.5X budget in somewhere else... Mobile Gaming Platform. Be it iPhone/ iPod, NDS, or PSP.
Where everything from Power / Bandwidth / Heat , Storage, Speed etc are ALL limited.
 
I remember the 6800 launch in Geneva, where Sweeney - to the obvious shock of Nvidia reps present - proclaimed anti-aliasing to be dead. Granted, it's not dead (yet), but the increasing number of titles trading AA-support for some üb0r-effect speaks for itself. :(

Sweeney passionately hates multisampling, that's all. The lack of AA in some titles is mostly related to them using the UE3 engine.
 
And i forgot... PC gaming is pretty much died ( in a sense of Profit making )..... Most Game deveopler will spend their 0.5X budget in somewhere else... Mobile Gaming Platform. Be it iPhone/ iPod, NDS, or PSP.
Where everything from Power / Bandwidth / Heat , Storage, Speed etc are ALL limited.

Even if people keep repeating this it would not become truer. The boxed PC gaming market is dying. But the online game market is growing faster than any other game sector. But like the mobile platforms we don’t talk about high end hardware here.

Sweeney passionately hates multisampling, that's all. The lack of AA in some titles is mostly related to them using the UE3 engine.

Hate is a strong word. I would say he just don’t care about AA when making technical decisions.
 
Strange how he downplays DX10 but doesn't mention DX11
*cough*regurgitating*cough*

(September 2008)

Twilight of the GPU, Goodbye, graphics APIs

http://arstechnica.com/gaming/news/2008/09/gpu-sweeney-interview.ars

(March 2008)
DX10 is the last relevant API, Hardware Acceleration will be gone
http://www.tgdaily.com/content/view/36436/118/1/1/

(Jan 2000)
2006-7:CPU’s become so fast and power ful that 3D hardware will be only marginally benfical for rendering relative to the limits of the human visualsystem, therefore 3D chips will likely be deemed a waste of silicon (andmoreexpensivebusplumbing), so the world will transition back to software-driven rendering
http://www.scribd.com/doc/93932/Tim-Sweeney-Archive-Interviews

I Think mister Sweeney has a bad case of "as long as I can shout it often enough, hard enough, it will eventually become true!"

I can not believe anyone that coded in the 90's does not see the benefits of GPU based rendering. Spending ages to get a donut right with a texture and phong-shading was fun, but there's a reason you want to offload your work.
 
Last edited by a moderator:
Well DX11 doesn’t count for him as it is just another “irrelevant” incremental step that doesn’t goes in his preferred direction.

And remember kids the return of software only 3D solutions is only 5 years away, always
 
Sweeney passionately hates multisampling, that's all. The lack of AA in some titles is mostly related to them using the UE3 engine.
Oh, that we have in common then. I'd much prefer user-selectable SG-Supersampling - or maybe Analytical AA, if it works as good as it sounds. :)

There are other Games/Engines, though, which unfortunately do not support MSAA: Gothic 3, GTA IV, ArmA 2 and more - like the problems mentioned in the upcoming DX11-ported DICE-Engine when using that compute-shader stuff.

With older games, you could almost universally force supersampling or multisampling and it would work.
 
There are other Games/Engines, though, which unfortunately do not support MSAA: Gothic 3, GTA IV, ArmA 2 and more - like the problems mentioned in the upcoming DX11-ported DICE-Engine when using that compute-shader stuff.
Arma 2 has MSAA in-engine, it was added with the first patch. It doesn't work right because of the order of AA resolve and tonemapping, leaving high-contrast edges with aliasing.

The fact is MSAA is only useful for all kinds of rendering methods from D3D10.1 onwards, when both the hardware and the API allow the developer to get all of the MSAA sample data and use it as they please. Not sure which version of OpenGL allows equivalent access...

Jawed
 
I can not believe anyone that coded in the 90's does not see the benefits of GPU based rendering. Spending ages to get a donut right with a texture and phong-shading was fun, but there's a reason you want to offload your work.

Sweeney still sounds bitter after being forced to put 3DFX support into Unreal after saying for months how it was a waste of time and unnecessary. He's been predicting the end of the GPU since then, and every time he's wrong.

Just because he manages teams of programmers for 3D engines, it doesn't mean he can see into the far future of such a fast moving and innovative field. If history has told us anything, predictions in the computer field are more likely to put egg on your face than anything else.
 
Back
Top