The End of The GPU Roadmap

Discussion in 'Architecture and Products' started by Berek, Aug 15, 2009.

  1. Berek

    Regular

    Joined:
    Oct 17, 2004
    Messages:
    271
    Likes Received:
    4
    Location:
    Houston, TX
    http://www.ubergizmo.com/15/archives/2009/08/tim_sweeney_the_end_of_the_gpu_roadmap.html

    Sweeney sees the 2020 date as a time when graphics will become realistic enough that we simply won't need more feature-rich GPUs, but more performance based GPUs. When was the last time we had accurate predictions 10+ years out?

    "In the next generation we’ll write 100-percent of our rendering code in a real programming language--not DirectX, not OpenGL, but a language like C++ or CUDA," he said last year.

    Though, we're already beginning to see the era of desktop systems as a whole begin to cave in the light of cheap and fully functional laptops, netbooks, etc. Sure, they may not have a GPU as fast as a desktop system, but when will that not matter as much, whether it is a limit of need, worthy new features, or the ability to use them?:

    "Hardware will become 20X faster, but: Game budgets will increase less than 2X."

    You can grab the full PDF here: http://graphics.cs.williams.edu/archive/SweeneyHPG2009/TimHPG2009.pdf
     
    #1 Berek, Aug 15, 2009
    Last edited by a moderator: Aug 16, 2009
  2. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,420
    Likes Received:
    179
    Location:
    Chania
    ...and so history repeats itself....I guess some weird predictions never die.

    Dumb layman's question: how do you code a game on CUDA and not let it run on a GPU? :shock:
     
  3. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    17,268
    Likes Received:
    1,785
    Location:
    Winfield, IN USA
    Oh noes, teh ends of the GPU!

    :runaway: :runaway: :runaway: :runaway:
     
  4. Martin Eddy

    Regular

    Joined:
    Oct 5, 2003
    Messages:
    491
    Likes Received:
    4
    Location:
    Australia,Brisbane
    I've heard that prediction so many times I've learned to not listen anymore.
     
  5. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    10,430
    Likes Received:
    433
    Location:
    New York
    He's not predicting the end of the GPU guys. Just calling for more programmability and fewer "APIs".
     
  6. Blazkowicz

    Legend Veteran

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    that was very readable! not much Carmack-style, "refractions are achieved through rerouting the plasma beams inside the depolarisation vortex conundrum".

    I liked the (putative) use of inherent concurrency in pure functional languages, and the atomic/transaction stuff. Concepts I read about (for my personal culture only). Good to know about that "multi-tier" approach with multithreading and vectorising. All methods suck for some tasks.. so if we've got a good generic manycore chip, you can use several unrelated methods.

    Now I see Intel's Larrabee as a kind of totally mad R&D budget offshoot (after that 80-core chip for test purpose only).

    Lastly the conclusion seems to be, everything is about BANDWITH, DATA and BANDWITH. that was a known problem already but perhaps it's a bigger one than Moore's law limitation or even power budget.
     
  7. MfA

    MfA
    Legend

    Joined:
    Feb 6, 2002
    Messages:
    6,806
    Likes Received:
    473
    Oh great, for a while Larrabee held off the "we need fat cores" crowd ... but I see they are already regrouping :/
     
  8. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,420
    Likes Received:
    179
    Location:
    Chania
    Fat should be misspelled as phat to emphasize on it :twisted:
     
  9. CarstenS

    Veteran Subscriber

    Joined:
    May 31, 2002
    Messages:
    4,798
    Likes Received:
    2,056
    Location:
    Germany
    I remember the 6800 launch in Geneva, where Sweeney - to the obvious shock of Nvidia reps present - proclaimed anti-aliasing to be dead. Granted, it's not dead (yet), but the increasing number of titles trading AA-support for some üb0r-effect speaks for itself. :(
     
  10. MfA

    MfA
    Legend

    Joined:
    Feb 6, 2002
    Messages:
    6,806
    Likes Received:
    473
    Now he is calling for greater precision in the AA.
     
  11. madyasiwi

    Newcomer

    Joined:
    Oct 7, 2008
    Messages:
    194
    Likes Received:
    32
    The GPU as we know it, will need a new name. Maybe 'compute processing unit', or in short: 'computer' :razz:
    Now that's something Scott McNealy would've never imagined: The GPU IS Teh Computer! :wink:
     
  12. iwod

    Newcomer

    Joined:
    Jun 3, 2004
    Messages:
    179
    Likes Received:
    1
    Well More 's Law has predicted that it will end sometimes in 2010 when we we hit by either power wall or bandwidth wall.

    Anyway, i still remember when Intel said, Pentium 4 will last near a decade, and we will have Dual Core CPU running at 10+ GHz. And Jansen from NV himself said by 2010 we will have Real Time Ray Tracing graphics from our Graphics Card ( Back in 1998 - 2000 ).

    And i forgot... PC gaming is pretty much died ( in a sense of Profit making )..... Most Game deveopler will spend their 0.5X budget in somewhere else... Mobile Gaming Platform. Be it iPhone/ iPod, NDS, or PSP.
    Where everything from Power / Bandwidth / Heat , Storage, Speed etc are ALL limited.
     
  13. Humus

    Humus Crazy coder
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,217
    Likes Received:
    77
    Location:
    Stockholm, Sweden
    Sweeney passionately hates multisampling, that's all. The lack of AA in some titles is mostly related to them using the UE3 engine.
     
  14. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    17,268
    Likes Received:
    1,785
    Location:
    Winfield, IN USA
    Odd, I passionately hate those that hate multisampling. :???:
     
  15. Demirug

    Veteran

    Joined:
    Dec 8, 2002
    Messages:
    1,326
    Likes Received:
    69
    Even if people keep repeating this it would not become truer. The boxed PC gaming market is dying. But the online game market is growing faster than any other game sector. But like the mobile platforms we don’t talk about high end hardware here.

    Hate is a strong word. I would say he just don’t care about AA when making technical decisions.
     
  16. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    Strange how he downplays DX10 but doesn't mention DX11
    *cough*regurgitating*cough*

    (September 2008)

    Twilight of the GPU, Goodbye, graphics APIs

    http://arstechnica.com/gaming/news/2008/09/gpu-sweeney-interview.ars

    (March 2008)
    DX10 is the last relevant API, Hardware Acceleration will be gone
    http://www.tgdaily.com/content/view/36436/118/1/1/

    (Jan 2000)
    2006-7:CPU’s become so fast and power ful that 3D hardware will be only marginally benfical for rendering relative to the limits of the human visualsystem, therefore 3D chips will likely be deemed a waste of silicon (andmoreexpensivebusplumbing), so the world will transition back to software-driven rendering
    http://www.scribd.com/doc/93932/Tim-Sweeney-Archive-Interviews

    I Think mister Sweeney has a bad case of "as long as I can shout it often enough, hard enough, it will eventually become true!"

    I can not believe anyone that coded in the 90's does not see the benefits of GPU based rendering. Spending ages to get a donut right with a texture and phong-shading was fun, but there's a reason you want to offload your work.
     
    #16 neliz, Aug 17, 2009
    Last edited by a moderator: Aug 17, 2009
  17. Demirug

    Veteran

    Joined:
    Dec 8, 2002
    Messages:
    1,326
    Likes Received:
    69
    Well DX11 doesn’t count for him as it is just another “irrelevant” incremental step that doesn’t goes in his preferred direction.

    And remember kids the return of software only 3D solutions is only 5 years away, always
     
  18. CarstenS

    Veteran Subscriber

    Joined:
    May 31, 2002
    Messages:
    4,798
    Likes Received:
    2,056
    Location:
    Germany
    Oh, that we have in common then. I'd much prefer user-selectable SG-Supersampling - or maybe Analytical AA, if it works as good as it sounds. :)

    There are other Games/Engines, though, which unfortunately do not support MSAA: Gothic 3, GTA IV, ArmA 2 and more - like the problems mentioned in the upcoming DX11-ported DICE-Engine when using that compute-shader stuff.

    With older games, you could almost universally force supersampling or multisampling and it would work.
     
  19. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    10,873
    Likes Received:
    767
    Location:
    London
    Arma 2 has MSAA in-engine, it was added with the first patch. It doesn't work right because of the order of AA resolve and tonemapping, leaving high-contrast edges with aliasing.

    The fact is MSAA is only useful for all kinds of rendering methods from D3D10.1 onwards, when both the hardware and the API allow the developer to get all of the MSAA sample data and use it as they please. Not sure which version of OpenGL allows equivalent access...

    Jawed
     
  20. Bouncing Zabaglione Bros.

    Legend

    Joined:
    Jun 24, 2003
    Messages:
    6,363
    Likes Received:
    82
    Sweeney still sounds bitter after being forced to put 3DFX support into Unreal after saying for months how it was a waste of time and unnecessary. He's been predicting the end of the GPU since then, and every time he's wrong.

    Just because he manages teams of programmers for 3D engines, it doesn't mean he can see into the far future of such a fast moving and innovative field. If history has told us anything, predictions in the computer field are more likely to put egg on your face than anything else.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...