Sweeney and Richards on the future of 3D

Discussion in 'Rendering Technology and APIs' started by trinibwoy, Sep 24, 2010.

  1. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    10,491
    Likes Received:
    514
    Location:
    New York
  2. Simon F

    Simon F Tea maker
    Moderator Veteran

    Joined:
    Feb 8, 2002
    Messages:
    4,560
    Likes Received:
    157
    Location:
    In the Island of Sodor, where the steam trains lie
    Started watching the first... have these guys never heard of a tripod?
     
  3. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    15,362
    Likes Received:
    2,554
    So to save me having to watch all 6 video's what is the future of 3d ?
     
  4. Richard

    Richard Mord's imaginary friend
    Veteran

    Joined:
    Jan 22, 2004
    Messages:
    3,508
    Likes Received:
    40
    Location:
    PT, EU
    Not having watched the videos I'd have to say... software renderer for Tim.

    @Simon: Semi Accurate... positioning. It's in the name!™
     
  5. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,495
    Likes Received:
    599
    Location:
    WI, USA
    They talk about a lot of stuff.

    They talk about how DirectX is a dead end due to how it fundamentally works. Tim thinks that DirectX is holding everything back and is the cause of diminishing returns regardless of having many times more powerful hardware than years ago. They want to get rid of the usual rasterization artifacts (texture and polygon aliasing). Tim says that if they'd continued working on UE1's software renderer that today it would probably have fewer artifacts than DirectX does today. There were aspects to the software renderer that were better than what they could do with 3DFX hardware (due to its limitations). Software innovation makes fixed function hardware unimpressive and limiting too quickly.

    There is some chat about how hardware architectures aren't around long enough to be fully explored before they are obsolete.

    They talk about software rasterization from custom hardware or from GPGPU. Also, current GPGPU is pretty useless from both software and hardware angles. Separate GPU and CPU chips are bad due to communication problems. They need to be able to quickly communicate because they are good at different things and so don't work well alone and separation equals unusable latency. Cell is interesting but not all that great (lots of things that need to change). But they are stuck with dealing with Cell for years yet. And they talk about the viability of the economics for the various parties involved in the hardware and the games. And who would be interested in breaking the mold. Who has the power to do so. Etc.

    There is chat about multicore CPUs and their chicken and the egg issue of not having many multithreaded apps while designing the CPUs. Compiler info. Epic made big investments into developing for multicore when it became clear that it was the future. Chat about the major challenge of heavily threading games. Need to find ways to go even more fine-grained to utilize more and more threads. Isn't getting easier. Cue more chat of GPU/CPU "fusion". Current trend of ever wider general purpose CPUs is not particularly useful in the long term.

    Power consumption comments. Andrew thinks that the next consoles can't exceed the power usage of the current ones.
     
    #5 swaaye, Sep 24, 2010
    Last edited by a moderator: Sep 24, 2010
  6. Richard

    Richard Mord's imaginary friend
    Veteran

    Joined:
    Jan 22, 2004
    Messages:
    3,508
    Likes Received:
    40
    Location:
    PT, EU
    Quelle surprise! :sleeping:
     
  7. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,495
    Likes Received:
    599
    Location:
    WI, USA
    :cool:
     
  8. liolio

    liolio Aquoiboniste
    Legend

    Joined:
    Jun 28, 2005
    Messages:
    5,724
    Likes Received:
    194
    Location:
    Stateless
    Here is the transcript of the conversation (thanks harison for pointing it to us ;) )

    I'm really grateful for the ones that wrote this down because between my bad english and the noisy vids I barely understand one word out of two...

    By the way I missed a pretty interesting part:
    I find this POV interesting especially after reading this thread
    It's a pretty fresh take as comparing a fusion chip (obviously one chip) to a dedicated CPU + dedicate GPUs (obviously two chips) is the basis of most conversations I read here and there.
    Dual sockets mobo to strike back?
     
  9. HAL-10K

    Newcomer

    Joined:
    Jul 28, 2002
    Messages:
    32
    Likes Received:
    0
    Yes, he has accomplished a lot, but it is simply absurd how Sweeney persists that a software render pipeline is the certain future of game rendering.

    Maybe there will be some cases where an alternative to classic rasterization could lead to a new pipeline standard. But why in hell would this not be implemented in (a) hardware (standard)? He has completeley lost touch with reality by suggesting that even a marginal amount of people in the industry is interested to create a complete render pipeline from ground up.
     
  10. liolio

    liolio Aquoiboniste
    Legend

    Joined:
    Jun 28, 2005
    Messages:
    5,724
    Likes Received:
    194
    Location:
    Stateless
    Well I would not say absurd when fixed function gpus overtook software the hardware that would have allow software rendering to survive didn't exist and more importantly it still doesn't exists. As AR stated one could try (tried actually) to push such a chip and fail for what ever reasons, power consumption, not meet clock speed, etc,etc. Even if one were to succeed point is history took another direction does that make his statement or belief absurd? Honestly I doesn't think so.
    On the other A.Richard points and statements are more "down to earth" and somehow interesting as they account the world as it is.
     
  11. 3dilettante

    Legend Alpha

    Joined:
    Sep 15, 2003
    Messages:
    8,196
    Likes Received:
    3,159
    Location:
    Well within 3d
    The implication of the software-only approach is that everyone not willing to build their own software platform and ecosystem would buy a software engine from a company that builds software engines.
     
  12. Albuquerque

    Albuquerque Red-headed step child
    Veteran

    Joined:
    Jun 17, 2004
    Messages:
    3,845
    Likes Received:
    329
    Location:
    35.1415,-90.056
    And I'm sure Epic and ID would be right there to help you out with that, for a phenomenal fee ;)
     
  13. Colourless

    Colourless Monochrome wench
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,274
    Likes Received:
    30
    Location:
    Somewhere in outback South Australia
    His software rendering rhetoric is annoying to me. He *could* have developed a high performance software rendering if he really though it was better than hardware. Why would he continue to develop hardware renderers if he didn't think it was they way to go. If he still doesn't think its the way to go why does he keep doing it.

    The major issue with a software renderer is the CPU just doens't have anywhere near enough memory bandwith to support a software renderer that gets anywhere near the performance of GPU. A Cell like architechture might be able to produce a quick deferered renderer by rendering to localstore, but an x86 CPU would be shithouse. I almost think that he is talking about what software rendering would be like if CPU advancement went exactly like what he wanted since 1997, and it hasn't.
     
  14. Harison

    Newcomer

    Joined:
    Mar 29, 2010
    Messages:
    195
    Likes Received:
    0
    To be fair, both sides are right and wrong :smile: For now, hardware implementation is the way to go because it delivers much faster performance with acceptable quality within manageable power consumption. DirectX while not perfect, unifies the market and simplifies creation of new games. So for foreseeable future, hardware approach + DirectX is the way to go.

    However, Tim is right that devs needs more flexibility and more programable approach, and thats exactly where industry is moving to. DirectX and new generation of video cards are more flexible than ever, just in baby steps. Larrabee would be what Sweeney is praying for, but it will take few more years till Intel releases drivers for gamers IMO (chips will be out to mass market as soon as next year), give it several more years to mature, and you get exactly that - fully programmable chips, if you want that.

    The rest of the market will stick with DirectX for many years to come, regardless on what HW they'll run it - AMD, Intel or NV. And its good, as much as I respect Sweeney and Carmack, I would hate if market would totally depend on fully software approach, this would only make them richer, but also make very diverse market with different engines. This could fix few DirectX issues, but introduce truckload of other issues and reduce overall quality of the games. Not everyone is as talented and has as much resources as Sweeney/Carmack, and if you think DirectX is buggy/limited, watch out for loads of semi-baked software engines, filled with bugs.

    For now, I'm happy with HW + DirectX :wink:
     
  15. rpg.314

    Veteran

    Joined:
    Jul 21, 2008
    Messages:
    4,298
    Likes Received:
    0
    Location:
    /
    I think while programmability of GPUs will continue to grow, some amount of ff hw (like the rasterizer) will always be there.
     
  16. HAL-10K

    Newcomer

    Joined:
    Jul 28, 2002
    Messages:
    32
    Likes Received:
    0
    He is not talking about making the existing render pipeline more programmable. He is saying that hardware should become completely generalistic so that it is not designed according to certain render pipelines.

    I don't understand why there are still people here who say that things like a software rasterizer/raycaster/whatever should turn out all right in the future after considering that even Intel now openly says that this idea is ridicules.

    It has been PROVEN with Larrabee that hardware tailored to a certain render pipeline is a least two times more efficient than generalistic hardware (Larrabee even had texture units). No sane company would pay that price for the freedom to write their own, COMPLETE render pipeline. And the worst part is that the vast majority doesn't even see that goal as a benefit in the first place.

    Yes, we might see some completely new render pipelines in the future, but it would be insane not to make these a standard and not mold this into specialized hardware.
     
  17. liolio

    liolio Aquoiboniste
    Legend

    Joined:
    Jun 28, 2005
    Messages:
    5,724
    Likes Received:
    194
    Location:
    Stateless
    Who knows if one were to spend a lot of time on Larrabee he could prove Sweeney right.
    I mean he may end with lower fps, lower resolution but ends up with "better" pixels as Sweeney put it.
    An huge problem with larrabee no matter what "absolute" perfs the chip achieved in the end is that it had to play a game he was not really design to play.

    But Sweeney asks indeed a lot and is a bit dishonest, from my outsider pov real-time rendering looks like an adventure built by humans and companies some have disappeared. Sweeney would have wanted a stable language, after so many man years invested this language still doesn't exist. It's kind of a stretch not only hardware didn't exist when software rendering died but neither was the software language.
    Actually I wonder if he would not have been howling with the wolves if back in time a company came with a hardware that would have allow software rendering to persist. Say a 4 tiny simplistic vliw cores, I could hear him and other scream as they scream some years ago (when multi cores CPU happened)... we don't want multi-core we want more serial performances, which languages are we supposed to use to make the most of the such a chip, etc.
    Sweeney has to be French he is always complaining :lol:
     
    #17 liolio, Sep 29, 2010
    Last edited by a moderator: Sep 29, 2010
  18. Freak'n Big Panda

    Regular

    Joined:
    Sep 28, 2002
    Messages:
    898
    Likes Received:
    4
    Location:
    Waterloo Ontario
    I'm interested in why Sweeney would say that DirectX is the reason we are seeing diminishing returns in graphics.... I thought that the reason there would just be our physical reality, requiring exponentially more horsepower for every unit increase in detail. Does anybody have any other thoughts on that? How could directX be producing these diminishing returns?

    As for the whole software rendering thing, doubt it. Just cause any algorithm can be accelerated in hardware and hardware is always faster.
     
  19. flynn

    Regular

    Joined:
    Jan 8, 2009
    Messages:
    400
    Likes Received:
    0
    Unfortunately this is not true. Else you would see accelerator cards for all major offline renderers. Some attempts have been made and all of them failed miserably.

    Current real time 3D graphics are constrained by what the GPUs can do.
     
  20. MDolenc

    Regular

    Joined:
    May 26, 2002
    Messages:
    690
    Likes Received:
    425
    Location:
    Slovenia
    It's also a question what pays off and what doesn't. Is there really a market to make a dedicated hardware for every software iteration of every major offline renderer? How much would such hardware cost to design and how much would you be able to sell it for?
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...