OpenGL deprecation myths

Discussion in 'Rendering Technology and APIs' started by rpg.314, Oct 23, 2009.

  1. Demirug

    Veteran

    Joined:
    Dec 8, 2002
    Messages:
    1,326
    Likes Received:
    69
    Unfortunately I had only access to the high level planning for Direct3D 10 were tessellation was listed as feature. But based on what I have seen I expect that it was equal to the tessellation on XBox 360.
     
  2. Novum

    Regular

    Joined:
    Jun 28, 2006
    Messages:
    335
    Likes Received:
    8
    Location:
    Germany
    And how is it done there?
     
  3. MrGaribaldi

    Regular

    Joined:
    Nov 23, 2002
    Messages:
    611
    Likes Received:
    0
    Location:
    In transit
    Well, Blizzard, EA and transgaming are contributing members of Khronos, and they gave input on 3.0.
    Additionally Khronos had a "school" at GDC so they do seem interested in getting developers to use it, but (as far as I can tell) not quite to the point of going out asking for input.

    repi twittered that he was going to a meeting "with the real powers that be at Intel in a few weeks about future gamedev parallel SW/HW utopia."

    Khronos would do well/could probably benefit from doing something similar with lead graphic devs from the game industry and find out what they think are the major hurdles for using OpenGL instead of DX.
     
    #43 MrGaribaldi, Oct 28, 2009
    Last edited by a moderator: Oct 28, 2009
  4. Demirug

    Veteran

    Joined:
    Dec 8, 2002
    Messages:
    1,326
    Likes Received:
    69
    It’s a fixed function unit that can generate barycentric coordinates as additional vertex shader inputs. These coordinates are either based on a fixed tessellation factor or per edge factors read from the memory. Therefore adaptive tessellation would require two passes instead of one.
     
  5. Fox5

    Veteran

    Joined:
    Mar 22, 2002
    Messages:
    3,674
    Likes Received:
    5
    I view OpenGL's model as something similar to X Windows. Strict separation of functionality based on usage requirements and hardware of the time, resulting in a very efficient use of resources that can be very restrictive and break down severely.

    A while back, I was running my PCI-Express slot at 1x. In Windows with DirectX, games often slowed to a crawl (even old, dx7 games) on a 9800gtx+. Under Wine, converted to OpenGL, the games still ran full speed. Not identical code, but I'd assume as close as you can get, probably closer even than a port of a d3d engine to opengl. My guess: OpenGL lends itself more to a dumb display model and not so much as a coprocessor model.
     
  6. OpenGL guy

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,357
    Likes Received:
    28
    Sounds like a driver issue to me, not an API issue. In Windows, probably some resources were being put in system memory and that was causing slow performance due to a slow PCIe slot. In the OpenGL driver under Wine, it was probably putting those resources into GPU board memory.
     
  7. Blazkowicz

    Legend

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    you may know about those "Hitler rants about X" videos. I find them so hilarious (the well-made ones).
    There's one suprisingly on-topic!

    Hitler Not Impressed With OpenGL 3.0
    http://www.youtube.com/watch?v=sddv3d-w5p4

    (I had to use a US proxy as youtube randomly disallows some videos)
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...