V-Sync vs. FPS (revisited)

Discussion in 'Architecture and Products' started by Chris_T, Oct 22, 2005.

  1. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    18,992
    Likes Received:
    3,533
    Location:
    Winfield, IN USA
    Agreed, and tearing SUCKS!!!!!

    I don't think I play any games with v-sync disabled, I always force it on thru the control panel if the game doesn't have an option.

    More D3D games need to support triple-buffering, although Ray Adam's tray tools does provide a great work-around. :)
     
  2. Diplo

    Veteran

    Joined:
    Apr 17, 2004
    Messages:
    1,474
    Likes Received:
    64
    Location:
    UK
    Well, it's down to semantics, but I personally wouldn't refer to a Voodoo 2 as a high-end card simply because at one time it was cutting edge, but there you go :)
     
  3. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    No, I'd refer to it as a well-aged high-end card :)
     
  4. Rolf N

    Rolf N Recurring Membmare
    Veteran

    Joined:
    Aug 18, 2003
    Messages:
    2,494
    Likes Received:
    55
    Location:
    yes
    I beg to differ! There is VSync in OpenGL. I don't know about your lead programmer friend, how old that quote is, what context it came from, but I know that there is VSync in OpenGL because I have used it millions of times.
    And glFlush/glFinish has nothing at all to do with VSync. It's definitely not usable as a replacement. That suggestion is just insane.
     
  5. tahrikmili

    Newcomer

    Joined:
    Sep 16, 2003
    Messages:
    68
    Likes Received:
    0
    Location:
    Istanbul, Turkey
    I used to think VSync was cool on my 9800XT when I could run everything 1280x960 4xAA 16xAF 100fps..

    Nowadays, when I get only 40-50fps with 0xAA 8xAF, enabling VSync REALLY hurts performance.

    No D3D Triple Buffering in Catalyst makes my troubles even worse. I just had to get used to tearing.. Hoping for ATI to provide the option to enable triple buffering for D3D as well as OGL so I can enable it once again.
     
  6. Xmas

    Xmas Porous
    Veteran Subscriber

    Joined:
    Feb 6, 2002
    Messages:
    3,344
    Likes Received:
    176
    Location:
    On the path to wisdom
    There's WGL_EXT_swap_control and GLX_SGI_swap_control. They specify the minimum refresh cycles a frame is shown, and default to 1 which means vsync on. Curiously, the GLX version doesn't allow to disable vsync as it doesn't accept 0.
     
  7. SmuvMoney

    Newcomer

    Joined:
    May 24, 2003
    Messages:
    67
    Likes Received:
    0
    Location:
    Chicago, IL
    To be honest, that probably won't happen ever in the Catalyst or the Detonator suite. Unlike OpenGL, I believe the number of frame buffers in Direct3D has to be specified by the game or application on startup. In OGL, it's the driver's call. This is why ATi - and now nV as a few releases ago - include triple buffering in OpenGL as an option. There are third party programs which allow you the option to force triple buffering in D3D. They essentially pass the startup parameter(s) to allow triple buffering. ATI Tray Tools and DXTweaker from one of our forum members whose name I suddenly forgotten (Demirug I think?) comes to mind. I believe natively including this option would prevent or disallow WHQL certification for ATi or nV. Don't quote me on the WHQL statement or the author of the DXTweaker tool though. :)
     
  8. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    Just note that nVidia doesn't list triple buffering as an OpenGL-only option, but as a global driver option. I haven't tested it, however. Maybe I'll do it tonight.
     
  9. kyleb

    Veteran

    Joined:
    Nov 21, 2002
    Messages:
    4,165
    Likes Received:
    52
    I have, at least for me it did nothing in D3D games. Which makes sense, as from what I understand D3D and the application determain how many back buffers are to be used, so driver doesn't get any say in that.
     
  10. overclocked

    Veteran

    Joined:
    Oct 25, 2002
    Messages:
    1,317
    Likes Received:
    6
    Location:
    Sweden
    Chalnot the thing you said about the extra memory fotprint for tripplebuffering, is this something to consider more on cards with 128MB of memory? How much more memory/BW does it use compared to if you uncheck this feature in the FW drivers, significant?
     
  11. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    Well, first of all, the memory bandwidth hit is basically zero (provided the hardware can make use of buffer swappin).

    The extra storage for the frame that is done rendering, but is waiting for the next refresh cycle, does require any memory bandwidth (it's just sitting there).

    So, the only cost is in memory space. And if the hardware does the downsampling at buffer swap (i.e. so that the front and middle buffers are both downsampled), then, for example, a 1600x1200x32 buffer would only cost about 7.5MB. So the extra memory hit isn't that big any longer, either.
     
  12. overclocked

    Veteran

    Joined:
    Oct 25, 2002
    Messages:
    1,317
    Likes Received:
    6
    Location:
    Sweden
    If we take 1024x768 with 4AA for ex then? And is this compared to "normal" doublebuffering then or is that 2/3?
     
  13. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    Well, AA makes no difference, provided the downsampling is done at buffer swap. You can do the calculation for 1024x768 yourself. It's just 1024x768x32 / 8 (dividing by 8 to convert bits to bytes).

    And yes, this is the difference from double buffering. It's obvious, really: it's just one extra buffer. So it's just the number of pixels (ex. 1024x768) times the bytes per pixel (4 bytes in the case of 32-bit color).
     
  14. SmuvMoney

    Newcomer

    Joined:
    May 24, 2003
    Messages:
    67
    Likes Received:
    0
    Location:
    Chicago, IL
    Does the help or mouseover tooltip for that option specify whether or not it is for OpenGL only?
     
  15. kyleb

    Veteran

    Joined:
    Nov 21, 2002
    Messages:
    4,165
    Likes Received:
    52
    No, it doesn't specify anything about APIs.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...