ATI Benchmarking Whitepaper

Discussion in 'Graphics and Semiconductor Industry' started by Dave Baumann, Oct 15, 2003.

  1. Dio

    Dio
    Veteran

    Joined:
    Jul 1, 2002
    Messages:
    1,758
    Likes Received:
    8
    Location:
    UK
    I agree, this is one optimisation that has remained 'beyond the pale' for many years and I hope it remains so. But let's make sure we all watch for it, by keeping the framerate down to the point where we might see it!

    The point I'm making is that if the original rate is down under the refresh rate, then it is possible to spot, while if it is above vsync, then it is virtually impossible (although backend scene capture should still be able to work it out if someone is prepared to go through every frame and count the tears - since we don't have backend scene capture, this is pretty much a moot point!).

    The ideal solution for me is a reasonably high refresh rate (the ones there are all good) and triple buffer - but I hate vsync tears, I find it easier to put up with framerate stutter.
     
  2. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    I think I can at least partially agree here--that I think most people are not much interested in the "why" behind differences of IQ. But I'd have to disagree if you are asserting that they aren't interested in whether or not IQ differences can be demonstrated to exist. I really think most people who read hardware reviews would be interested in knowing about such differences, even if no attempt is made to explain them in detail. To that end the important thing would be to demonstrate what differences in IQ, if any, exist. For instance, when fog isn't rendered, lights in a scene (or lighting in general) aren't rendered the same way, when FSAA doesn't work as advertised, shader effects don't render properly, etc. I would think that most people taking the time to read a review would be interested in this information--those that aren't would likely not read the review at all, but rather make a purchasing decision based on some other criteria--like a friend's recommendation, for instance.

    Basically, I would think that the "why" of rendering differences is not important in a review from the standpoint of the general consumer--but a comparison of rendering IQ between products is very important to a substantive product review.

    I agree with you on the frustration of reading remarks like, "It all looks the same to me," and so forth. But really, all those people are doing is admitting that they haven't actually looked for any differences in rendering IQ.

    What I'm suggesting for ATi as a part of its reviewer's guide is that they walk a reviewer through the steps of what to look for when evaluating differences in IQ among competing products--laying out a basic, practical method for doing it to get them started. You could start with a discussion on the "proper" way to grab a frame, up to and including screen shot examples illustrating the types of things commonly spotted. The screen shots would show them what to look for, in other words, because I agree with you that some of them simply don't know. You might even begin with a summary of why IQ differences affect frame-rate performance, which some might not know or clearly understand. Basically, I think a "reviewer's guide" should illustrate a basic framework of a coherent product review--especially since most product reviews are actually product comparisons. It may not move mountains, but something like that couldn't hurt...:)
     
  3. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    Heh...:) OK, I can't see how...:) How does turning off vsync enable my cpu to process more data at a faster rate, or change the load on the card, for instance? In the cases I refer to, simply turning off vsync resolved the problem. Therefore, the vsync frame-rate cap governed the performance of the 3d card in those instances. Right?

    The way it looked to me was that the vsync cap not only affects maximum frame-rate, it also affects minimum, as the minimum can never be above the maximum cap. My idea was that the minimum frame rate for those scenes needed to be higher than the vsync-forced cap allowed in order for those scenes to smoothly process in terms of visible frames. Since turning off vsync solved the problem, that would seem to be the case, and would seem to indicate that vsync had actually governed the frame-rate performance of the card--in those instances. I mean, the data to be processed didn't change, nor did the resolution, nor any other factor--except the vsync cap on the card was removed. The frame rate performance as I recall (since UT had a counter) did pick up quite a bit at the same time as well, I noticed.

    It just appeared to me that had it been merely a matter of data load on a part of the system then turning vsync off would not have cured the problem.
     
  4. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,360
    Likes Received:
    1,377
    Exactly. :)
    And looking back through time, I draw the same conclusion you (partly) did - that over the last years expectations re:graphics quality has increased continously, whereas new games that are released have had roughly the same typical framerates as games had 5-7 years ago running on the hardware of their time.

    This means that pretty much all additional horsepower we've gotten over these years has been invested into increasing the graphics quality rather than frame rate by the game designers. So, although reviews still use predominantly fps graphs, what the graphs represent is not merely "achievable fps", but actually the trade-off between frame-rate and quality level (as on this site).

    Since, as you say, the goal posts are moving as regards graphics quality (although seemingly very little for frame rate), this trade-off is likely to be critical for the foreseeable future, and reviews will continue to reflect that.
    Which is actually eminently reasonable.
     
  5. cthellis42

    cthellis42 Hoopy Frood
    Legend

    Joined:
    Jun 15, 2003
    Messages:
    5,890
    Likes Received:
    33
    Location:
    Out of my gourd
    Sadly, this is nothing that can be fixed by one reviewer, one IHV, or one set of directions. "Joe Public" will already be swayed by FPS number he doesn't know how to put into perspective, so things will be even more difficult with IQ examinations. For people like Joe, they can only get better if said person has the inclination and does the broadscale reading and learning to turn "relatively little knowledge" around and see things a lot more clearly. The impetus has to come from Joe, though, as without it he functions basically by crap-shoot on which site to look at, and glosses over charts while not reading text anyway--or perhaps just skipping right to the conclusion before happily opening his wallet.

    It's a sad state, but most consumers just don't care enough. <shrugs>
     
  6. AAlcHemY

    Newcomer

    Joined:
    Jun 17, 2003
    Messages:
    215
    Likes Received:
    0
    Location:
    Belgium
    WaltC,
    wasn't the choppy gameplay solved by enabling triple buffering ?
     
  7. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    Both of the games as I recall were played in D3d. But I have used triple buffering quite a bit in the past. As I say, though, in both of these instances the problem was solved merely by removing the vsync cap on frame rates. It's very possible that TB could have solved it, though...:)

    These days I rarely if ever experience that kind of thing with vsync on, as I'm running much higher refresh rates than my monitor would support at the time. It was one of those "Ah, Ha!" solutions of the type I remember.
     
  8. OpenGL guy

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,357
    Likes Received:
    28
    Yes, but, again, it's not the video card skipping frames but the application.

    When you disable vsync, then the video driver doesn't have to wait for the next vsync before submitting a frame, which means the CPU spends less time in an idle loop, which means the application has more time to process a new frame.
     
  9. Fred da Roza

    Newcomer

    Joined:
    May 6, 2003
    Messages:
    178
    Likes Received:
    2
    When does providing better image quality become a large enough factor that it 'wins' comparative reviews, never? If that were true ATI would be losing market share. The fact that there has been so much talk on forums about this illustrates it's not all about frame rates. Obviously a minimum acceptable frame rate is the first criteria but IMO once a card's minimum frame rate, in a benchmark, exceeds the refresh rate (especially at the highest IQ settings) it's primarily about image quality. So if we are talking about “benchmarkingâ€￾ top of the line cards (9800, 5900) using UT2003, IMO it's all about image quality.
     
  10. rwolf

    rwolf Rock Star
    Regular

    Joined:
    Oct 25, 2002
    Messages:
    968
    Likes Received:
    54
    Location:
    Canada
    Would it be possible to turn off Ansio/AA every couple frames to make it look like the frame rate was much faster? If you are going 100fps how would you know if every 5th frame had quality removed.
     
  11. Xmas

    Xmas Porous
    Veteran Subscriber

    Joined:
    Feb 6, 2002
    Messages:
    3,344
    Likes Received:
    176
    Location:
    On the path to wisdom
    It would flicker.
     
  12. Arun

    Arun Unknown.
    Legend

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    302
    Location:
    UK
    Most likely not possible for AA too.
    But for AF, you might be able to do a slow degradation and upgrade ( slowly switching between full bilinear and full trilinear ) and then maybe you also could detect when a screenshot is coming and give the max quality then.

    Would be pretty icky stuff though! Not sure it'd be worth the time programming it either :)


    Uttar
     
  13. Sabastian

    Regular

    Joined:
    Feb 6, 2002
    Messages:
    991
    Likes Received:
    2
    Location:
    Canada
    Hmm, someone was saying something about agressive PR...

    http://www.gamespot.com/pc/news/news_6077157.html
     
  14. YeuEmMaiMai

    Regular

    Joined:
    Sep 11, 2002
    Messages:
    579
    Likes Received:
    4
    Where were you guys when nVIdia relased all of the hacks for the nV30 cards (cheating drivers, application detection, etc)

    Here are are a year later and nVidia still has not caught up to the original R300 technology in terms of DX 9 performance.

    Lets see here

    1. we have companies wasting their time developing for nV30 specific code paths because it is WELL KNOWN FACT THAT NV3X DOES NOT HAVE WHAT IT TAKES TO COMPETE. This will have an impact on what we pay for games and graphics hardware.

    2. We have nVidia encrypting their drivers as to prevent anyone from seeing what they are doing (I will just wait for the MIT boys to crack these open)

    3. nV30 the dawn of cinematic computing? Now that is a good one since they cannot even do 4xFSAA right, or Trilinear for that matter.

    4. Was not nVidia the ones pushing for FP32 claiming that ATi FP24 was inferior? Now here we are over 1 year later and when you force both cards to run at MAX IQ the R3X0 cards completely dominate everything Nvidia has to offer. I was under the impression that when you pay $500 for a video card, you should get $500 worth of performance. I have seen situations where even a 9500Pro is faster than a 5900Ultra running under max IQ settings.

    5. PS 2.0 Even Nvidia's latest card cannot touch the ATI cards when it comes to PS 2.0 performance....


    Nvidia needs to do one of the following

    1. pack up their bag and go home since they cannot play

    or

    2. Get serious about competing and produce a QUALITY PRODUCT and quit wasting their time by trying to smear other companies.


    PS my favorite quote of the day "Intel is our main competition"
     
  15. ram

    ram
    Newcomer

    Joined:
    Feb 6, 2002
    Messages:
    218
    Likes Received:
    0
    Location:
    Switzerland
    The content of this document is one part of the 9600XT reviewers guide. I liked that part, as was written quite "objective" and provided a lot of facts. But the full reviewers guide also consists of controversal parts, e.g. about texture filtering. They bash NVDA for not doing correct trilinear filtering, proving it with screenshots made with common filter testing tools. At the same time, they themself switch back to non-trilinear anisotropic filtering for most texture stages if you force anisotropic filtering. I think I already know the content if NVDAs upcoming "reviewers guide". :roll:
     
  16. PiXEL_ShAdER

    Newcomer

    Joined:
    Oct 24, 2003
    Messages:
    114
    Likes Received:
    0
    Location:
    England
    I cannot be arsed to quote YeuEmMaiMai,

    Perhaps you should take a look at the Nvidia techdemos and that would give you a idea of "the dawn of cinematic computing"

    I just shows you what the card can do and how realistic realtime graphics are getting.

    The Vulcan demos is the most impressive realtime graphics demo i've ever seen, and I've never seen volumeric texures like that fire before, jesus. :roll:
     
  17. CorwinB

    Regular

    Joined:
    May 27, 2003
    Messages:
    274
    Likes Received:
    0
    Another personal attack ? Great way to improve your standing on these forums...

    Like the Dawn demo which runs faster on ATI hardware, although the specific OpenGL extensions have to go through a wrapper ?

    And I agree with YeyEmMaiMai, you can't pretend to provide "the dawn of cinematic computing" while you are busy cheating as much as possible and bringing real-time graphics back to circa 1999 in order to win benchmarks...

    http://www.ati.com/developer/demos/r9800.html
    http://www.ati.com/developer/demos/r9700.html

    Every IHV worth its grain of salt can make some fancy tech demos.

    NV-colored glasses... Don't get outside without them.
     
  18. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    18,992
    Likes Received:
    3,533
    Location:
    Winfield, IN USA
    But no games support those features, you'll only find them in them fancy nVidia tech demos.

    It's kind of misleading. Yeah, nVidia's cards can do some neat nVidia tricks; but it can't play the games the way they're supposed to.
     
  19. PiXEL_ShAdER

    Newcomer

    Joined:
    Oct 24, 2003
    Messages:
    114
    Likes Received:
    0
    Location:
    England
    I used to have a Radeon 9700Pro and can play the demos in my current card, they are not as impressive as the Nvidia demos.

    Why is it you come back to the same old crappy remarks, FFS, and why the hell do you think i'm having a go at ATI, I didnot even mension ATI in the fricking post. :roll:

    Jesus, you cannot say nothing good about anything Nvidia because the ATI trolls come marching in., get a grip man.
     
  20. PiXEL_ShAdER

    Newcomer

    Joined:
    Oct 24, 2003
    Messages:
    114
    Likes Received:
    0
    Location:
    England
    oh FFS, are you saying only nvidia can do the effects, I mean what the f*** are you taking about man, if deveopers got off there arses are put these effects in the games you would'nt be saying that.

    The GF2 could do per pixel lighting and shadows but no deveopers used the technology.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...