6800Ultra or X800XT PE

Discussion in '3D Hardware, Software & Output Devices' started by boxleitnerb, Aug 27, 2004.

  1. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    I lump 'em together because there's only one piece of hardware now that supports PS/VS 3.0 and FP blending. As far as I know, they'll always be connected, so I just lump them together with the term SM3.
     
  2. jvd

    jvd
    Banned

    Joined:
    Feb 13, 2002
    Messages:
    12,724
    Likes Received:
    9
    Location:
    new jersey
    Thats because u didn't look at my previous post

    I said
    Note what i said . I said in most games even current games 6x fsaa temporal is very playable. In a good amount of others 6x is and the rest 4x temporal.

    That is because 6x playable only needs 50ish fps where as temporal you want at least 65 most likely 75 average to make it worth it ..

    Then you jump in with this .
    Why the personal attack ?

    Then therw was this

    So i replied with the newest set of benchmarks showing 6x very playable at those reses .

    So the fact that the radeon x800xt pe is faster in eye candy modes is important. Because the diffrence will be 4x on nvidia vs 6x on ati at close to the same frame rates . Which is why i brought it up. If your looking for a video card for 1027x768 or 1280x960 the x800xt pe is faster than the 6800ultra in eye candy modes In a majority of the cases 6x is going to be playable. Much more often than 8xs from nvidia . So if your looking at those reses then the x800xt pe provides the best image quality and frame rate at those lvls .

    Can you disagree ?



    But what you fail to understand is yes nvidia has a slight edge in af. I will admit it but thats only with the optimizations off . If you turn them on it looks on par with ati's af . Though most benchmarks are done with the optimizations on. So you have to make sure the benchmark results are showing you the speed that comes with that improved image quality or the lack of speed .

    Then you have to understand that there is a stigma with ati's 6x fsaa . Since nvidia doesn't have a 6x and 8x is so demanding on thier cards we never see ati's 6x fsaa . The problem is in some tests like the one i posted its only a 4fps drop from 4x from ati .

    Which is why i've been calling for more max iq benchmarks. For nothing else but to see what the highest playable settings are going to be with those cards .

    Yes i love the gamma corrected aa .

    With temporal aa its funny how most reviewers fail to mention it.

    I think everyone can agree that a mode that looks almost as good as 4x but has a performance hit of 2x . A mode that looks almost as good as 8x with a performance hit of 4x . A mode that looks almost as good as a 12x with a performance hit of 6x is a good thing all around. But reviewers don't wnat to talk about it .


    Well lets take for example say JVD quest 44 . You love that game right ? Suddenly a mod comes out for it and that stellar performance u bought your video card suddenly disapears . You wont be pissed about that ?

    Yes that is nice but we are talking about two diffrent things. Your talking about app profiles. Where as if doom3 loads up your control panel sets say 4x fsaa and 8x af . I'm talking about where the driver replaces shaders or lowers quality of output to get faster fps to make the card look faster.

    How about in my example up above. In JVD quest a mod comes out and now there arne't rewriten shaders and your card is much much slower than it was with the rewriten shaders ? You wont be pissed at that ?

    You bought your card already. But the person asking hasn't . Thus saying it doesn't matter casue u own it already isn't helping your point on him buying a 6800ultra over a x800xt pe .

    Can you point to aniso degradation on the certian vendor's bag of aniso trickery ? Because after almost a year and a half of products with that aniso enhancment no one has been able to show loss of iq caused by it in any game .

    So if you can't show it then please don't bring it up. It doesn't help your point
     
  3. jvd

    jvd
    Banned

    Joined:
    Feb 13, 2002
    Messages:
    12,724
    Likes Received:
    9
    Location:
    new jersey
    Your right currently nvidia has a better interface. And yes nvidia has better over all opengl performance.

    Though half life 2 should show excellent hdr effects across all dx 9 hardware.

    So farcry can do hdr on other cards. They just choose not to .
     
  4. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    That's bs. There are HDR effects that you simply cannot do on hardware that does not support FP blending. In essence, since every game out there that I know of uses blending for something, you just can't output color data at higher than 8 bits on anything but a GeForce 6x00. This means that any HDR effects that are done will be specially coded for specific scenarios, and cannot be general.

    For example, in HL2, you may get bright reflections (since the reflection calculations can be done at high precision), but you won't get bloom effects on reflections. You can get bloom effects on light sources, of course, but we've seen those for years (I think I first saw them in the coronas in the original Unreal....they've definitely improved over the years, but it's still limited).
     
  5. martrox

    martrox Old Fart
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,065
    Likes Received:
    16
    Location:
    Jacksonville, Florida USA
    So, Chalnoth, exactly which DX9 cards won't be able to do HDR effects????

    HMMMmmmmm? :wink:
     
  6. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    All DX9 cards can do limited HDR. Only the GeForce 6x00 can do general HDR effects.
     
  7. DarthFrog

    Newcomer

    Joined:
    Aug 10, 2004
    Messages:
    52
    Likes Received:
    0
    jvd, I addressed most your points when you brought them up the first time earlier in the discussion. There would be little point in rehashing all this, go back and skim the the thread.

    Of course, this takes the bisquit:

    ... since I had addressed the very issue of app-specific driver optimizations - which is another aspect of app detection - right in the next sentence, which you also quoted afterwards. To repeat, game-specific driver optimizations have the consequence that the games in question are not fair benchmarks for performance in other games, which is obviously an issue to be wary of when looking at benchmarks to compare cards but it ceases to be an issue once you have made your choice.

    If a mod or patch causes a game to 'fall off the fast path' (JC) then the framerate will drop back to the base performance level. For example, we don't know yet how much of the NV40's performance in Doom³ is due to black driver magic - shader replacements or special support for certain instruction sequences - but so far it seems that there is nothing dramatic. A few percent? Anybody's guess is probably as good as mine, and we have to wait until the commission in the other thread comes up with a report. Unlike the NV3x the NV40 is quite a capable chip, which can be seen from the fact that it gives the R420 a good run for the money despite its lower clock. For the time being my guess is that it will not be noticable unless you compare benchmark numbers before and after; the actual framerate ingame can change much more dramatically even if you only look in a slightly different direction or an enemy opens fire (that is why I would like to see more reviews giving the minimum framerate in addition to average).

    As regards drivers lowering IQ in order to get higher framerate, that would be more your area of expertise since I know of no such issues regarding the NV40 but at least one regarding R420.

    I have already posted a link to a side-by-side comparison of 16AF by Elite Bastards with a scene where the lower filtering quality of the R420 was clearly apparent even in the still image; normally such issues are only noticed in-game when moving or turning (esp. things like brilinear vs. trilinear, LOD 'optimizations', shimmer from sampling 'optimizations' and so on). In this particular case the degradation was so obvious that I think there may have been something wrong with the testing procedures or perhaps a driver quirk; however, this is not the only example by any means.

    Such optimizations are fine as long as they can be enabled and disabled via the control panel in some fashion (slider for selecting Quality vs. Performance or whatever); they are not so fine if the driver quietly decides that lesser filtering is good enough for you.

    I think both the R420 and the NV40 have enough horsepower to filter more or less correctly in many games. If we're lucky then IQ will become more prominent in reviews and perhaps it will become possible to measure objectively what we can now only subjectively describe as 'waves of alternating sharpness/blurriness moving with the player', 'shimmer', 'interference patterns on regular textures' etc. when actually playing the games. ATI's adaptive scheme is particularly difficult to assess because it seems that it automatically switches to full quality whenever mip maps are uploaded instead of instructing the driver to generate them (at least that's what an ATI guy said somewhere) and for the time being we have to stick with pondering still images and with subjective reports of other players and reviewers.
     
  8. glawton

    Newcomer

    Joined:
    Aug 15, 2004
    Messages:
    17
    Likes Received:
    0
    x800 xt pe
     
  9. jvd

    jvd
    Banned

    Joined:
    Feb 13, 2002
    Messages:
    12,724
    Likes Received:
    9
    Location:
    new jersey

    Well we honestly don't know . But do you think nvidia would have spent the time and effort for shader replacements if it was only 1-5 fps ? I highly doubt it .

    it gives the r420 a run for its money because it has the same bandwidth (Well 10mhz less )



    Well now your talking about the angle optimization which has been in hardware since 8500 series.

    But hey . Like i said before as a 6800 user your limited to 4x where as in most cases i can use 6x .

    I feel the 1 or 2 cases the af problem happens is far out weighted by the fact that 6x allways looks much better than 4x .

    But i guess it doesn't work that way in your mind which is cool. You may value the 1 or 2 cases where the af looks better on nvidia cards over the larger amount of times that 6x fsaa is usable .

    So i take it you can post a link to more ?



    so then what video card do you use as nvidia does the same things . FOr every optimization we know of from ati and nvidia there are 10 more that we don't know

    and for ati's adaptive scheme there has been no proven iq loss in anything. THe reason why it switchs to full trilinear with color mip maps is because that filters them the best when the transitions are very abrupt if i'm explaining it


    You seem to be saying that the 1 or 2 times ati's af is a problem. Though many sites test with nvidia's optimizations on . Which means the reviews your comparing nvidia and ati with . They both have the iq problems you passed up the radeons for.

    Not to mention the fact that the reviews stop at 4x fsaa. If they continued on like in the cs source tests. You would see exactly how many games let 6x fsaa become playable compared to 8x from nvidia .

    Anyway. I guess we will wait to see what he picked
     
  10. fallguy

    Veteran

    Joined:
    Jun 17, 2003
    Messages:
    1,367
    Likes Received:
    11
    I would rather have a XT/PE. I changed from a X800 Pro, to a 6800 GT. Mainly because I wanted DD's new GPU+Ram waterblock. I would change to a XT/PE in a heartbeat, even if I had a 6800 Ultra. The only game Ive played that would perform much better on the Ultra, versus the XT/PE is Doom3. I beat it 3 weeks ago, and havent touched it since. Didnt care for the single player much, and the "multi"player is not very good to me.

    The XT/PE seems to be faster in most other games, but they are each very close to each other. It would all come down to price though. If the Ultra was $50 cheaper, I would have a hard time not choosing it. If they were the same price, I would nab the XT/PE.

    Saying one card "has a better interface" isnt a fact. Its your opinion, dont try to pass off your opinion as a fact.
     
  11. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    Um, let's see.

    nVidia's drivers include digital vibrance. It's not useful for everything, but I, for one, think it looks great for movies.

    nVidia's drivers have application profiles, which was a huge deal for me since I often play a couple of games at a time, and don't like having to switch settings between each game.

    nVidia's drivers include nView. Not that useful for me, but still fun to play with.

    nVidia's drivers have a much more full-featured taskbar utility (you can change most options straight from the taskbar without even opening the control panel).

    Yes, I'd say nVidia's drivers have a much better interface.
     
  12. PowerK

    Regular

    Joined:
    Jun 12, 2004
    Messages:
    312
    Likes Received:
    0
    Location:
    Seoul, Korea
    without AA ? :)
     
  13. pat777

    Newcomer

    Joined:
    May 19, 2004
    Messages:
    230
    Likes Received:
    0
    What about NV40's vertex texture look-ups? Pacific Fighters is using them.
    ftp://download.nvidia.com/developer/Papers/2004/Vertex_Textures/Vertex_Textures.pdf
    I think they should use parallax mapping for ATI cards.
     
  14. pat777

    Newcomer

    Joined:
    May 19, 2004
    Messages:
    230
    Likes Received:
    0
    Don't forget about PowerVR. :)
     
  15. fallguy

    Veteran

    Joined:
    Jun 17, 2003
    Messages:
    1,367
    Likes Received:
    11
    I dont care about DV, I think its useless.

    Yes App profiles are nice.

    I dont care about nView, I dont have dual monitors.

    I dont care about the taskbar utility, I dont like things in my taskbar.

    So as I said, its your opinion, dont try to pass it off as fact. Personally I like how ATi's drivers are setup. But then again, thats my opinion.
     
  16. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,062
    Likes Received:
    3,119
    Location:
    New York
  17. Blastman

    Newcomer

    Joined:
    Jul 15, 2003
    Messages:
    176
    Likes Received:
    2
    I’m not so sure those have anything to do with filtering quality. The 6800 has the same ground anomalies as the X800, they’re just pushed farther back into the courtyard corner.
     
  18. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    Since they don't have anything on the desktop at the moment, I don't care about PowerVR.
     
  19. DarthFrog

    Newcomer

    Joined:
    Aug 10, 2004
    Messages:
    52
    Likes Received:
    0
    You are right. I took a closer look and found tell-tale aliasing on the cracks between the stones. In the still image it is much more subtle than the Radeon effect but let me tell you, in-game it is awful. You guessed it - my curiosity was piqued and so I installed the Serious Sam SE demo from a gaming mag CD. The scene in question is right at the start of the technology demo map, you only need to turn 90 degrees.

    Test conditions: GF6800U with ForceWare 61.77, NoAA/16AF forced via control panel, resolution in-game 1024x768 to match the setup in Hanners' test.

    For the first test I set the quality slider in the control panel to "High Performance" and enabled both filtering 'optimizations'. Then I loaded the demo and compared the screen to Hanners' GF6800 image - perfect match (except for the crosshair). And, as I said, the effect may be only barely noticable in the still image but when moving or turning in the game it is glaringly obvious.

    Then I disabled the 'optimizations' one by one and compared, but the effect remained until I moved the quality slider to "High Quality" (which is where it belongs for a R420/NV40 class card anyway :D ). Then the effect was completely gone - no noise/shimmer/flicker on the floor whatsoever.

    Perhaps somebody with an X800 could do the same test? The demo is free and I guess quite a few here may have the full version of Sam anyway.
     
  20. poly-gone

    Newcomer

    Joined:
    May 22, 2004
    Messages:
    93
    Likes Received:
    0
    Now that's nonsense. Geometry displacement for the water surface can be achieved using sine waves in the vertex shader. This would simulate far better dynamic waves than a static displacement map. Even if one were to use scrolling displacement maps, I think the sine wave method would generate better looking waves.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...