X850 XT PE > 7800 GT in F.E.A.R.

Discussion in '3D Hardware, Software & Output Devices' started by Matas, Feb 20, 2006.

Thread Status:
Not open for further replies.
  1. Kocur

    Newcomer

    Joined:
    Feb 4, 2006
    Messages:
    23
    Likes Received:
    0
    Matas,

    Whenever X850 series cards score a win against 7800GT series in a game:

    - the game is very simple when it comes to shaders,
    - the game has a lot of simple operations on vertices,
    - the game heavly depends on memory bandwidth.

    Remember that X850 cards are mighty weak in shader perfromance even in comparison with old 6800 cards (which are weak by today's standards). You cannot compare them with 7800 or X1900 cards in that area. I hope that it will be clearly seen in upcoming games. FEAR, in my opinion, is only thought to be intensive on complex shaders :).
     
  2. Subtlesnake

    Regular

    Joined:
    Mar 18, 2005
    Messages:
    347
    Likes Received:
    126
    That's just the outcome - a certain set of ATI cards perform better than a certain set of Nvidia cards in X situation. No one is disputing that. What we're disputing is your rationale for this.

    You're attributing the performance difference to selective optimisations that give one card an unfair advantage. There's no evidence for this. There is evidence to suggest that the architectural differences are the cause.

    It's only natural that different architectures will excel at different games - that's the case across a broad spectrum of titles. Why is that whenever such situations are highlighted, people assume there's a bias towards one hardware vendor?

    Look at Half-Life 2. Everybody assumed that Half-Life 2 would always be 'biased' towards ATI, because of Valve's relationship with the company. Yet, the CS:S performance figures come out and guess what, Nvidia cards are faster! And with the launch of the X1800 series, guess what, the gains in Half-Life 2 over the 7800 GTX are minimal, especially compared with those in BF2 (a game that seemed to favour Nvidia hardware). The X1800 loses in DoD:S, a Source engine game. Then Lost Coast is released, after much speculation that Valve had deliberately delayed the game to coincide with the X1800 launch, and the 7800 GTX is faster than the X1800 XT again!

    The X1900 XTX is released, giving ATI a substantial (and atypical) lead in Lost Coast, and suddenly it seems ATI hardware is back in favour.

    So this is why you have to be careful about attributing bias.
     
    #42 Subtlesnake, Feb 21, 2006
    Last edited by a moderator: Feb 21, 2006
  3. XMAN26

    Banned

    Joined:
    Feb 17, 2003
    Messages:
    702
    Likes Received:
    1
    OK, then explain this one off. The X800GTO 16 has 16 TMUs, the 6800GS only has 12, but yet, the X800 is getting crushed. Start here and go forward, pay very close attention to all OGL based games.

    http://www.xbitlabs.com/articles/video/display/powercolor-x800gto16_6.html

    Try to explain how a 12pipe(NV) card is beating up on 16 pipe card(ATI)
     
  4. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    Shadows, stencil shadows, what carmack asked for, nVidia gave, thus any game using stencil shadows will show major performance increases on nV cards.
    Doom3 was even playable on a 5800U, but that's the only thing it looks good in when compared to the 9700.

    IL games used vertex buffering which seems too be a weak spot in ati's OGL driver.

    but the 1900xtx allready showed that their current gen is much better in OGL, doom3 doesn't exhibit those big performance differences anymore.

    Problem is with everyone complaining on OGL performance, the difference under D3D is bigger the other way...
     
  5. Matas

    Newcomer

    Joined:
    Feb 15, 2006
    Messages:
    159
    Likes Received:
    0
    But I think that 7800 GT will be faster than X850... in future games like Oblivion... and that F.E.A.R. is just like an exception of nowadays games.
     
  6. XMAN26

    Banned

    Joined:
    Feb 17, 2003
    Messages:
    702
    Likes Received:
    1
    That doesn't explain the D3D games in that comparison.
     
  7. Mintmaster

    Veteran

    Joined:
    Mar 31, 2002
    Messages:
    3,897
    Likes Received:
    87
    Quite the opposite, actually. FarCry and FEAR have quite long shaders, and run faster or similar on the X850XT. Doom3 and Quake4 have short shaders, which is why NVidia's free FP16 normalize comes in handy.

    My suspicion is that this isn't due to hardware, but rather due to the driver's compiler. I've experienced this first-hand with my own long shaders on both hardware. NVidia can hand tune ShaderMark, RightMark and 3DMark quite well, but games have just way too many right now for hand tuning.

    X850XT is far from weak compared to the 6800. Maybe on a per-clock standpoint it loses a bit, mostly in standard shader tests, but not by much. It has the same bandwidth as a 6800U as well. The funny thing is that no-one really recognized the X850XT as particularly fast until it started holding up to the 7800 surprisingly well (by holding up I mean not getting hammered, except, of course, in Doom3).

    It was never shader performance that hurt the X850. It was the lack of FP blending and poor Doom3 performance.
     
    #47 Mintmaster, Feb 21, 2006
    Last edited by a moderator: Feb 21, 2006
  8. Pete

    Pete Moderate Nuisance
    Moderator Legend

    Joined:
    Feb 7, 2002
    Messages:
    5,777
    Likes Received:
    1,814
    I don't mean to resurrect this toasty conversation, but NV did something similar with D3. Not quite the same, as NV doesn't sell cards under their own brand, but it seems to me NV and their AIBs pushed D3 as hard as ATI pushed HL2. See the NV35 previews with bonus Doom 3 benches (sponsored by NV). The reasons may be less nefarious than we'd all like to believe. NV listened to Carmack and molded their GPUs to his engine while Carmack aimed his game at NV hardware (GF1) for various reasons (likely mainly b/c iD was/is an OGL haus). Valve molded their game to the basic DX9 spec, which ATI was simply closer to and faster at with their 9-series for various reasons (and you can see the 6800, no slouch in the DX9 department, was no slouch in HL2, either, thus bypassing ATI/Valve's supposed anti-FX voodoo).

    As for HL2 and its optimizations, wasn't there that guy who got FX cards to run DX9 water effects at decent speed? I forget whether he did it by forcing shaders to FP16 or by translating them to PS1.4, but that stirred the Valve+ATI rumors for a bit, rightfully or not.

    I don't know how to interpret Riddick and IL2/Pacific Fighters, b/c I don't know the technical reasons behind NV hardware performing so well with them. It's possible Riddick was designed with Xbox (1) in mind, so the devs simply spent more time with NV hardware (or NV ppl with Riddick s/w). Or maybe they use DST + PCF to NV's great advantage. Or maybe it's just the double-z/stencil showing itself, though you'd think that'd change with AA. Or it's the HyperZ issue that IIRC Mint pointed out with D3. (BTW, can the DX9 Radeons use Riddick's "PS2++" soft shadows?) As for PF, I think Rage3D's benchmarks offer a clue: ATI takes a dive with AF, whereas NV doesn't. Game engine or poorly (generically) optimized drivers? Dunno.

    Anyway, I can't keep a good specious argument down, so resume if you must.
     
    #48 Pete, Feb 21, 2006
    Last edited by a moderator: Feb 21, 2006
  9. Matas

    Newcomer

    Joined:
    Feb 15, 2006
    Messages:
    159
    Likes Received:
    0

    What about future games like Oblivion?
     
  10. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    I thought FEAR was allready a good indication of where the future is heading.
    Even though FEAR is a TWIMTBP game, it still required a specific set of drivers to get decent performance in, the preliminary benchmarks with the older (77?) forceware drivers showed that r520 was hammering the 7800 all over the place, but with the release of 78? it suddenly became an even game.
    http://www.xbitlabs.com/articles/video/display/geforce-7800_5.html


    Although Anyone sane would expect the G70 to beat the r420 senseless, remember the benchmarks when it just came out? The Original GTX got hammered by the x800xl in Far Cry
    http://www.xbitlabs.com/articles/video/display/geforce-7800_4.html Yet it dominantly owned HalfLife2. which, again shows that TWIMTBP or GITG doesn't mean a thing.


    BTw, is LO:MAC a D3D or OGL game? it seems to be about the only flightsim where the x8's actually performed decently in.


    And, to really let Hella's weary mind have a rest, check this article from t'inq to see if, before launch, ati was really better than nv.
    http://www.theinquirer.net/?article=17978
     
  11. DOGMA1138

    Regular

    Joined:
    Oct 8, 2004
    Messages:
    533
    Likes Received:
    0
    Location:
    IsraHELL
    as mentioned in this post several times allready. the source engine, well atleast the current games based on it. are treating the FX card line as a DX8/8.1 cards. yes you can force the game into DX9 mode but at the cost of having horrible proformance. and yes forcing the shaders to run at half presicion gives you playable frame rate with the faster FX cards, FX5900's mostly. but at the cost of having shaders displaying horrible artifacts. the water shader did work rather fine under half presicion, because its was rather simple(pretty ugly water imo). but allot of the more complex shaders, like the ones they use for glass, and some of the lighting did displayed artifacts. you could see all kinds of artifacts, from very crude gradiants on diffrent materials like glass and metal. to completly screwed up lighting especialy on the ravenhold level. the only thing you can blame Valve for, is that they didnt wrote a complete set of special optimized DX9.0 shaders for the FX line. and even not for the whole line, since pretty much nothing but the FX5900's seemed to be able to run the game DX9 shaders, even at half presicion. but why should they? what reason did they had to spend alot of time and money to optimize DX9 shaders to run at half presicion, w/o displaying artifacts for a very small, allmost negligble number of cards?
     
    #51 DOGMA1138, Feb 22, 2006
    Last edited by a moderator: Feb 22, 2006
  12. ElMoIsEviL

    Newcomer

    Joined:
    Nov 3, 2003
    Messages:
    21
    Likes Received:
    0
    Location:
    Ottawa, Canada
    Seems some peeps are forgetting a few things. Half Life 2 was optimised to take advantage of the DirectX9 specs. ATi molded there architecture (R300) according to the DX9 spec. Valve noticed that the ATi cards ran the game beautifully. ATi took advantage of this by offering a marketing partnership.

    So technically the game was more optimised for ATi's architecture at the time because the game was built around the Directx9 specifications which ATi followed to a "T". nVidia on the other hand wanted to shove there own rendering API into people's faces and opted to take a different route then Microsoft (heck they left the negotiations table as I remember). Thus was born Cg.

    So it doesn't seem that ATi and Valve opted to cheat nVidia users by building the game around ATi's hardware. Just seems both opted to stick with Microsoft's specifications.

    The same happened with ID and nVidia. Mr Carmack opted to stick with OpenGL, Mr Carmack made some suggestions as to what he was looking for, nVidia listened ATi, well they ignored it (as they concentrate on Direct3D if you haven't noticed).

    Doom3 comes out and inferior hardware like the GeForceFX 5900 Ultra outperforms the superior Radeon 9800 Pro.

    End of story..;)

    No one specifically optimised for the other... it just ended up happening.
     
  13. Pete

    Pete Moderate Nuisance
    Moderator Legend

    Joined:
    Feb 7, 2002
    Messages:
    5,777
    Likes Received:
    1,814
    Thanks for the clarification. All I vaguely remember is that someone got the DX9 water working at a minimal performance hit with the FX series, but I never really followed up on it.
     
  14. Neeyik

    Neeyik Homo ergaster
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,231
    Likes Received:
    45
    Location:
    Cumbria, UK
    Thread was heavily edited, and then edited some....and then needed some more after that. I then noticed that it had gone wrong almost 5 posts; it's not really worth saving but I'm also thinking the same about a few user accounts too.
     
    Ailuros likes this.
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...