x800 texture shimmering: FarCry video

Discussion in 'Architecture and Products' started by Grestorn, Jun 7, 2004.

  1. Quitch

    Veteran

    Joined:
    Jun 11, 2003
    Messages:
    1,521
    Likes Received:
    4
    Location:
    UK
    I'd be interested in an investigation into the "It's the engine not the driver" point brought up... somewhere. Has anyone got a non-Crytek engine example?
     
  2. chavvdarrr

    Veteran

    Joined:
    Feb 25, 2003
    Messages:
    1,165
    Likes Received:
    34
    Location:
    Sofia, BG
    on ixbt forums there were examples with Halo (already removed, so no link)
     
  3. Sabastian

    Regular

    Joined:
    Feb 6, 2002
    Messages:
    991
    Likes Received:
    2
    Location:
    Canada
    While I don't have one of these cards and likely won't it seems your experience sums up what I have pretty much expected to come out of this matter. ATi has filtering optimizations, they don't effect IQ in noticeable ways but increase performance and it isn't app specific. The only reason the affair is being blown out of proportion is because we have a seething mass of NV fans frothing at the bit because NV has gone through insufferable amounts of cheating allegations. Now these characters are looking for disparities and what they come up with are a few pixels here and there. Suddenly now ATi is "cheating" and these characters are screaming rape!

    The lower IQ accusations are totally blown way out of proportion and when confronted by people such as yourself the only thing that can be said is that instances are extremely rare and that they have few too none in terms of in game examples. Anyone whom actually decides that this issue is a worth while compliant to go on and on about add nausea needs their head examined. Before I actually saw any comparisons I was expecting to see massive aliasing or blurring the likes of which we haven't seen sence the implement of Quincunx blur filtering.

    Sure the issue was noteworthy and even a few threads related to it would have been interesting but this has turned into idiocy. So much so I am beginning to wonder about my enthusiast status... maybe I'm not as much of one as I thought or maybe these people that are being overly critical of the optimization are .. over enthusiast. At any rate your experience really does sum up what I thought would come out of this. I think the over analyzing of the filtering method is simply an attempt to smear ATIs reputation and spread FUD in potential consumers.
     
  4. Dutch Guy

    Newcomer

    Joined:
    Jun 9, 2004
    Messages:
    38
    Likes Received:
    0
    Location:
    Amsterdam, the Netherlands
    Yes, but everyone was told that the card was doing full trilinear and there was no way to run the card in full trilinear to see the IQ change :wink:
     
  5. HaLDoL

    Newcomer

    Joined:
    Jun 10, 2004
    Messages:
    140
    Likes Received:
    2
    The Geforce 6800u is not an FX. There is no such thing as an FX6800.

    Why bother? It's in the spec of dx9c which comes with SP2 very soon. This means that every ATI cards that does not support PS3 is not dx9c compliant. So don't diss nv on dx, ATI doesn't even follow it.
     
  6. jvd

    jvd
    Banned

    Joined:
    Feb 13, 2002
    Messages:
    12,724
    Likes Received:
    9
    Location:
    new jersey
    thats because the pictures showed another issue. Not this filtering issue and its seen on all ati cards and on some nvidia card i believe
     
  7. jvd

    jvd
    Banned

    Joined:
    Feb 13, 2002
    Messages:
    12,724
    Likes Received:
    9
    Location:
    new jersey
    what does it matter ? I"m sure people running the 9600s knew what trilinear was . I"m sure if there was an image quality problem they would have said. Hey this looks like crap compared to my geforce 4 / 3 /2/1/ or this looks like crap compared to my radeon .8500 , 7000

    I can spot the diffrence between bilinear and trilinear . I can spot the diffrence with aniso on and it off. I can't spot a diffrence between trylinear and trilinear
     
  8. Sandwich

    Regular

    Joined:
    Mar 25, 2004
    Messages:
    288
    Likes Received:
    1
    And? The GF3 wasn't dx8.1 compliant either. The radeon 8500 was. It didn't matter then. dx9c doesn't mean much now.
    Ati cards are perfectly dx9 compliant for 2 years, unlike anything nvidia had up until now and that's just the 6800.
     
  9. karlotta

    karlotta pifft
    Veteran

    Joined:
    Jun 7, 2003
    Messages:
    1,292
    Likes Received:
    10
    Location:
    oregon
    when was i told, as a consumer Joe? and then you get into the wordgame of what tri is... and degrees of tri... as a consumer its the picture that counts , as a gfx geek its fun to talk about real, old , true IQ.
     
  10. Geeforcer

    Geeforcer Harmlessly Evil
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,320
    Likes Received:
    525
    FX was not DX9 compliant?
     
  11. Bjorn

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,775
    Likes Received:
    1
    Location:
    Luleå, Sweden
    You're missing the point imo. Ati claims that it has the same IQ as full trilinear, even better. Most investigations have been trying to find out if this is the case. And it seems that it's not. I don't i've seen any of the investigations claiming that the quality of trylinear is horrible or anything like that. But Ati claimed that they we're doing full trilinear, they told review sites to enable full trilinear on Nvidias cards so that the workload would be the same. And that is clearly wrong.

    What has happened now is that some review sites have compared trylinear, brilinear and full trilinear and are claiming that the difference is minimal and that most users won't notice the difference (don't know if that's true for the FX series though). And that's fine by me, but it definitely seems that some games have problems with all the filtering optimizations that the IHV's are using. So options that can disable all of them would be really helpful (like it seems that the NV40 has with the newest drivers). And also good for reviewers to check the raw performance of the cards. I don't understand why anyone wouldn't like to see options like that.
     
  12. karlotta

    karlotta pifft
    Veteran

    Joined:
    Jun 7, 2003
    Messages:
    1,292
    Likes Received:
    10
    Location:
    oregon
  13. hstewarth

    Newcomer

    Joined:
    Apr 13, 2004
    Messages:
    99
    Likes Received:
    0
    All this stuff really does not matter to most game players - only the ones that are on the leading edge.. I have both GF 4600 GFX 5700 Ultra and both play most games good enough, I am desiring leading age games and best Open GL card for Pro Graphics and I planning to get 6800 Ultra.

    It was really fun all the time people were complaining about the FX series and I got it the FX 5700 just as not open GL card for one of my rendering nodes. I thought by comments it would be worst than my trusty 4600 and I actually use it currently one my main machine. Of course for 3d graphics stuff, 59xx series is recommend, so the 6800 Ultra would be even better. By the way I still runing an orginal GF3 on one of my Rendering nodes - only machine that I problem rendering on my HP 3010 Centrino notebook with has an ATI 9200 - but that is only with Open GL and dual monitors.

    All in all, for most uses this stuff doesn't matter much..
     
  14. jvd

    jvd
    Banned

    Joined:
    Feb 13, 2002
    Messages:
    12,724
    Likes Received:
    9
    Location:
    new jersey
    thanks will try after dinner
     
  15. cthellis42

    cthellis42 Hoopy Frood
    Legend

    Joined:
    Jun 15, 2003
    Messages:
    5,890
    Likes Received:
    33
    Location:
    Out of my gourd
    And those are the points that should be complained about, Bjorn. And those are the points that need widespread examination, rather than the usual hopping-on of one circumstance and shouted from the rooftops as proof (as happened far too much to both nVidia and ATi last generation, especially with synthetic benchmarks). And those are the points ("choice") we should be pushing on ATi and nVidia both, as well as reminding anyone else who's trying to play the game.

    There are easy and instant comparisons between filtering methods, AA modes, AF options, shader choices... It's always been and always GOING to be a trade off between "best quality" and "best performance." Yet there are an immense amount of people who will label one thing as "deplorable" and others as "ignorable," and basically pick and choose through them as they will. (And as suits company preference, for the most part.) Talking about AA in a discussion about filtering may seem like trying to distract, but it's all a part of the big equation: IQ/performance.

    We see many examples of tech that doesn't fit their marketing defintions--trilinear that's not, DX9 chips that can't run DX9 games worth a damn or do so at worse quality than OTHER DX9 cards, AA modes which carry the same labels (such as "4xMS") that are distinctly different... Where are we drawing our arbitrary lines? Or is it that we should continue as we always have, examining each situation on its merits with proper analysis, and sharing the results with the community that should--hopefully--be adding on top?

    On this issue so far, the analysis has been notably poor so far, and the community just wanting to vent their frustrations. We've certainly seen THAT before, too, with equally little point. From a performance comparison, does a 10% increase from a new driver automatically count as "great"? In IQ, does a shift downward automatically count as "cheat" in a world we know to be filled with bugs? How is it we tell for sure, and how come so many people are not content with doing the WHOLE examination procedure before proclaiming their decisions?

    ATi and nVidia both seek eternally to one-up each other and exist both within accepted bounds and trying to push them. They've both experimented with new methods, many times improving on the tried-and-true, and at times even admitted by Microsoft to be superior to what they've laid out. Do we really see any of them sitting still? They'll be playing the IQ/performance game forever and constantly looking for new moves... Heck, it's what we WANT them to do.

    Frankly, I think the telling points are in what the companies do during the fallouts of issues, and in the severity of them--after they're properly looked at, of course. (Since otherwise how can we tell?) And we should always be keeping track and keeping perspective. No one "gets off" if another company does something similar--there are no simple checkboxes and tally lists. No one gets "excused"--though we may filter reaction through how long and by what method they fix a perceived issue. And we shouldn't have a list of "obvious wrongs" and "things to shrug about" that fits our preferences, when at the core they're all connected.
     
  16. jvd

    jvd
    Banned

    Joined:
    Feb 13, 2002
    Messages:
    12,724
    Likes Received:
    9
    Location:
    new jersey
    I get an error crysystem.dll loading failed

    Something about default font from the xml file ?
     
  17. Sabastian

    Regular

    Joined:
    Feb 6, 2002
    Messages:
    991
    Likes Received:
    2
    Location:
    Canada
    I didn't miss the point I got it quite clear. NV gets caught cheating ages ago.. what comes of it? They simply keep doing it. Now ATi gets caught doing something.. you think I'm going to huff and puff about it forever? I got tired of the endless accusations about NV cheating in app specific benchmarks, lower FPP, brilinier on and on it seemed it would never end. I still condemn them for a number of things but the worst of it was that they were caught cheating and never backed down. I'm not going to get into a diatribe about which corporation is the lesser of two evils but I'll drop most of the culpability for all this cheating on NVs lap and not even think twice of it.

    Never mind the IQ disparities are at best "rare" to non existent in in-game IQ comparisons and never mind it isn't app specific. Mostly all it does is increase performance. I don't give a dam about a driver switch that lowers performance and does nothing to increase IQ. Why should ATi provide it? So reviewers can show reduced performance in AF on ATi cards compared to NVs cards? I can't see any other reason really and if that is the case I wouldn't oblige them to provide it. No noticeable IQ benefit of it. ATIs Triliner filtering seems to do a fine job by my standards.
     
  18. radar1200gs

    Regular

    Joined:
    Nov 30, 2002
    Messages:
    900
    Likes Received:
    0
    You are confusing compliance with performance I do believe.

    The nx3x series is DX9 compliant enough to run Ruby with no shader changes, which was ATi's demonstration of their "new big thing".
     
  19. Bjorn

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,775
    Likes Received:
    1
    Location:
    Luleå, Sweden
    The mostly part is the problem. I have yet to see where it increases IQ, only where it lowers it, even though these "problems" might be rare in an actual game.

    And the "reduced performance in AF on Ati cards" would hardly mean lower performance then on the NV cards. Just lower in comparision. And the "other reason" for enabling it would of course be to get a more "apples to apples" comparision of raw performance and would get rid of any supposed corner cases where trylinear wouldn't get optimal quality. And of course get the community to shut up about this. I'm also guessing that we'll see brilinear vs trylinear (just look at the Extremetech article where they stated that the difference between all methods were minimal and not noticeable when playing the game) in coming reviews anyway so i don't think it would hurt at this point in time. Make it a checkbox like it is in Nvidias drivers.

    I'm going to quote Dave B here from the NV40 preview:

    And for a high end board like the X800, i want the option to enable full Trilinear filtering as it's a high end board and no quality compromises should be forced at this price point :)
     
  20. PatrickL

    Veteran

    Joined:
    Mar 3, 2003
    Messages:
    1,315
    Likes Received:
    13
    Bjorn, buy a X800 pro, play with it then you will see how little sense your post have in real gaming situation :lol:
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...