x800 texture shimmering: FarCry video

I'd be interested in an investigation into the "It's the engine not the driver" point brought up... somewhere. Has anyone got a non-Crytek engine example?
 
PatrickL said:
Oh i read that s why i used my saves to go thru the games. Thoses artifacts must be so rare that i found no problem. Maybe try again with the 4.6 and unless you are really looking for a problem you have been warned off i doubt any player will ever see anything. Well, my personnal experience with the game, not thru videos is a saw nothing and i tried hard all the evening.

While I don't have one of these cards and likely won't it seems your experience sums up what I have pretty much expected to come out of this matter. ATi has filtering optimizations, they don't effect IQ in noticeable ways but increase performance and it isn't app specific. The only reason the affair is being blown out of proportion is because we have a seething mass of NV fans frothing at the bit because NV has gone through insufferable amounts of cheating allegations. Now these characters are looking for disparities and what they come up with are a few pixels here and there. Suddenly now ATi is "cheating" and these characters are screaming rape!

The lower IQ accusations are totally blown way out of proportion and when confronted by people such as yourself the only thing that can be said is that instances are extremely rare and that they have few too none in terms of in game examples. Anyone whom actually decides that this issue is a worth while compliant to go on and on about add nausea needs their head examined. Before I actually saw any comparisons I was expecting to see massive aliasing or blurring the likes of which we haven't seen sence the implement of Quincunx blur filtering.

Sure the issue was noteworthy and even a few threads related to it would have been interesting but this has turned into idiocy. So much so I am beginning to wonder about my enthusiast status... maybe I'm not as much of one as I thought or maybe these people that are being overly critical of the optimization are .. over enthusiast. At any rate your experience really does sum up what I thought would come out of this. I think the over analyzing of the filtering method is simply an attempt to smear ATIs reputation and spread FUD in potential consumers.
 
PatrickL said:
Because 100 % did not see it for a year ?
Yes, but everyone was told that the card was doing full trilinear and there was no way to run the card in full trilinear to see the IQ change ;)
 
Sandwich said:
What competitors? XGI?
Nvidia? They're selling hardware that is even behind Ati's "last years tech". FX5900, FX5600, FX5200. Even when the FX6800 does finally arrive in the stores, it's the older FXs they'll be selling.
The Geforce 6800u is not an FX. There is no such thing as an FX6800.

Sandwich said:
As for nvidia's new tech: game developers have to sell games to a market that is still being flooded with FX cards that are barely even dx9 capable. Why even bother with PS3.0 now?
Why bother? It's in the spec of dx9c which comes with SP2 very soon. This means that every ATI cards that does not support PS3 is not dx9c compliant. So don't diss nv on dx, ATI doesn't even follow it.
 
chavvdarrr said:
on ixbt forums there were examples with Halo (already removed, so no link)
thats because the pictures showed another issue. Not this filtering issue and its seen on all ati cards and on some nvidia card i believe
 
Dutch Guy said:
PatrickL said:
Because 100 % did not see it for a year ?
Yes, but everyone was told that the card was doing full trilinear and there was no way to run the card in full trilinear to see the IQ change ;)
what does it matter ? I"m sure people running the 9600s knew what trilinear was . I"m sure if there was an image quality problem they would have said. Hey this looks like crap compared to my geforce 4 / 3 /2/1/ or this looks like crap compared to my radeon .8500 , 7000

I can spot the diffrence between bilinear and trilinear . I can spot the diffrence with aniso on and it off. I can't spot a diffrence between trylinear and trilinear
 
HaLDoL said:
Why bother? It's in the spec of dx9c which comes with SP2 very soon. This means that every ATI cards that does not support PS3 is not dx9c compliant. So don't diss nv on dx, ATI doesn't even follow it.

And? The GF3 wasn't dx8.1 compliant either. The radeon 8500 was. It didn't matter then. dx9c doesn't mean much now.
Ati cards are perfectly dx9 compliant for 2 years, unlike anything nvidia had up until now and that's just the 6800.
 
Dutch Guy said:
PatrickL said:
Because 100 % did not see it for a year ?
Yes, but everyone was told that the card was doing full trilinear and there was no way to run the card in full trilinear to see the IQ change ;)
when was i told, as a consumer Joe? and then you get into the wordgame of what tri is... and degrees of tri... as a consumer its the picture that counts , as a gfx geek its fun to talk about real, old , true IQ.
 
Sandwich said:
HaLDoL said:
Why bother? It's in the spec of dx9c which comes with SP2 very soon. This means that every ATI cards that does not support PS3 is not dx9c compliant. So don't diss nv on dx, ATI doesn't even follow it.

And? The GF3 wasn't dx8.1 compliant either. The radeon 8500 was. It didn't matter then. dx9c doesn't mean much now.
Ati cards are perfectly dx9 compliant for 2 years, unlike anything nvidia had up until now and that's just the 6800.

FX was not DX9 compliant?
 
Sabastian said:
The lower IQ accusations are totally blown way out of proportion and when confronted by people such as yourself the only thing that can be said is that instances are extremely rare and that they have few too none in terms of in game examples.

You're missing the point imo. Ati claims that it has the same IQ as full trilinear, even better. Most investigations have been trying to find out if this is the case. And it seems that it's not. I don't i've seen any of the investigations claiming that the quality of trylinear is horrible or anything like that. But Ati claimed that they we're doing full trilinear, they told review sites to enable full trilinear on Nvidias cards so that the workload would be the same. And that is clearly wrong.

What has happened now is that some review sites have compared trylinear, brilinear and full trilinear and are claiming that the difference is minimal and that most users won't notice the difference (don't know if that's true for the FX series though). And that's fine by me, but it definitely seems that some games have problems with all the filtering optimizations that the IHV's are using. So options that can disable all of them would be really helpful (like it seems that the NV40 has with the newest drivers). And also good for reviewers to check the raw performance of the cards. I don't understand why anyone wouldn't like to see options like that.
 
Geeforcer said:
Sandwich said:
HaLDoL said:
Why bother? It's in the spec of dx9c which comes with SP2 very soon. This means that every ATI cards that does not support PS3 is not dx9c compliant. So don't diss nv on dx, ATI doesn't even follow it.

And? The GF3 wasn't dx8.1 compliant either. The radeon 8500 was. It didn't matter then. dx9c doesn't mean much now.
Ati cards are perfectly dx9 compliant for 2 years, unlike anything nvidia had up until now and that's just the 6800.

FX was not DX9 compliant?

All this stuff really does not matter to most game players - only the ones that are on the leading edge.. I have both GF 4600 GFX 5700 Ultra and both play most games good enough, I am desiring leading age games and best Open GL card for Pro Graphics and I planning to get 6800 Ultra.

It was really fun all the time people were complaining about the FX series and I got it the FX 5700 just as not open GL card for one of my rendering nodes. I thought by comments it would be worst than my trusty 4600 and I actually use it currently one my main machine. Of course for 3d graphics stuff, 59xx series is recommend, so the 6800 Ultra would be even better. By the way I still runing an orginal GF3 on one of my Rendering nodes - only machine that I problem rendering on my HP 3010 Centrino notebook with has an ATI 9200 - but that is only with Open GL and dual monitors.

All in all, for most uses this stuff doesn't matter much..
 
Grestorn said:
Since so many seem to have problems finding the spot I recorded in the demo, I've prepared a .zip file with the save game.

Instructions:
  • Unpack the zip file into your FarCry installation. It uses its own Profile, so your other save games should be save. Anyway, it's a good idea to create a backup you Profile directory first.
  • Start the game. From the main menu, chose "Profile" and select the profile "ShimmerCheck".
  • Then select "Campaign" and load the only save game present in that profile.
  • Turn on your flashlight, look at the left wall (also visible on the right wall), and move forward and backwards.

Here's the link: http://grestorn.webinit.de/FC_ShimmerSave.zip

And, yes, the shimmering is also visible with Cat 4.6.

thanks will try after dinner
 
And those are the points that should be complained about, Bjorn. And those are the points that need widespread examination, rather than the usual hopping-on of one circumstance and shouted from the rooftops as proof (as happened far too much to both nVidia and ATi last generation, especially with synthetic benchmarks). And those are the points ("choice") we should be pushing on ATi and nVidia both, as well as reminding anyone else who's trying to play the game.

There are easy and instant comparisons between filtering methods, AA modes, AF options, shader choices... It's always been and always GOING to be a trade off between "best quality" and "best performance." Yet there are an immense amount of people who will label one thing as "deplorable" and others as "ignorable," and basically pick and choose through them as they will. (And as suits company preference, for the most part.) Talking about AA in a discussion about filtering may seem like trying to distract, but it's all a part of the big equation: IQ/performance.

We see many examples of tech that doesn't fit their marketing defintions--trilinear that's not, DX9 chips that can't run DX9 games worth a damn or do so at worse quality than OTHER DX9 cards, AA modes which carry the same labels (such as "4xMS") that are distinctly different... Where are we drawing our arbitrary lines? Or is it that we should continue as we always have, examining each situation on its merits with proper analysis, and sharing the results with the community that should--hopefully--be adding on top?

On this issue so far, the analysis has been notably poor so far, and the community just wanting to vent their frustrations. We've certainly seen THAT before, too, with equally little point. From a performance comparison, does a 10% increase from a new driver automatically count as "great"? In IQ, does a shift downward automatically count as "cheat" in a world we know to be filled with bugs? How is it we tell for sure, and how come so many people are not content with doing the WHOLE examination procedure before proclaiming their decisions?

ATi and nVidia both seek eternally to one-up each other and exist both within accepted bounds and trying to push them. They've both experimented with new methods, many times improving on the tried-and-true, and at times even admitted by Microsoft to be superior to what they've laid out. Do we really see any of them sitting still? They'll be playing the IQ/performance game forever and constantly looking for new moves... Heck, it's what we WANT them to do.

Frankly, I think the telling points are in what the companies do during the fallouts of issues, and in the severity of them--after they're properly looked at, of course. (Since otherwise how can we tell?) And we should always be keeping track and keeping perspective. No one "gets off" if another company does something similar--there are no simple checkboxes and tally lists. No one gets "excused"--though we may filter reaction through how long and by what method they fix a perceived issue. And we shouldn't have a list of "obvious wrongs" and "things to shrug about" that fits our preferences, when at the core they're all connected.
 
Bjorn said:
You're missing the point imo. Ati claims that it has the same IQ as full trilinear, even better. Most investigations have been trying to find out if this is the case. And it seems that it's not. I don't i've seen any of the investigations claiming that the quality of trylinear is horrible or anything like that. But Ati claimed that they we're doing full trilinear, they told review sites to enable full trilinear on Nvidias cards so that the workload would be the same. And that is clearly wrong.

I didn't miss the point I got it quite clear. NV gets caught cheating ages ago.. what comes of it? They simply keep doing it. Now ATi gets caught doing something.. you think I'm going to huff and puff about it forever? I got tired of the endless accusations about NV cheating in app specific benchmarks, lower FPP, brilinier on and on it seemed it would never end. I still condemn them for a number of things but the worst of it was that they were caught cheating and never backed down. I'm not going to get into a diatribe about which corporation is the lesser of two evils but I'll drop most of the culpability for all this cheating on NVs lap and not even think twice of it.

Never mind the IQ disparities are at best "rare" to non existent in in-game IQ comparisons and never mind it isn't app specific. Mostly all it does is increase performance. I don't give a dam about a driver switch that lowers performance and does nothing to increase IQ. Why should ATi provide it? So reviewers can show reduced performance in AF on ATi cards compared to NVs cards? I can't see any other reason really and if that is the case I wouldn't oblige them to provide it. No noticeable IQ benefit of it. ATIs Triliner filtering seems to do a fine job by my standards.
 
Sandwich said:
HaLDoL said:
Why bother? It's in the spec of dx9c which comes with SP2 very soon. This means that every ATI cards that does not support PS3 is not dx9c compliant. So don't diss nv on dx, ATI doesn't even follow it.

And? The GF3 wasn't dx8.1 compliant either. The radeon 8500 was. It didn't matter then. dx9c doesn't mean much now.
Ati cards are perfectly dx9 compliant for 2 years, unlike anything nvidia had up until now and that's just the 6800.

You are confusing compliance with performance I do believe.

The nx3x series is DX9 compliant enough to run Ruby with no shader changes, which was ATi's demonstration of their "new big thing".
 
Sabastian said:
Never mind the IQ disparities are at best "rare" to non existent in in-game IQ comparisons and never mind it isn't app specific. Mostly all it does is increase performance. I don't give a dam about a driver switch that lowers performance and does nothing to increase IQ. Why should ATi provide it? So reviewers can show reduced performance in AF on ATi cards compared to NVs cards? I can't see any other reason really and if that is the case I wouldn't oblige them to provide it. No noticeable IQ benefit of it. ATIs Triliner filtering seems to do a fine job by my standards.

The mostly part is the problem. I have yet to see where it increases IQ, only where it lowers it, even though these "problems" might be rare in an actual game.

And the "reduced performance in AF on Ati cards" would hardly mean lower performance then on the NV cards. Just lower in comparision. And the "other reason" for enabling it would of course be to get a more "apples to apples" comparision of raw performance and would get rid of any supposed corner cases where trylinear wouldn't get optimal quality. And of course get the community to shut up about this. I'm also guessing that we'll see brilinear vs trylinear (just look at the Extremetech article where they stated that the difference between all methods were minimal and not noticeable when playing the game) in coming reviews anyway so i don't think it would hurt at this point in time. Make it a checkbox like it is in Nvidias drivers.

I'm going to quote Dave B here from the NV40 preview:

Note: For the purposes of benchmarking we will enable full Trilinear filtering on the 6800 Ultra as this is a high end board and quality compromises shouldn't be forced at this price point.

And for a high end board like the X800, i want the option to enable full Trilinear filtering as it's a high end board and no quality compromises should be forced at this price point :)
 
Back
Top