ATi is ch**t**g in Filtering

christoph said:
@Suspicious
i think we allready have a consense here about the 'ethical' part of this 'issue' so i dont see much sense in arguing that further.....
any news on the technical side of things?

How can you be so damned demanding, man?
That would need, like, actual knowledge! :oops:
 
digitalwanderer said:
Yo Suspicious, if'n it makes you feel any better I'm extremely miffed about how ATi dealt with choosing to keep this secret and their FUD over trilinear and reviewers using colored mipmaps to see their full trilinear in action. :?

But even though they screwed up in the way it was presented, that doesn't necessarily make it a bad thing.

I can sympathize with your aingst over being lied to, I don't like it either....but try and weigh the new technique on it's technical merits rather than your emotions about it to give it a fair chance, it really isn't all that bad and I still can't see a difference.

Granted they didn't tell us and that is without a doubt bad and wrong, but the technique itself is actually quite clever and good.
And you called Dave a Hopeless fanATIc :rolleyes:



















:p
Anyway, I think this is the majority consensous amoung people who really care about the mechanics of the issue. We all agree the way they did it was wrong. But until proven it is a bad thing, will wait until it is thoroughly tested before making any conclusions on the optimization itself. But I guess since we wait on proof instead of going off of "Suspicions" that makes us hopeless fanATIcs :rolleyes:
 
karlotta said:
Most of what your going on about was disccused already, you could read the threads, and do a search.

I read a whole thread but what provoked me to join and post is that as time passes by and number of pages and posts increases I see more people that are willing to accept this "optimization" as actual improvement instead of plain deceiving of public. That is why I reiterated over some points -- I afraid that they will be lost in noise.

Come on people, if that "new" filtering was that good why they weren't ringing all the bells with it?

karlotta said:
I have been a 9600 user and the IQ is better than my 9700, only my 9800xt seemed better do to the higher res and speed i can run it at.

IQ is subjective thing. How do you measure it? With your own eyes? Entire industry has some standards they have to adhere when producing things, only computer hardware seems to get by with anything and I have problem with that.

I am not trying to blow this issue out of proportions but just try to imagine what would happen if car safety were judged by visual inspection and not by crash tests and what not?

This old article deals with a lot of terms discussed here so I suggest everyone to read it:
http://www.digit-life.com/articles2/digest3d/1002/itogi-video-ani.html
 
Suspicious said:
Entire industry has some standards they have to adhere when producing things, only computer hardware seems to get by with anything and I have problem with that.
And the way to solve your problem is to denounce one specific member of the industry as loudly and obnoxiously as possible? Puh-leeez. :rolleyes:

I wonder does this ATI statement still hold after we have learned about their "optimization"?
FFS man, that's about the original Radeon and holds no relevance to the R420 whatsoever.

So ATI scr*wed us for a year and you forgive them because it was good f*ck?
Uh... yeah. That's kinda the whole point of gaming, right?
 
Suspicious said:
FUDie said:
Take your FUD elsewhere.

You can take your hostility up your a$$ as far as I am concerned -- I have every right to comment on ATI IQ since I have their card. You don't have to agree with me though.

FUDie said:
Bad image quality all the time? Funny, not a single person pointed it out until a year later.

So ATI scr*wed us for a year and you forgive them because it was good f*ck?

If I understand correctly you actually give them credit for scr*wing us so good that we haven't even noticed it?

And you don't want people calling you "< f a n b o y >" as stated in your first post? :rolleyes:

Chill dude, most of the arguments you are claiming have been discussed in the whole thread (some several times).

Get some snacks, something nice to drink, make sure no one will bother you (hint, disable messenger programs) and read it all over again. ;)
 
Chalnoth said:
croc_mak said:
AFAIK there is no exact equation that needs to be executed to any bit of accuracy with lod calculation either - in DirectX or Opengl
There is. It's in the spec. There are acceptable LOD calculations, and there are unacceptable ones. It's all very well-defined, though I'm a bit tired to dig up the spec myself. If you're curious, you can download the OpenGL spec from http://www.opengl.org

and there are well defined RANGES OF PRECICION, and nothing more, because opengl is platform, and hardware independent. it only defines minimum quality requirements, and THATS IT. it does NOT force, in ANY form, bitwise accuracy. the trylinear is an acceptable, allowed optimisation, according to opengl and dx specs. brilinear is not (if you want to count it as a trilinear replacement).

how long does it take for people to aktually understand this? how f***ing long? this thread is 50 pages, and still people drop in claiming ati to do any cheating while they never can see it, and, by themselfes, not even measure it. all they can do is looking at some pictures, don't see a difference, but of course know bether.
 
davepermen said:
Chalnoth said:
croc_mak said:
AFAIK there is no exact equation that needs to be executed to any bit of accuracy with lod calculation either - in DirectX or Opengl
There is. It's in the spec. There are acceptable LOD calculations, and there are unacceptable ones. It's all very well-defined, though I'm a bit tired to dig up the spec myself. If you're curious, you can download the OpenGL spec from http://www.opengl.org

and there are well defined RANGES OF PRECICION, and nothing more, because opengl is platform, and hardware independent. it only defines minimum quality requirements, and THATS IT. it does NOT force, in ANY form, bitwise accuracy. the trylinear is an acceptable, allowed optimisation, according to opengl and dx specs. brilinear is not (if you want to count it as a trilinear replacement).

how long does it take for people to aktually understand this? how f***ing long? this thread is 50 pages, and still people drop in claiming ati to do any cheating while they never can see it, and, by themselfes, not even measure it. all they can do is looking at some pictures, don't see a difference, but of course know bether.

I have seriously doubt ATI's drivers would meet any serious CAD certification because of these optimisations. ( since some of them require bit accurate rendering ).


I don't see how you think that bilinear is not acceptable but trilinear is as their are no precision ranges for the sepfication in the opengl spec its maths. Its forumalic their are no error ranges or anything in what they put up.
 
davepermen said:
and there are well defined RANGES OF PRECICION, and nothing more, because opengl is platform, and hardware independent. it only defines minimum quality requirements, and THATS IT. it does NOT force, in ANY form, bitwise accuracy. the trylinear is an acceptable, allowed optimisation, according to opengl and dx specs. brilinear is not (if you want to count it as a trilinear replacement).
One thing:
How can we define ATI's trilinear optimizations as "within spec" if we don't know what those optimizations are?
 
because you can measure the variance and see it falls into spec ?!? because you don't know SHIT ABOUT WHATS REALLY GOING ON IN ANY HW OR DRIVER ANYWAYS?

you have to trust them. you always had to. the output is valid, and fine. they describe in some detail what they do, how they do it. you can say they all lie. in the same breath, i can say all the cheat-accusing dudes out there lie, those images are fake, and all. we can not base our discussion on the simple unbelieving on you. you eighter trust them, or not. there is no, simply NO thing that shows otherwise.

you have to trust the shader optimizer of nvidia as well. it can (and will) change output, due to its optimisations. this, now, is a filtering optimizer. so what?
 
davepermen said:
you have to trust the shader optimizer of nvidia as well. it can (and will) change output, due to its optimisations.
It shouldn't. It should simply recompile the code into a form that is more amenable to the hardware. Well, I suppose this may cause single-bit errors, but the likely hood of these making it to the output is pretty slim.

this, now, is a filtering optimizer. so what?
Fine. What is it doing? No, ATI has not described in any reasonable detail what they are doing. They have only stated that they examine textures at load time to see if they can apply their optimization. They have not stated what that optimization is.
 
davepermen said:
because you can measure the variance and see it falls into spec ?!? because you don't know SHIT ABOUT WHATS REALLY GOING ON IN ANY HW OR DRIVER ANYWAYS?

you have to trust them. you always had to. the output is valid, and fine. they describe in some detail what they do, how they do it. you can say they all lie. in the same breath, i can say all the cheat-accusing dudes out there lie, those images are fake, and all. we can not base our discussion on the simple unbelieving on you. you eighter trust them, or not. there is no, simply NO thing that shows otherwise.

you have to trust the shader optimizer of nvidia as well. it can (and will) change output, due to its optimisations. this, now, is a filtering optimizer. so what?

Can you please point out which page of the opengl 1.5 spec shows the allowed varience. For trilinear filtering or for anything else.
 
Suspicious said:
You don't have a point. Just because I havent't noticed something doesn't mean it is ok to do it behind my back.

What you are basically saying is "go on people, steal from me, it is ok as long as I don't catch you".

ha ha ha, that's so funny it's unreal.

It's all about viewing, seeing image quality differences. So yep, if you don't see it, it doesn't matter because it's not real, it's not tangible, nothing is being stolen - if you can't see it!

Unlike having something real stolen from you.

People please, stop with the world ending dramatics.
 
Suspicious said:
Unlike having something real stolen from you.

People please, stop with the world ending dramatics.
Hey if you pay for somethign and it doesn't do what they advertise it does I'd call that stealing.
 
if i steal something from you and give you back, for free, an as much worthing thing, looking the same, what did i take from you?

if i take you a one dollar note, and give you another one, does it hurt you?

there is nothing stolen. only replaced. by an equal performing thing. but i guess this is all just flaming, because normal people understood it pages ago
 
bloodbob said:
Suspicious said:
Unlike having something real stolen from you.

People please, stop with the world ending dramatics.
Hey if you pay for somethign and it doesn't do what they advertise it does I'd call that stealing.

prove to me it doesn't do what they advertise it does?
 
www.ixbt.com/video2/nv40-rx800-3.shtml

It's in russian, but screenshots are universal and utilitie itself is in English.

The main idea is: rv3x0/r420 are using full trilinear for static and dynamic DX textures, optimized trilinear is used for managed textures. And there certainly IS a difference in quality.

(The article comparing trilinear in real world games on r360 and r420 should be ready soon too.)
 
DegustatoR said:
www.ixbt.com/video2/nv40-rx800-3.shtml

It's in russian, but screenshots are universal and utilitie itself is in English.

The main idea is: rv3x0/r420 are using full trilinear for static and dynamic DX textures, optimized trilinear is used for managed textures. And there certainly IS a difference in quality.

(The article comparing trilinear in real world games on r360 and r420 should be ready soon too.)

Thanks its looking really good :p
 
Back
Top