Good explanation of filtering (must read for beginners)

No one here can show any reduced iq with ati's version. Yet they did with nvidia . That is a fact .

You seem to only be seeing what you want to see. First of all, NV's new filtering method with trilinear optimizations on in the 60.72/61.11 drivers is apparently much improved over their older techniques that were used with the FX. Second of all, some reviewers have noted some slight differences in filtering quality between the 6800 and X800. And as I have mentioned many times before, some reviewers have also noted in their initial set of reviews that NV's aniso filtering algorithm in the 6800 appears at times to be clearer and more distinct than ATI's aniso filtering algorithm.

This is not an issue about cheat or not cheat, but rather an issue about what settings to use when benching new NV cards against new ATI cards. Realistically, the X800 cards should be benched against the 6800 cards with trilinear optimizations on, especially since NV's new brilinear filtering method seems to be vastly improved this go-around.

Also i don't see fp24 bit being full persicion . That was ms . It is not my fault that nvidia supports 16/32bit and not 24bit. That is nvidia's fault. For them full pericion under dx 9 specs is fp 32 bit .

A dev never asks for 32bit or 24 bit. THey ask for full or pp hints . Full is 24/32bit and pp hints are 16bit.

Once again it is not my fault that nvidia is supporting fp 32 and can not support it at a fast enough speed that they have to foce 16 bit which is not full .

NV's support for FP16 and FP32 is arguably a good thing, especially in this new generation. The entire industry is moving towards FP32, SM 3.0 stipulates that full precision is FP32, and FP16 is useful in any situation where lower precision will not result in any visual anomolies or errors. Also, to my understanding, the FP32 performance on the 6800U should move more closely in line with FP16 performance as the driver matures. The 6800U is very fast using FP32 anyway.
 
You seem to only be seeing what you want to see. First of all, NV's new filtering method with trilinear optimizations on in the 60.72/61.11 drivers is apparently much improved over their older techniques that were used with the FX
No i am not talking about that . NO websites have done anything about the new tech in the drivers. I am talking about when the stink was made about nvidia as it is now being done with ati.

Second of all, some reviewers have noted some slight differences in filtering quality between the 6800 and X800
So are these slight diffrences worse or better ? they haven't said .

And as I have mentioned many times before, some reviewers have also noted in their initial set of reviews that NV's aniso filtering algorithm in the 6800 appears at times to be clearer and more distinct than ATI's aniso filtering algorithm.

And the other times ? are they the same ? are atis better ?

Very vague . Once again my point still stands .

This is not an issue about cheat or not cheat, but rather an issue about what settings to use when benching new NV cards against new ATI cards. Realistically, the X800 cards should be benched against the 6800 cards with trilinear optimizations on.
which is what is happening as you can't turn off the optimizations on either card with the latest drivers .

NV's support for FP16 and FP32 is arguably a good thing, especially in this new generation
:oops:



The entire industry is moving towards FP32
Well the whole indursty is using fp 24 already . Nvidia is still using fx 12 , fp 16 , fp 32 . But very little of fp32 . DO you think suddenly because nvidia has 1 part which may or may not be capable of decent speeds with fp32 and sm 3.0 .




and FP16 is useful in any situation where lower precision will not result in any visual anomolies or errors
only when a dev requests it though. Not when nvidia deciedes to change it to fp 16 or lower so they can run the shader at the speed.

Also, to my understanding, the FP32 performance on the 6800U should move more closely in line with FP16 performance as the driver matures.
Your understanding ? should ? as drivers mature ?

Well basicly your saying you don't know . What happens if it doesn't ?



The 6800U is very fast using FP32 anyway.
really in what game ?
 
:rolleyes:

No i am not talking about that . NO websites have done anything about the new tech in the drivers. I am talking about when the stink was made about nvidia as it is now being done with ati.

Reviewers and consumers are upset at ATI for very obvious reasons: they informed reviewers to bench NV cards using non-optimized trilinear filtering, while their own cards had a secret optimized filtering method that no one knew about until it was recently discovered. There is some good reason to be upset at ATI about this, just as there was good reason to be upset at NV in the past, so give it a rest already.

So are these slight diffrences worse or better ? they haven't said .

Why don't you email the reviewers for clarification?

And the other times ? are they the same ? are atis better ?

Again, this is an issue you should discuss with the reviewers. Email them for clarification. As of now, I have yet to see any review that claims that ATI's filtering methods on the X800 are more effective than that on the 6800U, FWIW.

which is what is happening as you can't turn off the optimizations on either card with the latest drivers .

You mean the latest "beta", non WHQL drivers? This situation realistically should not be seen moving forwards. In fact, NV has stated that users will actually get more filtering options moving forwards.

Well the whole indursty is using fp 24 already .

LOL, sure, other than NVIDIA and any company working on next gen hardware ;)

Nvidia is still using fx 12 , fp 16 , fp 32 . But very little of fp32 .

What do you mean by "very little of FP32"? Did you program the new gaming engines? That doesn't really make any sense. It is always advisable to use the least precision that does not result in any visual anomolies or errors. FP16 is perfectly fine for some situations, and FP32 is used for the rest.

DO you think suddenly because nvidia has 1 part which may or may not be capable of decent speeds with fp32 and sm 3.0 .

It is not really a question of "may or may not", the 6800U does have very fast FP32 performance, and the Shadermark 2.0 tests show exactly this. Probably just the tip of the iceberg too, considering how raw the drivers are.
 
Well I am sick of nvidia fanboys...

That is just because NV cards cost more for the same performance in the high end, why is that? Because they have to many blasted fans...

Oh and yeah I was just thinking maybe I would like to try the 6800, but if it costs >$100 more than the x800 pro which seems the same well forget it i'll stick with ati.
 
Reviewers and consumers are upset at ATI for very obvious reasons: they informed reviewers to bench NV cards using non-optimized trilinear filtering, while their own cards had a secret optimized filtering method that no one knew about until it was recently discovered. There is some good reason to be upset at ATI about this, just as there was good reason to be upset at NV in the past, so give it a rest already.

really can u post where they said this with regards to the x800s ?

Yes there would be a reason if anyone can prove that there is an image quality problem. Which no one has done yet. There is one video so far which i can not reproduce and he has yet to answer me on what drivers were used.

Why don't you email the reviewers for clarification?

RIght i would but i don't know which reviewers your talking about. Point out the reviews and I will contact them .

Again, this is an issue you should discuss with the reviewers. Email them for clarification. As of now, I have yet to see any review that claims that ATI's filtering methods on the X800 are more effective than that on the 6800U, FWIW.
well considering that no one can see any problems with ati's method short of subtracting them from another image i don't any problem with this and aagain what reviewers ? Links i will contact them .

You mean the latest "beta", non WHQL drivers? This situation realistically should not be seen moving forwards. In fact, NV has stated that users will actually get more filtering options moving forwards
No i mean the betas given to reviewers to use to benchmark . If there was such an obvious problem then surely nvidia shouldn't have let people benchmark with these drivers ? Let alone if 6800ultras are actually shipping these are the only drivers end users can use !!!



LOL, sure, other than NVIDIA and any company working on next gen hardware

Let me explain if you can't understand dx 9.

When a developer asks for full persicion they get 24bit or higher . So ever dev making a dx 9 game is using fp 24 .

That simple enough for you ?

What do you mean by "very little of FP32"? Did you program the new gaming engines? That doesn't really make any sense. It is always advisable to use the least precision that does not result in any visual anomolies or errors. FP16 is perfectly fine for some situations, and FP32 is used for the rest.

see carmacks .plan (yes they took out the nv3x path recently ) see half life 2 . See farcry.

Fp16 may be fine for all situations . But if a dev wants full persicion (i.e fp 24 or 32bit ) then they should get it. IT is not up to nvidia to change it . A dev should not have to put in a new code path because hardware can't run at full percision and they have to run it at fp16 .

Again this has been discused to death. Go and read some threads on it .

It is not really a question of "may or may not", the 6800U does have very fast FP32 performance, and the Shadermark 2.0 tests show exactly this. Probably just the tip of the iceberg too, considering how raw the drivers are.

Show me a game .

Hell even 3dmark hasn't approved 1 driver for the 6800ultra . Farcry forces it to render lower persicion like the nv3x at this time too .

There is no proof that in games this wll run fast .


Right now we don't know how sm 3.0 is going to run. We don't know how fp 32 is going to run.
 
jvd said:
Sxotty said:
jvd said:
But nvidia did worse than what ati has done. THey had lowered iq noticable . Yet thier fans defended them.

You are defending ATI now as a fan it seems kind of silly to bash others for what you are doing yourself. Furthermore looking at dave's polls on methods of filtering it certainly seems like a wash all around until people are told what to vote for.
jvd said:
If the dev asks for 32bit fp then they should get it. Its not up to nvidia to choose when a dev gets what he asks for .

Now when does a dev ask for fp32, if they ask for that you are suggesting that ATI cards should just have a pop up window saying "oops sorry we can't do it"

If you are really trying to imply that when they ask for full precision which you view as fp24 or 32 they should get one or the other, then that is what you should say instead of imply that they actually are asking for fp32, or full trilinear *cough cough*

You don't know how to read do you .

No one here can show any reduced iq with ati's version. Yet they did with nvidia . That is a fact . Unless you ahve proof otherwise and no subtracting a picture from another picture is not proof as no one can see the image quality diffrence. With the first ffew versions of nvidia's way you can see the horrible diffrence


Also i don't see fp24 bit being full persicion . That was ms . It is not my fault that nvidia supports 16/32bit and not 24bit. That is nvidia's fault. For them full pericion under dx 9 specs is fp 32 bit .

A dev never asks for 32bit or 24 bit. THey ask for full or pp hints . Full is 24/32bit and pp hints are 16bit.

Once again it is not my fault that nvidia is supporting fp 32 and can not support it at a fast enough speed that they have to foce 16 bit which is not full .

Stop living in the past. That won't affect anybody using the latest drivers. Anyway IIRC the whole IQ issue started with a Tech Report article that showed the differences by image subtraction, so I think you exaggerate just a tad.
 
Stop living in the past. That won't affect anybody using the latest drivers. Anyway IIRC the whole IQ issue started with a Tech Report article that showed the differences by image subtraction, so I think you exaggerate just a tad.

stop living in the past ? Its still going on with regards to nvidia . How is that hte past ? because they suddenly introduced a new card they are going to stop cheating ?

As for the iq issue . Well the original nvidia issues were easly seen with blown up images and if you were unlucky enough to see it in motion there was no questiong.

THe ati problems are with image subtraction .
 
jvd said:
Stop living in the past. That won't affect anybody using the latest drivers. Anyway IIRC the whole IQ issue started with a Tech Report article that showed the differences by image subtraction, so I think you exaggerate just a tad.

stop living in the past ? Its still going on with regards to nvidia . How is that hte past ? because they suddenly introduced a new card they are going to stop cheating ?

As for the iq issue . Well the original nvidia issues were easly seen with blown up images and if you were unlucky enough to see it in motion there was no questiong.

THe ati problems are with image subtraction .

It isn't still going on with nVidia. Brilinear has evolved and can be disabled altogether.

And the same thing with your shader replacement claims. Prove them. nVidia no longer does this.
 
It isn't still going on with nVidia. Brilinear has evolved and can be disabled altogether.

And the same thing with your shader replacement claims. Prove them. nVidia no longer does this.

On the 6800 series you can not disable them .


As for shaders its already proven they do it. Prove that they don't still do it .
 
radar1200gs said:
It isn't still going on with nVidia. Brilinear has evolved and can be disabled altogether.
Yes, it is still going on. Under no circumstances has anyone been able to disable it on an FX board, so it is still going on. Presently they have only disabled in one set of beta drivers for a bearly available board, so it is still going on

And the same thing with your shader replacement claims. Prove them. nVidia no longer does this.
Why do you think the drivers released that don't get Futuremarks approval don't get their approval? In fact, is there even a set of approved FX drivers that Futuremark have approval for the ps2.0 test?
 
Point out the reviews and I will contact them .

I've spoken of these reviews many times before. Do a search.

No i mean the betas given to reviewers to use to benchmark . If there was such an obvious problem then surely nvidia shouldn't have let people benchmark with these drivers ? Let alone if 6800ultras are actually shipping these are the only drivers end users can use !!!

You are not being very coherent here. Users have at least two or three drivers to choose from (including the FM approve 60.72 drivers that many reviewers benched with, and that has the option of turning trilinear optimizations off).

When a developer asks for full persicion they get 24bit or higher . So ever dev making a dx 9 game is using fp 24 .

That simple enough for you ?

Read my comment again. I said next generation hardware for good reason.

Fp16 may be fine for all situations . But if a dev wants full persicion (i.e fp 24 or 32bit ) then they should get it. IT is not up to nvidia to change it . A dev should not have to put in a new code path because hardware can't run at full percision and they have to run it at fp16 .

NVIDIA works hand in hand with the developers to help ensure that the games run as smoothly as possible on their hardware. With the FX series, there was much to gain using FP16 partial precision because of architectural limitations. With the 6800U, full FP32 precision can be used much more often without bringing down performance. Like I said before, FP16 precision is supposed to be used in situations where there are no visual anomolies or errors. This makes sense for efficiency purposes. Anyway, for DirectX 9.0c and future versions of DirectX, FP32 is full precision, and everything less than that is considered partial precision.

Show me a game .

I don't need to show you anything. It's a no-brainer really to anyone who is willing to open their eyes.

Farcry forces it to render lower persicion like the nv3x at this time too .

Precision is not the major issue with FarCry. The major issue is that the 6800U is running on the sub-optimal NV3x path that uses lower quality shaders. Let's revisit this issue in one or two months when the SM 3.0 add-on comes out. I suspect that it will be a real eye-opener.

Right now we don't know how sm 3.0 is going to run. We don't know how fp 32 is going to run.

We know a lot more than you think. We know that SM 3.0 is designed to make certain effects run more efficiently than SM 2.0. We know that SM 3.0 will be available in several titles coming out this year. We know that the 6800U performs very well in shadermark when running at full FP32 precision. We also know that the drivers are very raw, and that the compiler is really in it's infancy. Again, let's revisit this issue in one or two months ;)
 
Nvidia has been demonstrated to at best stretch the truth and at worst outright lie in the past.

After one driver release for an as-yet unavailable card the ability to disable brilinear has been itself disabled. How about we reserve judgement on whether Nvidia has truly disabled brilinear until there are shipping cards and drivers. So far it's looking like 50/50, one NV4x driver did allow it, one didn't. No way to tell what the final driver will do.
 
I think the screenshots posted at… THG … are bogus. It seems to me the whole articles premise that … NV-Brilinear = ATI-Adaptive-Tri … is based on this one screenshot only.

I would like to know how, where, and under what circumstances these screenshots were obtained. Was ATI’s Adaptive-Tri forced on the color mipmps like another article did? Did they take slight bit differences and magnify them 100x to contrast differences that aren’t really visible to the naked eye? These screenshots just don’t jive with the excellent filtering displayed by the X800 in the xbit article.
 
I've spoken of these reviews many times before. Do a search.
I did , i ran a search for your name in not one of your posts in the last month have you posted links to reviews. So post them here .

You are not being very coherent here. Users have at least two or three drivers to choose from (including the FM approve 60.72 drivers that many reviewers benched with, and that has the option of turning trilinear optimizations off).
right which is the lowest performing out of them all and i only see 3 sites that benchmarked with them and they are also the oldest ones benchmarked.

So we have 1 version of the driver that lets you turn them off only on 6800ultras and two versions that don't allow you to turn it off and are used on the majority of benchmarks .



Read my comment again. I said next generation hardware for good reason.

Do we know this ? farcry which nvidia mentions at every turn for sm 3.0 still uses the 6800s as a fx board and limits its shader useage . Doesn't sound to promising . You would think the biggest dx 9 game out right now they'd want to show off thier speed in it . Or is it because when you trick the game into thinking its a radeon it becomes extremly slow at rendering the full persicion and p.s 2 paths ?



NVIDIA works hand in hand with the developers to help ensure that the games run as smoothly as possible on their hardware
Never said they didn't



With the FX series, there was much to gain using FP16 partial precision because of architectural limitations

With the 6800U, full FP32 precision can be used much more often without bringing down performance.
incorrect . At this time we do not know if full percision can be used much more often with out bringing down performance.

Which even still is a far cry (get the pun ?) of the r3x0 and r420 series of cards that run full percision all the time . and very quickly at that .



This makes sense for efficiency purposes
Its funny that ati doesn't need to do this for efficiency purposes . Nvidia must be making crippled hardware .

Anyway, for DirectX 9.0c and future versions of DirectX, FP32 is full precision, and everything less than that is considered partial precision

Perhaps . I have to read more about that . But as it stands ati doesn't support sm3.0 and so the shaders it does support sm2.0b (or whatever) will run all shaders at full percision at all times .

don't need to show you anything. It's a no-brainer really to anyone who is willing to open their eyes.

I have my eyes open. How else could i know what your writing ?

As i said forcing the 6800ultras to render far cry in full percision sm2.0 drasticly reduces performance . If this is true who will they be able to use sm3.0 at full percision ?

Precision is not the major issue with FarCry. The major issue is that the 6800U is running on the sub-optimal NV3x path that uses lower quality shaders. Let's revisit this issue in one or two months when the SM 3.0 add-on comes out. I suspect that it will be a real eye-opener.

People have forced it to run the standard path by making it run like an radeon 9700pro. Its performance drops drasticly.

Perhaps the nv3x path is the optimal path.

Mabye 3.0 will be an eye opener. But for who ? you or I ?

We know that SM 3.0 is designed to make certain effects run more efficiently than SM 2.0.
Yes we know that sm3.0 can increase speed but yet some of the features can decrease speed .

We know that SM 3.0 will be available in several titles coming out this year
for speed increases not iq increases

We know that the 6800U performs very well in shadermark when running at full FP32 precision
and there are some issues you can read about in this very forum with regards to that



We also know that the drivers are very raw, and that the compiler is really in it's infancy

Broken record. This is what they said about the n3x .

Sm3.0 should increase speed but if the speed already sucks whats it going to do suck a little less ?
 
<yawn>

Keep living in the stone ages if you wish, it is pointless trying to reason with someone who is irrational :D

As for references to reviews, do the research, it is simple enough.

:D
 
jimmyjames123 said:
<yawn>

Keep living in the stone ages if you wish, it is pointless trying to reason with someone who is irrational :D

As for references to reviews, do the research, it is simple enough.

:D

SO it goes from you speaking of them often enough to having to go out and do research ?

Why is it every time you get proven wrong you change your tune ?

I tried pointing out where your bias becomes aparent but it seems you can't see your own bias .

To me it doesn't matter which card ends up the fastest in the future. I'm lucky enough to have both of them in my house hold .
 
Would you guys mind bringing the noise-signal ratio down a bit? I feel like I'm reading a fansite's forum plowing through this constant bickering (though, yes, of course nothing is making me read this thread).
 
Blastman said:
I think the screenshots posted at… THG … are bogus. It seems to me the whole articles premise that … NV-Brilinear = ATI-Adaptive-Tri … is based on this one screenshot only.

Thier are other screen shots on other sites how many screen shots do they need 2? 4? 23914792847921374908?

I would like to know how, where, and under what circumstances these screenshots were obtained. Was ATI’s Adaptive-Tri forced on the color mipmps like another article did? Did they take slight bit differences and magnify them 100x to contrast differences that aren’t really visible to the naked eye? These screenshots just don’t jive with the excellent filtering displayed by the X800 in the xbit article.

On the ixbt article exactly which screenshot is the "excellent filtering" because I can see faults in the Line screen shots. If your talking about the coloured mipmap its already been stated around the place that ATI are dilerbatly disabling their "excellent filtering" when colour mip-maps are used and use trilinear filtering.

If you actually read what tom did he took a screen shot of in game bilinear and a screenshot at the same place in the game and with it set to trilinear ( which is trylinear on these cards ) and compared the difference. Though he may have increased the contrast or something I'm not sure I don't really have a x800 to do testing with feel free to send me one.
 
Back
Top