ATI - Full Tri Performance Hit

Status
Not open for further replies.
It doesn't look like anyone's posted this yet, so I'll go ahead. Computerbase has done some more work and they've managed to disable ATI's optimizations:

The Goods

Courtesy the Inq.
 
http://www.beyond3d.com/forum/viewtopic.php?p=302916#302916


How it looks, we were somewhat too hasty. Full, trilinear AF gives it only over all texture daily, if the application requests it or with an external Tool is helped. The AF forced over the driver filters only on the first texture days trilinear AF, over it only bilineares AF is ordered - similar to what is ordered in the ATi driver, if one does without external Tools.

from how it reads it seems they thought the new nvidia drivers gave the option to use trilinear filtering across all texture stages on the geforce 6800ultra and fx series.

This is not the case. IT just allows you to turn of brilinear. IT still only uses trilinear on 1 texture stage like ati.

THey went on to force trilinear on all stages of the texture on the radeons and increased the load on them.

Then comparing them to what they thought were equal benchmarks to nvidia (i asume they had benchmarked with these drivers somewhere else )

That is why the update is there.

That is what i gather from the broken english.
 
Ardrid said:
Ahh...gotcha. Thanks for the link.

no problem and don't forget i could be wrong as its poorly translated. But thats how it reads and no one except radar (who when i explained stoped) have claimed diffrently
 
Ardrid said:
It doesn't look like anyone's posted this yet, so I'll go ahead. Computerbase has done some more work and they've managed to disable ATI's optimizations:

The Goods

Courtesy the Inq.

Very interesting article, from what I can tell from poorly translation is that X800 XT without optimizations is about the same as default 9600. This might confirm that X800 is not much more than a slightly upgraded 9800.

I am wondering if anybodys try the same type of things on 9800's and also the 6800's. ( if possible ).
 
hstewarth said:
Ardrid said:
It doesn't look like anyone's posted this yet, so I'll go ahead. Computerbase has done some more work and they've managed to disable ATI's optimizations:

The Goods

Courtesy the Inq.

Very interesting article, from what I can tell from poorly translation is that X800 XT without optimizations is about the same as default 9600. This might confirm that X800 is not much more than a slightly upgraded 9800.

I am wondering if anybodys try the same type of things on 9800's and also the 6800's. ( if possible ).

u do know that the x800s were in 1600x1200 and the 9600s were 800x600

don't u ?
 
Until someone can properly translate the article I'm not entirely sure.

But the graphs indicate they covered lots of modes on the ATi hardware.
 
hstewarth said:
Very interesting article, from what I can tell from poorly translation is that X800 XT without optimizations is about the same as default 9600. This might confirm that X800 is not much more than a slightly upgraded 9800.

The x800 runs at 1600x1200, the 9600XT only runs at 800x600. The x800 pushes 4 times as many pixels per frames.
 
Ardrid said:
It doesn't look like anyone's posted this yet, so I'll go ahead. Computerbase has done some more work and they've managed to disable ATI's optimizations:

The Goods

Courtesy the Inq.

My primary question is: what's the purpose of disabling optimizations which you cannot demonstrate to cause IQ degradation? Doing so would seem to me to be tantamount to crippling the drivers, as the only reason to want to disable an optimization in the first place is because it clearly degrades IQ. Apart from that, I can find no reason to desire to disable them.

Also, what on earth might be the conceivable point of this:

force the full trilinear filtering which - according to first views - seems to function also over all texture stages and with all Levels of anisotropic filtering
(empahsis mine)

...? Unless all texture stages are *visible,* which would depend on the specific game and engine used, the only thing that forcing treatment on the invisible stages would provide would be a performance degradation with no IQ benefit whatever.

If an IHV's performance optimizations do not visibly degrade IQ, then adding a control panel switch to defeat them strikes me as sensible as adding a cp switch labeled "Slow down your gaming here!", which I think I'm safe in saying would not be viewed as a popular or desired option.

Therefore, the first order of business is to demonstrate conclusive proof of IQ degradation caused by driver optimization, then to go about defeating such optimizations. If the first cannot be accomplished, there simply is no point to the second.

It strikes me that people who think that disabling driver optimizations on general principle will automatically improve IQ simply don't know what they are talking about. Some optimizations improve performance without sacrificing IQ, some do not. Without knowing the difference, a person is blind.
 
Tim said:
hstewarth said:
Very interesting article, from what I can tell from poorly translation is that X800 XT without optimizations is about the same as default 9600. This might confirm that X800 is not much more than a slightly upgraded 9800.

The x800 runs at 1600x1200, the 9600XT only runs at 800x600. The x800 pushes 4 times as many pixels per frames.

I think what he meant is that given the theoretical fill rates (no of pipes / clocks etc) of both cards the efficiency of each chip is about the same i.e the X800 is a slightly upgraded 9800. I don't think anyone can really deny that from what we've seen so far.
 
jvd said:
This is not the case. IT just allows you to turn of brilinear. IT still only uses trilinear on 1 texture stage like ati.

Not true. With the new drivers it's fully possible to get full trilinear on all texture stages.

I'm using 61.41 right now and it has an option called anisotropic optimization as well as trilinear optimization.
If you turn on High Quality both these options are set to Off and you get full trilinear on all texture stages both with and without aniso.

(Still no possibility of turning off angle dependency though.
 
WaltC said:
My primary question is: what's the purpose of disabling optimizations which you cannot demonstrate to cause IQ degradation? Doing so would seem to me to be tantamount to crippling the drivers, as the only reason to want to disable an optimization in the first place is because it clearly degrades IQ. Apart from that, I can find no reason to desire to disable them.

It strikes me that people who think that disabling driver optimizations on general principle will automatically improve IQ simply don't know what they are talking about. Some optimizations improve performance without sacrificing IQ, some do not. Without knowing the difference, a person is blind.

Well in answer to your question there is another very compelling reason to disable the optimization given that IQ is equal. Scientific curiosity. Disable them so we can see for ourselves how much performance impact full tri has on the X800 series since ATI sure as hell won't tell us. I would think that kind of effort would be appreciated in the hardware enthusiast (geek) community. Guess I was wrong. :?
 
Ante P said:
jvd said:
This is not the case. IT just allows you to turn of brilinear. IT still only uses trilinear on 1 texture stage like ati.

Not true. With the new drivers it's fully possible to get full trilinear on all texture stages.

I'm using 61.41 right now and it has an option called anisotropic optimization as well as trilinear optimization.
If you turn on High Quality both these options are set to Off and you get full trilinear on all texture stages both with and without aniso.

(Still no possibility of turning off angle dependency though.

are u sure

Update:
How it looks, we were somewhat too hasty. Full, trilinear AF gives it only over all texture daily, if the application requests it or with an external Tool is helped. The AF forced over the driver filters only on the first texture days trilinear AF, over it only bilineares AF is ordered - similar to what is ordered in the ATi driver, if one does without external Tools.
It sounds like its saying they were to hasty , full trillinear af gives it only over all textures daily (dunno what daily means ) if the application requests it or with an external tool helping it (forcing it)

SO are u sure that they actually do put it on if u ask for it. Or only if the application asks for it ?

It then goes on to say similar to what ati drivers do. We all know ati only does it on the first texture stage. Which would imply if translated correctly that the new nvidia drivers do the same thing

So i ask 1 more time. Are you sure at high or maxed out settings it doesn't allow the application to choose if it wants full trilinear ?

Which is what ati drivers do (ut2k3 and 4 don't seem to want it other wise the drivers would let it do it )
 
slight different translation
How it looks, we were a little bit too rash. There is full, trilineares AF only about all texture levels if the application requests for it or is helped out with an external tool. The AF enforced about the driver filters only for the first texture level trilinear-AF, about that point is offered only bilineares AF - analogously to him what is offered in the ATi driver if one renounces external tools

So from what I gather the GF6800 is only doing tri on the first level whereas the x800 is doing it on all levels hence the performance hit
 
jvd said:
waltc do u nkow why they changed the lod setting ?

The kind of approach they've consistently demonstrated seems to me an inductive process: they first saw a difference in DX rasterizer results and improperly concluded it meant a degradation in IQ which they have improperly concluded was caused by ATi's automatic trilinear optimizing algorithms (when we have an unknown employee quoted verbatim on Tech Report and THG saying M$ hasn't yet updated its DX rasterizer software for the newest generation of 3d hardware.)

So, they started with the idea that IQ was degraded by the optimizations before proving such IQ degradation actually existed, and they have been working from the same inductive premise ever since--namely, that a Trilinear optimization must cause IQ degradation. As with all optimizations, it is not true that they must all create IQ degradation. I much favor the deductive approach, which is that you see the IQ degradation first, and then you start going about isolating its cause--that's exactly the track on which the nVidia driver investigations for nV3x proceeded last year, and imo it is the only correct track to take. Indeed, nVidia is still optimizing for trilinear, but from accounts I read the IQ degradation seen in the beginning is greatly ameliorated in recent drivers.

What I fear happening is that after discovering their initial inductive premise was in error they will go about manipulating driver settings, test conditions, and visible "proofs," in an attempt to create an illusion of IQ degradation in order to validate their original premise. CYA seems to be a universal human foible...;)
 
it still implys that it only does tri on the first texture stage like ati drivers

analogously

Similar or alike in such a way as to permit the drawing of an analogy
 
WaltC said:
jvd said:
waltc do u nkow why they changed the lod setting ?

The kind of approach they've consistently demonstrated seems to me an inductive process: they first saw a difference in DX rasterizer results and improperly concluded it meant a degradation in IQ which they have improperly concluded was caused by ATi's automatic trilinear optimizing algorithms (when we have an unknown employee quoted verbatim on Tech Report and THG saying M$ hasn't yet updated its DX rasterizer software for the newest generation of 3d hardware.)

So, they started with the idea that IQ was degraded by the optimizations before proving such IQ degradation actually existed, and they have been working from the same inductive premise ever since--namely, that a Trilinear optimization must cause IQ degradation. As with all optimizations, it is not true that they must all create IQ degradation. I much favor the deductive approach, which is that you see the IQ degradation first, and then you start going about isolating its cause--that's exactly the track on which the nVidia driver investigations for nV3x proceeded last year, and imo it is the only correct track to take. Indeed, nVidia is still optimizing for trilinear, but from accounts I read the IQ degradation seen in the beginning is greatly ameliorated in recent drivers.

What I fear happening is that after discovering their initial inductive premise was in error they will go about manipulating driver settings, test conditions, and visible "proofs," in an attempt to create an illusion of IQ degradation in order to validate their original premise. CYA seems to be a universal human foible...;)

thank you waltc . Very interesting
 
trinibwoy said:
Well in answer to your question there is another very compelling reason to disable the optimization given that IQ is equal. Scientific curiosity. Disable them so we can see for ourselves how much performance impact full tri has on the X800 series since ATI sure as hell won't tell us. I would think that kind of effort would be appreciated in the hardware enthusiast (geek) community. Guess I was wrong. :?

But the point you overlook--scientifically--is that nothing about the compared hardware products is equal in the first place. The hardware is unequal and different, the drivers are unequal and different. What is the same between them are the Windows APIs and the games or applicaitons used in such a comparison--that's it.

Do you really think it might be an object of "scientific curioisty" to discover how unnecessarily slow we can *force* each product to run those APIs and games? Heh....;) I'm not the least bit interested in that...;)
 
Status
Not open for further replies.
Back
Top