ATI - Full Tri Performance Hit

Status
Not open for further replies.
radar1200gs said:
read the graphs jvd. They tested the ATi cards in several configurations. One of those configuarations was app preference, which will yield the same filtering you claim NV40 was doing (tri first stage AF, Bi thereafter) unless the app specifies otherwise.
What a funny way of showing your bias. App preference mode on ATI means just that: Whatever min/mag/mip filter is specified by the app is what you get. For UT2003/2004, for example, that means trilinear on 4 texture stages. If NVIDIA is only doing trilinear on the first stage and forcing bi-(or bri-)linear on the rest that would be a huge performance benefit.

-FUDie
 
IF you do not like that then please show me where in that article it states that the image quality was affected do to ati's method of optimizing and i will gladly take the second to last results and compare them nvidias trilinear scores. If not the third from last still stands as valid.
Oh common dude, we've seen numerous articles from several sites showing that brilinear/trylinear doesn't produce the same results as full trilinear. It's not an optimisation, a true optimisation results in less work but an identical end result. Brilinear/trylinear does less work, but the end result isn't indentical to full trilinear, so they shouldn't regarded as optimisations, but quality/performance trade offs.

brilinear/trylinear != full trilinear.

simple, proven and discussed quite a bit.

Brilinear should only be compared to trylinear.

Full trilinear should only be compared to full trilinear.

Ati fans be fair and realise that we should be comparing apples to apples. apples to oranges don't count and don't mean much, and if you don't agree with that, then you're a bunch of hypocrites.
 
mozmo said:
Ati fans be fair and realise that we should be comparing apples to apples. apples to oranges don't count and don't mean much, and if you don't agree with that, then you're a bunch of hypocrites.

I'm not saying that I prefer one or the other but I don't think your "black or white" view is the best approach.
You're leaving out the fact that these tradeoffs can be more or less aggressive. From what we've seen yet ATis method certainly seems to be less aggressive.

I have both boards and personally I can spot the "brilinear" on nVidias boards while I can't on ATis.
On the other hand, on nVidias boards I have the ability to switch these optimizations of completely so I'm not saying one is worse/better than the other.
 
Martillo1 said:
jvd said:
actually very few complained about the geforce 4. Most said p.s 1.4 wasn't much of a jump or would never be used.

Much like now with sm 3.0

I think history is going to repeat itself with nvidia the one with the barely supported shader model.

I recall something like this from the notes of an ATI's presentation: "try to steer people away from dynamic branching at least until R500 shows up with decent performance" or so, I am writing from memory.

So IMHO your statement about graphic manufacturers' planned S.M. 3.0 support does not reflect reality in an accurate way.

I recall nvidia saying the same thing to not use dynamic branching with the 6800s . I will try to bring up the quote.

read the graphs jvd. They tested the ATi cards in several configurations. One of those configuarations was app preference, which will yield the same filtering you claim NV40 was doing (tri first stage AF, Bi thereafter) unless the app specifies otherwise.

If the app specifies otherwise then there is no forced slowdown going on - the driver is merely obeying the app..

except no where in the article did they say trilinear optimizations affected image quality. So why would u force it off it it doesn't affect iq ? Makes no sense . As i said if you find where in the article it says that the optimized mode produces a lower quality image then i will accept that the second score from the bottom is correct. But since you still have not then i must say the default option from ati has the same quality as full trilinear on both cards and is an acceptable optimization.

THe only reason why any would argue that is because they want the nv40 to apear faster than it currently is by gimping the r420s.




Oh common dude, we've seen numerous articles from several sites showing that brilinear/trylinear doesn't produce the same results as full trilinear

I have yet to see any article show games affected . Please if i am wrong correct me. Not only that but the article does not mention any image degration.

I agree that if teh videos of mp2 are real and are not messed with then that game has an iq problem.

But there are no proven iq problems with unreal 2k4 and this article does not provide them either .

brilinear/trylinear != full trilinear.

simple, proven and discussed quite a bit.

come on dude this has yet to be proven in game. brilinear != trylinear ?!= full trilnear.

We have some sites saying there are no image quality problems , we have some saying that there are but they can only show it using apps meant to break it. Then we have some posters claiming there are problems. We also have some that claim there are none .

So if you think its proven then i got a bridge to sell ya .



Ati fans be fair and realise that we should be comparing apples to apples. apples to oranges don't count and don't mean much, and if you don't agree with that, then you're a bunch of hypocrites.

We should. If thief 3 with trylinear on looks just trilnear on nvidia cards to me that is apples to apples. Doesn't matter how they get there as long as the image quality is the same . Correct ?

I have both boards and personally I can spot the "brilinear" on nVidias boards while I can't on ATis.
On the other hand, on nVidias boards I have the ability to switch these optimizations of completely so I'm not saying one is worse/better than the other.

right and even brilinear on nvidia has become much better than it was when first found. As walt said you could see it in screen shots of actual games . Not with special programs

whoever said the x800xt was cheap..was wrong its 860..US

yet here at bestbuy i have a confirmed order for msrp.

I can go to pricewatch and find some 800$ geforce 6800s .

Whats your point ? when something is hard to get and there is a demand people will abuse it and charge more .





I happen to agree with walt. Untill they are able to find problems in games that are reproducable trylinear scores should stand. If and when we find examples that are repeatable in actual games we should force full trilnear on for benchmarks. Untill such a time the problem is gone if it is ever gone.
 
azndragon04 said:
whoever said the x800xt was cheap..was wrong its 860..US

http://www.xbitlabs.com/news/video/display/20040609120006.html
screwed? =P

oh and ouch, ati got pwned, just like nvidia did =P

The default configuration of Alienware Area-51 system includes the RADEON X800 PRO graphics card. In order to upgrade to NVIDIA GeForce 6800 Ultra graphics card customers have to pay extra $265, in order to change the graphics card to NVIDIA GeForce FX 5950 Extreme, end-users have to bid extra $30, while installing ATI RADEON X800 XT graphics card cost customers extra $108.


Looks like the X800XT is quite a bit cheaper outside of Japan.
 
jvd, by chance, do you work for ATI, or do you have some personal connection with them? :D

Arguing with jvd is a fruitless exercise, because he is not rational. The guy always has his blinders on, he only hears what he wants to hear when it comes to ATI.

The whole "issue" overall is really not very difficult to comprehend. If one is going to benchmark ATI cards with their optimized filtering techniques, one should bench them against NV cards using all of their optimized techniques. Of course, since ATI does not give the user the option of turning off optimizations on the X800 cards, most people will not have a frame of reference to compare to, obviously.

I'm sure you ignored the Extremetech article, where they claimed that ATI's optimized technique overall looks very similar "in game" vs NV's optimized technique. There have been other reviews that have shown differences in quality between NV and ATI's filtering methods. Of course, I am not going to do the research for you. You can dig it up yourself, it is easy enough to do. ;)
 
Quote:
The default configuration of Alienware Area-51 system includes the RADEON X800 PRO graphics card. In order to upgrade to NVIDIA GeForce 6800 Ultra graphics card customers have to pay extra $265, in order to change the graphics card to NVIDIA GeForce FX 5950 Extreme, end-users have to bid extra $30, while installing ATI RADEON X800 XT graphics card cost customers extra $108.



Looks like the X800XT is quite a bit cheaper outside of Japan.

Relatively cheaper? Not necessarily. Look at Falcon Northwest computers, where the 6800 Ultra is $120 cheaper than the X800XT, and the 6800UE is the same price as the X800XT.
 
jvd, by chance, do you work for ATI, or do you have some personal connection with them?
nope. Wish i had some stock though. My brithday is coming in sept want to buy me some ?

Arguing with jvd is a fruitless exercise, because he is not rational. The guy always has his blinders on, he only hears what he wants to hear when it comes to ATI.

I asked a simple question to radar. If he can show me where in the article it states that the image quality is worse with trylinear on than off , that i will accept the forced trilinear as the correct score. Untill then since it does not affect the image quality trylinear scores are what i will use to compare to the nv40s .

Is that an unreasable request ?

Image quality has to be redcued for someone to want to turn off something that gives you what 20fps speed increase ?

DOn't you agree with that ?

If one is going to benchmark ATI cards with their optimized filtering techniques, one should bench them against NV cards using all of their optimized techniques

Why is brlinear on nv cards equal to that of trylinear on ati cards ? if it is please provide me with a link . I would like to read about that as it seems that trylinear in 99.9 % of cases looks as good as trilinear on nvidia cards .

Once again these are not unreasonable requests .

If you find a game like max payne 2 or farcry and they act ually have a problem i will accept the forced trilnear scores as correct. Untill then why handicap a feature that isn't affecting iq ?

I dunno is that having blinders on ? I think that is very reasonable out look.



I'm sure you ignored the Extremetech article, where they claimed that ATI's optimized technique overall looks very similar "in game" vs NV's optimized technique

actually i haven't read that one . Can you give me a link to it ?

I have read toms , hardocp and other reviews that state there is no image quality problems with trylinear and it looks as good as nvidias trlinear . Have you ignored those ?

There have been other reviews that have shown differences in quality between NV and ATI's filtering methods.
right using apps to break the optimizations or to find thier fualts . There are also articles that compare ati's trylinear to nvidias trilinear and claim there is no diffrence.

Which ones to bleieve ?



Of course, I am not going to do the research for you. You can dig it up yourself, it is easy enough to do.
:rolleyes:
 
u talking about this extremetech image quality article ?

In this close-up of the floor at 4X magnification, we can see what the filtering looks like on the GeForce 6800 Ultra (with and without trilinear optimizations) and on the Radeon X800 XT. There are two important things to note. First is that there is virtually no visible difference between enabling and disabling the trilinear filtering optimizations on the 6800. Second, ATI's texture filtering is noticeably better
pretty sure in these drivers you can't disable it for the geforce 6800ultra.

But at least they prove in pain killer trylinear is better than brilinear .



There has been so much noise lately in the 3D graphics community over the trilinear optimizations by ATI and nVidia that you'd think it had some huge impact on image quality. Honestly, we can't see what all the commotion is about. If you look at a still image from a game at 4x magnification and can't really see the difference, it's a non-issue

I agree.
 
[quote="jvd
read the graphs jvd. They tested the ATi cards in several configurations. One of those configuarations was app preference, which will yield the same filtering you claim NV40 was doing (tri first stage AF, Bi thereafter) unless the app specifies otherwise.

If the app specifies otherwise then there is no forced slowdown going on - the driver is merely obeying the app..

except no where in the article did they say trilinear optimizations affected image quality. So why would u force it off it it doesn't affect iq ? Makes no sense . As i said if you find where in the article it says that the optimized mode produces a lower quality image then i will accept that the second score from the bottom is correct. But since you still have not then i must say the default option from ati has the same quality as full trilinear on both cards and is an acceptable optimization.

THe only reason why any would argue that is because they want the nv40 to apear faster than it currently is by gimping the r420s.

[/quote]
No JVD. Testing both chips without driver optimizations gives an insight into the efficiency of the underlying hardware and should allow us to predict roughly how much more performance can legitiemately increase in the future. What the unoptimised scores say to me is the NV40 is a better raw number cruncher than R420.
 
radar1200gs said:
No JVD. Testing both chips without driver optimizations gives an insight into the efficiency of the underlying hardware and should allow us to predict roughly how much more performance can legitiemately increase in the future. What the unoptimised scores say to me is the NV40 is a better raw number cruncher than R420.
As soon as all the games you play crunch raw numbers, let me know.

ALL cards are about a hardware/software interaction. The silicon is useless without the drivers, and the drivers always seek to improve the use of the silicon. THEN there are all the further software layers above it; the OS, the individual software... We've seen the FX's increase notably in efficiency since they were launched--what can you tell me about just the hardware now? How about a year ago? How about when the 5800 launched? Video cards are a "package" and as we've all seen, projects eternally in motion.

But please, go ahead and list what "legitimate increases" we're allowed to see, since you can recognize them instantly and classify them flawlessly. Oooh, we'll make a flowchart! I'm sure there can't be more than a few dozen categories, can there?
 
I'll bet no end-user here can say they know someone who has a 6800U in their system right now that they bought retail for $450.

This guy can say that of his X800XT.

I don't remember a single 6800 selling for under retail, whereas I've seen a surprising amount of X800s selling for well under retail. This continues a trend for ATi cards undercutting retail at launch, but I don't know what it signifies (is ATi still the underdog, or are they just cutting better deals for their OEMs and AIBs?).

I may be asking the impossible, but can we at least try to keep the fanboying factual? TIA. :p
 
Its the exact same argument fanATIcs used against nVidia and their optimizations. You can't have your cake and eat it too you know.
 
radar1200gs said:
Its the exact same argument fanATIcs used against nVidia and their optimizations. You can't have your cake and eat it too you know.

Radar u keep saying this but its simply not true.

With nvidia's brilinear you did not need to use special programs to see the diffrence in image quality. It was there as soon as you started playing games and extremly visable in games.

This is not the case with the r420 optimizations .

As i said. Show me where in that article it says trylinear affected image quality negativly to want me to turn off the optimizations . If image quality is the same with and with out the optimization why would u wish to turn it off ?
 
azndragon04 said:
ouch if forced trilinear is accepted for ati, cause that brings ati's benchs down a lot, across the board, ouch ouch..
Yeah, keep spreading that FUD around. It's already been pointed out that the testing was flawed.

-FUDie
 
AlphaWolf said:
The default configuration of Alienware Area-51 system includes the RADEON X800 PRO graphics card. In order to upgrade to NVIDIA GeForce 6800 Ultra graphics card customers have to pay extra $265, in order to change the graphics card to NVIDIA GeForce FX 5950 Extreme, end-users have to bid extra $30, while installing ATI RADEON X800 XT graphics card cost customers extra $108.


Looks like the X800XT is quite a bit cheaper outside of Japan.
Hmm, downgrade to a 5950 from a x800 pro for $30.00. Alienware should have some mercy to those who would do such a thing for they will need the money for a lobotomy. :oops:
 
Status
Not open for further replies.
Back
Top