When enough is enough (AF quality on g70)

tEd said:
To make them look better(faster) to the competition and their own little brother? Just a thought.

Ormaybe g70 just can't do better than that i wouldn't rule that out either
I highly doubt that such a thing gives g70 any performance improvements.

As for "can't" -- i'm pretty sure that g70 can do anything that nv4x can...
 
tEd, a big THANKS! I was just about to pull the trigger on a 7800GTX SLi setup - already got a SLi MB. The texture ailising on the 6xxx series alway drove me crazy. To spend $1000 to find myself back in the same situation would have driven me insane (some here might say I already am!). Guess I'll wait for R520/crossfire..........
 
DegustatoR said:
I highly doubt that such a thing gives g70 any performance improvements.

As for "can't" -- i'm pretty sure that g70 can do anything that nv4x can...

Indeed, the filtering logic is the same.

Without giving the game away, various people in the driver team are having a look to see what's up. Hopefully they'll come back with something soon.
 
Dunno, ATI's year-old recommendation that NVidia's optimisations be turned off for a fair image-quality and benchmark comparison still stands.

It's rather convenient, right now, that it seems NVidia drivers won't let you turn off the optimisations, even when you thought you had.

Comparison movies are the only correct way to assess AF quality in my view. If NVidia's cards continue to produce worse AF IQ, then you're barking up the wrong tree saying that ATI needs to improve.

On the other hand if you want to argue that ATI should include transparent AA modes in future drivers/hardware (because NVidia supports them) then you have my full support.

Has NVidia fixed the transparency-AA bugs in HL-2 yet?...

Jawed
 
DSC said:
ATI's AF is always inferior, 5bit, doesn't even comply to OpenGL's 8bit minimum requirement, that is a fact and they've always used the inferior angle dependant AF to gain speed.
Only bilinear filter uses 5 bit, its influence on AF quality is very insignificant. And I dont think opengl has minimum requirment for this, just standards set by sgi hardware.
 
Last edited by a moderator:
I just agree with TeD about there is a problem in the G70. Anyway problem is a problem, you cannot lead people with the simple word BUGGING like the M$ doing it with its OS thing. I accept that the G70 is great VID but what do you people expect by spending a lot on this VID. Just simple word, what if you bought sport car that can only excel by intake it wiht Jet fuel? It would be great for NV to step front and say anything about this in public, not just only let someone (closed to you) asked you and lead people here ;P I can get it that there is noting perfect but to ignore it... will make it far away than what you are claiming perfect!

Also, to the reviewer, I don't get the point of how all you have to rush to be first website to review a product since, by the way, people always visit almost every review-site to read-on. I respect for you devote your times and effort to do so but you are not NEWSPAPER or MAGAZINE! you can take sometime for the article (just only one limitation that they allow you hold the sample on short period, that's I guess so?)
 
One another thing popping up in my mine.... while NV do this, people called it OPTIMIZATION, but what if ATi do it... it's simply a mistake...

Take it simple.... noting is perfect.
 
btw

david kirk says it is a reaction to the user demand , which for the most part just looks at the achieved fps and they adjusting the quality to the competition

original quote from a german pc hardware magazine maybe somebody can translate it more accurately:

So sagt Nvidia-Chefentwickler David Kirk, man reagiere damit nur auf die Anforderungen der Kunden, die grossteils auf die erzielbaren Bildweiderholraten schauen würden. Zudem passe man sich hinsichtlich der gebotenen Qualtität dem Mitbewerber an.
 
Last edited:
tEd said:
Actually with quality setting which the reviewers benched there is no doubt that it shimmers everything else is simply a lie. On High Quality it is probably very depnending games and situation.

Not sure I follow - what exactly is a lie?
 
caboosemoose said:
Not sure I follow - what exactly is a lie?

If someone says with quality mode he doesn't see it shimmer than i call that a lie unless he's blind of course
 
DSC said:
http://www.techreport.com/etc/2004q2/filtering/index.x?pg=1

Don't forget this, people. Last year ATI was caught red handed doing stuff they claim they were not doing.

ATI's AF is always inferior, 5bit, doesn't even comply to OpenGL's 8bit minimum requirement, that is a fact and they've always used the inferior angle dependant AF to gain speed.
That article by Techreport almost reads like a press release by NV. THG published the same type of rubbish to justify running lower filtering quality with the NV40 cards.

It never came down to whether ATI was doing optimizations with filtering, it was a question of what type of IQ does that optimization routine produce. The Trylinear on the R420’s cards is so close in quality to the “full” Trilinear (or old Tri) of the 9800’s, it is basically equivalent for IQ purposes. The Brilinear that NV is using on the NV40’s at the Q setting (where they are benched) is much farther over to the Bilinear side of IQ. The problem is that most of the Hardware sites weren’t willing to take the time to do proper comparisons. So, they said ATI is doing optimizations -- so we’ll run the NV cards with optimizations too.

I’ve always thought that ATI should play the same game as NV (since hardware sites don’t take the time to compare IQ) and produce a Brilinear equivalent (for the Q setting) to NV’s Q setting and set the CP default at Q -- so the cards are benched with equivalent filtering quality.
 
I think the concensus is that 9600XT and 9800XT and newer cores(with the new AF capability) look better than the 9800Pro (R350).

Maybe my memory's foggy...

Jawed
 
Jawed said:
I think the concensus is that 9600XT and 9800XT and newer cores(with the new AF capability) look better than the 9800Pro (R350).

Maybe my memory's foggy...

Jawed

What you mean with new AF cababilitys? There is more programability but the max AF quality achieved if no optimization are applied is the same on r420/rv350 and r300/r350
 
I think the conclusion was "optimised" filtering tends to look better than normal trilinear. A minor difference anyway, the whole thing came to nothing as the IQ more than stood-up to close inspection. Storm in a teacup.

Jawed
 
Jawed said:
I think the conclusion was "optimised" filtering tends to look better than normal trilinear. A minor difference anyway, the whole thing came to nothing as the IQ more than stood-up to close inspection. Storm in a teacup.

Jawed

Was that your conclusion or just in general? Because i can disagree with it :)
 
It was the conclusion I drew from what I read - or at least that's my memory of the conclusion I drew.

I've never had the variety of cards to be able to play with this stuff personally. Just a 9800Pro which I don't have anymore cos the fan stopped and I RMA'd it.

Pats Radeon 32MB SDR...

Jawed
 
ATI's particular trilinear optimization doesn't really decrease quality in games it does if you use the right tools and a worst case scenario but my experience is that it doesn't increase IQ either.
 
Back
Top