Anand Compares NV38/R360 IQ

OpenGL guy said:
I found his treatment of TRAOD to be in rather poor taste. For all the reader knows, none of the cards could give a playable game because he only shows how much performance drops when enabling PS 2.0. I mean, are we going from 100 to 40 or 10 to 4? Also, you can't really compare the results between the 9800XT and the 5950 because you have no idea where they started without PS 2.0.

That is my single biggest gripe with this review. WTF was up with the TR performance difference without the absolute numbers? :?:

And while most of the gross rendering issues with the Dets (on these tests) appear to be fixed, there is no image quality analysis on filtering, which is a huge question mark.

I certainly don't expect a scrutinizing of each of the tests...but at least pick one or two and go into some detail. (Particularly one or two tetsts that had large performance increases.)

A lot of time and effort went into that, and it's appreciated...but there was hardly any "image quality comparisons" to speak of.

Not even a lick bout relative AA quality (when nVidia was doing AA, that is.)

The overall tone seemed to be almost apologetic to nVidia in many cases...though I'll re-read the article with a more open mind before commenting on some of it. ;)

I mean, based on the following:

1) ATI lead the majority of tests, and some by a significant amount
2) ATI had far fewer "driver issues" encountered

ATI is the clear winner....and while he states his preference for ATI, it comes with the "you should really wait" disclaimer. "Wait till next year" to decide the DX9 performance winner?

How about decide the performance winner now....and if it happens to change next year, so be it?
 
I thought the R3x0 architecture has almost-optimal drivers as-is? That's what I've gathered from these forums, anyway. I recall a recent thread where ATi devs said not to expect 20%+ performance jumps in shader-heavy games simply due to optimized drivers.

I think its more of a case of "don't expect it yet". AFAIK, 3.7's were the first release of any kinds of optimiser, however there are some gains to be had yet - if you think about the drop in GT4 in 3DMark03, going from hand coded shaders to unoptimised, then there is still that type of gain to be had. Eventually the automatic optimiser should get close to the performance of the hand coded shader. They still have some work to do to be able to automatically reorder code to make best use of co-issue, better use of the mini and full ALU's and hiding texture ops within ALU ops (as these can happen in parallel with R300).
 
I suggest everyone to save the images and go through them one by one. F1 challenge AF is noticeably better on the Radeon for example.
The naming conventions used for the images have to be changed though because they create a mixed-up order...
 
Joe DeFuria said:
OpenGL guy said:
I found his treatment of TRAOD to be in rather poor taste. For all the reader knows, none of the cards could give a playable game because he only shows how much performance drops when enabling PS 2.0. I mean, are we going from 100 to 40 or 10 to 4? Also, you can't really compare the results between the 9800XT and the 5950 because you have no idea where they started without PS 2.0.

That is my single biggest gripe with this review. WTF was up with the TR performance difference without the absolute numbers? :?:

C'mon now Joe I pegged you as smarter than that. Must you ask this question? ;)

Joe DeFuria said:
And while most of the gross rendering issues with the Dets (on these tests) appear to be fixed, there is no image quality analysis on filtering, which is a huge question mark.

I really wanted to hear about the screen-cap problems Valve brought up to see whether or not those apply.

Joe DeFuria said:
ATI is the clear winner....and while he states his preference for ATI, it comes with the "you should really wait" disclaimer. "Wait till next year" to decide the DX9 performance winner?

Because TR:AOD is not a "true" DX9 game and HL2 has been delayed conveniently, maybe, into next year.

*cue conspiracy music

Dunh Dunh Dunnnnnnhhhhhhhhh

Joe DeFuria said:
How about decide the performance winner now....and if it happens to change next year, so be it?

You expect far too much my child.......
 
Laa-Yosh said:
I suggest everyone to save the images and go through them one by one. F1 challenge AF is noticeably better on the Radeon for example.
The naming conventions used for the images have to be changed though because they create a mixed-up order...

Indeed. What happened to the nice rollover images anand used to use in his articles?
 
Copy-paste of my post from anandtech:

Note: the AA/AF and noAA/AF images of Warcraft3 have been mixed up for the NV52.14.

It tells a lot about the value of the screenshots that it takes careful inspection to find this error. I have played a lot of War3 recently and the difference is very noticeable in game, even with this GF4.
 
Natoma said:
That sniffs very strongly of the screen-cap probs Valve reported. IQ problems in game that can't be detected in screenshots.

Natoma said:
I really wanted to hear about the screen-cap problems Valve brought up to see whether or not those apply.

Natoma said:
no mention whatsoever of that very damning point from Valve, in an image quality test of those very drivers, strikes me as "curious" at best.

Natoma said:
could this be the result of that screen-cap detect stuff Valve complained about?

You wouldn't be obsessing would you? (And these are just from this thread!)

reviewer said:
Many of the image quality issues from part 1 were due to rendering problems that couldn't be captured in a screen shot (like jerkiness in X2 and F1), or a lack of AA.

OMG! NVIDIA is hiding jerkiness by purposely capturing a still image! OMG! ;)
 
I am obsessing about that when it comes to an IQ comparison chock-full-o-screenshots. Those problems were enough for Valve to publicly state to everyone "Do not use these drivers!"

You're not interested in the least bit about that issue?

p.s. I've read the comments on the AT comments board. I did not make the comment about Valve, surprise surprise. The past two reviews, someone has taken my comments and posted them verbatim to their forum, and some AP also insulted my intelligence thinking that I was posting. :LOL:

Didn't realize I was that special. :oops:
 
http://www.anandtech.com/talkarticle.html?i=1896&ATVAR_START=41&p=3

The post at the top of that thread (not by me...I did not post in that thread at all), pretty much sums up my feelings on the "apologetic tone" toward nVidia with examples.

Now back to some more productive and pointed observations:

Aquamark, AA/Aniso image quality.

Clearly, there is some "blurring" going on in the nVidia shots..including alpha textures (grass). There's only two possible explanations for this:

1) The AA mode used by the nVidia card is one of the XS / partial supersampling modes.

2) The AA mode used by nvidia is Quincunx.

Since the whole image appears blurred to me, not just "filtered" relative to the ATI shot (I mean every texture if you compare them), I'm inclined to believe it's Quincunx.

Is there more than one Quincunx / bluir filter AA mode on NV cards? Or just the 2X AA + blur?
 
I presume the question whether the FX59xx is using AA or not in some images is due to only having Vertical offset AA. Therefore areas like the Michelin / lights bar in F1, the X2 animated logo will look as though AA isn't on - and circular graphics such as the Stop sign in Simcity etc won't look smooth as only half the circle has AA applied. - (More like HSAA)!
I suppose it's another NV performance saver when comparing competitor AA tests.
 
You know I hate to say it, but I am rather impressed, I thought the FX line was milked completely out pretty much, getting gains with that little drop in IQ, I think is actually an achievement of note. However it still doesn't make them competitive and nothing short of a new arcitecture will do that though, because even though (despite what many claim) nvidia is still decent for most current games they cost to darn much.
 
I'm beginning to think that alot of the hardware community's hatred towards NVIDIA has become so blinding, they will never appreciate anything that comes from the company now, or in the future. Not that I think that NVIDIA hasn't made mistakes, but I think alot of people have taken things so far that even if the companies were reversed a year from now, nothing could change the ATI love fest.

I mean, look at this objectively; the 52.xx performance gains are massive. The IQ quality compared to 4x.xx is again a huge improvement. Yes, ATI is still ahead slightly, but that never caused anyone to 'hate'/witch hunt them when they were behind in the DX8 days.

It's just pretty messed up. 52.xx shows large improvements. Anandtech praises these impovements (while still giving the final nod to ATI), yet gets accused of being on the NVIDIA payroll. Pretty scary times these days.

This isn't so much an obervation on Anand's article, but rather on the responses I've read around the net regarding it.

*insert joke about people taking cheating/lying to heart so seriously yet overlooking the current American administration, meh (or something)*

(except Natoma, that crazy DemocrATIc :LOL:)
 
Joe,

There's also 4xOGMS + 9-tap; by the way the bluriness of Quincunx or the 4/ 9-tap mode really never came across in all it's glory in screenshots. The latter mode is just a tad more apparent on screenshots, but it's rather mild to what you experience in real-time. If you have really time to waste, re-read both the first and the second article; the oxymoron is quite obvious between them.

Personally I'll start objecting to any review from now on that won't use custom timedemos. Unless I've missed something I can't seem to see any at Anand's.

Wavey,

Wouldn't it be safe to assume though that the performance gain with shader optimizers is quite a bit higher with NV3x's than with R3xx's?
 
I mean, look at this objectively; the 52.xx performance gains are massive. The IQ quality compared to 4x.xx is again a huge improvement. Yes, ATI is still ahead slightly, but that never caused anyone to 'hate'/witch hunt them when they were behind in the DX8 days.

If you really want me to look at it objectively, I'll give you my personal opinion about the specific driver set when I can download it myself and see it's behaviour with my own eyes.

The NVIDIA "hatred" you're addressing there is an ongoing phenomenon for years now, nothing new there either.
 
I mean, look at this objectively; the 52.xx performance gains are massive. The IQ quality compared to 4x.xx is again a huge improvement. Yes, ATI is still ahead slightly, but that never caused anyone to 'hate'/witch hunt them when they were behind in the DX8 days.

You've got a short memory then. I remember a time when every ATI card review had the disclaimer that their drivers weren't as good as Nvidia's. Also the disclaimers about "Nvidia has something better just around the corner". Nothing has changed here except the roles have been reversed.
 
DaveBaumann said:
I'd also like to know why they made the assumption that NVIDIA would be the only ones to gain from a shader optimiser - ATI are only beginning this as well.

It's simple. They're genuine twits.
 
I'll have to add myself to the list of skeptics about the supposed improvements offered by the Det52.14 driver that Anand tested. If this driver is really so fantastic, why isn't it available for download? Maybe because it still has too many issues? Maybe because it isn't passing the WHQL certification tests because it's messing with IQ and/or DirectX instructions? Maybe because it doesn't yet provide enough performance to match ATI's latest products, even with Nvidia's own unreleased future hardware?

I think the last question is particularly interesting. How many people would actually spend $500 on a graphics card that's not the fastest AND doesn't have the best IQ? Would it even make sense for Nvidia to announce the NV38 and Det50 until they've found a way to somehow get higher performance than the 9800 XT in a significant number of games? Perhaps convincing gullible press like Anand to spend countless hours reviewing of unannounced hardware and software is their new "paper launch" strategy... this way, you can start generating hype about your upcoming products without having to actually commit to anything like clock speeds, availability, features, etc.
 
Joe DeFuria said:
ATI is the clear winner....and while he states his preference for ATI, it comes with the "you should really wait" disclaimer. "Wait till next year" to decide the DX9 performance winner?

How about decide the performance winner now....and if it happens to change next year, so be it?

:LOL: I remember the exact quote somewhere else.... (hmm.. was it in the Anand's article comment section? )
 
Sixty pages of Sizzle. Unfortunately no Steak. I seemed to get the impression that so long as the IQ. was close enough it is fair to compare. Personally I would rather see ten games done throughly than twenty glossed over.
 
DaveBaumann said:
I'd also like to know why they made the assumption that NVIDIA would be the only ones to gain from a shader optimiser - ATI are only beginning this as well.
Not that i agree with it, but Ati's hardware has been around a lot longer so i guess they're assuming that they have less things left to optimize for.
But i don't really understand this part though:
If NVIDIA can continue to extract the kinds of performance gains from unoptimized DX9 code as they have done with the 52.14 drivers (without sacrificing image quality), they will be well on their way to taking the performance crown back from ATI by the time NV40 and R400 drop

Nvidia's hardware runs at a 50 MHz clock advantage (9800 XT vs 5950) and also uses >25 million transistors more and they're still not able to beat Ati, with a lot crappier FSAA to boot. So i don't know how the Det 52.xx drivers can lead anyone to believe that they are on their way to take back the performance crown. I would say that there's now a possibility that they can get it back though (again, assuming that these performance gains are legit).

Joe DeFuria said:
How about decide the performance winner now....and if it happens to change next year, so be it?
He does say:
NVIDIA has a long road ahead of them in order to improve their compilers to the point where game developers won't have to hand-code special NV3x codepaths, but for now ATI seems to have won the battle.

And the conclusion seems ok to me:

ATI is still the recommendation, but NVIDIA is not a bad card to have by any stretch of the imagination. We still urge our readers not to buy a card until the game they want to play shows up on the street. For those of you who need a card now, we'll be doing a value card round up as part of this series as well.

I'm very interested in the value card round up since i want to know if these new enhancements will carry over to the entire FX line. Hopefully we'll soon see some Det 52.xx reviews that includes 9600, 5200-5600 series. Perhaps from B3D ?
 
Back
Top