Anand Compares NV38/R360 IQ

I don't really see any serious image faults in the det 52.14 screenshots, but I can't really see anything with the picture sizes and how they are cropped. The links to the full-sized pictures don't seem to work for me, and not all pictures had that option.

In the case of Aquamark, the grass did seem pretty fuzzy though. There was some evidence of inferior AA, but it was pretty hard to see with the samples they showed.

It is also unfortunate that a number of the programs being used don't seem to be able to capture the same frame with each sample.

It's possible the det 52.14 drivers don't significantly degrade the image quality, but I guess it will take more time to find out if they really hold up as they should after they are released.
 
Finished reading... well, Anand certainly had a busy week.

Unfortunately many screenshots are cropped and/or taken with a camera angle so that you really can't take a look at various IQ related things (like texture filtering, AF). There aren't any obvious differences between the ATI cards and the NV cards using det 5x.x drivers. Also, neither the Halo nor the TR images had anything that looked like using PS2.0 to me.

I don't know what to think about the NV38 being unreleased though... or about using an AthlonFX for the tests.
And then there's the question of where Nvidia has managed to gain performance from, or how Halflife 2 would perform. Also, I know this will sound paranoid, but I truly, honestly hope that other websites will be able to reproduce Anand's benchmark results.
 
IMHO this article isn't that bad. Though I'm now wondering why anand even did part1, as this article has all benchmarks mentioned there included.
And the article failed to mention some of the "minor" issues, I'd really have liked to have some analysis of the tri/af (quality/app pref) behaviour of the different drivers.

And I don't agree with (some of) his conclusion remarks, well of course if nvidia can get a 30% increase in performance with every driver at some point will they beat the r360! But looking at the hardware, this just doesn't seem possible...

The X2 benchmarks seem also pretty useless (anand mentions it himself), so why put them up in the first place? Especially if the problems encountered are likely to change the performance numbers (and just dropping frames as anand suggested sure would change the numbers).

Anand also somehow manages to blame ATI for the missing shiny water in NWN, since he was obviously browsing around some forums he should have known that bioware just used nvidia proprietary extensions and didn't bother to also code for ATI's extensions. (For the record, shiny water in nwn still doesn't work with linux and ati cards (and ati's drivers). Some bioware person said in the nwn forum the ati shiny water code isn't in the linux build, since nobody at bioware bothered to test it - maybe it would just work...)

Will there be some analysis of this driver on beyond3d (these should be whql'd sometime iirc)? I'm sure anand missed some things. 8)
 
Hmmm.... Given the mention of image anomalies in almost every test in part 1, yet his screenshots show almost nothing at all, could this be the result of that screen-cap detect stuff Valve complained about?

I wonder.....
 
Heh, I found this part from page 2 to be ironic: "Our previous Flight Simulator benchmark just didn't push the game far enough, and we are hard at work trying to find a benchmark that better reflects gameplay and is completely repeatable".

Someone really needs to tell Anand that using benchmarks that rely on game cutscenes (Halo's are actually letterbox cutscenes!) isn't reflective of real gameplay.
 
Natoma said:
Given the mention of image anomalies in almost every test in part 1,

Wow, I've completely forgot about that...

yet his screenshots show almost nothing at all, could this be the result of that screen-cap detect stuff Valve complained about?

Would be truly shocking... but he has looked at the tests (at least he says so), so he should've noticed it...
 
Laa-Yosh,

I put nothing past anand at this point. Imo the burden of proof is on him to do a thorough analysis.

At the very least, no mention whatsoever of that very damning point from Valve, in an image quality test of those very drivers, strikes me as "curious" at best.
 
:oops: Fuck me, that was a lot!

Generally I thought it pretty good.

They didn't want to delve into any test apps, or even use mip colours, so they didn't spot the trilinear issues. I thought it odd that they didn't wait an just use 3.8 straight away (don't ask how this could be done...). I'd also like to know why they made the assumption that NVIDIA would be the only ones to gain from a shader optimiser - ATI are only beginning this as well.
 
Here's what Anand wrote in Part 1 of the article:
The 52.14 drivers apparently have issues in two games, neither of which are featured in our test suite (Half Life 2 & Gunmetal).

Gunmetal screenshots in Part 2 do have some anomalies, actually on all the three, there are some artifacts...

Moving on with Part1:
AA and AF didn't really seem to work as well on the NVIDIA cards as it did on the ATI cards. There was some difference between the two, but we will have to do more research into this area before we can bring forth anything conclusive.

No mention of it in Part 2.

Homeworld2:
This is another test where it was not apparent that NVIDIA's AA was doing as much as needed for the scene, so we will take a closer look at this benchmark when we do our image quality comparison.

Now he simply does not include AA scores in Part 2.


So it looks like he has covered most of the IQ issues mentioned in the first part of the article. Sim City 4 is an interesting question though...
 
I found his treatment of TRAOD to be in rather poor taste. For all the reader knows, none of the cards could give a playable game because he only shows how much performance drops when enabling PS 2.0. I mean, are we going from 100 to 40 or 10 to 4? Also, you can't really compare the results between the 9800XT and the 5950 because you have no idea where they started without PS 2.0.
 
Now that's what bothers me. Those two pictures are supposed to be with AA enabled, and yet the 52.14 drivers don't look like AA is being applied at all to me. No mention in the blurb above the images.

And the blurriness is noticeably marked between the two images in the textures...
 
mczak said:
IMHO this article isn't that bad. Though I'm now wondering why anand even did part1, as this article has all benchmarks mentioned there included.
And the article failed to mention some of the "minor" issues, I'd really have liked to have some analysis of the tri/af (quality/app pref) behaviour of the different drivers.

~

The X2 benchmarks seem also pretty useless (anand mentions it himself), so why put them up in the first place? Especially if the problems encountered are likely to change the performance numbers (and just dropping frames as anand suggested sure would change the numbers).

I thought this was going to be an IQ comparison. It seemed to be mostly benchmarks with a few comments about IQ here and there.

I know Anand mentioned the lack of AA in Homeworld and X2, but it looks like it's not working in F1 either. Just look at the car. And what's with the tiny images.

http://www.anandtech.com/video/showdoc.html?i=1896&p=12
 
Natoma said:
Now that's what bothers me. Those two pictures are supposed to be with AA enabled, and yet the 52.14 drivers don't look like AA is being applied at all to me. No mention in the blurb above the images.
There's AA in the 52.14 shots, it's just very poor quality (Lack of gamma correction and ordered grid).
 
DaveBaumann said:
:oops: Fuck me, that was a lot!

Generally I thought it pretty good.

They didn't want to delve into any test apps, or even use mip colours, so they didn't spot the trilinear issues. I thought it odd that they didn't wait an just use 3.8 straight away (don't ask how this could be done...). I'd also like to know why they made the assumption that NVIDIA would be the only ones to gain from a shader optimiser - ATI are only beginning this as well.
I thought the R3x0 architecture has almost-optimal drivers as-is? That's what I've gathered from these forums, anyway. I recall a recent thread where ATi devs said not to expect 20%+ performance jumps in shader-heavy games simply due to optimized drivers.

Yes, I agree that Derek (who was the main author) didn't spend enough time remarking on general AA/AF quality, and I didn't like his screenshot treatment. I noted this in his Ars thread, so hopefully he'll give me his reasons for that. I hope they're nothing more than time pressure, but I still prefer quality over timeliness.
 
OpenGL guy said:
Natoma said:
Now that's what bothers me. Those two pictures are supposed to be with AA enabled, and yet the 52.14 drivers don't look like AA is being applied at all to me. No mention in the blurb above the images.
There's AA in the 52.14 shots, it's just very poor quality (Lack of gamma correction and ordered grid).

Hmm.. Definitely poor quality then. I guess I'm spoiled by my 9800 Pro. :LOL:

Interesting comment from Derek Wilson in the comments section for AT:

Many of the image quality issues from part 1 were due to rendering problems that couldn't be captured in a screen shot (like jerkiness in X2 and F1), or a lack of AA. For some of the tests, we just didn't do AA performance benchmarks if one driver or the other didn't do what it was supposed to.

That sniffs very strongly of the screen-cap probs Valve reported. IQ problems in game that can't be detected in screenshots.
 
Pete said:
DaveBaumann said:
:oops: Fuck me, that was a lot!

Generally I thought it pretty good.

They didn't want to delve into any test apps, or even use mip colours, so they didn't spot the trilinear issues. I thought it odd that they didn't wait an just use 3.8 straight away (don't ask how this could be done...). I'd also like to know why they made the assumption that NVIDIA would be the only ones to gain from a shader optimiser - ATI are only beginning this as well.
I thought the R3x0 architecture has almost-optimal drivers as-is? That's what I've gathered from these forums, anyway. I recall a recent thread where ATi devs said not to expect 20%+ performance jumps in shader-heavy games simply due to optimized drivers.

Yes, I agree that Derek (who was the main author) didn't spend enough time remarking on general AA/AF quality, and I didn't like his screenshot treatment. I noted this in his Ars thread, so hopefully he'll give me his reasons for that. I hope they're nothing more than time pressure, but I still prefer quality over timeliness.

Sireric noted in another thread that they've only begun scratching the surface of the shader compiler. Basically shader rendering to this point is brute force of the hardware and un-optimized.
 
Back
Top