Anand Compares NV38/R360 IQ

THe_KELRaTH said:
A nice guy at AT forum, hominid skull, has put an animated Gif of the F1 images.

AT's conclusion should have included that if you don't want line crawling even with AA on then of the reviewed cards ATI is your ONLY option right now.
I can't see any AA at all on two of those three images.
 
Looking at the original Anandtech 4xAA JPG shots from F1 Challenge (.gifs really aren't good for image quality comparisons, and JPGs suck as well) either he did something wrong or there's something strange going on as there appears to be no aa on the horizontal edge of the banner over the track at all on either the 45.23 or 52.14 shots. I would expect to see one intermediate shade even on nVidia's 2x mode (it should stretch about halfway across the stairstep between one y coordinate and the next), and in the 4x mode there should be mostly one intermediate shade stretching about the same distance with the odd pixel of other shades here and there, but there seem to be pretty much no intermediate shades at all.

There are a few pixels dotted around some of the other edges in the image that look like they do have some level of AA applied though.

Very curious. :?
 
digitalwanderer said:
THe_KELRaTH said:
I'm suprised AT didn't do this as it's alot easier to view any differences - even with those tiny images.
I don't think Anand's goal was really to show the differences, I think he was trying to hide them...thus the tiny screenshots. ;)

Taken with the in-game view carefully angled to minimise the problem. I guess it's a case of "Nvidia tells us it doesn't impact image quality" :rolleyes:
 
the one thing that stood out for me was on the 52.14s the last fence panel on the right. If you look at the 3.7s and the 43s they both show the trees through the fence. On the 52s you get grey.
 
THe_KELRaTH said:
If you grab the F1 FX images and zoom in there is one intermediate shade on the banner aswell as other horizontal offset areas (like the car front window), areas that are near the vertical axis have 4 shades.
Not on the horizontal edges - as far as I can see there are a couple of pixels (at most) at each stairstep on the banner that might be an intermediate colour - in either 2x or 4x AA modes there should be a long strip of an intermediate shade running about halfway between each stairstep and the next (because of the sampling patterns used). There might be a similar problem with the horizontal edges of the tires as well, but because of the underlying noise from the road surface it's difficult to tell.

It's very strange.

Looking at it some more there also doesn't seem to be much detail in the road surface in the 52.14 drivers AF shot (looking at the area of track nearest to the car the track 'noise' looks to be at a much lower frequency)

[Edit - the lower frequency on the track detail appears to be new on the 52.14 driver - the 45.23 screenshot doesn't have this effect.]
 
Hey andy while you're here, quick question. :)

Does ATI lap the R3x00 core before applying it to the circuitboard? I noticed that the core on my 9800 pro was unbelievably reflective when I took off the stock heatsink to apply arctic ceramique; almost a perfect mirror. Never seen that before on any core, processor or VGA alike.
 
Oh dear... Looks like CatalystMaker's gonna piss off AT. All that designer TR-AOD benchmarking's just gone out the window with Cat 3.8 :LOL:

" Tomb Raider frame rates improve as much as 20%"
 
Natoma said:
Hey andy while you're here, quick question. :)

Does ATI lap the R3x00 core before applying it to the circuitboard? I noticed that the core on my 9800 pro was unbelievably reflective when I took off the stock heatsink to apply arctic ceramique; almost a perfect mirror. Never seen that before on any core, processor or VGA alike.
The back of a die/wafer is generally that shiny. At least all the wafers I've seen.
 
Natoma said:
Does ATI lap the R3x00 core before applying it to the circuitboard? I noticed that the core on my 9800 pro was unbelievably reflective when I took off the stock heatsink to apply arctic ceramique; almost a perfect mirror. Never seen that before on any core, processor or VGA alike.

I don't know so I can't tell you I'm afraid - you could try sireric.
 
the dets show much more blurriness than the cat 3.7 if you look into the distance.. there are also very apparent 'boundaries' where filtering is being applied it seems..

I agree that in the centre of the image, the road and the overhead signs looked much better in the cats... but I couldn't really tell with the side buildings which was giving the better image there (since without knowing what the building SHOULD look like). Only thing I can say is the new dets were pretty (and obviously) bad.

Looking at it like that though, the cat is the obvious winner.
 
zurich said:
I'm beginning to think that alot of the hardware community's hatred towards NVIDIA has become so blinding, they will never appreciate anything that comes from the company now, or in the future. Not that I think that NVIDIA hasn't made mistakes, but I think alot of people have taken things so far that even if the companies were reversed a year from now, nothing could change the ATI love fest.

I mean, look at this objectively; the 52.xx performance gains are massive. The IQ quality compared to 4x.xx is again a huge improvement. Yes, ATI is still ahead slightly, but that never caused anyone to 'hate'/witch hunt them when they were behind in the DX8 days.

It's just pretty messed up. 52.xx shows large improvements. Anandtech praises these impovements (while still giving the final nod to ATI), yet gets accused of being on the NVIDIA payroll. Pretty scary times these days.

This isn't so much an obervation on Anand's article, but rather on the responses I've read around the net regarding it.

*insert joke about people taking cheating/lying to heart so seriously yet overlooking the current American administration, meh (or something)*

(except Natoma, that crazy DemocrATIc :LOL:)

Just from the AA aspect of it I don't think Nivida is competing at all. Why even turn it on for a FX card? I'm not blinded by anything. If anything I see more clearly than ever.

I guess the questions you should ask yourself is this: 1. Does AA quality matter to me even on the horiziontal? 2. Do I want my card to perform the best it can out of the box, or wait six months to a year to see good performance?

If the answers to both of those questions are yes then you have your answer.

If you answer NO to both questions then you my fellow are the one that is not seeing clearly. :rolleyes:
 
zurich said:
I'm beginning to think that alot of the hardware community's hatred towards NVIDIA has become so blinding, they will never appreciate anything that comes from the company now, or in the future. Not that I think that NVIDIA hasn't made mistakes, but I think alot of people have taken things so far that even if the companies were reversed a year from now, nothing could change the ATI love fest.

I mean, look at this objectively; the 52.xx performance gains are massive. The IQ quality compared to 4x.xx is again a huge improvement. Yes, ATI is still ahead slightly, but that never caused anyone to 'hate'/witch hunt them when they were behind in the DX8 days.

It's just pretty messed up. 52.xx shows large improvements. Anandtech praises these impovements (while still giving the final nod to ATI), yet gets accused of being on the NVIDIA payroll. Pretty scary times these days.

This isn't so much an obervation on Anand's article, but rather on the responses I've read around the net regarding it.

*insert joke about people taking cheating/lying to heart so seriously yet overlooking the current American administration, meh (or something)*

(except Natoma, that crazy DemocrATIc :LOL:)

I think what you have struck on here could be regarded as the heart of nVidia's problems now - After about twelve months of deception, cheats, and image quality reductions, people expect that and nothing more from them, and because of that are naturally wary and skeptical of anything that comes out of nVidia. nVidia need to start earning back the respect they had in the past - The Detonator 50s might be the beginning of that long road, they might not. Just remember how long it took ATi to regain trust and respect after their reputation for poor quality drivers and the whole Quack incident.

Personally, I think there will be a lot of genuine improvement in the Detonator 50s, but that is going to be tempered by more image quality reductions in other areas - From what I've read it's going to be very much a mixed bag.
 
[3dc said:
Leonidas]English translation is ready:
http://www.3dcenter.org/artikel/detonator_52.14/index_e.php


Please keep in mind, ATi have a similary optimization under Direct3D AF: texture stages 1-7 are only bilinear filtered. IMO this is not corrected with Cata 3.8.
This is only true if you've forced AF from the control panel. If you use application enabled AF, then you get what the application asks for.
2006: We have GeForceFX III / Radeon 20000 and play Doom 7 with a nice bilinear filter at all :)
It's funnier if you get your facts straight.
 
That reminds me GL Guy, those ideas you told us about a CP option to force trilinear on all texture stages; have they taken off yet? :D
 
Is it reasonable to assume that the jump in DX9 shader performance is due to some form of wrapper so calls for Full precision auto drop to partial and calls for partial auto drop to 16Int.
 
OpenGL guy said:
This is only true if you've forced AF from the control panel. If you use application enabled AF, then you get what the application asks for.
That's a bad excuse IMO. The number of games that take control of the AF themselves is quite small. Luckily aTuner can disable that "optimization". But something isn't automatically good if something else is worse.

At least OpenGL has been exempted from any filtering "optimizations" so far.

It's funnier if you get your facts straight.
Maybe his comment was badly worded, but I don't think Leonidas has to get his facts straight.
 
THe_KELRaTH said:
Is it reasonable to assume that the jump in DX9 shader performance is due to some form of wrapper so calls for Full precision auto drop to partial and calls for partial auto drop to 16Int.
No, that would be easy to spot. If the driver drops to lower precision at all, it needs to do a more thorough analysis of how the values are used to decide where a drop in precision would be "acceptable".

I don't think performance increases of 30% with complex shaders in comparison to early drivers are particularly surprising. Many shaders leave room for optimization, especially those written in HLSL.

btw, it's FX12, not 16.
 
Xmas said:
OpenGL guy said:
This is only true if you've forced AF from the control panel. If you use application enabled AF, then you get what the application asks for.
That's a bad excuse IMO. The number of games that take control of the AF themselves is quite small.
And the number of games that benefit from trilinear on every stage is small.
But something isn't automatically good if something else is worse.
I have no idea what this means.
It's funnier if you get your facts straight.
Maybe his comment was badly worded, but I don't think Leonidas has to get his facts straight.
No, I believe he thought that ATI never did full trilinear, which is not the case. Many people have gotten confused on this issue.
 
Back
Top