Recommended review settings

Nvidia apparently instructs reviewers to use the Balanced setting in reviews, is that correct? I was wondering if ATI have a similar recommendation. If so is it the "Balanced" option in the overall OGL & D3D quality super-sliders? How about Matrox et al?
 
ya right, Nvidia sacrificing FPS for IQ?? please....

"Per NVIDIA’s recommendation, all benchmarks for this review were initially run with the “Aggressiveâ€￾ image quality slider selection within the drivers"

taken from [H] review of the 5200 and 5600
 
Maybe they give different instructions for the 5800 vs the others, I thought it was Balanced for the former. Anyway the Balanced option (or whatever it's called depending on which driver you use) doesn't do "proper" trilinear filtering so that's sacrificing IQ for FPS too.
 
Myrmecophagavir said:
Nvidia apparently instructs reviewers to use the Balanced setting in reviews, is that correct? I was wondering if ATI have a similar recommendation. If so is it the "Balanced" option in the overall OGL & D3D quality super-sliders? How about Matrox et al?

I haven't seen a single reference to that general quality slider in ATis guides.

nVidia reccomends 1280x1024 with 2xAA and 8x Aggressive AF as a sweet spot for their card.
They recommend using 8x Aggressive AF @ 1280x1024 (or higher) for "default" testing.

ATi recommends 1600x1200 with 4-6x AA and 8-16x performance and quality AF for the 9800 Pro

and BTW nvidias guide claims that Balanced does trilinear filtering of a quality that's superior to their competitors
 
Thanks for the info. I suppose if they're suggesting exact AA/AF modes the super-sliders are disabled, ie. set to Custom Settings.

My main concern is the Texture Preference and Mipmap Detail Level settings for ATI. You never see these sliders mentioned in reviews, but presumably they have some impact.

If you check the Balanced super-slider setting, you get different settings for D3D vs. OGL - under OGL the texture preference is set to Quality rather than High Quality and textures come out looking bad - hardly fitting the description "quality", and worrying that that should be the default...
 
Myrmecophagavir said:
If you check the Balanced super-slider setting, you get different settings for D3D vs. OGL - under OGL the texture preference is set to Quality rather than High Quality and textures come out looking bad - hardly fitting the description "quality", and worrying that that should be the default...

I'd rather say that Open GLs "high quality" corresponds to a slightly more aggressive setting than Direct3Ds "high quality".

Since there's vritually no difference in performance at all between that single setting in Open GL it's not much to worry about.
 
DaveBaumann said:
cho said:
NVIDIA will recommand use "Application" with the new driver .

Where did this come from?
It sounds odd, as I've just read elsewhere that nV changed the settings names again, from App/Qual/Perf to Qual/Bal/Perf.
 
Correct, the new drivers are now Qual / Bal / Perf

What is quite likely, though, is that nVidia would encourage reviewers who insist on apple-to-apple to do ATI's Qual VS nVidia's Qual, since they both use Trilinear everywhere and are "roughly" similar...
While it might give ATI a slight performance advantage, it's still no where as big as Balanced vs Performance, for example - and remember that's what Anandtech was doing...


Uttar
 
Ante P said:
Myrmecophagavir said:
If you check the Balanced super-slider setting, you get different settings for D3D vs. OGL - under OGL the texture preference is set to Quality rather than High Quality and textures come out looking bad - hardly fitting the description "quality", and worrying that that should be the default...
I'd rather say that Open GLs "high quality" corresponds to a slightly more aggressive setting than Direct3Ds "high quality".

Since there's vritually no difference in performance at all between that single setting in Open GL it's not much to worry about.
A negligable performance difference would be all the more reason to increase the default setting and bring it in line with D3D's Balanced setting :)

I see in 3DMark03's help file that it is "required" you set all driver settings to maximum quality to produce a score regarded as "default". Hence review sites should really be maximising this slider if they're doing this benchmark. But this doesn't apply to other tests like UT2003. And even here at Beyond3D, from the 9600 Pro review:
For the purposes of this review the default driver settings will be used for normal rendering, as this produces standard Trilinear Filtering as represented above. 16x 'Quality' filtering will be used for more detailed game results on the Radeon 9600 PRO.
So that's not in line with 3DMark's "requirements".

I took some screenshots from a simple OpenGL program to illustrate the difference and posted them in this Rage3D thread. You can see there's a marked difference between the texture quality of "Quality" and "High Quality" mode which led me to believe this is sigificant for benchmarking purposes.

Doesn't anyone else find it funny that no-one considers these sliders when giving results?
 
Myrmecophagavir said:
I see in 3DMark03's help file that it is "required" you set all driver settings to maximum quality to produce a score regarded as "default". Hence review sites should really be maximising this slider if they're doing this benchmark. But this doesn't apply to other tests like UT2003. And even here at Beyond3D, from the 9600 Pro review:
For the purposes of this review the default driver settings will be used for normal rendering, as this produces standard Trilinear Filtering as represented above. 16x 'Quality' filtering will be used for more detailed game results on the Radeon 9600 PRO.
So that's not in line with 3DMark's "requirements".
Considering that 3D Mark 2003 is a D3D app, what difference does it make what settings are being used for OpenGL?!
 
Back
Top