more evidence that anandtech is in nvidia's pocket

Derek Wilson - Anandtech Comments said:
We could only use a couple benchmarks, and the couple we chose are standardish (UT2K3), based on very common engines (JKJA), or one of the few available (Halo having PS2.0 support). These were not the games with huge performance gaps between them (like Tomb Raider or Tron). Also, since we were including 5700 and 9600 parts, we wanted to stick with the standard-but-lowish 1024x768 resolution rather than bump up a 1280 flavor.
Why is there an aversion to using a DX9 benchmark that shows something other than parity?
 
Beyond the gaming experience, we will be looking at fan noise and heat dissipation as well as which memory modules the vendor decided to use. Some of the cards are even pre-overclocked for our gaming pleasure.

I find it rather interesting that they've gone trough all this trouble and still didn't mention anything about image quality in the whole article. (unless i'm missing something).
 
Bjorn said:
Beyond the gaming experience, we will be looking at fan noise and heat dissipation as well as which memory modules the vendor decided to use. Some of the cards are even pre-overclocked for our gaming pleasure.

I find it rather interesting that they've gone trough all this trouble and still didn't mention anything about image quality in the whole article. (unless i'm missing something).
Sayeth Mr."TR:AOD benchmark isn't a good indicator of nVidia's performance".... :rolleyes:
 
Derek Wilson - Anandtech Comments said:
We could only use a couple benchmarks, and the couple we chose are standardish (UT2K3), based on very common engines (JKJA), or one of the few available (Halo having PS2.0 support). These were not the games with huge performance gaps between them (like Tomb Raider or Tron).


Wow...what a novel idea! Let's do our best to highlight similarities in cards, not differences. This simplifies our decision making immensely!! "Eh...just pick any card....they all perform about the same!"

:rolleyes:

Also, since we were including 5700 and 9600 parts, we wanted to stick with the standard-but-lowish 1024x768 resolution rather than bump up a 1280 flavor.

Now we're talking!!

Let's round up all these graphics cards, and run all these tests....such that we don't actually test the GPUs!! Pure Genius!

I don't think I've ever witnessed a more asinine testing methodology...
 
Meh, they went through all that trouble overclocking and they didn't even include an HIS 9800 with IceQ. Given how unbelievably well the HIS 9600XT performed, what was the hold up? And don't tell me AT is lacking the funds to purchase these cards themselves.
 
cthellis42 said:
Oooh, recap! Recap! It's fun. You might have spotted once that I missed. ;)

lol, fair enough, actually others have pointed out most of my gripes but i suppose i should add to the list. first and formost, he goes though the trouble of displaying core temps but never bothers to mention the very perentant fact that nvidia underclocks and undervolts their cards when running in 2d mode were all the temprature readings were taken. also, he speaks about his overclocking tests as if the results for the indivdual cards he tested are directly indicative of the products as a whole.

add it all up and the article is nothing more than a travesty of a mockery of a sham of a mockery of a travesty of two mockeries of a sham. :cry:
 
digitalwanderer said:
Entropy said:
Quitch said:
It's the way [H] was last year, without the wild mood swings :)
Brent, to my knowledge, never produced anything as bad as that.
*COUGH-COUGH*"Brilinear is just as good as trilinear" POS/BS article*COUGH-COUGH*
speaking of brilinear/trilinear, anyone know if you can get full trilinear in UT2003 with an ATI card? yup, kinda random, but I need a full trilinear image. NV is making me crazy with the differences between app and CP AF that make no sense.
 
kyleb, can you PLEASE change the spelling of "evedence" to "evidence". It is driving me crazy every time I look at this topic.
 
The Baron said:
digitalwanderer said:
Entropy said:
Quitch said:
It's the way [H] was last year, without the wild mood swings :)
Brent, to my knowledge, never produced anything as bad as that.
*COUGH-COUGH*"Brilinear is just as good as trilinear" POS/BS article*COUGH-COUGH*
speaking of brilinear/trilinear, anyone know if you can get full trilinear in UT2003 with an ATI card? yup, kinda random, but I need a full trilinear image. NV is making me crazy with the differences between app and CP AF that make no sense.

Usually leaving the application if it provides tri and aniso settings should take care of it. If you're still not getting there you can force trilinear on all texturing stages with rTool:

http://www.3dcenter.de/downloads/rtool.php
 
The Baron said:
speaking of brilinear/trilinear, anyone know if you can get full trilinear in UT2003 with an ATI card? yup, kinda random, but I need a full trilinear image. NV is making me crazy with the differences between app and CP AF that make no sense.

Yeah, set the ATI control panel to "application preference" and then (IIRC) in your system\ut2003.ini file set "UseTrilinear=True"
 
Or aTuner. :)

Ailuros said:
The Baron said:
digitalwanderer said:
Entropy said:
Quitch said:
It's the way [H] was last year, without the wild mood swings :)
Brent, to my knowledge, never produced anything as bad as that.
*COUGH-COUGH*"Brilinear is just as good as trilinear" POS/BS article*COUGH-COUGH*
speaking of brilinear/trilinear, anyone know if you can get full trilinear in UT2003 with an ATI card? yup, kinda random, but I need a full trilinear image. NV is making me crazy with the differences between app and CP AF that make no sense.

Usually leaving the application if it provides tri and aniso settings should take care of it. If you're still not getting there you can force trilinear on all texturing stages with rTool:

http://www.3dcenter.de/downloads/rtool.php
 
I like this Gem in the Final Words...

" The fact that the 9800XT doesn't benefit as much from overclocking is interesting, especially since the 9600XT seems to benefit so much from it."

Duh!!! It wouldn't have anything to do with the fact that the 9600XT uses a Low k .13 process to manufacture the cores, as opposed to the older .15 process used by the 9800XT could it?


and this...

"Core processing power is becoming more and more important, and with shader intensive DX9 games on their way, enthusiasts are going to want more and more power from their graphics cards."

So maybe it would be nice to actually test those DX9 functions somehow?
Halo, JK, and UT are sort of poor examples to use to show that future potential.

Anand's articles' educational value surely suck, but the Entertainment factor is off the scale. What a joke.
 
Back
Top