ET article confirms ATi/Nvidia cheats

John Reynolds

Ecce homo
Veteran
http://www.extremetech.com/article2/0,3973,1105259,00.asp

According to an nVidia official with whom we spoke after Futuremark's disclosure, the company continues to maintain that these "discoveries" are just driver bugs that nVidia will work to resolve. The nVidia official reiterated nVidia's preferred emphasis on real-world games, since that's why people buy 3D hardware in the first place. This official also complained about driver developers wasting time chasing down 3DMark03 bugs instead of making games run better. (Of course, it must have taken substantial time to optimize the driver for 3Dmark2003 as well, as it appears the company has done.)
 
Interesting, they claim that ATI is 'replacing shader code', while ATI states just re-ordering.

The quote above is expected, the Red Phone has been ringing off the hook at Nvidia..call in the Damage Control Team.
 
Doomtrooper said:
Interesting, they claim that ATI is 'replacing shader code', while ATI states just re-ordering.

They are doing both. Taking the original shader, reordering it, and replacing the original shader. Although I can see how some might think 'replacing shader code' means using different functions, etc.
 
They found a more noticeable change for the 9800 in game tests 2 & 3 than I did, I tested with different res, AF/AA and CPU though; however I didn't think those tests were considered suspect for ATI ATM?

Doomtrooper said:
Interesting, they claim that ATI is 'replacing shader code', while ATI states just re-ordering.
I don't think ATI intended to imply they were applying a general re-ordering algorithm, since that would still work when FM changed the code trivially, rather that they re-jigged the code by hand and then detect FM's exact shader and replace it with their own as Nvidia did.
 
ET is testing at high resolutions, and not the 3DMark default. The net result of this is that the parts that are Pixel Shader optimised will have a greater difference, but the buffer clears will have a lesser difference (the more frames being drawn the more not clearing buffers gains you).
 
I thought the preliminary accusation against ATI didn't include buffer clearing, just replacement shaders in GT4 (I suspect GT1 too)? Then it would scale for resolution & AA/AF, agreed, but not in tests 2 & 3 which is where ET noticed a reasonable difference. I mean I saw literally no change in 2, and 0.3% in 3.
 
Looks like Extremetech's screwed up 9800 Pro numbers with 330 patch...I bet he has something like aniso on for 330 and off for 320...My testing here shows that the overall 3dmark03 number does not vary more than 3% between 330 and 320 at all resolutions. Not sure how ET got over 10% variance.

Not the first time that ET screws up the numbers. The least any website that portrays itself and makes money as a hardware review site should do is to make sure that the numbers they print are accurate. I had an impression before that ET was usually a reliable site...looks like they are getting old and jittery
 
you guys should change the name of this forum to "the bash nvidia forum"

that would seem to match 90% of the posts. rarely do i see anything 3d tech related here anymore, except relating to how nvidia sux0res
 
That'd only be a bad thing if NVIDIA didn't suck so much right now.

Oh well, can always hope NV40 will Magically Fix Everything. And that they'll fire their idiot PR department.
 
ET's numbers show a much bigger difference in 9800P scores between v320 and 330 than I'd have thought we'd see, especially in games that were not supposed to have been affected (GT1-3). Strange.
 
I just downloaded the patch and ran some before and after tests. I didn't want to bench all night so I only used 1024x768 and 1600x1200. This is on a 9700pro with cat 3.4 drivers.

Here are my numbers:

1024x768
320 | 330

test 1. 95.6|95.6
test 2. 32.9|33.0
test 3. 28.4|28.4
test 4. 30.9|28.5

1600x1200
320 | 330

Test 1. 78.2|77.9
Test 2. 16.0|16.0
Test 3. 15.2|15.2
Test 4. 21.4|19.0


Nothing like extreme techs numbers. The only tests that changed besides the Nature numbers were the test 2 in default rez went up by .1fps and the game test 1 at 1600x1200 went down by .3fps. I'd say aside from the nature scores the other tests seem to be within the proper range.

Not sure what ET did but their results don't match mine at all.
 
Aye. My result is similar to jjyab's.
ET made some mistakes i guess.

PS) I just tested again to check aa/af situation. But no difference.
 
croc_mak said:
Looks like Extremetech's screwed up 9800 Pro numbers with 330 patch...I bet he has something like aniso on for 330 and off for 320...My testing here shows that the overall 3dmark03 number does not vary more than 3% between 330 and 320 at all resolutions. Not sure how ET got over 10% variance.

Not the first time that ET screws up the numbers. The least any website that portrays itself and makes money as a hardware review site should do is to make sure that the numbers they print are accurate. I had an impression before that ET was usually a reliable site...looks like they are getting old and jittery

I've benchmarked a couple of times now and I'm still under a 2 % (1,8 to be exact)difference between 320 and 330 with ATi boards.
With my nVidia FX boards the difference is 28 %.
 
Evildeus said:
binmaze said:
binmaze said:
PS) I just tested again to check aa/af situation. But no difference.
.
Well then i see 3 possibilities:
- The issue is only on 9800
- The 3GHz of ET increases the gaps
- Et screws

I have 7 9800 boards (including the 256 Mb version) at home right now. I can vouch for counting out no 1 on that list.
I also have a 1700+ clocked to almost 2800+ so I personally tend to belive that no 2 is unlikely too.

Which leaves us with.. ;)
 
Back
Top