Another Site reviwing with 3DMARK and 53.03

DriverHeaven said:
For the benchmark junkies out there I don’t think you can get a better bang for the buck card in 3dMark03 and this really is the TI4200 of its generation. This is the card that Nvidia, and the graphics industry have really needed.
DriverHeaven said:
So which should you buy? Well the 5900XT really seems to be the end for the 5700Ultra in the mainstream market so that leaves the 9600XT and the 5900XT. Looking at these two it really depends on your budget. Whichever you can afford is the honest answer, you wont be disappointed with either.
I could post some more choice quotes from throughout the article but honestly I don't want to read it again since is was such a piece of crap. Not only does the author not comprehend the whole 3DMark fiasco he also has no clue as to what should be compared with what or the architechural differences between the cards.
 
If you read between the lines all the review really says is the FX5900XT benches a fair bit better but the 9600XT plays games a tiny bit better and looks nicer.

Funny when you think about how the two architectures were originally marketed.
 
There goes another site down the drain. Illogical conclusions and breaking rules must be the new fad in the hardware review industry.
 
They even mentioned that the 53.03 weren't FM approved, and then went ahead and used 'em anyways. :rolleyes:

You're screwed FM, just screwed. :(
 
Considering they're breaking EULA by doing so, it's hardly their right to. Not to mention the only reason to do so is to make nVidia cards look better than they really are.
 
The only thing that gets my goat is when major hardware sites use drivers the public never see, year after year. Especially the ones who proclaim they'll never fall for that trick again, and then freakin' fall for it again.
 
It was just funny he said that review sites use whatever driver they want.

Of course we can use whatever driver we want. It is up to no one else but the reviewer to what driver he uses. Hopefully the latest ones since they indicate what end users are experiencing in games.
 
Tim said:
Brent said:
{Sniping}Waste said:
Looks like review sites will use whatever driver they want.

How dare they! ;)

Any reason why you think it is OK that reviewers support nVidia’s cheating?

I think it is up to the end user to support or not support those kind of actions.

I think it is up to the reviewer to simply report everything objectively with fact so that the information is out there and the end users can make informed decisions.

And of course the IHV's will also see the reviews and what we have found and will hopefully take that information and better their products.
 
Quitch said:
Considering they're breaking EULA by doing so, it's hardly their right to. Not to mention the only reason to do so is to make nVidia cards look better than they really are.

Here is an option, don't use 3DMark03. On Noes, I said it!

There are plenty of other good synthetic benchmarks out there if you want to use one, like Shadermark for example, heck its got an anti-cheat option.
 
And Brent Nvidia get by the anti-cheats in Shadermark with a driver set just like 3DMARK so in you idea you can't use Shadermark too. If Nvidia would not cheat in the drivers then there would be no problems so stop supporting the cheats by Nvidia. (You were once a fighter aginst it but now your a Kyle puppet). With Nvidia puting cheats in the drivers, even the allmighty benchmark with games are not safe too. The drivers will just change the LOAD no mater what you want the LOAD to be to gain speed to inflat scores.
FM Is one of the hand full out there that is fighting this like Shadermark with update patches to stop the cheat in the driver but Nvidia puts out a new driver that defeats the anti-cheat.

What happen to the Brent we knew a year ago? :(
 
This cheating talk is way out of wack. Sure, it is not just to make one product do better than another in a synthetic benchmark; however, when playing a game, I cant notice a difference between a fp16 shader and a fp32 shader. Maybe when you take screenshots and observe each of the pixels they look different, but who really cares. Games are about having fun, and honestly, when you start talking about precision then it gets ridiculous. Like it or not, fp16 is part of the dx9 spec, and use of it to speed up the hardware seems like a very viable option to me if for no toher purpose than to make games run smoother on a platform to make them more enjoyable.

Certainly, the early stages of nVidia's brilinear were an eyesore because you could see the mip transitions, but now the method has been perfected. At the end of the day, sure nVidia runs its trilinear mode in a lower quality, runs some shaders in lower quality, and has subpar AA implementations, but if you are playing a game, play the game. These things are noteworthy for objections to a new card, but everyone knows about them now. Even though I am a happy ATI owner, I must say the nVidia bashing has gone quite too far, and while nVidia should stop implementing cheats for synthetic benchmarks, who really cares anymore? :rolleyes:
 
{Sniping}Waste said:
And Brent Nvidia get by the anti-cheats in Shadermark with a driver set just like 3DMARK so in you idea you can't use Shadermark too. If Nvidia would not cheat in the drivers then there would be no problems so stop supporting the cheats by Nvidia. (You were once a fighter aginst it but now your a Kyle puppet). With Nvidia puting cheats in the drivers, even the allmighty benchmark with games are not safe too. The drivers will just change the LOAD no mater what you want the LOAD to be to gain speed to inflat scores.
FM Is one of the hand full out there that is fighting this like Shadermark with update patches to stop the cheat in the driver but Nvidia puts out a new driver that defeats the anti-cheat.

What happen to the Brent we knew a year ago? :(

I still stand by what I said above.

It is the reviewers job to report everything objectively with fact so that the information is out there and the end users can make informed decisions. As long as we are reporting our findings in games as we play them on each card with the drivers then we are giving the end user the info they need to choose which video card they want to buy. That includes any image quality concerns, performance concerns, problems etc...

I've never not shown something, even if people don't agree with me, the least I can do is put the info out there. For example the weird lighting I have seen on NVIDIA cards with two different driver sets in NFS: U. I reported that problem in the review. I added in the UT2k3 sections the fact about NVIDIA doing the lessened trilinear and included my own personal experiences playing the game with that level of quality. We put our concerns out there about wanting the option for full trilinear in the control panel.

As long as we report on these things that we experience in games then the information is out there, and end users can decide for themsleves which video cards to buy.

That is what I've always done and will continue to do. I'm objective as they come really. I look at myself like a reporter, reporting facts and my own experiences with the cards so people can read it and go from there.

And also hopefully IHV's will read it and see where they can make improvements on their products ;)
 
Nvidia get by the anti-cheats in Shadermark with a driver set just like 3DMARK so in you idea you can't use Shadermark too

Are you sure they've got past the anti detect mode in shadermark 2.0? :?

I know they detected and rendered shadermark 1.0 useless but I was under the impression shadermark 2.0 still gives valid results. The 9600XT is definitly very competitive with the fx 5950 in it ;)
 
lost said:
This cheating talk is way out of wack. Sure, it is not just to make one product do better than another in a synthetic benchmark; however, when playing a game, I cant notice a difference between a fp16 shader and a fp32 shader. Maybe when you take screenshots and observe each of the pixels they look different, but who really cares. Games are about having fun, and honestly, when you start talking about precision then it gets ridiculous. Like it or not, fp16 is part of the dx9 spec, and use of it to speed up the hardware seems like a very viable option to me if for no toher purpose than to make games run smoother on a platform to make them more enjoyable.

Certainly, the early stages of nVidia's brilinear were an eyesore because you could see the mip transitions, but now the method has been perfected. At the end of the day, sure nVidia runs its trilinear mode in a lower quality, runs some shaders in lower quality, and has subpar AA implementations, but if you are playing a game, play the game. These things are noteworthy for objections to a new card, but everyone knows about them now. Even though I am a happy ATI owner, I must say the nVidia bashing has gone quite too far, and while nVidia should stop implementing cheats for synthetic benchmarks, who really cares anymore? :rolleyes:

I see nothing wrong with FP16 if FP32 is used were FP16 is to low. The think is that ATI R3XX are still faster at PS then the 5XXX with FP16. I see your piont about the game but not benchmarks. Thing have to be as even as possable in a benchmark and changing the LOAD in the drivers in a benchmark is WRONG. Look at 3Dmark 03 and you will see in gametest 2 and 3 with the new driver the FPS is up but its PS1.1 so the FP16 is not a problem but the drivers change other thing to speed it up like shader replacment of PS and VS. Its wronge to do this in a benchmark.
 
lost said:
This cheating talk is way out of wack. Sure, it is not just to make one product do better than another in a synthetic benchmark; however, when playing a game, I cant notice a difference between a fp16 shader and a fp32 shader. Maybe when you take screenshots and observe each of the pixels they look different, but who really cares. Games are about having fun, and honestly, when you start talking about precision then it gets ridiculous. Like it or not, fp16 is part of the dx9 spec, and use of it to speed up the hardware seems like a very viable option to me if for no toher purpose than to make games run smoother on a platform to make them more enjoyable.

Certainly, the early stages of nVidia's brilinear were an eyesore because you could see the mip transitions, but now the method has been perfected. At the end of the day, sure nVidia runs its trilinear mode in a lower quality, runs some shaders in lower quality, and has subpar AA implementations, but if you are playing a game, play the game. These things are noteworthy for objections to a new card, but everyone knows about them now. Even though I am a happy ATI owner, I must say the nVidia bashing has gone quite too far, and while nVidia should stop implementing cheats for synthetic benchmarks, who really cares anymore? :rolleyes:

No Nvidia's cheating or "optimizations" is an issue and it's still a big and important one. Their reliance on application-specific code makes any kind of benchmarking almost impossible. We need un-optimized benchmarks if we are to have any hope of extrapolating performance for other games than the ones that are optimized. The idea behind a synthetic test like 3DMark03 is to try and show how the card will perform under that kind of rendering workload-- which can then be used for approximate predictions of relative performance in un-tested games.

If Nvidia optimizes for games that are being benchmarked, that's great for the people who play those games, but it doesn't do much good for those of us who play games that aren't often benchmarked. If Nvidia optimizes for those games then the performance isn't going to reflect the performance in similar games (even using the same rendering engine) that haven't been optimized. If I can see a wide variety of games benchmarked, using different engines I can get a feeling for roughly how the card would perform in other games, the ones I would play. But if all those games are optimized it doesn't help me.

I need to see raw data, real data to make my choices, and Nvidia doesn't want me to have it. That sets off all sorts of warning flags.

In the long run optimizing for popularly benchmarked games hurts the consumer who doesn't limit their gameplay to those games.
 
Back
Top