Aquamark3 Preview at 3DGPU

L233 said:
They do it because they don't really have a choice. It's either being like 50% slower when running DX9 shaders or less image quality. Since most reviews don't look at image quality, it's a no-brainer. If I was Nvidia, I'd do the same. Mask your own weakness as good as possible, create a PR smoke screen and hope most customers won't look too closely.

I mean... what do you expect? Nvidia issuing a press statement like "Ok guys, our hardware isn't really competetive in new DX9 games. Sorry, for that. You should better just buy an ATI board."?

I really have to disagree pretty much. nVidia's had plenty of choices over the last couple of years, just as has ATi. ATi has changed itself seemingly from top to bottom out of a recognition that it needed to do so. nVidia is obviously going to have to face up to the same thing, IMO.

Masking your own weaknesses and hoping your customers don't notice is fine--provided they don't notice. But lots and lots of things have been noticed about nVidia's nV3x products this year and widely circulated. So much so that nVidia withdrew nv30 and officially declared it a failure.

Now, if that's not catalyst enough to make them turn introspective to ferret out what's being done wrong--then nothing is, in my view. The whole point is that their former strategy of masking weaknesses and overcompensating through PR isn't working very well for them currently. Yet they seem almost oblivious to it and stuck in old patterns that once worked, but are no longer effective.

A further catalyst for change in nVidia is recognition that they are facing intensive competition from a determined competitor which has already proven it can produce and market superior products. nVidia ought to be abuzz with internal change right now, to the degree that it becomes obvious in nVidia's approach to the markets. But what's happening?

Response to R3x0: Just wait for out next driver set

Response to losing xBox contract: Ho-hummm...Yawn...

I mean, somebody there seems asleep at the switch to me...
 
Dave,

There's a distinction between, say, a new product release in which case you have to accept what you are given (new/beta drivers) and utter craparama drivers whose sole mission in life is to dumb things down in order to give the perception that performance has increased by a substantial amount.

I feel that at this point in time, pretty much everybody is just sick and tired of these games that a certain outfit has been a ridiculous habit out of playing, and it's time for people to finally say, "Enough is enough. This will not be tolerated any longer."

I firmly believe that this could work if the framework for such a declaration could be agreed upon by the major players.
 
g__day said:
BTW Dave when you say shader replacement you are referring to a driver shader alteration by NVidia - not Aquamark 3 itself down grading a PS 2.0 shader to a PS 1.4 or lower shader as Ingo Frick pointed out is built in


There is no need to lower the Shaders in Aquamark3. According to c't Aquamark3 has the following shaders inside :

Code:
30 PS 1.1  Shaders

 5  PS 1.4  Shaders

 4  PS 2.0  Shaders

Even TombRaider has 14 PS2.0 Shaders. Aquamark3 is nearly an pure DX8 / DX8.1 Benchmark with only 4 PS2.0 Shader so it can be called an DX9 benchmark (like Gunmetal). WHAT an JOKE !!!


Link (in german) : http://www.heise.de/newsticker/data/uma-13.09.03-000/
 
Look at this Statment Made by Massive about AQ3. Which Confirms my suspicions about built in *optimizations* they were including after their little meeting with Nvidia several months ago.

But nevertheless we try to create the same screen content with every technique, we face minor differences which arise from the internal accuracy which is smaller when we select a multi-pass technique instead of a multi-texture technique. The fallback mechanisms are optimized as heavily as possible, so we can ensure that the way to achieve the defined result is a near optimal way for all ps/vs versions. For that reason, the AM3 score is comparable, because the only fact that counts is the users benefit (ignoring image quality losses, which are negligible in AM3).
IMO it is simply Ridiculous to Suggest that you can impliment *Maximum Optimizations* For Specific Vendors Who need Fall back Routines... and Still claim that the Results are Comparable. Unless the Image is Completely and totally Unafected.

Case in point. How can one card be allowed to run the HQ mode against another card HQ mode. When one of the Two by Internal Design is never actually running the same level of testing? Im not talking about FP16/FP24 at all. Which I personally think is an Acceptable comparrison. Im talking about one card getting to use VASTLY Lower Percision and Specially optomized code. Like Shaders Dynamically Recompiled into PS 1.4 routines Vs Full PS 2.0

I just dont think it works out with legit Results at All.
 
mboeller said:
Even TombRaider has 14 PS2.0 Shaders. Aquamark3 is nearly an pure DX8 / DX8.1 Benchmark with only 4 PS2.0 Shader so it can be called an DX9 benchmark (like Gunmetal). WHAT an JOKE !!!

Link (in german) : http://www.heise.de/newsticker/data/uma-13.09.03-000/

C't makes a valid point however - why would any game use PS2.0 just to use PS2.0? If the effect you're looking for can be done with previous shader versions, doesn't it make more sense to do that? It's faster and many more cards can run those shaders natively.

Aside from that, they point out that the 4 PS2.0 shaders affect no less than 30% of all pixels in this benchmark, so it still has a substantial impact on the overal benchmark.
 
-edit- picture removed because 3dgpu stopped allowing linking to pics directly
 
How can one card be allowed to run the HQ mode against another card HQ mode. When one of the Two by Internal Design is never actually running the same level of testing? Im not talking about FP16/FP24 at all. Which I personally think is an Acceptable comparrison. Im talking about one card getting to use VASTLY Lower Percision and Specially optomized code.

Interesting, now this reminds me of 3dfx a few years back and how I argued the same point. The entire 16/22 bit vrs 24bit color arguements could apply to this arguement now. I once argued that the 3dfx cards (mainly the voodoo 3) had a vastly lower color precision and had a advantage from specifically optimized code (on certain games of course). Also didn't the 3dfx drivers take textures in a higher quality and convert and do something to them before they were ouput to a screen?

I remember so many people saying they couldn't tell the difference between color modes, or that the difference was so small it didn't matter. Still, how fair was it back then to compare quality/performance from the highest quality on 3dfx to the highest quality on nvidia cards when they were doing totally different things?? IMO it wasn't fair back then, and it's not fair now.

It's never been an apples to apples comparrison, when it comes to video hardware, so why start now? :(
 
Typedef Enum said:
This is what I think. I believe that representatives from all the major hardware review sites should get together, and draft a document that essentially states that they will no longer accept this kind of nonsense in their previews/reviews any longer. In other words, as long as you continue to kick out this kind of crap, we will _refuse_ to use those drivers when talking about numbers.

If all the major sites out there agree to this basic concept and simply refuse to use these drivers...forward the declaration to nVidia, I think they will actually think very hard about providing some sort of checkbox in their drivers that will ultimately allow the end user to enable/disable the optimizations so _they_ can choose how they want to play the game.

I have been thinking about this for a long time. I truly believe that nVidia would actually cave if the Anandtechs/HARDOCP's/Toms/B3D's of the world simply flat out refused to use these drivers.

I totally agree, well said.
 
Qroach said:
It's never been an apples to apples comparrison, when it comes to video hardware, so why start now? :(

This is not a matter of just not being apples to apples.

It's a matter of not being apples to apples....while one IHV is trying to convince others that it is.
 
Typedef Enum said:
Dave,

There's a distinction between, say, a new product release in which case you have to accept what you are given (new/beta drivers) and utter craparama drivers whose sole mission in life is to dumb things down in order to give the perception that performance has increased by a substantial amount.

I feel that at this point in time, pretty much everybody is just sick and tired of these games that a certain outfit has been a ridiculous habit out of playing, and it's time for people to finally say, "Enough is enough. This will not be tolerated any longer."

I firmly believe that this could work if the framework for such a declaration could be agreed upon by the major players.

I'll third that. In fact, I wrote this earlier as a reaction to the screen grab doctoring revelation:

Me said:
IMO, it's coming to the point where the hardware review press needs to get together and take a stand. If Valve (or anyone else) would positively identify such things with proof, web review sites should just REFUSE to give nvidia ANY press. No reviews...act like they don't exist.

nVidia is basically making hardware reviewer's job near impossible. So the attitutde should be "if we can't trust your drviers, we won't touch your card. We can't spend countless hours trying to figure out how you're trying to cheat. That's not our job, nor should it be...so we won't bother."
 
Welcome to the Dawn of cinematic computing!!!!

Where the most important think is how long your product's benchmark bar is and image quality is meaningless.

It seriously looks as if Nvidia takes ATI's benchmark scores and then lowers its IQ until their card is slightly faster.

Absolutely sickening....
 
Three things

1) Dave can you please clarify as I asked before what you meant by shader substitutions - are you referring to Massive susbsituting shader when they see a card is lacking a DX9 feature set component or do you imply NVidia in their drivers are doing something naughty or concerns relating to both practices?

2) EliteBastards had a bit to say about - "Hey NVidia - where did the Dawn of Cinematic Rendering go to?" http://www.elitebastards.com/page.p...3ee4e3b8e7f4724a140fe15dbd3&head=1&comments=1

3) I have asked Massive to comment on the concerns raised here, as their MD previously stated that shaders are assigned soley by DX9 featureset detected not by performance. Its tricky reconciling what Alexander - their MD has said and Ingo Frick - their Chief Technican posted on guru3d and how we therefore compare apples vs apples if that is what we are doing!

http://arc.aquamark3.com/forum/showthread.php?s=&postid=1888#post1888
 
Florin said:
C't makes a valid point however - why would any game use PS2.0 just to use PS2.0? If the effect you're looking for can be done with previous shader versions, doesn't it make more sense to do that? It's faster and many more cards can run those shaders natively.

Aside from that, they point out that the 4 PS2.0 shaders affect no less than 30% of all pixels in this benchmark, so it still has a substantial impact on the overal benchmark.

OTOH, if the purpose of a benchmark is to check on particular hardware feature support and try and define it, like for instance the power and efficiency of the hardware's ps2.0 shaders, FSAA, AF, etc., I think it's entirely proper to do so. I would assume that there are some things ps2.0 does better than 1.4--otherwise, what would be the point of the feature in the first place? As far as a game goes, if the hardware supports all the shaders it claims to support, why should the developer care which shader version he'd use? The decision as to which one is "better" is entirely up to the developer and I would think would very much be determined by what the developer was trying to do. In short, while there's no reason to exclusively use 2.0, there's no reason not to, either--it's a matter of developer preference, IMO, isn't it? I can't see how it'd be a problem for a developer to support a DX9 code path as well as a DX8.1 or lower code path in his software, and assign the DX9 path to the hardware which can handle it while assigning the DX8 code path to the hardware which does a poor job with DX9 but a decent job with DX8. Seems that should work with a minimum of fuss.
 
DaveBaumann said:
I've wanted to do a similar thing for a long time, but the issue comes with preview reference hardware - nobody would agree to waiting for official WHQL'ed drivers before reviewing. Unfortunatly though, its these previews that can give the largest misrepresentation of a product as they are conducted with limited time and its very difficult to catch anything.

That's good to hear, Dave. While all these 'previews' are welcome, they aren't taken w/the grain of salt I believe they should. BETA drivers/pre-release benches &/or ES/reference silicone aren't indicitive of what the end user will experience. Seems many ppl have forgotten that & are more interested in website hits than supplying the public they claim to be doing this for 'end user' results. Det 50's that are only in the hands of 'previewers', pfft! What does that mean to me: the end user? Nothing.

Take your time B3D & give us the real goods when it is truly known on end user drivers/hardware. 8)

.02,
 
mboeller said:
g__day said:
BTW Dave when you say shader replacement you are referring to a driver shader alteration by NVidia - not Aquamark 3 itself down grading a PS 2.0 shader to a PS 1.4 or lower shader as Ingo Frick pointed out is built in


There is no need to lower the Shaders in Aquamark3. According to c't Aquamark3 has the following shaders inside :

Code:
30 PS 1.1  Shaders

 5  PS 1.4  Shaders

 4  PS 2.0  Shaders

Even TombRaider has 14 PS2.0 Shaders. Aquamark3 is nearly an pure DX8 / DX8.1 Benchmark with only 4 PS2.0 Shader so it can be called an DX9 benchmark (like Gunmetal). WHAT an JOKE !!!


Link (in german) : http://www.heise.de/newsticker/data/uma-13.09.03-000/


Based on your link so far... all I can sy is: Aquamark 3 IS NOT A DX9 benchmark software. It's DX8.0, maybe DX8.1 - but that's all.

I think it's pretty much sums up everything regarding GFFX comparable scores...
 
Joe DeFuria said:
IMO, it's coming to the point where the hardware review press needs to get together and take a stand. If Valve (or anyone else) would positively identify such things with proof, web review sites should just REFUSE to give nvidia ANY press. No reviews...act like they don't exist.

nVidia is basically making hardware reviewer's job near impossible. So the attitutde should be "if we can't trust your drviers, we won't touch your card. We can't spend countless hours trying to figure out how you're trying to cheat. That's not our job, nor should it be...so we won't bother."

Well said, though we probably aren’t going to go as far as denying coverage altogether (that’s silly really). Still, it’s no secret that it’s becoming more and more difficult to make fair comparisons with so many different factors (like app detection) becoming an issue when these things obviously shouldn’t be an issue in the first place (well, in a perfect world). Kyle at the [H] won't be hard to convince about an industry wide video benchmarking standards practice of some kind, he's very open minded (despite what some here think :)). I'm sure Dave is probably more than willing to agree on some type of standard, despite the fact that I've never had the pleasure of meeting him (though Anand has I believe). THG may be another issue altogether, we’ll see. :)

Take care,

Evan
 
Back
Top