My response to the latest HardOCP editorial on benchmarks...

Lars,

You really think we get the correct answers on teh issues if we ask without making the discussion public before?!

Did you read what I wrote?

Would "public discussion" not also occur with an "article" that simply stated nVidia's positions....and NOT YOUR OPINIONS along with it? You seem to think that "public discussion" means not only reporting on what PR nVidia fed you, but ALSO your own opinions on 3DMark before you've had the chance to have "the other viewpoint."

One other web-site (esacpes me at the moment), did exactly that. They basically "published the literature" that nVidia sent them, but held off on giving THEIR thoughts on the matter until they could do proper research FIRST.

About the "why"... just think about where NVIDIA is making their money:Mainstream! This means GF4 Ti 4200 / MX440 (or below). And those cards suddenly look very bad compared to R8500/9000 in 3DM. Real games can get optimized by NV devs. 3DM cannot!

Right...now WHERE IS THAT stated anywhere in your "column?" as a possible motivation for nVidia's all-the-suidden flip-flop stance on 3DMark? Don't you think that might be a wee bit important for readers to be clued into?

EDIT (addition):

I mean, your last paragraph starts with "NVIDIA's sudden change of mind is baffling - "

And somehow, the fact that their current line up looks like CRAP compared to their competitor in 3DMark...never made it into your "list of possibilities" to explain that "baffling" change of mind.
 
Borsti said:
How many DX9 cards have been sold yet?

Including the preordered FXes I'd say we're soon close to 1.5 million.

On the incomming we've got S3 and yet even more products from nVidia and ATi as well as SiS etc.

And since when is did "Futuremark" mean "Presentmark"?
 
Sure, we are not at DX9 but who cares? How many DX9 cards have been sold yet? Do you really expect many DX9 games in a short time? We have now reached the point where a huge amount of DX8 cards are used by gamers. And Xbox is important because many games will be also launched on the MS console. Apart of some "big" titles it does´nt make sense to develop a game for PS 1.4 and then report it back to 1.1.

Omg what kind of crap is that, there is also a significant amount of PS 1.4 cards on the market, and a great lineup of lower cost DX9 parts (9500 NP).
This benchmark is about the Future, like they always have been.
X-box titles should never affect the progression of a PC title, the developer can easily have fallback options like Futuremark did.
With DX9 HLSL here that part is even easier now. (NOT CG, a non-standard API HLSL)
IT IS ONLY the fault of Nvidia that they chose not to support the faster, superior Pixel Shader version, nobody else.

I get a sense from your post you feel if Nvidia who is very far behind in technology vs. ATI doesn't preach PS 1.4 is good for PC graphics, or PS 2.0 then they are automatically correct...wrong.

There is other card companies you know, biased drivel has no place on supposed unbiased reviews i.e 'Pretender to the Crown' 'Geforce FX Ultra the new king' with inferior IQ, rendering errors and cancelled :rolleyes:


There is alot of developers on this forum, ask them (like in your VERY questionable article that a developer would choose PS 1.1 over 1.4) and see what happens.
 
Here is an example of what I would call a REASONABLE analysis of the whole situation:

http://www.extremetech.com/article2/0,3973,888225,00.asp

I don't agree with everything that is said, and how everything is done, but it is clear that the author has an open mind, sought feedback from NON nvidia sources, and then ran multiple (and relevant) tests and applied some of them to nVidia's claims.

In any case their PLANS for 3DMark are spot on:

ExtremeTech's 3DMark03 Plans: To that end, we will continue to use our current 3.0 version of 3D GameGauge as our primary performance measurement tool, and 3DMark 2003 as a secondary test. As DX8/DX9 games become available, we will evaluate those and add to the mix those high-profile games that are doing interesting shader work.
 
X-box titles should never affect the progression of a PC title, the developer can easily have fallback options like Futuremark did.

Actually, the XBox ccould have a positive impact on PC development of games as developers could focus on DX8 level hardware and have a port of the title to the Box to offset reduced potential sales. DX9? I'd love to see it next week, but then again I'm anxiously awaiting the first real DX8 PC game and have been for a couple of years now ;)
 
Joe, I see your point. But I have a opinion on 3D Mark. So why should´nt I write about it? I _had_ to post my opinion after running the tests with the new NVIDIA driver. Regardless if NVIDIA´s critic is legitime or not. They prove that it´s easily possible to "cheat" in the benchmark. The main question is: WHAT did they do exactly to get the better results?

The first time my opinion comes in is on page 1. I say that running different cards with different shader codes makes comparsion almost impossible. Everything else on page one and two only decribes NVIDIA´s opinion. On page 3 I show the results of the new driver. The conclusion for sure is my opinion.

My critics on 3D Mark 2003 are:

- no PS1.1 comparsion possible
- no polygon count/light tests anymore
- how is it possible to create a "standard" shader that runs well on all cards?
- what about driver "cheats"?
- image quality tests could be better

Lars
 
But I have a opinion on 3D Mark. So why should´n I write about it?

Because as I said, your opinion cannot possibly be objective due to the fact that you have not heard both sides of the story.

As you admit, you're a journalist and not a "tech guy" with enough knowlege to really ascertain the validity of nVidia's calims. Of course, nVidia's claims can appear fully reasonable to you. There is no way that this can not impact your opinion if you don't have someone else, equally as technically inclined as nVidia, giving a different point of view.

They prove that it´s easily possible to "cheat" in the benchmark.

No, they proved that the driver can severly impact benchmark scores. Is this not common knowledge? Are you not aware of this fact, and that it occurs on ALL benchmarks, including games, since the dawn of time, and not just 3D Mark?

I say that running different cards with different shader codes makes comparsion almost impossible.

Then I suppose you won't be benchmarking Doom3 when it comes out? (Which Ironically is something that nVidia suggest you use...) Because that's exactly what it does. In fact, in a VERY SIMILAR way.

GeForce3/4 cards will use OpenGL's equivalent of "Pixel Shader 1.1". Radeon 8500 will use OpenGL's equivalent of "Pixel Shader 1.4". As a direct consequence, GeForce 3/4 cards will be forced to render the scene in with multiple passes per light, whereas Radeon 8500 will be able to do it in a single pass per light.

My critics on 3D Mark 2003 are:

- no PS1.1 comparsion possible

Then RUN 3D MARK 2001, and you can get that if you want.

- no polygon count/light tests anymore

That's really only relevant for a "fixed function T&L" test. Again, use 3DMark2001 if you want such things. 3DMark 01 did not disappear. '03 is forward looking, hence the emphasis on VERTEX SHADERS, and not old-school fixed function T&L.

- how is it possible to create a "standard" shader that runs well on all cards?

You can't. The best you can do is what some people like Carmack do: wright different shader paths, where the RESULTS are essentially the same, yet they take different paths to get there.

- what about driver "cheats"?

What about them? You'll have to bring that up with nVidia. Why don't you just say "if you're so concerned about cheating, then, uh, don't cheat."

- image quality tests could be better

I know you suggested some synthetic AA test, but beyond that, how do you suggest image quality tests be done?
 
Borsti said:
- no PS1.1 comparsion possible
- no polygon count/light tests anymore
- how is it possible to create a "standard" shader that runs well on all cards?
- what about driver "cheats"?
- image quality tests could be better

Lars

Can test 1.1 on ATI cards quite easily.

This compare link is from some initial testing I did with PS 1.4 vs PS 1.1 on a 9700Pro. The same test on a 8500 or 9000 would be even more informative. See this thread.
 
I think I'm starting to see the problem of why review sites are having problems with 3DMark03. They see the release of 3DMark03 as a replacement of previous versions. However, I don't see it that way and I'm sure Futuremark doesn't either. 3DMark03 does NOT make 3DMark 2001 obsolete!!

With such a wide DirectX generation gap in current games, 3DMark 2001 should STILL be used for comparing graphics in DirectX 7/8 type games. Now, had developers actually kept up with the pace of DirectX and hardware, then I think we might not be having this problem.

Also, in the past, 3DMark was used and thought of, though wrongly, as a benchmark for testing performance in the type of games used today. I think it's finally come to the point where people are just now starting to realize that is not it's intention and never was. They seem to want a synthetic benchmark that tests performance in the type of games used today and don't realize that they still have one in 3DMark 2001.

It's almost as if Futuremark shot themselves in the foot by releasing a DirectX 9 benchmark so early. I mean most games today don't even use much DirectX 8 features as it is. Now, that doesn't mean I don't the like benchmark. I haven't played with it yet, but I believe from what I've read so far here that it is a fair benchmark, but maybe one that is a little too early and it's intentions are not understood.

"The first step towards knowledge is to know that we are ignorant." - Richard Cecil

Tommy McClain
 
Nice summary there, Tommy. ;)

3DMark03 does NOT make 3DMark 2001 obsolete!!

Exactly. The RELEASE of 3DMark03 does not make 2001 obsolete. The only thing that makes 2001 obsolete, is when actual games catch up to the technology that 2001 uses. Games right now are just starting to "max out" DX7 technology and are branching into DX8, so we're almost there, but not quite.

Also, in the past, 3DMark was used and thought of, though wrongly, as a benchmark for testing performance in the type of games used today. I think it's finally come to the point where people are just now starting to realize that is not it's intention and never was.

Yup. And instead of admitting that they've been "using it wrong" this whole time, they see the new benchmark as "different and wrong" instead.

That being said, I do feel that some of the blame lies with FutureMark, because I don't think the benchmark is publicized correctly enough. They could be more evangelical and proactive in the correct usage of their benchmark.

It's almost as if Futuremark shot themselves in the foot by releasing a DirectX 9 benchmark so early.

That's an interesting way to look at it, and you may be right...
 
Borsti said:
My critics on 3D Mark 2003 are:

- no PS1.1 comparsion possible
- no polygon count/light tests anymore
- how is it possible to create a "standard" shader that runs well on all cards?
- what about driver "cheats"?
- image quality tests could be better

Lars

1) I don't see what you want to achieve with this? You will favor one IHV over another if you force a specific PS level.
2) No, because fixed function T&L isn't that interesting anymore. (at least not fixed function L)
3) You could use HLSL (or Cg!) but Futuremark apparently didn't.
4) Yes, what about them? If a driver can do something faster without IQ degrading/changing, I don't see the point.
5) Dunno. Probably.
 
Joe DeFuria said:
Nice summary there, Tommy. ;)

Thanks Joe. :)

Joe DeFuria said:
Exactly. The RELEASE of 3DMark03 does not make 2001 obsolete. The only thing that makes 2001 obsolete, is when actual games catch up to the technology that 2001 uses. Games right now are just starting to "max out" DX7 technology and are branching into DX8, so we're almost there, but not quite.

Agreed. I wished reviewers would post results from both 3DMark03 and 3DMark 2001. Yes, it's a lot of work, but it will show how the card does in today's games and future games. It would definitely be interesting to see the comparisons between the two. However, I think the reviewers are too lazy to do the extra work. I've always said, the more data points the better.

Joe DeFuria said:
Yup. And instead of admitting that they've been "using it wrong" this whole time, they see the new benchmark as "different and wrong" instead.

Agreed. Sad isn't it?

Joe DeFuria said:
That being said, I do feel that some of the blame lies with FutureMark, because I don't think the benchmark is publicized correctly enough. They could be more evangelical and proactive in the correct usage of their benchmark.

True. Especially ever since ZDLabs quit updating 3DWinbench. Now that they are the only synthetic 3D benchmark in town they starting getting soft. ;)

Joe DeFuria said:
It's almost as if Futuremark shot themselves in the foot by releasing a DirectX 9 benchmark so early.

That's an interesting way to look at it, and you may be right...

Hey, I'm all about looking at things in interesting ways. LOL :)

Tommy McClain
 
Lars, it's a pity you argue more eloquently and elaborate more clearly with your first response in this thread than you did with your column, which will reach (and confuse) a far larger audience.

Borsti said:
Real games can get optimized by NV devs. 3DM cannot!
You said yourself in your "column" than IHV's can optimize for certain applications (games/benchmarks) without the application knowing.

The issue is not to test who can multiply X matrices faster, the issue is who can produce X image faster. 3D game coding is, as everyone and their mother has said, the art of cheating without getting caught--creating the illusion you want with as little processing done as possible. So if PS1.4 can produce the same image as PS1.1/1.3 in less time (with less work), what's the problem? It's nV's fault they don't support it, not ATi's fault that they provide their customers with better hardware in that respect.

And please don't trot out the "more ppl have PS1.1 h/w" argument, as if nV is doing everyone a favor by denouncing 3DM03. nV is holding the industry back by selling all their PS0.0 GF4MX's. They're a generation behind ATi in two of three markets, low and mid-end, and merely overall comparable in the high end. Don't use the Xbox as a minimum spec example, as nV's lineup bottoms out much lower.

Borsti said:
But I have a opinion on 3D Mark. So why should´nt I write about it? I _had_ to post my opinion after running the tests with the new NVIDIA driver. Regardless if NVIDIA´s critic is legitime or not.
This is just irresponsible journalism on your part, if I can even call it that. I realize I shouldn't have any expectations for reporting I don't pay for, but for a site that reaches as large an audience as THG does, you have a greater responsibility to make sure you present a fair picture.

Borsti said:
They prove that it´s easily possible to "cheat" in the benchmark. The main question is: WHAT did they do exactly to get the better results?
While that question may be interesting to the dev-heads here, the only issue you should concern yourself with should be, "Do the new drivers provide that dramatically-improved performance with the same IQ?" I believe your responsibility was to show that nV's complaints were irrelevant. Their lack of developmental input (for only the final few months, you should add) didn't prevent them from releasing drivers that hugely improved performance pretty much simultaneously with 3DM03's release.

Basically, rather than parroting nV's company line (which is, like any publically-traded company, "More Profit."), you should have investigated the issue further. If nV produced equivalent IQ with their new drivers, then consider it proof that nV has good driver-optimizers, something to be commended. Futuremark is not beholden to nVidia, and they should not be faulted if they fail to portray nV in the best light.

ATi has the most future-proof hardware ATM, and 3DM03, as a benchmark of future performance, shows it. So what's the problem all of a sudden?

Edit: On review, my post seems too harsh. Lars, I apologize if I come off as unduly critical. Your column left me wanting more in certain parts, and I don't fully agree with some of your conclusions, but you're entitled to your opinion. I'll leave my post as is, so ppl can see how not to overreact. ;)
 
Lars,

Love this ending quote from your article:
"...it's difficult to imagine that actual "neutral" code would be useful for performance evaluation. "

:)

Anyway, the sad thing here is that whatever the facts are, after this mess is over the situation is so muddled that the consumer is left with thoughts of "hmm, 3DMark03, I've heard something odd about it, I better ignore it".

Futuremark loses. And on the long term, NVIDIA will lose too.
 
It's a sad, sad day when the most accurate--err, better make that least inaccurate--editorial/article on this whole 3DMark03 "controversy" comes from The Inquirer. Nothing new or insightful to read there, but it's just such a nice change of pace to read something on the subject which is not directly...uninsightful (what's the opposite of "insightful"? "Ass-backwards", perhaps); something that actually expresses doubts on the complete accuracy of Nvidia's desperate mudslinging attack on a 3rd party benchmark rather than sententiously paraphrasing (usually incorrectly) its conclusions with the authority and EE expertise that can only come from a long career of rebooting, changing the FSB clock in the BIOS, and typing "`timedemo 1 <enter> demo demo001 <enter>," a few hundred thousand times.

Actually that's not quite true; there was something new: this fine quotation, attributed to one Tony Tomasi, senior director of desktop product marketing for Nvidia:

Tony Tomasi said:
Being a beta partner at all required us to pay money to Mad Onion and that seemed really wrong. Don't get me wrong - that's a tough business. I totally appreciate the quandary that MadOnion is in… From our perspective, it didn't make sense for Nvidia to give them money for something we didn't believe in.
:oops: :oops: :oops:

Does Nvidia think no one else knows that every respected industry-standard benchmark is beta-tested and steered to a significant degree by a consortium of paying IHVs and ISVs?? How do they think SPEC and TPC are run??? And to act as if it's normal behavior to withdraw and stop paying membership fees because you don't like the results your product gets??!?! :oops: :oops: :!: :?: :rolleyes:

Of course, as no one has pointed out how utterly inane and transcendently unprofessional this is, I guess Nvidia is right.

Unbelievable.
 
Ichneumon said:
Borsti said:
- no PS1.1 comparsion possible
- no polygon count/light tests anymore
- how is it possible to create a "standard" shader that runs well on all cards?
- what about driver "cheats"?
- image quality tests could be better

Lars

Can test 1.1 on ATI cards quite easily.

This compare link is from some initial testing I did with PS 1.4 vs PS 1.1 on a 9700Pro. The same test on a 8500 or 9000 would be even more informative. See this thread.

If someone tells me the registy switch then I'll test it out on my 8500 :) (im on 3.0a drivers btw)
 
nm I installed rage3d tweak.

Trolls lair dies on the 8500 with ps1.1, freezes after the loading screen and after rebooting I get the serious error message and one that the display adapter couldn't finish a drawing operation.

Athlon XP 12.5x141, 512mb DDR333, 8500 at 310/310.
With PS1.4 and VS1.1:

GT2: 7.3fps
Ragtroll: 5.0fps

PS1.1 and VS1.1:

GT2: 6.2fps
Ragtroll: 4.7fps

So there is a speed increase but its not a huge one.
PS1.4 performance seems to have gone up in the drivers recently though. In the 3dmark2001 APS test I used to get ~75fps on PS1.1 and ~85fps on PS1.4, I just got 102fps with PS1.4 with 1.1 still 75fps.
 
I wonder what the point of discrediting FutureMark03 really is if you have (as Nvidia says) a whole suite of DX9 capable cards ranging from $99 to $399 ready to go.

Surely the more software you have to show them off, the better.

The technology push is on, so why not join it lads (Nvidia), on terms other than your own.

Nvidia's DX9 range is struggling to get on stage because its taking too much time to put way too much make up on.

Meanwhile the FutureMark03 show has been very entertaining so far, with applause for the ATi cast members, while the Nvidia Diva sits backstage throwing a tantrum.

Sometimes less is more.
 
Back
Top