My response to the latest HardOCP editorial on benchmarks...

this may have been posted already, but here is some food for thought

the new nvidia drivers make the gffx outperform the 9700 pro in 3dmark03, yet nvidia is still hates it

if there were really a conspiricy or something wouldn't you think we would just show the 'New' drivers, showing the GFFX is better, ignoring the whole fact NVIDIA purposely optimized drivers to make it better in 3dmark03 only, and then say w00h00 3dmark03 rox0rs lets all use it :p

Instead, even though the GFFX came out on top, our point is to say look what they can do in 3DMark03, do we really want to use a benchmark any longer that can really be optimized like this but doesn't actually effect any game performance?

We are simply saying no, we aren't goingn to use it in that way anymore
 
Brent,

And some other food for thought

When you look at SHIPPING PRODUCTS today that compete in similar price brackets:

GeForce4MX vs. Radeon 9000
GeForce4Ti vs. Radeon 9500/Pro

nVidia's cards get SPANKED in 3DMark03. 3DMark shows a BIG 3DMark score difference between the the competing products.

Yes, GeForceFX Ultra currently beats the 9700 Pro. That doens't help much if all the Ultras that can be bought have already been ordered, or if by the time the Ultra actully does appear, the R350 starts getting reviewed....

Instead, even though the GFFX came out on top, our point is to say look what they can do in 3DMark03, do we really want to use a benchmark any longer that can really be optimized like this but doesn't actually effect any game performance?

Again, have you read my commentary on what 3DMark actually REPRESENTS? It's not all about performance. It's about "the ability to play future games". I see ZERO PROBLEM with using 3DMark in that context.
 
Russ,

The 3X figure was HYPOTHETICAL. In reality the 3DMark score shows the 9700 Pro as "2.5X Better" than the GeForce4Ti.

Do you not expect the 9700 Pro to be roughly 2X the performance of the GeForce4 Ti, when running, say, Doom3? I know I am.

Combine that with the fact that the 9700 will have better image quality, and supports DX9 in general...I'd say that qualifies for a card that's 2.5X "better" in forward looking games.

In any case, "exact numbers" are really reading into things too much. The fact that there is a "step change" in score is what is significant, and telling.

Or do you not thing there will even be a "step change" difference between the 9700 Pro and GeForce4 with Doom3?
 
Brent said:
this may have been posted already, but here is some food for thought

the new nvidia drivers make the gffx outperform the 9700 pro in 3dmark03, yet nvidia is still hates it

if there were really a conspiricy or something wouldn't you think we would just show the 'New' drivers, showing the GFFX is better, ignoring the whole fact NVIDIA purposely optimized drivers to make it better in 3dmark03 only, and then say w00h00 3dmark03 rox0rs lets all use it :p

Instead, even though the GFFX came out on top, our point is to say look what they can do in 3DMark03, do we really want to use a benchmark any longer that can really be optimized like this but doesn't actually effect any game performance?

We are simply saying no, we aren't goingn to use it in that way anymore

They were part of the BETA team up until December Brent, you could download it right off their FTP. If you don't find it Ironic that they publicly protest the benchmark that they praised a year earlier when they were dominating then you are smoking something real good.
3Dmark 2001 is in no way a better benchmark that this version, in fact alot worse in many ways and [H] used it all the time.

Yes..the new drivers beat the 9700 slightly, thats nice for a card that you can't even buy :rolleyes:
Reports from Futuremark forums also show rendering errors on the latest drivers,(Det 41.09 is the only driver that shows the proper effects) like no expolosions or bullets rendered:

shot0000.jpg
 
Joe DeFuria said:
Brent,

And some other food for thought

When you look at SHIPPING PRODUCTS today that compete in similar price brackets:

GeForce4MX vs. Radeon 9000
GeForce4Ti vs. Radeon 9500/Pro

nVidia's cards get SPANKED in 3DMark03. 3DMark shows a BIG 3DMark score difference between the the competing products.

Yes, GeForceFX Ultra currently beats the 9700 Pro. That doens't help much if all the Ultras that can be bought have already been ordered, or if by the time the Ultra actully does appear, the R350 starts getting reviewed....

Instead, even though the GFFX came out on top, our point is to say look what they can do in 3DMark03, do we really want to use a benchmark any longer that can really be optimized like this but doesn't actually effect any game performance?

Again, have you read my commentary on what 3DMark actually REPRESENTS? It's not all about performance. It's about "the ability to play future games". I see ZERO PROBLEM with using 3DMark in that context.

Ok, thats your opinion, and ours is ours :p

You can continue to use it of course however you like.

But we know where we stand now on the issue, and hopefully others do as well. We'll use the benchmarks we use, games + synthetic benchmarks for specific features, and you use the benchmarks you want to use :D

Here is a question for you now

Would you like to see more benchmarking abilities/utilities included in Games?
 
Find a DX9 game title that has use of the feature sets being demonstrated in this latest synthetic release then for sure, but using Epic's DX7 class engine on DX9 hardware isn't testing the proper DX9 feature sets, there is no DX9 games out there..So at this time its not correct to benchmark future hardware on old engines.

That is where [H] is wrong in their assumption, testing recent titles is always the best way, but it certainly isn't testing the advanced features of the cards.
The only title coming soon would be Doom 3 to properly test some of the DX9 cards features, and even that is more DX8 class than DX9.
 
Ok, thats your opinion, and ours is ours :p

OK, then you'll have to consider myself as a non visitor to your site. And I will suggest to others to do the same.

Because in my opinion, you are doing the disservice to the community by IGNORING 3DMark.

You would be doing everyone a service by using 3DMark in the proper context, rather than continuning to assert the 3DMark score shows no purpose at all.

Would you like to see more benchmarking abilities/utilities included in Games?

As I said in my very first post in this thred, yes, "that is a good thing." There's no reason why you can't push for that, and use 3DMArk at the same time.
 
Joe DeFuria said:
Did you read ANY OF MY POSTS about what the 3DMark score MEANS?! HINT: IT'S NOT SIMPLY ABOUT RELATIVE PERFORMANCE.

BIGGER HINT: Look a few posts up...I said, and I'll add EMPHASIS so that you don't THINK I'M CRAZY:

It's not necessarily 3X faster...but 3X "better." Combination of higher performance and/or better ability to see higher quality visuals.


Well perhaps you should give me the raison d'etre of a benchmarking that is using FPS for it's puropose.

Yeah that's it 3 times better, but in what sense? How does it calculate this number? Oh yeah on "better" things. That's doesn't mean much to say "better ability to see higher quality visuals". Does at 1024*768 the GFTI 4600 looks 2.5 slower or awful than the 9700 pro? And how do you spare the difference between performance, and higher ability to make eye candy?

Sorry, but a bench is about relative perspective, otherwise, i is not very useful.

So yes, when i see 2.5 times for a Radeon 9700 pro over a 4600, or +50% for a Ti 4200 over a 9000, i'm wondering on the usefulness of the benchmark.

I do not need a benchmark to have absolute numbers but for relative numbers or at least is like that i understand the meaning of benchmark.
 
Doomtrooper said:
ATI and Nvidia claiming the release cycle will start to widen for video cards is a good thing, 6 month product cycles are not realistic IMO, the hardware has advanced so far beyond what developers use, what's the point.

The point is making advanced hardware as widely available as possible. It's true that developers aren't taking advantage of the hardware, but IMO it's mostly because they have so many generations of hardware to write for, and because writing for them is inconvenient. Developers would love to use shaders 2.0 or better for everything, and not have to be bothered trying to do things in shaders 1.1-1.3. Unfortunately, most users still have shaders 1.1-1.3 hardware, or even no shaders. Slowing the product cycle will probably only make this problem worse.
 
Brent,

How come you guys didn't raise this "issue" when Nvidia released the detonator 40 series that boosted scores across Nvidia products by up to @ 1000 points on 3dmark2001 and did NOTHING for the gaming performance.

It's ok to drop it (3dmark03) now when nvidia is at a disadvantage? But before when it closed the gap between the Ti4600 and 9700 Pro is was OK...

that is not what I call journalistic integrity. If you were going to bring it up... it should have been a long time ago. The timing of this "revalation" is very fishy indeed
 
I suspect that the overall relative performance of Doom3 will end up being approximately the same relative performance of quake3.

I.e. anything that runs Q3 fast now, will still run Doom3 "fast" (comparitively), but will dial down prettyness in older cards like the GF4MX(i.e. GF2-ULTRA), and even the GF3. It might not even look much worse than the super wizbang ones.

But not all (or even many) games will even be as taxing as Doom3, or requiring of DX9-esque features. While I'd love to see shaders get tons of support (and that's really only going to happen if lots of people buy shader capable cards, and Futuremark is taking a bold step to give all shader incapable cards really crappy "goodness" numbers), the fact of the matter is that on todays games and most games that come out tomorrow and for the forseeable future, none will prove out the 9500 to be 2x "as good" as a GF4.

I think (personally) the XBOX will be the bar set for games for the next 2 years or so, but an "XBOX"-esque card gets 1/3 the score of a 9700.

I like the idea of pushing technology, but as a benchmark used to direct lay people as to what is good to buy now for the forseeable future, I don't think that futuremark is necessarily providing that service. While you wont go wrong buying a 9500 instead of a GF4 or 9000/9100, I think futuremark is upselling the vast majority of users into buying something they don't need.

By the way, on that graph, where does the 8500 sit?
 
Well perhaps you should give me the raison d'etre of a benchmarking that is using FPS for it's puropose.

Um....because FPS relates to performance, and performance is ONE ASPECT of the ability to play future games?

Perhaps you should explain to me how much "faster" a 9700 is, over a GeForce4ti, when running PS 2.0 shaders?

Yeah that's it 3 times better, but in what sense? How does it calculate this number?

Read the whitepaper...it's all there in black and white. (And some other colors too.)

Like I said originally....we can always debate "how" socres are weighted, etc. That can be debated forever. But the principal behind the scoring system is sound, and their weighting system and tests are certainly reasonable to be. (Though even I would have tweaked them a bit differently.)

1) More points are scored for better performance
2) More points are socred also just based on the very fact that a card supports certain DX features.

Or do you think Performance and DX feature support both do not contribute to the ability to play future games?

Does at 1024*768 the GFTI 4600 looks 2.5 slower or awful than the 9700 pro?

With future games? Who's to say? I'd be willing to bet it will be around 2 times as slow...
 
How come you guys didn't raise this "issue" when Nvidia released the detonator 40 series that boosted scores across Nvidia products by up to @ 1000 points on 3dmark2001 and did NOTHING for the gaming performance.

That's precisely what HardOCP/Kyle have refused to say...All the reasons they outlined were nearly verbatim with those pointed out by nVidia.

I really wish I had my old GeForce3 here, because I would like to do a little investigative benchmarking to see just how _few_ titles got performance increases by using that generations Magic driver (whichever version suposedly yielded X % increase). You can pretty much write it down that, by and large, most titles don't improve a heck of a lot, while others like Quake3/3DMark saw massive performance increases....and if you were to go back to website that promoted those drivers and/or their own documentation, you would see claims of HUGE performance increases...
 
Joe DeFuria said:
Um....because FPS relates to performance, and performance is ONE ASPECT of the ability to play future games?

Perhaps you should explain to me how much "faster" a 9700 is, over a GeForce4ti, when running PS 2.0 shaders?

Like I said originally....we can always debate "how" socres are weighted, etc. That can be debated forever. But the principal behind the scoring system is sound, and their weighting system and tests are certainly reasonable to be. (Though even I would have tweaked them a bit differently.)

1) More points are scored for better performance
2) More points are socred also just based on the very fact that a card supports certain DX features.

Or do you think Performance and DX feature support both do not contribute to the ability to play future games?

With future games? Who's to say? I'd be willing to bet it will be around 2 times as slow...
Well i think we can agree that we disagree.
It's not because it's writen, that's it isn't a drawback. And i see it like that.

It's not because in the future (well when in 1 year, 2? 5? 10?) we could see a game where the R300 2.5 times faster, that it doesn't give today a false average gap. When i will see that time comes true, i'll remind you of it and we will see.

Concerning the way it's done, well, you can say eye candy, i can say that if you put some tricks in drivers you could have some beautifull numbers and then i'll tell you that it's "high performance and or high hability to make some eye candy".

I'm sorry, but i don't see this benchmark as useful as it doesn't give me an relative performance gap between current games in a simple resolution. I see that you think otherwise. Then, you will put some weight on this particular bench, i'll dismiss it.
 
But not all (or even many) games will even be as taxing as Doom3, or requiring of DX9-esque features.

I agree.

Again, this is not for FutureMark to decide ("how many games will look like this), or try to work into their benchmarks. At least, not IMO.

It is the responsibility of the web reviewer to look at the 3DMark score, and put it in context with other benchamarks, etc. Comment on how much the reader should consider the 3D Mark score, vs. other "current game" banchmarks. (Based on the individual reviewer's vision of "the futre", etc.)

We all agree that just throwing up a 3D Mark score (or any other single benchmark) is, in a nut-shell "wrong." That being said, if I were to pick ONE, SINGLE benchmark to use, in an effort to rank 3D Cards for their "goodness?"

I would use 3DMark03. I would also explain what the score means...(and that it does not mean that "Card X is y% faster than Card Y".

By the way, on that graph, where does the 8500 sit?

Dunno...they didn't test one! Mine sits at around 800...but I'm on a PIII-700, and had lots of crap running in the background. ;)
 
Typedef Enum said:
That's precisely what HardOCP/Kyle have refused to say...All the reasons they outlined were nearly verbatim with those pointed out by nVidia.

Agreed. The semantics are too similar for comfort (at least mine). The timing is too suspicious, also. How much time did [H] spend with 3DMark03 before somehow reaching the exact same conclusions as a certain IHV? Wow, quite the coincidence there.

That said, using only games works for this gamer. If Dave Barron reads this he might remember a IM argument he and I had a few years back over his use of 3DMark for B3D reviews. Just use as wide a variety of titles as possible, lest any IHVs be tempted to optimize their drivers for one or two engines. And we all know that won't happen when Doom 3 is released. :rolleyes:
 
RussSchultz said:
I like the idea of pushing technology, but as a benchmark used to direct lay people as to what is good to buy now for the forseeable future, I don't think that futuremark is necessarily providing that service.

Who are you calling lay people? People who read hardware sites are IMO sophisticated enough to know that this is just one benchmark. And when it appears, it will appear together with game scores. Which makes it pretty easy to ignore the 3DMark score if all you care about is the here and now (or a couple of years old Quake3 :eek:).
 
Joe and Doom, you guys seem so hellbent on ripping [H] a new asshole, that you're blinding yourself to their message.

Brent said:
Instead, even though the GFFX came out on top, our point is to say look what they can do in 3DMark03, do we really want to use a benchmark any longer that can really be optimized like this but doesn't actually effect any game performance?

We are simply saying no, we aren't goingn to use it in that way anymore

Doomtrooper, you quoted this same phrase, then went on to bitch about how NVIDIA is altering their drivers to increase their 3DMark score while at the same time not rendering everything in the scene. Please read Brent's words carefully, as that is exactly the kind of thing he's complaining about.

Joe DeFuria said:
Because in my opinion, you are doing the disservice to the community by IGNORING 3DMark.

You would be doing everyone a service by using 3DMark in the proper context, rather than continuning to assert the 3DMark score shows no purpose at all.


I thought that's exactly what they were saying they were going to do...

Brent (taken from HardOCP's 3DMark03 Preview article) said:
In closing, Kyle has informed me that [H]ard|OCP will not be using the overall 3DMark03 score to evaluate video cards.

Kyle (taken from HardOCP's Benchmarking Right article) said:
I am not going to dwell on that anymore except to say that we will not be using the overall scoring data from 3DMark03 in the evaluation of video cards at this time

I see nowhere that they state they will never touch the program, just that they won't be using the final 3DMark score as a valid comparison of video cards as it relates to actual games.
 
Crusher,

Joe and Doom, you guys seem so hellbent on ripping [H] a new asshole, that you're blinding yourself to their message.

No I'm not. I disagree with their message. I KNOW they have said they might use individual tests, etc.

I DISAGREE with their decision to not ALSO use the 3DMARK Score. When have I said anything different? Is their message not that they will ignore the score?

You appear so hell-bent on arguing against me (for whatever reason), that you've blided yourself to my message.

I thought [ignoring the score is] exactly what they were saying they were going to do...

Ummmm.right. And I DISAGREE WITH THAT.

Have I really not been clear on that point?

This is how I sum up [H]'s message:

* Using the 3DMark "Score" does a disservice to the community. So we won't use it.

This is the message I'd PREFER that they have:

* Using the 3DMark score without proper context does a disservice to the community. Therefore, we will use the score, but not without proper commentary and context with other benchmarks.

Clear?
 
Back
Top