HardOCP and Doom 3 benchmarks

I'm not complaining one way or another regarding the differences in speed. What I'm stating is that I think the best comparison between two separate cards in a game is to run them on the "standard" application level.

ARB2 is an OpenGL standard is it not? I don't think you can compare ATI's path to Nvidia's path because frankly we don't know that they were optimized evenly, or if that's even possible. However, if you code directly to ARB2, then it's up to the hardware manufacturers to adhere to the standards set forth by the governing OpenGL board. Now correct me if I'm wrong here, but is that not what a review is supposed to do?

I read this forum and see people bitch about how it's so difficult to compare apples-to-apples image quality when it comes to AA and AF, due to the names and whatnot. What's the difference here in comparing an apples-to-apples rendering path? If you want to show the differences between the Nvidia and the ATI paths specifically, then I think that should be a separate portion of the review.

But the bulk of the reviews, imo, should be run on the standard path. Unless reviewers stress the standard path, you honestly think card manufacturers will see a need to support those standards if they think that developers will support their proprietary extensions?

I don't buy a card to play one game. I buy a card that will support every game that I want to purchase, and run them without having to have uber optimizations for it. Frankly not every developer will necessarily do that.

I don't see what the problem with asking an adherance to standards is, and why the rolleyes and the attitudes about it. I think it's a fair statement and expectation.
 
I believe it's a fair expectation to want ARB2 comparaisons in a final review. However, this is just a Doom3 preview, and I believe comparing the paths used by default for each architecture is more important in such a preview.

Also, Carmack traditionally was quite good at making the IQ of his paths comparable IIRC. So, for now, I say we should give him the benefit of the doubt and say that he maybe managed to use FX12 correctly and get practically identical IQ.


Uttar
 
Ahem, what I find mindly surprising is that the benchmarks (Toms Hardware) showed a very clear advantage to the NV30 over the R300/R350.

1024x768 medium quality:
NV30: 81.1
NV35: 83.0
R350: 68.0
R300: 60.8

http://www.tomshardware.com/graphic/20030512/geforce_fx_5900-11.html

This is not in line with John Carmack's previous perfomance estimate:

Jan 29, 2003
------------
NV30 vs R300, current developments, etc

At the moment, the NV30 is slightly faster on most scenes in Doom than the R300, but I can still find some scenes where the R300 pulls a little bit ahead.

This was with R300 defaulting to using the ARB2 path, while the NV30 used the vendor-specific NV30 path.

I would wait for John Carmack to update his plan.
 
All that says to me is that Carmack's internal rendering precision for the engine maybe doesn't exceed fp12...;) Whoopee....;) Says not a whole lot, in other words.

Frankly, I think running pre-release demos of software when such demos are not publicly available for inspection, and the game itself is months away and won't be shipping until both companies have released or will be about to release new hardware--is, well, bunk...;) Both the hardware and software are going to change significantly in that time frame--and so it is much ado about nothing, IMO.

Personally speaking, HL2 weighs much more heavily for me than DoomIII at this point in time--although I will probably get both--when they ship, that is.

Isn't it great--we get nv35 previews along with Doom III previews...? Will be nice when both are shipping and we can drop the p.
 
Uttar said:
I believe it's a fair expectation to want ARB2 comparaisons in a final review. However, this is just a Doom3 preview, and I believe comparing the paths used by default for each architecture is more important in such a preview.

Also, Carmack traditionally was quite good at making the IQ of his paths comparable IIRC. So, for now, I say we should give him the benefit of the doubt and say that he maybe managed to use FX12 correctly and get practically identical IQ.


Uttar

I thought the R3xx defaults to the ARB2 path? Am I mistaken in this belief?
 
One issue noted from Anandtech's inclusion of Carmack's comments, and from what I don't recall from the Doom 3 comparisons: No discussion of the R200/NV30/ARB paths was made for context of the results (equivalency was implied at Anandtech, atleast, and I recall no details of comparison of the ARB paths for the new NV35 to the R350).

Carmack's commentary gives the opportunity for a clear picture, but the benchmark graphs give: 1) an accurate picture to "gamers only" of the situation right now 2) insufficient information about the applicability of that picture for the relatively (compared to this board in general, for example) uninformed.

Doom 3 obsessed gamers are more concerned for 1, so I think theat HardOCP's separation of that and emphasis in a dedicated piece is less of a problem (and prone to natural updates as Doom 3 approaches). In both cases, further information and analysis based on Carmack's prior .plan comments would have been very suitable.

In the case of the HardOCP Doom 3 coverage, it could have been properly covered by just one or two additional sentences, but there seem to have been issues preventing them from being able to test the ARB2 path performance metrics to provide such context briefly to address "2" above, so it seems fair for the limits they specified.
 
I just want to know what the purpose of the test is? Is it to determine expected Doom 3 performance for end-users?
Ask yourselves this question.
Do you think this test accurately portrays the performance of the final build of the game with available drivers at the time the game is released?
Here is my estimation :
For ATI there is a 99% chance that this testing procedure will not reflect final performance and possibly could mislead purchase decisions. Publishing it shows little editorial wisdom. Then taking into acount the circumstances of the testing i.e. Nvidia initiated , throws the integrity of the whole process out of wack.
There is a 75% chance of the tests have an accurate result for Nvidia cards (surprise , surprise) and the results are most likely not misleading. I think it is okay to publish them.
If [H] through 4 more graphics chip manufacturers in the mix for the testing and everyone's drivers were broken except Nvidia and Nvidia instigated the testing I think there would be a different view of the whole article.
 
It seems funny to me how, at the moment, the community is very anti-nVidia. If this D3 benchmark had been done in the 9700 or 9800 reviews, i doubt there would have been nearly the same uproar. It's a sad reflection on the community that they can't just look at something and appreciate it for what it is. All it is, is a simple comparison of how things stand right now. Right Now. No one is saying that the ATi offerings are going to run at 10fps in the final version, and hell - no one is saying the NV35 is going to run at 100fps in the final version. Stop looking for conspiracies in every single thing.

"Oh god, nVidia got a D3 benchmark, JC is being bought off by nVidia, omg what a sell out."

"NVIDIA WON A BENCHMARK, DAMN CHEATERS, STUPID REVIEWERS, NO DOUBT BEING PAID OFF, I HATE xy SITE"


Seriously, get a god damn grip. This isn't necessarily only relevant to this thread, but to the GPU forums in general. People refuse to believe anything anymore, and it truely makes me sad. But then again, OBVIOUSLY, nVidia is cheating at everything, and ATi is cheating at nothing, and they are in fact the saviours of the industry.

Sorry for the rant, but damn - It gets on my nerves.
 
Natoma said:
I thought the R3xx defaults to the ARB2 path? Am I mistaken in this belief?

IIRC Carmack identified an "R200" vendor path but stated that it wasn't much if any faster than ARB2 for R300.
 
PaulS said:
It seems funny to me how, at the moment, the community is very anti-nVidia. If this D3 benchmark had been done in the 9700 or 9800 reviews, i doubt there would have been nearly the same uproar. It's a sad reflection on the community that they can't just look at something and appreciate it for what it is. All it is, is a simple comparison of how things stand right now. Right Now. No one is saying that the ATi offerings are going to run at 10fps in the final version, and hell - no one is saying the NV35 is going to run at 100fps in the final version. Stop looking for conspiracies in every single thing.

"Oh god, nVidia got a D3 benchmark, JC is being bought off by nVidia, omg what a sell out."

"NVIDIA WON A BENCHMARK, DAMN CHEATERS, STUPID REVIEWERS, NO DOUBT BEING PAID OFF, I HATE xy SITE"


Seriously, get a god damn grip. This isn't necessarily only relevant to this thread, but to the GPU forums in general. People refuse to believe anything anymore, and it truely makes me sad. But then again, OBVIOUSLY, nVidia is cheating at everything, and ATi is cheating at nothing, and they are in fact the saviours of the industry.

Sorry for the rant, but damn - It gets on my nerves.

I can't speak for the "community," however I can speak for myself. Personally I am tired of the reviews that bench one card in its specific path, and another card in the "standard" path, then declare at the end the specific card to be the victor.

Or the reviews last year and earlier that would compare 8x AF to 16x AF and declare the 8x AF to be the winner even though the 16x AF was only slightly slower, but had vastly superior image quality.

That's my opinion on the matter. I take an interest in this because I'm waiting until the winter to build a new machine. Either around clawhammer or prescott, and either an NV35/NV40 or R390(??)/R400. Whatever it is. But I want competent reviews so that I can make an informed purchase.

That is where I stand.
 
Natoma said:
<snip>
I read this forum and see people bitch about how it's so difficult to compare apples-to-apples image quality when it comes to AA and AF, due to the names and whatnot. What's the difference here in comparing an apples-to-apples rendering path? If you want to show the differences between the Nvidia and the ATI paths specifically, then I think that should be a separate portion of the review.
You've reminded me of my "bitching" about my complaints about video card shootouts.

If you want to compare "rendering paths", then you have a point (I'm not going to argue with you that ARB2 on Doom3 is slower on a NV3x than a R3x0). But if you are to view and judge video cards based on what each can offer, through their own "ways", as a final result for games, then you don't have a point.

Which "point" is more important in your opinion? Are you going to take "Doom3"'s (note my putting Doom3 in quotes!) performance as evidenced by these NV35/R350-256mb reviews as representative of all games?

These are previews of Doom3's state as-is running on the different cards+drivers. Don't read too much more into this than it is.

But the bulk of the reviews, imo, should be run on the standard path.
No, they should use what each card is best run on, unless there are huge IQ sacrifices involved. And even if there are huge differences (which would run contrary to what Carmack told me ), reviewers should simply report it and not "equalize".
Unless reviewers stress the standard path, you honestly think card manufacturers will see a need to support those standards if they think that developers will support their proprietary extensions?
So you're saying reviewers determine the way IHVs make their products?

I'm so honored because I thought it was the developers, not the reviewers.

:)

I don't buy a card to play one game. I buy a card that will support every game that I want to purchase, and run them without having to have uber optimizations for it. Frankly not every developer will necessarily do that.
Huh? You don'[t have anything to do with the optimizations, onluy the developer of the game you may buy!

If you buy a card to play all the games that you buy, why do you care if a developer bitches about the amount of work he has to put in? You should only care about your card and the games you buy... and if the developers put in the amount of work Carmack puts in, well, I'm lost... why do you care???

I don't see what the problem with asking an adherance to standards is, and why the rolleyes and the attitudes about it. I think it's a fair statement and expectation.
Again, what is the "standard"?

If your love is playing games and don't want this "mess", get a console and enjoy.

PS. Sorry if I sound aggressive.
 
PaulS said:
It seems funny to me how, at the moment, the community is very anti-nVidia. If this D3 benchmark had been done in the 9700 or 9800 reviews, i doubt there would have been nearly the same uproar. It's a sad reflection on the community that they can't just look at something and appreciate it for what it is. All it is, is a simple comparison of how things stand right now. Right Now. No one is saying that the ATi offerings are going to run at 10fps in the final version, and hell - no one is saying the NV35 is going to run at 100fps in the final version. Stop looking for conspiracies in every single thing.

"Oh god, nVidia got a D3 benchmark, JC is being bought off by nVidia, omg what a sell out."

"NVIDIA WON A BENCHMARK, DAMN CHEATERS, STUPID REVIEWERS, NO DOUBT BEING PAID OFF, I HATE xy SITE"


Seriously, get a god damn grip. This isn't necessarily only relevant to this thread, but to the GPU forums in general. People refuse to believe anything anymore, and it truely makes me sad. But then again, OBVIOUSLY, nVidia is cheating at everything, and ATi is cheating at nothing, and they are in fact the saviours of the industry.

Sorry for the rant, but damn - It gets on my nerves.

I don't remember anyone saying Nvidia cheated in thread. If they did please point it out.
No one is saying that the ATi offerings are going to run at 10fps in the final version
Then what is the purpose for a NEWS site to post results when the results have no tangible accuracy for the readers? Especially when the test initiator is an interested party. I want better standards in journalism not less. Stop trying to turn this into an ATI vs. Nvidia thing. It's not. It's PR vs NEWS dept thing

editted for grammar
 
Regardless of the disclaimers involved, many people are going to accept the fps graphs at face value, and that's most unfortunate.

* [H]'s own review of the 5900/9800-256 showed what a difference the 256MB of ram can make at hi-res with AA, etc.

* ATI states quite clearly that drivers prior to Cat3.4 only recognize 128MB of ram, thus potentially crippling their 256MB card.

* The Cat3.4 seems to have a major issue with Doom]I[ at the moment.

This leaves ATI's card with no way to perform up to the level it should be at for this test. But rather than wait until a fair comparison can be made, the results are shown anyway. I only hope that id allows a new benchmark run after ATI is given a chance to fix the issue in their drivers(which they can't really be faulted much for since the game isn't actually done yet).
 
Reverend said:
But the bulk of the reviews, imo, should be run on the standard path.
No, they should use what each card is best run on, unless there are huge IQ sacrifices involved.
IMHO: or unless there are perceptible (less than huge) differences. And to be fair to the gamer/buyer/consumer we should have a look at the IQ differences :)

Eventually I may end up buying an nvidia card but I like to know what I am buying :)

Many people will buy new cards without knowing it.
 
Frankly, I think running pre-release demos of software when such demos are not publicly available for inspection, and the game itself is months away and won't be shipping until both companies have released or will be about to release new hardware--is, well, bunk... Both the hardware and software are going to change significantly in that time frame--and so it is much ado about nothing, IMO.

Carmack specifically mentioned that he thought that this demo would be indicative of final performance. I don't think that he would have done that if he didn't thought it wasn't true.

Now, drivers and new hardware is another matter though and i agree. We shouldn't make any final judgements on this as it is. But speculation is fun and we seem to have a full fledged war at the moment with Ati to release a new card within >= 2 months so i'm happy that they released the Doom3 benchmarks :)
 
I know a lot of you are interested in the IQ differences between the different paths but please, please bear in mind what JC had to say in my interview with him. We all like "apples-to-apples" comparisons but I think we should just calm down and be a little bit realistic. I mean, if you want "apples-to-apples" for the NV35 and R3x0 using the ARB2 path, you'll never get it anyway!! (ARB2 = fp32 on NV3x, fp24 on R3x0).

PS. Don't jump on me and say "Rev.. the NVIDIOT". I'm just pleading for you guys to be a bit more realistic. Like I said before, shootouts the way they are done now don't really tell the whole picture!
 
I guess this part is the important one:

What about the difference between NV30+NV30-path and R300+R200-path in terms of performance and quality?

Very close. The quality differences on the ARB2 path are really not all that significant, most people won't be able to tell the difference without having it pointed out to them.

But the interesting thing is still, what does "most people won't.." mean ?

Do we have to look at magnified screenshots to see a pixel at one place to have the "wrong" color or is it something that might not be easy to notice at first but easy when you know about it or .....
 
Back
Top