HardOCP and Doom 3 benchmarks

I would hope that if B3D ever uses DOOM III for benchmarking it will force a common path. This his a hardware site, is it not. ;)
 
Lezmaka said:
Natoma said:
No. What I'm saying is that games should be benched in the standard paths for the engine. So if the standard path is ARB2 for DOOM3, then I think that the reviews should be benched for ARB2.

Who determines which path is the standard path for a game? The reviewer? The developer? You?

Or by standard, do you mean one that uses no proprietary extensions? Then how would you go about comparing cards like GF4MX and GF4Ti and Radeon 9700? Would you be able to use ARB2 since there are some things that the GF4MX doesn't support that the other cards do support?

I would assume the developer would determine what the standard path is. For instance, R300 runs the ARB2 path, and from what I understand, ARB2 extensions are standard OGL2 extensions, meaning that any OGL2 card should be able to run ARB2 extensions just fine.

The same as if DOOM3 were a DX9 title, you would expect any card capable of running PS2.0 would run the shaders equally fine, and not have to rely on any proprietary extensions from any of the card manufacturers.
 
Natoma said:
I would assume the developer would determine what the standard path is. For instance, R300 runs the ARB2 path, and from what I understand, ARB2 extensions are standard OGL2 extensions, meaning that any OGL2 card should be able to run ARB2 extensions just fine.

The same as if DOOM3 were a DX9 title, you would expect any card capable of running PS2.0 would run the shaders equally fine.

And in this case, the developer decided to not have a standard path and instead one for each card (almost at least). Thus, we have to compare the different paths with regards to image quality & speed and then let the users decide what they feel is the best option for them.
 
How did this thread get hijacked into ARB2 vs NV30 path?

I was talking about the newsworthy of the article and the fairness of even including ATI in the testing. Anyone with half a brain can see what was at play here. Anyone can tell the drivers were borked and that the tests results could not impart any more knowledge or wisdom as to the 9800 Pro's Doom 3 performance than is already in the public record. No reviewer to my knowledge came right out and said something like the following
"We can not reach any logical conclusion from test results for the 9800
Pro they have as much value in discerning performance as any random numbers between 1 and 500 furthermore all software test components are a variable with the exception of the OS"
 
Well, I was finally able to look at Tom's "Doom3" numbers (site's been inaccessible most of the day for me.)

Here are some things that I would like to have "investigated", but due to the "one shot deal" of these benchmarks, won't get done. This is my major beef with how this was carried out - lack of any follow-up.

At first, we see the ATi cards lead the pack. The FX 5900 Ultra only overtakes the competition at 1600 x 1200. According to NVIDIA, this may be a driver problem. The NVIDIA-optimized anisotropic filtering may have trouble with the anisotropic levels Doom III uses.

DoomIII in "High Quality" mode, which presumably uses DOOM 3 calls for anisotropic filtering, and not forced settings on the control panel, isn't any faster on the 5900 than on the 8900.

Nvidia sites "driver bug". I'm willing to bet that when using the in-game settings, the aniso quality is DIFFERENT than when quality is forced by the control panel. I guess we won't know for a few months?

And then there's AA:

Here's a case where the 5900 seems to dominate the 9800 in Doom. However, a few sites have noticed that at least in opengl, the 5900 appears to be blurring instead of straight AA in 4X mode. Is there really 4X AA going on in openGL and Doom3? Or is it more like QuincunX, 2X AA with a post filter?

Again, these are things that now the Doom3 "benchmark" sessions are over, cannot be checked and verified. I think iD just exercised bad form with the way they handled this.
 
IMHO when looking at hardware performance one should always try to compare like with like, this means : same code paths, same features, same quality (even if that measn dropping to the lowest common denominator). When judging game performance and playability you enable the game defaults which might be different for different hardware (GPUs, CPUs, Sound Cards) and you check if the game manages to deliver an enjoyable experience no matter the system (assuming its at or above the defined minum specs). The (p)reviews we saw today were hardware not gaming (they were not allowed to discuss the game play or what they saw in terms of "DoomIII the game") so they should have compared apples with apples and not apples with oranges. We do not know which code path was used, we do not know how "official" and "general purpose" the NVIDIA driver release was nor do we know exactly what AA methods they enabled (blur versus actual samples etc).

So IMHO these benchmarks, while a nice marketing stunt, have little real meaning until they can be done in a clear and independent way which completely specifies the actual settings and conditions without one hardware vendor hanging around and organising this event.

What we have learned is that both ATI R300 and NV35 will deliver playable framerates in DoomIII, and that's about all we can say.

K-
 
People complaining about the legitimacy of these benchmarks need to get a grip. This is not a horse race. The average gamer who reads these things doesn't want to find out if Ati fanboys have a bigger tool than Nvidia fans boys. They are not anal about meticulously leveling every detail on the playing field. The question on everyone's mind, other than the vocal few on this board is, which card will give me the best performance on DOOM III and games based on DOOM III. Am I putting down $500 for a card that will have satisfactory performance on the games I'm really looking forward to? Today the question was answered. If the situation changes in the future, I'm sure that Ati will be the first to let you know.

Ati has had years to optimize their hardware and software. Have they not been aware that people have wanted to know about performance on DOOM III? Did they not know that DOOM III is running faster on NV35 and NV30? You think the extra two weeks notice they would have gotten would have allowed them to pull a functioning driver out of their asses? No, what would have happened was that their marketing department would have started spewing preemptive fud of the variety that a few on this board are doing now and all video card companies have done plenty of times in the past.

What happened today was that one video card was shown to be faster than another on an upcoming game. People are still starving in Ethiopia. The troops are still in Iraq. There are still love ones to care for, attractive women to chase after, et cetera. Get a life.
 
So it is OK for one bench to utilize a specific code path that favors one vendor(3DMark2K3), but not another bench that favors the other vendor?
 
Joe DeFuria said:
DoomIII in "High Quality" mode, which presumably uses DOOM 3 calls for anisotropic filtering, and not forced settings on the control panel, isn't any faster on the 5900 than on the 8900.

Nvidia sites "driver bug". I'm willing to bet that when using the in-game settings, the aniso quality is DIFFERENT than when quality is forced by the control panel. I guess we won't know for a few months?

Xbitlab's has an interesting take on AF with the new drivers. It seems Nvidia is playing around with LOD selection. This is apparently evident only in OGL & includes the Quality mode.
 
I don't trust any benchmark application that is not publicy available - Wheater it is Aquamark, Unreal Performance Test or this new DOOM III build. You simply can't verify what the experts at [H], THG and Anand benchmarked, especialy under these time constrained environment.
Further, do all IHVs have access to this closed benchmark applications, including Trident, SiS, 3D Labs, PowerVR and Matrox? Kinda unfair if not. Either you give it out to everybody or nobody at all.
Did the editors have enough time to verify AA and other settings are actualy working as they should? Why only THG benchmarks the high quality mode, while they benchmark all other games with maximum quality?
Can todays results be representative for the release time of [any] game? I don't think so at all. Assuming the game hints the street Q403, the average and peak CPU performance will be quite a bit higher than it is right now. Therefore, fillrate and bandwith limited situations will occur more often than today, which means that a card A with higher CPU load than card B will have a higher benefit from future generations CPUs.
 
When I write an ihv path I do it because of either better image quality or better speeds or both. Some write to arb because of time constraints or don't see any advantage using ihv path. It varies and everyone has different reasons it seems. But it is not conspiracy I don't think. Since today's articles were named 'previews' people should realize that that doesn't mean 'reviews' or 'final'. Drivers always improve so that is one thing to keep in mind. But I agree that people usually pronounce the winner based on previews anyways. So in that respect it was not fair to ati for reviewers to be running doom3 benchies. Then again, you might hear people that ati was standing still and it's their own fault for not writing better drivers faster. I think ati was hard at work and the nv35 release didn't parallel with new cats drvs.
 
BenSkywalker said:
So it is OK for one bench to utilize a specific code path that favors one vendor(3DMark2K3), but not another bench that favors the other vendor?

Which non-standard vendor-specific code path in 3DMark03 are you talking about?

Please don't say PS1.4 - if you do I'll be very disappointed. Not only do nVidia support it but they knew well ahead of time that they would have to support it because it has always been a requirement in DirectX that cards that support higher numbered shaders must also support all lower numbered shaders.

There is no requirement in OpenGL to support any vendor-specific paths, in fact if a specific vendor owns an extension you can't support it without their permission, so i really don't see how you're making a relevant argument. The two cases are in no way comparable.
 
BenSkywalker said:
So it is OK for one bench to utilize a specific code path that favors one vendor(3DMark2K3), but not another bench that favors the other vendor?

That depends on the purpose of running the benchmark. If the purpose is to show how cards perform using the same settings, then different paths doesn't make sense. But if the purpose is to show how a game performs when you play it, then why should it matter if it uses 2 different paths? I seriously doubt the average gamer who has a NV3x would buy Doom 3 and run it using the ARB2 path.

Maybe I didn't read well enough, but I thought the point of this demo was to see how well the cards run Doom 3 as it is today, not how well the cards run Doom 3 without using proprietary extensions.
 
Kristof said:
IMHO when looking at hardware performance one should always try to compare like with like, this means : same code paths, same features, same quality (even if that measn dropping to the lowest common denominator). When judging game performance and playability you enable the game defaults which might be different for different hardware (GPUs, CPUs, Sound Cards) and you check if the game manages to deliver an enjoyable experience no matter the system (assuming its at or above the defined minum specs). The (p)reviews we saw today were hardware not gaming (they were not allowed to discuss the game play or what they saw in terms of "DoomIII the game") so they should have compared apples with apples and not apples with oranges. We do not know which code path was used, we do not know how "official" and "general purpose" the NVIDIA driver release was nor do we know exactly what AA methods they enabled (blur versus actual samples etc).

So IMHO these benchmarks, while a nice marketing stunt, have little real meaning until they can be done in a clear and independent way which completely specifies the actual settings and conditions without one hardware vendor hanging around and organising this event.

What we have learned is that both ATI R300 and NV35 will deliver playable framerates in DoomIII, and that's about all we can say.

K-
If I ever get the chance to do a shootout involving R3x0 and NV3x products, I will run DOOM3 benchmarks on a NV3x and a R3x0 using all the possible rendering paths while providing screenshots showing possible (and probable) IQ differences without needing to zoom in 4x (or such high analytical zooms) to see the differences. I will also run both hardware using the "common" (=ARB2) path just to try and investigate possible (and probable) differences, whether in terms of IQ or performance.

BUT I'm not going to make a big deal out of the possible differences unless the IQ differences demand it. I will try to simply report the facts as I see it and won't try to have an ego and go say which hardware is better. I might say which hardware is my preference based on my experiences but that will be in bold as stated as a matter of personal opinion and I will try not to come out sounding like an authoritative figure (even though it's the Internet... "It's on the Internet... it must be true" :) ). Many reviewers nowadays tend to state their opinion as undeniable Fact, when it is nothing more than, well, their opinion. Hey, you think beards are cool, I don't :).

You don't make a piece of hardware for the sake of making it, you make it so that it can run to the best of its abilities in as wide a variety of apps/games as possible, taking whatever advantages that can be taken. Hardware reviewers should explore and report all possible hardware-related and gaming-related angles.

The bottomline, however, is that hardware is made for running apps, not to be compared with other (competitive) hardware, and to do so you try to take advantage of its architecture.

PS. More to say but can't since I'm still frustrated B3D went down for while earlier.

PPS. I also understand you're a hardware guy while I lean more towards what hardware is for.

PPPS. I absolutely agree with your second para, K. We should take all these D3 benchies at face value.
 
I was hoping B3D would have its own preview. Guess you guys haven't made it on the list?

You are quickly moving there though. :D
 
Please don't say PS1.4 - if you do I'll be very disappointed. Not only do nVidia support it but they knew well ahead of time that they would have to support it because it has always been a requirement in DirectX that cards that support higher numbered shaders must also support all lower numbered shaders.

What does support have to do with using a rendering path that favors one board over another?

Splinter Cell benches are another example. They are becoming increasingly common but which site has spelled out exactly how the boards differentiate themselves from one another? Why wasn't a major issue made about that? There is a case where there is a fairly enormous difference in rendering quality between ATi and nVidia but the sites can get away with that.... why? Instead we have people speculating that there will be noticeable differences between the rendering paths in Doom3 and it is being made in to a large issue while we know there are major differences in SplinterCell and there isn't much commentary from the reviewers and that is OK. Why aren't people flaming {H}, Anand and Tom for ignoring the superior IQ of nV boards running SC? We know that's real, it requires no speculation.

You can check my post history if you would like, I have no issues with sites using 3DMark2K3, nor Splinter Cell and I don't with Doom3 either. If an issue is to be made of Doom3, then the same issue should be made for the others also.
 
boobs said:
People complaining about the legitimacy of these benchmarks need to get a grip. This is not a horse race. The average gamer who reads these things doesn't want to find out if Ati fanboys have a bigger tool than Nvidia fans boys. They are not anal about meticulously leveling every detail on the playing field.

The argument is that the reviewers should be. I do agree. What happened was that a site with an enthusiasm based readership prioritized enthusiasm over leveling the playing field. People here don't need to get a grip, you're just visiting a forum where (for many) the priorities are different than that.

The question on everyone's mind, other than the vocal few on this board is, which card will give me the best performance on DOOM III and games based on DOOM III.

Well, you're trying to say "everyone" like you have the exclusive right to speak for them. :oops:

Am I putting down $500 for a card that will have satisfactory performance on the games I'm really looking forward to? Today the question was answered.

You see, that's exactly the question that was not answered. The only thing that was answered was the performance of Doom 3 today for the fastest path for each card, the game itself is months from release.

ATI did not have the involvement nVidia did, nor, quite obviously, an opportunity to prepare a driver set with Doom 3 in mind.

Note that the NV35 has been stated not to be able to run the ARB path for the Doom 3 build tested, though the NV30 did in the past. Problem in Doom 3? nVidia preventing equivalent comparison for Doom 3, so engineering the preclusion of the context I mentioned earlier? It seems evident that the 44.03 and the Doom 3 showing were something nVidia were planning in a linked fashion.

Come to think of it, it might be a very significant issue that the Cat 3.4 were used as they were (even given the priorities of HardOCP) to run the ARB path, and the R200 path wasn't tried. Did they try that method of overcoming the Cat 3.4 issue? I don't recall mention of it.

If the ARB2 path didn't work for the NV35, and gave low performance for the Cat 3.4, this introduces the possibility that the R200 pathmight have performed well and shown significantly better performance with all 256MB of RAM utilized for the 9800 256MB. This possibility opens up an inherent unfairness, as I don't recall HardOCP mentioning that the ARB2 path didn't work for the NV35 (think I saw that mention at Tom's?), which would certainly belong with the observations about Cat 3.4 and running the ARB2 path at 10 fps, and lends a new possibility for the R200 path beyond minor performance increase with image quality a bit closer to the NV30 custom path (i.e., enabling an extra 128 MB of RAM and the driver changes geared towards taking advantage of it).

If the situation changes in the future, I'm sure that Ati will be the first to let you know.
Actually, it should be the job of reviewers who tested Doom III now to let you know then, as that is the avenue to address the inherent imbalance of the testing situation that was presented. Aside from the Cat 3.4 issue that just occurred to me, it is doing that which would allow the Doom 3 preview to be data that wasn't a misrepresentation rather than an objective data point. Until this Cat 3.4/R200 question occurred to me, I considered the unfairness a matter of circumstance and not the approach of the HardOCP preview within those circumstances, but now I'm not so sure.

Ati has had years to optimize their hardware and software.

Hmm...this comment doesn't make sense to me. Are you under the impression that nVidia drivers are perfect, or do you recognize that driver development is an ongoing process? What about Doom 3 development...do you recognize it is still changing?

Have they not been aware that people have wanted to know about performance on DOOM III? Did they not know that DOOM III is running faster on NV35 and NV30? You think the extra two weeks notice they would have gotten would have allowed them to pull a functioning driver out of their asses?

Well, nVidia arranged this showing of Doom 3, what do you think that says about their focus for the 44.03 drivers used? ATI personnel seem to indicate that they have no current builds for which they could have performed optimizations. In a sense, that's good, they seem to focus on games people are playing and game specific hand tuning was hindered, but it is bad because there are obvious issues they could have addressed for their latest drivers, and nVidia was advantaged by only having to sucessfully implement the direct expression of their hardware functionality via their own OpenGL extension.

Why are you bringing up "years" when talking about Doom 3 anyways? I tend to think that yes, an extra 2 weeks with a recent build would have made a more than slight difference to Cat 3.4 performance with Doom 3, which don't seem to have been made with Doom 3 in mind at all.

No, what would have happened was that their marketing department would have started spewing preemptive fud of the variety that a few on this board are doing now and all video card companies have done plenty of times in the past.

I think you're confusing ATI with nVidia, as there do remain some rather distinctive differences to their marketing. I also don't think simply labelling people's reaction as FUD is a very useful way to discuss opposing viewpoints.

What happened today was that one video card was shown to be faster than another on an upcoming game.

Now you have stated it in a way that makes sense to me, and indeed this is why I wasn't condemning HardOCP. Frankly, I think the site's priorities are inferior to those of Beyond3D (EDIT: wanted to make it clear that I'm talking about for objective hardware analysis, as opposed to non-technical gaming evaluation exclusives ranking higher), but as it isn't Beyond3D, and knowing those priorities, I think separating the Doom 3 comparison from the hardware shootout, and the disclaimers made, do a good job of providing context for the results within those priorities...except that now I'm thinking there may have been some significant details unacceptably glossed over wrt to the NV35 ARB2 path (I believe it was Tom's who said it didn't work for the new nvidia drivers...is this true?) and not giving info about trying both the R200 and ARB2 paths for the R350 (a second hand comment that "the most optimized path was used for each card" is hardly a definitive answer in that regard).

Even aside from that, it doesn't change that those priorities led to a comparison that was unfair, and not even as good as it could have been with a minimum of extra analysis, so of course people are going to complain, especially people who like this site more than HardOCP for a reason. That's why it is good that not every site is about gamer enthusiasm, and some are about hardware details enthusiasm.

People are still starving in Ethiopia. The troops are still in Iraq. There are still love ones to care for, attractive women to chase after, et cetera. Get a life.

OK, do you ever consider applying your standards to your self, or is it only other people who should shut up about their own opinions?

Geeze..."You guys are wrong, need to get a grip, and get a life instead of stating your opinions, but listen to me while I tell you mine". A bit one-sided, don't you think? The "people are starving, so don't complain about anything less" is a bit worn as a conversational tactic, don't you think? What are you doing posting here yourself? Or are you intending sarcasm by your entire post?
 
BenSkywalker said:
Please don't say PS1.4 - if you do I'll be very disappointed. Not only do nVidia support it but they knew well ahead of time that they would have to support it because it has always been a requirement in DirectX that cards that support higher numbered shaders must also support all lower numbered shaders.

What does support have to do with using a rendering path that favors one board over another?

Because an equivalent argument would be that anything using PS1.1->1.3 favours nVidia cards since this is their native path. The reason that this is not really much of an argument is that since other vendors know that they must support PS1.1->1.3 they can design their hardware to execute these quickly. If instead you knowingly ignore the existence of a standard feature of an API and then complain because you then can't execute it as quickly as someone else then you really have no-one to blame but yourself.

On the other hand if someone specifically writes a path to a non-standard extension that you do not (or are not allowed to) support then there's really very little that you can do about it.

I am not stating that it is wrong to optimise as far as possible - a developer naturally should try to get the best performance in all cases. I am simply saying that the example that you chose was not particularly relevant.

Splinter Cell benches are another example. They are becoming increasingly common but which site has spelled out exactly how the boards differentiate themselves from one another? Why wasn't a major issue made about that? There is a case where there is a fairly enormous difference in rendering quality between ATi and nVidia but the sites can get away with that.... why?

If the sites are benchmarking correctly with Splinter Cell there should be no special difference in rendering quality. As I understand the problem is with the different shadow rendering paths, so the choice of path should be equalised (I think that shadow projectors is then the preferred path).

I know that HardOCP at least did this in their review (although they really should have stated this clearly somewhere - they later confirmed it on their message board).

Funnily enough, if I remember correctly I believe that the rendering path used for the 'better' nVidia shadow buffering is actually non-standard. It requires declaring a special depth buffer format as a texture, which wasn't even supported in the reference rasterizer in DX8 (you couldn't create a depth buffer and use it as a texture on refrast). I'm open to correction here if anyone remembers more clearly. If I'm right then it just goes to show that you can even sneak non-standard vendor specific stuff into DX if you try hard enough...
 
I think we should wait a bit for final judgement here, as you could suspect that nVidia at this point have been putting more of an effort into drivers regarding Doom III.

While only semi-official, CATALYST maker (from ATI) made this remark at rage3d:

Anyways.... Doom III.
Interesting little game, even more interesting that reviews would appear on an unreleased game. All I can say at this point is that we have not had that particular benchmark before the review sites started using it. What a shame that people are getting to look at something that we havent had a chance to play around with a bit.

Anyways please dont pay any attention to that benchmark. It is on an unfinished product that we have not looked at yet. Wait till it really comes out and we will be ready to rock.

Source:
http://www.rage3d.com/board/showthread.php?s=&threadid=33685071&perpage=20&pagenumber=2
 
Back
Top