HardOCP and Doom 3 benchmarks

Well, i hope what he says is not true, because then it's still an issue isn't it? Why does Nvidia has more time to play with this "little game"? At the end, won't it be the same result, i.e. Nvidia being favorised?
 
Well, I dont have a problem with an apples-orange comparison as long as I know it is an apple-orange comparison (most people dont know), fully understand what it means, can see both the apples and the oranges (where are the IQ evaluations?) and can see the apples-apples too.

Then as I gamer/consumer I will be able to decide what is best for me. For example, Doom III is about a next level in gaming realism then I want a next level in image quality. Probably the best way to play it is with AA, aniso, high quality image settings at lower resolution.
 
One question I have not seen asked in this thread that is of great importance to the debate on D3 benchmarks, why where Cat 3.4 compleltey hosed when running D3? All three groups who had D3 noticed the same thing, abysmal performance with Cat 3.4 and all switched back to Cat 3.2s (instead of autodeclaring nV the winner as I am sure nV would have liked). That is a fact. Now the real question becomes, if Cat 3.2 worked well for D3, what did they "break" in Cat 3.4 that killed performance on "one application?"
 
It was quite interesting to see that some of the lower cards were OK with the game, I've been thinking that the R350/nv35 would be the only thing playable at high quality.

One assumes it will get faster than slower from this point so that is even better, though they have probably finished the engine and now just adding the other bits, visuals and story and a few pops and crackles and booms.

When 3Dmark03 came out people kept saying that they would wait for game benchmarks to come out as that would show in actuality what the buyer could expect when playing games rather than benching 3dmark. Now people are arguing the complete opposite, use standard paths so it's all equal !

Regards

Andy
 
DadUM said:
One question I have not seen asked in this thread that is of great importance to the debate on D3 benchmarks, why where Cat 3.4 compleltey hosed when running D3? All three groups who had D3 noticed the same thing, abysmal performance with Cat 3.4 and all switched back to Cat 3.2s (instead of autodeclaring nV the winner as I am sure nV would have liked). That is a fact. Now the real question becomes, if Cat 3.2 worked well for D3, what did they "break" in Cat 3.4 that killed performance on "one application?"

I was thinking this as well. Doom3 was shown earlier using Ati earlier drivers and it worked fine, now the 3.4's come along and they don't work ...

Can one of the experts also tell me why earlier versions of the drivers do not like 256MB of RAM ? For instance, with 64MB and 128MB cards you don't need a new set of drivers. Why does 3.2 not like 256Mb ?

Regards

Andy
 
The argument is that the reviewers should be. I do agree. What happened was that a site with an enthusiasm based readership prioritized enthusiasm over leveling the playing field. People here don't need to get a grip, you're just visiting a forum where (for many) the priorities are different than that.

If that was the truth, then this thread would have taken on a completely different tone. Instead of discussing various conspiracy theories, people might have taken the time to discuss what optimizations Nvidia's hardware/software might have, what Ati might have done to optimize their software, etc, which, given the membership of this forum, might have been very interesting and informative. Instead, it degenerated into a completely academic debate about the morality of something that has no clear grounds for establishing a moral bearing.

Well, you're trying to say "everyone" like you have the exclusive right to speak for them. :oops:

Since this has turned into a morality debate, there's a need to establish a moral rubric. The obvious choice here is the expectations of the readership community versus what was being served. I didn't say that "everyone" have the exclusive right to speak for them, but if one were to establish "morality" for video card tests, then the consumer has priority.

You see, that's exactly the question that was not answered. The only thing that was answered was the performance of Doom 3 today for the fastest path for each card, the game itself is months from release.

No, that questiong WAS answered. You are holding the answer to a standard that no answer could fullfill. John Carmack indicated that this test would be indicative of performance of the final products. What do you want these people to do? Sign contractual guarantees that things would never change?


Note that the NV35 has been stated not to be able to run the ARB path for the Doom 3 build tested, though the NV30 did in the past. Problem in Doom 3? nVidia preventing equivalent comparison for Doom 3, so engineering the preclusion of the context I mentioned earlier? It seems evident that the 44.03 and the Doom 3 showing were something nVidia were planning in a linked fashion.

Planned from day one that product was designed.


Come to think of it, it might be a very significant issue that the Cat 3.4 were used as they were (even given the priorities of HardOCP) to run the ARB path, and the R200 path wasn't tried. Did they try that method of overcoming the Cat 3.4 issue? I don't recall mention of it.

Do you think that they none of the 3 sites that got this were smart enough to go into a menu and try a different setting? Even if they made that horrible oversight, do you think that Carmack himself would then give his stamp of approval?

Actually, it should be the job of reviewers who tested Doom III now to let you know then, as that is the avenue to address the inherent imbalance of the testing situation that was presented.

I'm talking about reality, not who should or who shouldn't. In this case, Ati would likely find out first. I'm assuming that their marketing department would then jump all over it. Am I wrong on this?

Hmm...this comment doesn't make sense to me. Are you under the impression that nVidia drivers are perfect, or do you recognize that driver development is an ongoing process? What about Doom 3 development...do you recognize it is still changing?

Drivers are continuously evolving, and so is DOOM III, but the basic concepts were nailed down at least over a year ago, otherwise the development progress on DOOM III and R9800 would look like the progress on Daikatana and Rampage.

Well, nVidia arranged this showing of Doom 3, what do you think that says about their focus for the 44.03 drivers used? ATI personnel seem to indicate that they have no current builds for which they could have performed optimizations. In a sense, that's good, they seem to focus on games people are playing and game specific hand tuning was hindered, but it is bad because there are obvious issues they could have addressed for their latest drivers, and nVidia was advantaged by only having to sucessfully implement the direct expression of their hardware functionality via their own OpenGL extension.

No, actually, one of their focuses from day one, which is likely over a year ago, has been DOOM III. Now, if people can show where they could have specifically optimized drivers for DOOMIII to the detriment of everything else, THAT would be interesting, but all this talk is worthless speculation.

Why are you bringing up "years" when talking about Doom 3 anyways? I tend to think that yes, an extra 2 weeks with a recent build would have made a more than slight difference to Cat 3.4 performance with Doom 3, which don't seem to have been made with Doom 3 in mind at all.

And why would you think that? If you can come up with specific examples, I'm all ears because I think such discussion would be very interesting and I'm eager to learn more about these things.

I think you're confusing ATI with nVidia, as there do remain some rather distinctive differences to their marketing. I also don't think simply labelling people's reaction as FUD is a very useful way to discuss opposing viewpoints.

I think you're confusing business with charity, and calling people's reactions FUD is a useful way of prodding things in the right direction, sometimes.

OK, do you ever consider applying your standards to your self, or is it only other people who should shut up about their own opinions?

Geeze..."You guys are wrong, need to get a grip, and get a life instead of stating your opinions, but listen to me while I tell you mine". A bit one-sided, don't you think? The "people are starving, so don't complain about anything less" is a bit worn as a conversational tactic, don't you think? What are you doing posting here yourself? Or are you intending sarcasm by your entire post?

I spoke up for a specific purpose. Frankly, I was disappointed when I was finally able to load up B3D only to find the same kind of crap that usually goes on at HardForums, namely, a discussion with no objectivity, no technical insight. It seems that if people were really enthusiastic about hardware and graphics, then they'd take the time and effort to learn the important details, and not make idle speculation.

Other people are entitled to their opinions. I'm entitled to thinking their opinions are worthless. Frankly, I wrote the post to try to move debate in a different direction and express my frustrations. What I should have done was stick to the first part and forget about the second, and in that, I'm guilty of being as lame as all the other people expressing worthless opinions. :oops:

There I've said it, can we move on to a more interesting discussion now? ;)
 
boobs said:
If that was the truth, then this thread would have taken on a completely different tone. Instead of discussing various conspiracy theories, people might have taken the time to discuss what optimizations Nvidia's hardware/software might have, what Ati might have done to optimize their software, etc, which, given the membership of this forum, might have been very interesting and informative.

Well, the problem with how Doom3 was handled, is that we have no idea what optimizations may or may not have been done. Specifically:

1) The reviewers had little time with the demo
2) They could not even post screenshots for scrutiny.

Instead, it degenerated into a completely academic debate about the morality of something that has no clear grounds for establishing a moral bearing.

There is room for both kinds of debate here. This Doom3 demo situation touches on both.

Since this has turned into a morality debate, there's a need to establish a moral rubric. The obvious choice here is the expectations of the readership community versus what was being served.

I think everyone, consumers and IHVs, expects at least a "fair" comparison. Given that ATI (or other vendors, for that matter) had no indication that there would be a public benchmarking of Doom3, and nVidia was well aware of this, I woudlsay that is grounds for not being fair, to all IHVsm nor to consumers.

No, that questiong WAS answered. You are holding the answer to a standard that no answer could fullfill. John Carmack indicated that this test would be indicative of performance of the final products. What do you want these people to do? Sign contractual guarantees that things would never change?

No, I want Carmack to give ATI (or any other hardware vendor on which the tests will be run) the opportunity to agree or disagree with Carmack's assesment.

Carmack can only know one thing: how close his own code is to being complete and optimized. He really doesn't know what else ATI can do to improve performance on his code with their drivers, etc. And to release the benchmark in this fashion is just "wrong", IMO.

Note that the NV35 has been stated not to be able to run the ARB path for the Doom 3 build tested, though the NV30 did in the past. Problem in Doom 3? nVidia preventing equivalent comparison for Doom 3, so engineering the preclusion of the context I mentioned earlier? It seems evident that the 44.03 and the Doom 3 showing were something nVidia were planning in a linked fashion.

Planned from day one that product was designed.

What was planned from day one? That NV3x would not run the ARB2 path in benchmark mode? What has been planned from day one, is that NV3x path would be optimal for NV3x hardware, but that NV3x hardware would also run the complete ARB2 path as well.

It is very obvious to me that the ARB2 path was disabled by nvidia on purpose, precisely to prevent a direct apples to apples comparison.

To be clear, I do agree that ATI running ARB2 vs. nVidia running NV3x paths (with commentary on any quality difference) is a valid comparison. However, having both run the ARB2 path is also a valid comparison...one that was not able to be performed.



Do you think that they none of the 3 sites that got this were smart enough to go into a menu and try a different setting? Even if they made that horrible oversight, do you think that Carmack himself would then give his stamp of approval?

I don't understand your question. At least one of the 3 sites definitely tried and documented that try of a separate path for NV35 on the ARB2 path. Why would that same site, if they tried the R200 path, not document that as well?

And why shouldn't Carmack give his stamp of approval?

Actually, it should be the job of reviewers who tested Doom III now to let you know then, as that is the avenue to address the inherent imbalance of the testing situation that was presented.

I'm talking about reality, not who should or who shouldn't. In this case, Ati would likely find out first. I'm assuming that their marketing department would then jump all over it. Am I wrong on this?

ATI was not aware that this was going to be tested and publicized until it was already done. That's the point.

Drivers are continuously evolving, and so is DOOM III, but the basic concepts were nailed down at least over a year ago, otherwise the development progress on DOOM III and R9800 would look like the progress on Daikatana and Rampage.

Yes, but the R300 core series of products have been shipping in quantity for a long time now. ATI doesn't have quite the same luxury as nVidia to dedicate lots of resources for the driver for that core to Doom3 which won't be shipping for many months yet. They have other real shipping games out there that need tweaks and bug fixes. And given that Doom3 is certainly up and working on R30x in at least a very solid fashion, that's good enough for Id's development purposes until the game is getting closer to shipping, at which time ATI can start getting heavy into serious tweaking / optimizing for that game / engine.

No, actually, one of their focuses from day one, which is likely over a year ago, has been DOOM III. Now, if people can show where they could have specifically optimized drivers for DOOMIII to the detriment of everything else, THAT would be interesting, but all this talk is worthless speculation.

Again, the point is, with the way this benchmark was released and controlled, we cant do any such thing. That's the problem. No screen shots, no follow-up, nada.

All we have is some indications from OTHER GL apps, that aniso and AA might have been fiddled with. I do hope there is further investigation into that, but as for how that impacts Doom3? WE WON'T KNOW, because it won't be re-tested any time soon, nor will we have screen-shots to look at and judge.
 
Because an equivalent argument would be that anything using PS1.1->1.3 favours nVidia cards since this is their native path.

Which could be relatively fairly deduced. If 3DM2K3 had chose to have one 1.1 and one 1.4 test it would have been a decent leveling.

If instead you knowingly ignore the existence of a standard feature of an API and then complain because you then can't execute it as quickly as someone else then you really have no-one to blame but yourself.

You mean, ignore a feature that ends up shipping in a version of DX other then one you are targetting? Should we criticize/penalize DX9 level boards because they don't support DX9.1? PS 1.4 favors ATi hardware, which is what I have been talking about all along. You know it, I know it, and pretty much everyone else knows it(that posts here anyway).

I am not stating that it is wrong to optimise as far as possible - a developer naturally should try to get the best performance in all cases. I am simply saying that the example that you chose was not particularly relevant.

It is entirely relevant. Given the context of 3DM2K3 it is supposed to be a bench that stresses vid cards and gives an indicator of how boards will stack up in games now and in the future. For their DX8 level test they utilize a DX8.1 specific feature, one that is nigh MIA in games, that favors ATi hardware. This tilts the field in favor of ATi. Is it their right to do so? Absolutely. Just as it is Carmack's right to optimize his game however he sees fit and as it is fine for UbiSoft to optimize SC however they see fit.

On the other hand if someone specifically writes a path to a non-standard extension that you do not (or are not allowed to) support then there's really very little that you can do about it.

So would you say then that any benches of games that request WBuffer should be invalid for R3x0 boards?

If the sites are benchmarking correctly with Splinter Cell there should be no special difference in rendering quality.

Run the game with low quality settings to compensate for ATi's missing features? If anything the game should be run with both high quality settings and then benched again with low quality settings. Why not run Doom3 benches at the lowest quality setting possible? After all, the ARB2 path is not entirely equal even between nV and ATi, so do you think it is valid in the least to consider running it at the lowest settings to level the playing field?

It requires declaring a special depth buffer format as a texture, which wasn't even supported in the reference rasterizer in DX8

The DX8 refrast doesn't support PS 1.4 either. It should work fine using the DX 8.1 refrast.

Joe-

I think everyone, consumers and IHVs, expects at least a "fair" comparison. Given that ATI (or other vendors, for that matter) had no indication that there would be a public benchmarking of Doom3, and nVidia was well aware of this, I woudlsay that is grounds for not being fair, to all IHVsm nor to consumers.

Was it fair for ATi to release the Doom3 alpha? If nVidia pulls the same sort of stunt, I would expect they would get the same treatment from id.
 
Was it fair for ATi to release the Doom3 alpha? If nVidia pulls the same sort of stunt, I would expect they would get the same treatment from id.

Ah, so now you are also assuming that ATI did leak it. (Which is my suspicion as well.) However, you seem to take it further that ATI purposely leaked it (not some dolt employee)? Do you also know what action ATI may have taken against that employee? If ATI fired that employee, is that not enough?

Has Id publically stated who leaked the demo? Id did more damage to their own credibility with this stunt than anyone else.

Is it fair for nVidia to do this after ATI bent over backwards to get the game running for them on a card that could actually RUN THE DAMN THING at a decent clip at high quality for last year's E3? You think in all that rush to get the demo up and running (for the benefit of ID as well as ATI), that in all that confusion and late nights of work, passing between who knows how many different PCs and engineers, the demo managed to get into the wrong hands at one point, as the ultimate source of the leak?

To be clear, if someone at ATI did leak it, it is ultimately ATI's responsibility. But when both ATI and ID are gunning to get it up and running for the big show, it easy to see how such a thing could happen.

Id handled this with very bad form. They like to portray themselves as being "fair" and "for the gamer" etc. That's not the way I see it. Id, at the moment, is nVidia's bitch.

At least you seem to agree that this isn't fair....you just think that it's legitimate "pay-back".
 
Ah, so now you are also assuming that ATI did leak it.

Is there anyone who doesn't?

However, you seem to take it further that ATI purposely leaked it (not some dolt employee)?

ATi isn't an individual, it is a group of people. When a BS PR statement is released from nVidia that some dolt came up with who gets hammered for it? Same thing.

Do you also know what action ATI may have taken against that employee? If ATI fired that employee, is that not enough?

Where I work, if we are contracting another company and one employee that works for them leaks some of our confidential information that company no longer has our business period. We expect the same from our constomers(we screw up, we lose them).

To be clear, if someone at ATI did leak it, it is ultimately ATI's responsibility. But when both ATI and ID are gunning to get it up and running for the big show, it easy to see how such a thing could happen.

There are dozens/hundreds of games shown off at E3 each year, and almost all of them have the same hectic schedule to get it ready. If ATi couldn't handle it for last years E3 without losing control over it, why should id expect them to pull it off this time?

Id handled this with very bad form. They like to portray themselves as being "fair" and "for the gamer" etc. That's not the way I see it. Id, at the moment, is nVidia's bitch.

The question is one of if being for the gamer is exclusive from working closely with nVidia. Looking at marketshare, there is a good reasoning to see why it would be the case. It was the same situation when 3dfx was the big dog in the gaming market. I have a version of GLQuake that won't run under OpenGL without a Glide wrapper by default. Were they 3Dfx's bitch back then? Yes. And they were for the gamer too.

At least you seem to agree that this isn't fair....you just think that it's legitimate "pay-back".

Running Doom3 benches isn't fair, running SC benches isn't fair, using 3DM2K3 isn't fair, running Quake3 benches isn't fair, running UT benches isn't fair. Life isn't fair. I don't live in a world that's fair, I don't play games from a world that's fair, and I don't buy hardware from a world that's fair. What I am far more interested in is seeing represenative information. If Half-Life2 comes out and runs five times faster on S3 hardware then it does on anything else obviously it isn't fair, but it may get me to buy a S3 video card anyway.

When I read B3D's reviews I expect them to be as fair as possible as the goal is one of 3D technology. When I read Anand's or Tom's(etc) I want to see represenative data to what I would use the board for.

I'll ask this, can anyone tell me of a single time when they, as a consumer and not a reviewer, changed the settings they have determined to be optimal for their rig in a game they are playing because it wasn't fair in relation to another board? Think about it.
 
BenSkywalker said:
You mean, ignore a feature that ends up shipping in a version of DX other then one you are targetting? Should we criticize/penalize DX9 level boards because they don't support DX9.1? PS 1.4 favors ATi hardware, which is what I have been talking about all along. You know it, I know it, and pretty much everyone else knows it(that posts here anyway).

Of course a DX9 board will be penalized in a DX9.1 benchmark if it doesn't support a feature required for that benchmark. Just as a DX8 board may be penalised in benchmarks written after the release of DX8.1 that support features that they do not possess. Just what is your point here?

PS1.4 should be able to favour all advanced hardware created after its specification was finalised, and writing apps that use it should give you better performance. Why not?

It is entirely relevant. Given the context of 3DM2K3 it is supposed to be a bench that stresses vid cards and gives an indicator of how boards will stack up in games now and in the future. For their DX8 level test they utilize a DX8.1 specific feature, one that is nigh MIA in games, that favors ATi hardware. This tilts the field in favor of ATi. Is it their right to do so? Absolutely. Just as it is Carmack's right to optimize his game however he sees fit and as it is fine for UbiSoft to optimize SC however they see fit.

The PS1.4 test tilts the field in favour of any vendor who chose to create hardware that implements the specification in an efficient manner, and it excludes nobody since anyone is free to create a piece of hardware supporting PS1.4, and the specifications of PS1.4 have been around for a long time. Whether other informed vendors chose to take the advantages offered by providing fast support for this shader version, or simply chose to ignore such support and provide a minimal level of useability was entirely up to them. Futuremark are not the only developer that uses PS1.4.

On the other hand if someone specifically writes a path to a non-standard extension that you do not (or are not allowed to) support then there's really very little that you can do about it.
So would you say then that any benches of games that request WBuffer should be invalid for R3x0 boards?

Where did you get that idea from? W-buffers are a completely optional feature of the API, not a compulsory one like PS1.4, and they always have been. Developers are free to ignore this fact (perhaps misunderstanding the difference between optional and compulsory).

If someone writes a bench that requires a W-buffer then it simply won't function correctly on a lot of hardware. As such it would not produce valid results on an R3x0 board, so yes - it would be an invalid benchmark for that hardware.

If at some future time W-buffers become compulsory then I might expect to see new benchmarks using them, and perhaps having some fallback for earlier cards that do not have W-buffers to show how they perform. It seems to me that there should be no problem with that.

Run the game with low quality settings to compensate for ATi's missing features? If anything the game should be run with both high quality settings and then benched again with low quality settings. Why not run Doom3 benches at the lowest quality setting possible? After all, the ARB2 path is not entirely equal even between nV and ATi, so do you think it is valid in the least to consider running it at the lowest settings to level the playing field?

There is no missing feature here. R3x0 can do shadow buffers - we even have demonstrations of shadow buffers running on R200s using PS1.4. It is only the fact that AFAIK in Splinter Cell the shadow buffers are implemented through a non-standard, unsupported and undocumented extension to the API behaviour that prevents them from being used.

I have already said that I have no issue with developers trying to write the fastest possible code, and my comments had nothing to do with Doom3 per se. It was what I perceived as your poor choice of a comparative example that prompted my original comments.

It requires declaring a special depth buffer format as a texture, which wasn't even supported in the reference rasterizer in DX8
The DX8 refrast doesn't support PS 1.4 either. It should work fine using the DX 8.1 refrast.

To the best of my knowledge it still doesn't work through that path using the DX8.1 refrast or even the DX9 refrast (although I admit I could be mistaken).

The only shadow buffer support officially exposed up to and including in DX9 AFAIK requires the shadow buffering code to be explicitly written using appropriate pixel shaders, not using implicit behaviour on one vendor's hardware.

[edit]Toned down a tad as it seemed unnecessarily provocative on further reading. [/edit]
 
BenSkywalker said:
You mean, ignore a feature that ends up shipping in a version of DX other then one you are targetting? Should we criticize/penalize DX9 level boards because they don't support DX9.1? PS 1.4 favors ATi hardware, which is what I have been talking about all along. You know it, I know it, and pretty much everyone else knows it(that posts here anyway).

Funny you should mention DX9, as if this was a DX9 title this conversation wouldn't be happening, there would be no 'NV30 Path'..Period.
We are talking about DX9 hardware, and DX9 hardware should be running at least at the minimum amount of precision for compliancy.

I am not stating that it is wrong to optimise as far as possible - a developer naturally should try to get the best performance in all cases. I am simply saying that the example that you chose was not particularly relevant.

Wohoa this has nothing to do with optimizing, this was forced on them..this NV30 code path. John Carmack even states it:

The NV30 can run DOOM in five different modes: ARB, NV10 (full featured, five
rendering passes, no vertex programs), NV20 (full featured, two or three
rendering passes), NV30 ( full featured, single pass), and ARB2.

The R200 path has a slight speed advantage over the ARB2 path on the R300, but
only by a small margin, so it defaults to using the ARB2 path for the quality
improvements. The NV30 runs the ARB2 path MUCH slower than the NV30 path.
Half the speed at the moment.
This is unfortunate, because when you do an
exact, apples-to-apples comparison using exactly the same API, the R300 looks
twice as fast, but when you use the vendor-specific paths, the NV30 wins.

Run the game with low quality settings to compensate for ATi's missing features? If anything the game should be run with both high quality settings and then benched again with low quality settings. Why not run Doom3 benches at the lowest quality setting possible? After all, the ARB2 path is not entirely equal even between nV and ATi, so do you think it is valid in the least to consider running it at the lowest settings to level the playing field?

Nvidias design decision not to support the minimum required spec, so now a developer uses the Fp 32 excuse as being to slow to use and drops down to lower than the minimum to gain speed.
The ARB 2 path is the Standard non-proprietary code path. so yes it is the standard all ARB members agreed on, including Nvidia

The only thing these vendor specific code paths are allowing in this case is 'lower precision' which in turn is increasing 'speed' and to such a point that now the NV30 code path is faster than than the Standard ARB path.
As a consumer one product should not be put in a better light using non-standard code paths, nor should they be used for benchmarking purposes which is all that is being done here when all things are not equal and not complying with minimum precision.
Pixel Shader 1.4 is a 'scraping the bottom of the barrel' comparison, especially when it is a part of DX 8.1, and Nvidia again chose NOT to support it till recently, has nothing to do with this example at all as ATI or any other IHV could never run the non-standard Proprietary NV30 code path.
While any IHV can and does support Pixel Shader 1.4 as it is not Proprietary

http://www.theinquirer.net/?article=7083

http://www.evilavatar.com/redirected.asp?fromurl=http://www.evilavatar.com/EA/News/M42853/
 
BenSkywalker said:
ATi isn't an individual, it is a group of people. When a BS PR statement is released from nVidia that some dolt came up with who gets hammered for it? Same thing.

Like I said, ATI as a company is ultimately responsible if one of their employees is responsible. But you have to consider all circumstances surrounding what happened when determining the course of action.

Where I work, if we are contracting another company and one employee that works for them leaks some of our confidential information that company no longer has our business period.

A blanket statement like that is pure bullshit. It depends on the company, and it depends on what was leaked, and what impact it has. At my company, when stuff like that happens, we consider those factors, among others, when deciding a course of action.

I bet if your company is, say, "Abit" and the company that "leaked" your information was "Intel", you wouldn't stop doing business with Intel.

There are dozens/hundreds of games shown off at E3 each year, and almost all of them have the same hectic schedule to get it ready.

Wrong. There is ONE game "like Doom3" each year, that has the expectations of Doom3, the display of which was compounded by the fact that there was only really one brand-new-fresh-back-from-the-fab part that will run it good if-they-can-get-it-up-and-running-in-time, That is the combination of Doom3 and R300 at last year's E3.

If ATi couldn't handle it for last years E3 without losing control over it, why should id expect them to pull it off this time?

Because the circumstances are completely different, and ATI made whatever changes are necessary to prevent it from happening again? (Like what all companies like mine do when we "do something wrong". We admit the mistake, explain how we addressed it.)


The question is one of if being for the gamer is exclusive from working closely with nVidia. Looking at marketshare, there is a good reasoning to see why it would be the case.

Market share is a good reason to work closely with a vendor. It's not a good reason put other vendors at an artificial disadvantage.

Were they 3Dfx's bitch back then? Yes. And they were for the gamer too.

Yes, they were. 3dfx was the only one persuing Gl development with iD at the time. 3dfx is the reason why GL QUake existed, because carmack said "I don't want to use D3D, and I don't really want another proprietery API, what can we do". And 3dfx said "hey, we'll make enough of a GL driver so you can code in GL."

At least you seem to agree that this isn't fair....you just think that it's legitimate "pay-back".

Running Doom3 benches isn't fair, running SC benches isn't fair, using 3DM2K3 isn't fair, running Quake3 benches isn't fair, running UT benches isn't fair. Life isn't fair.

So that means one shouldn't attempt to address the things that aren't?

I don't live in a world that's fair,

The question is...do you care about doing anything about it.

I don't play games from a world that's fair, and I don't buy hardware from a world that's fair. What I am far more interested in is seeing represenative information. If Half-Life2 comes out and runs five times faster on S3 hardware then it does on anything else obviously it isn't fair, but it may get me to buy a S3 video card anyway.

And your point?

When I read B3D's reviews I expect them to be as fair as possible as the goal is one of 3D technology.

Why? Life isn't fair. Why should assesment of 3D Technology be any different? :rolleyes:

I'll ask this, can anyone tell me of a single time when they, as a consumer and not a reviewer, changed the settings they have determined to be optimal for their rig in a game they are playing because it wasn't fair in relation to another board? Think about it.

I thought about it, and I'm trying to figure out what this has to do with Id not giving ATI advance notice about releasing a Doom3 benchmark, and the opportunity to do some specific optimizations, in the interest of fairness.
 
I am using the Cat 3.3 drivers for my 9800pro and I noticed a fair performance boost when playing the Doom 3 alpha that I have, as well as small gains in game tests 2 and 3 of 3DMark03. Seems they scrapped optimizations in lieu of some other performance gains.
 
boobs said:
...
If that was the truth, then this thread would have taken on a completely different tone.

I don't think you should presume to judge the thread and treat it with contempt according to your evaluation. That doesn't improve it, and you're just persisting in making it deviate from what you propose you desire. Your contempt is your own conceit, please don't inflict it on the forum as the only truth worth stating.

Instead of discussing various conspiracy theories, people might have taken the time to discuss what optimizations Nvidia's hardware/software might have, what Ati might have done to optimize their software, etc, which, given the membership of this forum, might have been very interesting and informative. Instead, it degenerated into a completely academic debate about the morality of something that has no clear grounds for establishing a moral bearing.

So you decree. Does that decree do anything to address that problem you think is there? Does it change that some people disagree with your evaluation and have reason to do so? I propose to you that starting a discussion about what you wanted with comments to discuss would have worked towards achieving what you wanted, and that complaining and doing the same exact thing, except with your own opinion on "morality" instead merely prompts, at best, people to tell you exactly why they disagree.

The opinion that what you term "morality" cannot be established is just an opinion, not an edict that entitles you to condemn everyone else accordingly without your standards being applied to yourself. Are you "listening" to what I say, by the way? If not, my own standards will make this my last reply at such length on this discussion you started.

Well, you're trying to say "everyone" like you have the exclusive right to speak for them. :oops:

Since this has turned into a morality debate, there's a need to establish a moral rubric.
OK, and why are you right in where you establish that criteria? You are just proposing your own decision as right, and everyone who disagrees as wrong. Are you more than a bit egotistical, or can you recognize how unproductive that is...i.e., that people won't just line up and obey your telling them to "get a grip" as you define they should?

The obvious choice here is the expectations of the readership community versus what was being served.
You are substituting your own expectations in place of people who specifically disagree, and then telling them their opinions do not matter. Singularly and consistently useless, it seems to me. "The obvious choice here"? Ack!
I didn't say that "everyone" have the exclusive right to speak for them, but if one were to establish "morality" for video card tests, then the consumer has priority.
What I said was that you maintained that you presumed to speak for "everyone", and though you just said you disagree, it is my observation that this is exactly what you are continuing to do.

You see, that's exactly the question that was not answered. The only thing that was answered was the performance of Doom 3 today for the fastest path for each card, the game itself is months from release.

No, that questiong WAS answered. You are holding the answer to a standard that no answer could fullfill. John Carmack indicated that this test would be indicative of performance of the final products.

Your commentary is improperly selective. Simplest description of the flaw in it: Doom 3 is close to its final form, outside of bug squashing, but the actual hardware and drivers are rather far removed from that which will be available at its launch. All of these products have bearing on your question, not just the first, as does the person's own opinion of "satisfactory". With each person determining what is "satisfactory" for a $500 purchase, and the issues that I brought up, that question was not answered.

However, perhaps it was answered for you and people who share your opinion atleast that far, but I've already covered why I think your application of that evaluation universally is a problem.

What do you want these people to do? Sign contractual guarantees that things would never change?

Eh? No, I'm simply maintaining that that question was not answered. I'd think my stating what seems to have been answered instead would have avoided the need for such a silly proposition as to what I wanted. :-?
...
Planned from day one that product was designed.

OK, I'm supposed to guess at your particular meaning? Which product? Doom 3 designed for NV35? 44.03 designed for Doome 3? NV35 designed for Doom 3? None dispute my conclusion, so were you just agreeing?

Come to think of it, it might be a very significant issue that the Cat 3.4 were used as they were (even given the priorities of HardOCP) to run the ARB path, and the R200 path wasn't tried. Did they try that method of overcoming the Cat 3.4 issue? I don't recall mention of it.

Do you think that they none of the 3 sites that got this were smart enough to go into a menu and try a different setting?
Yes.

No mention was made of changing the render path settings, and if you saw indication, it would have been useful to just point it out to me to it instead of something as ridulous as implying that I should implicitly trust reviewer thoroughness. :oops:

In any case, from Anandtech, since my reference to HardOCP's commentary wasn't sufficient: "The only options that we could change in game were the quality settings which we could set to low, medium or high."

Even if they made that horrible oversight, do you think that Carmack himself would then give his stamp of approval?

For the HardOCP article in particular, I ask you to note again that Carmack provided the demo, he did not dictate the testing methodology or provide his "stamp of approval" beyond replacing the nVidia made demo.

Why are you making up things at random and proposing them as actuality, and since when are reviewers above reproach? :oops:

Actually, it should be the job of reviewers who tested Doom III now to let you know then, as that is the avenue to address the inherent imbalance of the testing situation that was presented.

I'm talking about reality, not who should or who shouldn't.

I'm talking about reality too. You seem to be implying that should or shouldn't don't matter at all, only what actually happens. That is pretty circular.
Is the criteria for should and shouldn't that you continue to propose "reality" rather than your own opinion? Or are you simply maintaing that once something has happened "in reality", criticism of it is not valid?

In this case, Ati would likely find out first. I'm assuming that their marketing department would then jump all over it. Am I wrong on this?

You mean wrong in your assumptions? I think so, and I don't recall having seen them jump all over any game or benchmark when their card was shown to ill favor. I have, on the other hand, seen this specifically and at great lengths from nVidia, and also seen this from them in technical criticism of competing hardware. Therefore, it is my opinion that your assumption is based on ATI acting like nVidia when I personally don't see indication of justification for that.

You are free to correct me, or even simply disagree, though I will consider that opinion unfounded without such correction.

Hmm...this comment doesn't make sense to me. Are you under the impression that nVidia drivers are perfect, or do you recognize that driver development is an ongoing process? What about Doom 3 development...do you recognize it is still changing?

Drivers are continuously evolving, and so is DOOM III, but the basic concepts were nailed down at least over a year ago, otherwise the development progress on DOOM III and R9800 would look like the progress on Daikatana and Rampage.

What is with the consistent introduction of these silly comments, like this one about Daikatana and Rampage? The comment in question was "Ati has had years to optimize their hardware and software", and it remains a comment that doesn't make sense to me when talking about Doom 3.

What do you think the Cat 3.4 results showed, exactly, if not that ATI didn't have the opportunity to do some testing? What about the game issues for NV3x products which were conceived at the same time as the R300 and R350? Are you instead proposing that ATI's R300 drivers last year were just as optimized for Doom 3? What about the performance changes since the E3 presentation that seem to directly contradict this?

Well, nVidia arranged this showing of Doom 3, what do you think that says about their focus for the 44.03 drivers used? ATI personnel seem to indicate that they have no current builds for which they could have performed optimizations. In a sense, that's good, they seem to focus on games people are playing and game specific hand tuning was hindered, but it is bad because there are obvious issues they could have addressed for their latest drivers, and nVidia was advantaged by only having to sucessfully implement the direct expression of their hardware functionality via their own OpenGL extension.
No, actually, one of their focuses from day one, which is likely over a year ago, has been DOOM III. Now, if people can show where they could have specifically optimized drivers for DOOMIII to the detriment of everything else, THAT would be interesting, but all this talk is worthless speculation.

Who said detriment of everything else? Please step back and stop making up the viewpoint you are attacking. I consider the NV35 Doom 3 results valid and representative of final performance with the game for that path. What I was proposing as invalid was the presentation of Cat 3.2 (recognizing only 128MB according to ATI) versus Cat 3.4 (driver performance improvement + recognition of 256MB, but having issues with the Doom 3 beta build selected), and no discussion or apparent attempt to use the R200 path for the R300.
As the NV30 path is a direct mapping to nVidia hardware, I consider ATI unfairly disadvantaged with the Cat 3.4 performance issues displayed, with no mention of the apparent issues the NV3x has with that same path (according to what I recall from elsewhere). Doom 3's bugs, and both ATI's and nVidia's driver bugs for the ARB fragment shader extension, were not fairly discussed, nor was it clarified or explored, apparently, that the R200 path might not have had the issues with Cat 3.4.

Now, I think the NV35 might have shown similar performance leadership anyways, and I'm not surprised given Doom 3 is directly targetted as "PS1.3+better texture access" level shaders (the ideal situation for the NV35). My complaint is about what I regard as some significant failings in information provided that it occurs to me wouldn't have conflicted with their focus on getting the gaming exclusive.

Why are you bringing up "years" when talking about Doom 3 anyways? I tend to think that yes, an extra 2 weeks with a recent build would have made a more than slight difference to Cat 3.4 performance with Doom 3, which don't seem to have been made with Doom 3 in mind at all.

And why would you think that? If you can come up with specific examples, I'm all ears because I think such discussion would be very interesting and I'm eager to learn more about these things.

Well, if you can see why I have a problem with the other parts, you can maybe see how your goal would have been served by focusing on this discussion a bit more.

To answer: Consider that the E3 results last year were shown with lower settings than used in these benchmarks, and were achieving around 30 fps (IIRC). Consider that with later drivers the leaked demo used for that presentation delivers better performance for the R300 family. Finally, consider that one reason Doom 3 is still not released and seems to perform even better than that leaked demo is that there are significant changes and optimizations since even one year ago, and that the interaction between the latest drivers, hardware, and the latest Doom 3 build is not something that was or could be determined "years" ago.

If you want to continue with a discussion of this nature, that would be a good thing, but other people will still have problems with details of the benchmarking and remain singularly unconvinced by your telling them to "get a grip". :-? Next time, try just asking for such a discussion?

I think you're confusing ATI with nVidia, as there do remain some rather distinctive differences to their marketing. I also don't think simply labelling people's reaction as FUD is a very useful way to discuss opposing viewpoints.

I think you're confusing business with charity,

No, ATI has done and continues to do things in their self interest, those things just don't seem to have included what you propose in recent history, and in the same time period where they have rather prominently for nVidia. The problem here is that you aren't just stating that they are a business, you are proposing that they would have done a specific action, and you don't seem to have a basis for it. What fits their pattern is to address this type of issue with driver updates, not to start a FUD campaign of the nature you just decided randomnly to propose that they would.

and calling people's reactions FUD is a useful way of prodding things in the right direction, sometimes.

You seem singularly unwilling to base your expectations on actual observations. Atleast, that is my opinion, you are free to correct me.

...

Other people are entitled to their opinions. I'm entitled to thinking their opinions are worthless. Frankly, I wrote the post to try to move debate in a different direction and express my frustrations.
That was a remarkably fruitless way to go about it. People react adversely to being told to get a grip. However, some people react the same way to detailed reasoning as to why you disagree with them as they would to posts like yours, but I happen to think it is better if their reaction is their fault, rather than mine. How about yourself?
What I should have done was stick to the first part and forget about the second, and in that, I'm guilty of being as lame as all the other people expressing worthless opinions. :oops:

Didn't stop you from repeating those opinions again, while again labelling differing opinions as useless, I note. :(

There I've said it, can we move on to a more interesting discussion now? ;)

Sure, just stop making the comments you've already established as useless. I've tried to illustrate why I think they are, instead of just flaming you. It is up to you how you react to that.
 
Joe-

A blanket statement like that is pure bullshit. It depends on the company, and it depends on what was leaked, and what impact it has. At my company, when stuff like that happens, we consider those factors, among others, when deciding a course of action.

Where I work was the statement that began the quote. The information that we handle for our customers is financial in nature. Not only will leaking it get you fired, it could easily land you in jail. Why don't you try working where I do and leak some information, then come back with you rhetoric about a BS statement, although that could be in five to ten years :)

Wrong. There is ONE game "like Doom3" each year, that has the expectations of Doom3,

I'm sure Rockstar was terrified that Doom3 would outsell GTA Vice City. Do you follow E3 at all? Overall, PC gaming is pushed in to the corner as its marketshare tends to dictate. Why don't you ask some of the console developers what the weeks leading up to E3 are like. Look at the mainstream gaming press, Doom3 isn't nearly the topic that GTA, Zelda, MGS and the FF titles are. For the PC gaming industry Doom3 was huge because of the visuals, the same way Quake3 was huge. Quake3 sold, in relative terms compared to the real big titles in the gaming industry, poorly.

Yes, they were. 3dfx was the only one persuing Gl development with iD at the time. 3dfx is the reason why GL QUake existed, because carmack said "I don't want to use D3D, and I don't really want another proprietery API, what can we do". And 3dfx said "hey, we'll make enough of a GL driver so you can code in GL."

Which was a Glide wrapper that wouldn't work under OpenGL for some time.

So that means one shouldn't attempt to address the things that aren't?

I never said id wasn't responsible for making it less fair then it could have been.

And your point?

Politics are always at play. Is there some reason to believe that nV will not continue to get special attention from Carmack and crew?

Why? Life isn't fair. Why should assesment of 3D Technology be any different?

As fair as possible, not actually fair. There is a big difference.

Andy-

Of course a DX9 board will be penalized in a DX9.1 benchmark if it doesn't support a feature required for that benchmark. Just as a DX8 board may be penalised in benchmarks written after the release of DX8.1 that support features that they do not possess. Just what is your point here?

I'm talking about DX8 parts here, apologies for not being clear on that point. Most of the rest of your points regarding this aspect revolve around DX9 class board it appears. When 3DM2K3 launched nV had no DX9 level part.

Where did you get that idea from? W-buffers are a completely optional feature of the API, not a compulsory one like PS1.4

And proprietary extensions are an option for OpenGL. ATi is free to utilize their own extensions offering comparable functionality to nV's. Their hardware lacking some of those features may be an issue, but like WBuffer it is up to the vendor.

The only shadow buffer support officially exposed up to and including in DX9 AFAIK requires the shadow buffering code to be explicitly written using appropriate pixel shaders, not using implicit behaviour on one vendor's hardware.

Looking through the old DX8 docs a bit and comparing it to the information on what they are doing with SC's shadows for the NV boards and I'm not seeing what isn't supported even under 8.0. Actually, it appears that the NV specific version is much simpler, I don't see why ATi doesn't have support...?

Doom-

We are talking about DX9 hardware, and DX9 hardware should be running at least at the minimum amount of precision for compliancy.

It should run the precission required to get the end results needed. If people can't see any differences(and when I say any I mean any, if there is any visible differences then they should up their precission), then it is good enough.

Wohoa this has nothing to do with optimizing, this was forced on them..this NV30 code path. John Carmack even states it:

He also states-

NV30 ( full featured, single pass)

so yes it is the standard all ARB members agreed on, including Nvidia

You think that everything the ARB decides is without disagreement? :oops:
 
BenSkywalker said:
Where I work was the statement that began the quote. The information that we handle for our customers is financial in nature. Not only will leaking it get you fired, it could easily land you in jail. Why don't you try working where I do and leak some information, then come back with you rhetoric about a BS statement, although that could be in five to ten years :)

Right...the point being that you cannot apply this logic to the relationship between Id and ATI. It's completely different. You can no more assert that someone at ATI should go to jail, any more than you can assert it's "fair" for id to do what it did to ATI. It all depends on the specific circumstances of the situation, of which neither you nor I are prithy to.

I'm sure Rockstar was terrified that Doom3 would outsell GTA Vice City. Do you follow E3 at all? Overall, PC gaming is pushed in to the corner as its marketshare tends to dictate.

We are talking about PC games, with different hardware platforms to play games on. You think any console developer is worried about an executable demo being leaked for a console? of which the hardware was just made available a week or so eariler? Of course everyone is frenzied to get things up and running for an E3 "demo", PC and console developers alike.

The point is, the Doom3 / R300 situation at E3 was unique.

Look at the mainstream gaming press, Doom3 isn't nearly the topic that GTA, Zelda, MGS and the FF titles are. For the PC gaming industry Doom3 was huge because of the visuals, the same way Quake3 was huge. Quake3 sold, in relative terms compared to the real big titles in the gaming industry, poorly.

What is your point in all of this?

In terms of a game selling PC HARDWARE, which single upcoming game do believe is most important? Isn't it obvious?

Which was a Glide wrapper that wouldn't work under OpenGL for some time.

Right...id was 3dfx's bitch at the time.

I never said id wasn't responsible for making it less fair then it could have been.

That what exactly are you arguing against?

Politics are always at play. Is there some reason to believe that nV will not continue to get special attention from Carmack and crew?

Yes, if ATI's hardware is seen by the public as the superior platform for Doom3. If that happens, Carmack may cater more to ATI. If Carmack puts up artificial road-blocks to help prevent that from happening, then it's "not fair."
 
demalion said:

I've made my points. I stand by them.

Does anyone care to take a guess at what Ati will be able to do in their next driver to improve performance over the 3.2 drivers?
 
Back
Top