HardOCP and Doom 3 benchmarks

Discussion in 'Architecture and Products' started by indio, May 12, 2003.

  1. Evildeus

    Veteran

    Joined:
    May 24, 2002
    Messages:
    2,657
    Likes Received:
    2
    Well, i hope what he says is not true, because then it's still an issue isn't it? Why does Nvidia has more time to play with this "little game"? At the end, won't it be the same result, i.e. Nvidia being favorised?
     
  2. pascal

    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    1,968
    Likes Received:
    221
    Location:
    Brasil
    Well, I dont have a problem with an apples-orange comparison as long as I know it is an apple-orange comparison (most people dont know), fully understand what it means, can see both the apples and the oranges (where are the IQ evaluations?) and can see the apples-apples too.

    Then as I gamer/consumer I will be able to decide what is best for me. For example, Doom III is about a next level in gaming realism then I want a next level in image quality. Probably the best way to play it is with AA, aniso, high quality image settings at lower resolution.
     
  3. DadUM

    Newcomer

    Joined:
    Oct 11, 2002
    Messages:
    55
    Likes Received:
    0
    One question I have not seen asked in this thread that is of great importance to the debate on D3 benchmarks, why where Cat 3.4 compleltey hosed when running D3? All three groups who had D3 noticed the same thing, abysmal performance with Cat 3.4 and all switched back to Cat 3.2s (instead of autodeclaring nV the winner as I am sure nV would have liked). That is a fact. Now the real question becomes, if Cat 3.2 worked well for D3, what did they "break" in Cat 3.4 that killed performance on "one application?"
     
  4. Morris Ital

    Newcomer

    Joined:
    Jan 22, 2003
    Messages:
    28
    Likes Received:
    0
    Location:
    UK
    It was quite interesting to see that some of the lower cards were OK with the game, I've been thinking that the R350/nv35 would be the only thing playable at high quality.

    One assumes it will get faster than slower from this point so that is even better, though they have probably finished the engine and now just adding the other bits, visuals and story and a few pops and crackles and booms.

    When 3Dmark03 came out people kept saying that they would wait for game benchmarks to come out as that would show in actuality what the buyer could expect when playing games rather than benching 3dmark. Now people are arguing the complete opposite, use standard paths so it's all equal !

    Regards

    Andy
     
  5. Morris Ital

    Newcomer

    Joined:
    Jan 22, 2003
    Messages:
    28
    Likes Received:
    0
    Location:
    UK
    I was thinking this as well. Doom3 was shown earlier using Ati earlier drivers and it worked fine, now the 3.4's come along and they don't work ...

    Can one of the experts also tell me why earlier versions of the drivers do not like 256MB of RAM ? For instance, with 64MB and 128MB cards you don't need a new set of drivers. Why does 3.2 not like 256Mb ?

    Regards

    Andy
     
  6. boobs

    Newcomer

    Joined:
    Jan 27, 2003
    Messages:
    66
    Likes Received:
    0
    If that was the truth, then this thread would have taken on a completely different tone. Instead of discussing various conspiracy theories, people might have taken the time to discuss what optimizations Nvidia's hardware/software might have, what Ati might have done to optimize their software, etc, which, given the membership of this forum, might have been very interesting and informative. Instead, it degenerated into a completely academic debate about the morality of something that has no clear grounds for establishing a moral bearing.

    Since this has turned into a morality debate, there's a need to establish a moral rubric. The obvious choice here is the expectations of the readership community versus what was being served. I didn't say that "everyone" have the exclusive right to speak for them, but if one were to establish "morality" for video card tests, then the consumer has priority.

    No, that questiong WAS answered. You are holding the answer to a standard that no answer could fullfill. John Carmack indicated that this test would be indicative of performance of the final products. What do you want these people to do? Sign contractual guarantees that things would never change?


    Planned from day one that product was designed.


    Do you think that they none of the 3 sites that got this were smart enough to go into a menu and try a different setting? Even if they made that horrible oversight, do you think that Carmack himself would then give his stamp of approval?

    I'm talking about reality, not who should or who shouldn't. In this case, Ati would likely find out first. I'm assuming that their marketing department would then jump all over it. Am I wrong on this?

    Drivers are continuously evolving, and so is DOOM III, but the basic concepts were nailed down at least over a year ago, otherwise the development progress on DOOM III and R9800 would look like the progress on Daikatana and Rampage.

    No, actually, one of their focuses from day one, which is likely over a year ago, has been DOOM III. Now, if people can show where they could have specifically optimized drivers for DOOMIII to the detriment of everything else, THAT would be interesting, but all this talk is worthless speculation.

    And why would you think that? If you can come up with specific examples, I'm all ears because I think such discussion would be very interesting and I'm eager to learn more about these things.

    I think you're confusing business with charity, and calling people's reactions FUD is a useful way of prodding things in the right direction, sometimes.

    I spoke up for a specific purpose. Frankly, I was disappointed when I was finally able to load up B3D only to find the same kind of crap that usually goes on at HardForums, namely, a discussion with no objectivity, no technical insight. It seems that if people were really enthusiastic about hardware and graphics, then they'd take the time and effort to learn the important details, and not make idle speculation.

    Other people are entitled to their opinions. I'm entitled to thinking their opinions are worthless. Frankly, I wrote the post to try to move debate in a different direction and express my frustrations. What I should have done was stick to the first part and forget about the second, and in that, I'm guilty of being as lame as all the other people expressing worthless opinions. :oops:

    There I've said it, can we move on to a more interesting discussion now? :wink:
     
  7. Joe DeFuria

    Legend

    Joined:
    Feb 6, 2002
    Messages:
    5,994
    Likes Received:
    71
    Well, the problem with how Doom3 was handled, is that we have no idea what optimizations may or may not have been done. Specifically:

    1) The reviewers had little time with the demo
    2) They could not even post screenshots for scrutiny.

    There is room for both kinds of debate here. This Doom3 demo situation touches on both.

    I think everyone, consumers and IHVs, expects at least a "fair" comparison. Given that ATI (or other vendors, for that matter) had no indication that there would be a public benchmarking of Doom3, and nVidia was well aware of this, I woudlsay that is grounds for not being fair, to all IHVsm nor to consumers.

    No, I want Carmack to give ATI (or any other hardware vendor on which the tests will be run) the opportunity to agree or disagree with Carmack's assesment.

    Carmack can only know one thing: how close his own code is to being complete and optimized. He really doesn't know what else ATI can do to improve performance on his code with their drivers, etc. And to release the benchmark in this fashion is just "wrong", IMO.

    What was planned from day one? That NV3x would not run the ARB2 path in benchmark mode? What has been planned from day one, is that NV3x path would be optimal for NV3x hardware, but that NV3x hardware would also run the complete ARB2 path as well.

    It is very obvious to me that the ARB2 path was disabled by nvidia on purpose, precisely to prevent a direct apples to apples comparison.

    To be clear, I do agree that ATI running ARB2 vs. nVidia running NV3x paths (with commentary on any quality difference) is a valid comparison. However, having both run the ARB2 path is also a valid comparison...one that was not able to be performed.



    I don't understand your question. At least one of the 3 sites definitely tried and documented that try of a separate path for NV35 on the ARB2 path. Why would that same site, if they tried the R200 path, not document that as well?

    And why shouldn't Carmack give his stamp of approval?

    ATI was not aware that this was going to be tested and publicized until it was already done. That's the point.

    Yes, but the R300 core series of products have been shipping in quantity for a long time now. ATI doesn't have quite the same luxury as nVidia to dedicate lots of resources for the driver for that core to Doom3 which won't be shipping for many months yet. They have other real shipping games out there that need tweaks and bug fixes. And given that Doom3 is certainly up and working on R30x in at least a very solid fashion, that's good enough for Id's development purposes until the game is getting closer to shipping, at which time ATI can start getting heavy into serious tweaking / optimizing for that game / engine.

    Again, the point is, with the way this benchmark was released and controlled, we cant do any such thing. That's the problem. No screen shots, no follow-up, nada.

    All we have is some indications from OTHER GL apps, that aniso and AA might have been fiddled with. I do hope there is further investigation into that, but as for how that impacts Doom3? WE WON'T KNOW, because it won't be re-tested any time soon, nor will we have screen-shots to look at and judge.
     
  8. BenSkywalker

    Regular

    Joined:
    Feb 6, 2002
    Messages:
    823
    Likes Received:
    5
    Which could be relatively fairly deduced. If 3DM2K3 had chose to have one 1.1 and one 1.4 test it would have been a decent leveling.

    You mean, ignore a feature that ends up shipping in a version of DX other then one you are targetting? Should we criticize/penalize DX9 level boards because they don't support DX9.1? PS 1.4 favors ATi hardware, which is what I have been talking about all along. You know it, I know it, and pretty much everyone else knows it(that posts here anyway).

    It is entirely relevant. Given the context of 3DM2K3 it is supposed to be a bench that stresses vid cards and gives an indicator of how boards will stack up in games now and in the future. For their DX8 level test they utilize a DX8.1 specific feature, one that is nigh MIA in games, that favors ATi hardware. This tilts the field in favor of ATi. Is it their right to do so? Absolutely. Just as it is Carmack's right to optimize his game however he sees fit and as it is fine for UbiSoft to optimize SC however they see fit.

    So would you say then that any benches of games that request WBuffer should be invalid for R3x0 boards?

    Run the game with low quality settings to compensate for ATi's missing features? If anything the game should be run with both high quality settings and then benched again with low quality settings. Why not run Doom3 benches at the lowest quality setting possible? After all, the ARB2 path is not entirely equal even between nV and ATi, so do you think it is valid in the least to consider running it at the lowest settings to level the playing field?

    The DX8 refrast doesn't support PS 1.4 either. It should work fine using the DX 8.1 refrast.

    Joe-

    Was it fair for ATi to release the Doom3 alpha? If nVidia pulls the same sort of stunt, I would expect they would get the same treatment from id.
     
  9. Joe DeFuria

    Legend

    Joined:
    Feb 6, 2002
    Messages:
    5,994
    Likes Received:
    71
    Ah, so now you are also assuming that ATI did leak it. (Which is my suspicion as well.) However, you seem to take it further that ATI purposely leaked it (not some dolt employee)? Do you also know what action ATI may have taken against that employee? If ATI fired that employee, is that not enough?

    Has Id publically stated who leaked the demo? Id did more damage to their own credibility with this stunt than anyone else.

    Is it fair for nVidia to do this after ATI bent over backwards to get the game running for them on a card that could actually RUN THE DAMN THING at a decent clip at high quality for last year's E3? You think in all that rush to get the demo up and running (for the benefit of ID as well as ATI), that in all that confusion and late nights of work, passing between who knows how many different PCs and engineers, the demo managed to get into the wrong hands at one point, as the ultimate source of the leak?

    To be clear, if someone at ATI did leak it, it is ultimately ATI's responsibility. But when both ATI and ID are gunning to get it up and running for the big show, it easy to see how such a thing could happen.

    Id handled this with very bad form. They like to portray themselves as being "fair" and "for the gamer" etc. That's not the way I see it. Id, at the moment, is nVidia's bitch.

    At least you seem to agree that this isn't fair....you just think that it's legitimate "pay-back".
     
  10. BenSkywalker

    Regular

    Joined:
    Feb 6, 2002
    Messages:
    823
    Likes Received:
    5
    Is there anyone who doesn't?

    ATi isn't an individual, it is a group of people. When a BS PR statement is released from nVidia that some dolt came up with who gets hammered for it? Same thing.

    Where I work, if we are contracting another company and one employee that works for them leaks some of our confidential information that company no longer has our business period. We expect the same from our constomers(we screw up, we lose them).

    There are dozens/hundreds of games shown off at E3 each year, and almost all of them have the same hectic schedule to get it ready. If ATi couldn't handle it for last years E3 without losing control over it, why should id expect them to pull it off this time?

    The question is one of if being for the gamer is exclusive from working closely with nVidia. Looking at marketshare, there is a good reasoning to see why it would be the case. It was the same situation when 3dfx was the big dog in the gaming market. I have a version of GLQuake that won't run under OpenGL without a Glide wrapper by default. Were they 3Dfx's bitch back then? Yes. And they were for the gamer too.

    Running Doom3 benches isn't fair, running SC benches isn't fair, using 3DM2K3 isn't fair, running Quake3 benches isn't fair, running UT benches isn't fair. Life isn't fair. I don't live in a world that's fair, I don't play games from a world that's fair, and I don't buy hardware from a world that's fair. What I am far more interested in is seeing represenative information. If Half-Life2 comes out and runs five times faster on S3 hardware then it does on anything else obviously it isn't fair, but it may get me to buy a S3 video card anyway.

    When I read B3D's reviews I expect them to be as fair as possible as the goal is one of 3D technology. When I read Anand's or Tom's(etc) I want to see represenative data to what I would use the board for.

    I'll ask this, can anyone tell me of a single time when they, as a consumer and not a reviewer, changed the settings they have determined to be optimal for their rig in a game they are playing because it wasn't fair in relation to another board? Think about it.
     
  11. andypski

    Regular

    Joined:
    May 20, 2002
    Messages:
    584
    Likes Received:
    28
    Location:
    Santa Clara
    Of course a DX9 board will be penalized in a DX9.1 benchmark if it doesn't support a feature required for that benchmark. Just as a DX8 board may be penalised in benchmarks written after the release of DX8.1 that support features that they do not possess. Just what is your point here?

    PS1.4 should be able to favour all advanced hardware created after its specification was finalised, and writing apps that use it should give you better performance. Why not?

    The PS1.4 test tilts the field in favour of any vendor who chose to create hardware that implements the specification in an efficient manner, and it excludes nobody since anyone is free to create a piece of hardware supporting PS1.4, and the specifications of PS1.4 have been around for a long time. Whether other informed vendors chose to take the advantages offered by providing fast support for this shader version, or simply chose to ignore such support and provide a minimal level of useability was entirely up to them. Futuremark are not the only developer that uses PS1.4.

    Where did you get that idea from? W-buffers are a completely optional feature of the API, not a compulsory one like PS1.4, and they always have been. Developers are free to ignore this fact (perhaps misunderstanding the difference between optional and compulsory).

    If someone writes a bench that requires a W-buffer then it simply won't function correctly on a lot of hardware. As such it would not produce valid results on an R3x0 board, so yes - it would be an invalid benchmark for that hardware.

    If at some future time W-buffers become compulsory then I might expect to see new benchmarks using them, and perhaps having some fallback for earlier cards that do not have W-buffers to show how they perform. It seems to me that there should be no problem with that.

    There is no missing feature here. R3x0 can do shadow buffers - we even have demonstrations of shadow buffers running on R200s using PS1.4. It is only the fact that AFAIK in Splinter Cell the shadow buffers are implemented through a non-standard, unsupported and undocumented extension to the API behaviour that prevents them from being used.

    I have already said that I have no issue with developers trying to write the fastest possible code, and my comments had nothing to do with Doom3 per se. It was what I perceived as your poor choice of a comparative example that prompted my original comments.

    To the best of my knowledge it still doesn't work through that path using the DX8.1 refrast or even the DX9 refrast (although I admit I could be mistaken).

    The only shadow buffer support officially exposed up to and including in DX9 AFAIK requires the shadow buffering code to be explicitly written using appropriate pixel shaders, not using implicit behaviour on one vendor's hardware.

    [edit]Toned down a tad as it seemed unnecessarily provocative on further reading. [/edit]
     
  12. Doomtrooper

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,328
    Likes Received:
    0
    Location:
    Ontario, Canada
     
  13. Joe DeFuria

    Legend

    Joined:
    Feb 6, 2002
    Messages:
    5,994
    Likes Received:
    71
    Like I said, ATI as a company is ultimately responsible if one of their employees is responsible. But you have to consider all circumstances surrounding what happened when determining the course of action.

    A blanket statement like that is pure bullshit. It depends on the company, and it depends on what was leaked, and what impact it has. At my company, when stuff like that happens, we consider those factors, among others, when deciding a course of action.

    I bet if your company is, say, "Abit" and the company that "leaked" your information was "Intel", you wouldn't stop doing business with Intel.

    Wrong. There is ONE game "like Doom3" each year, that has the expectations of Doom3, the display of which was compounded by the fact that there was only really one brand-new-fresh-back-from-the-fab part that will run it good if-they-can-get-it-up-and-running-in-time, That is the combination of Doom3 and R300 at last year's E3.

    Because the circumstances are completely different, and ATI made whatever changes are necessary to prevent it from happening again? (Like what all companies like mine do when we "do something wrong". We admit the mistake, explain how we addressed it.)


    Market share is a good reason to work closely with a vendor. It's not a good reason put other vendors at an artificial disadvantage.

    Yes, they were. 3dfx was the only one persuing Gl development with iD at the time. 3dfx is the reason why GL QUake existed, because carmack said "I don't want to use D3D, and I don't really want another proprietery API, what can we do". And 3dfx said "hey, we'll make enough of a GL driver so you can code in GL."

    So that means one shouldn't attempt to address the things that aren't?

    The question is...do you care about doing anything about it.

    And your point?

    Why? Life isn't fair. Why should assesment of 3D Technology be any different? :roll:

    I thought about it, and I'm trying to figure out what this has to do with Id not giving ATI advance notice about releasing a Doom3 benchmark, and the opportunity to do some specific optimizations, in the interest of fairness.
     
  14. Luminescent

    Veteran

    Joined:
    Aug 4, 2002
    Messages:
    1,036
    Likes Received:
    0
    Location:
    Miami, Fl
    I am using the Cat 3.3 drivers for my 9800pro and I noticed a fair performance boost when playing the Doom 3 alpha that I have, as well as small gains in game tests 2 and 3 of 3DMark03. Seems they scrapped optimizations in lieu of some other performance gains.
     
  15. demalion

    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    2,024
    Likes Received:
    1
    Location:
    CT
    I don't think you should presume to judge the thread and treat it with contempt according to your evaluation. That doesn't improve it, and you're just persisting in making it deviate from what you propose you desire. Your contempt is your own conceit, please don't inflict it on the forum as the only truth worth stating.

    So you decree. Does that decree do anything to address that problem you think is there? Does it change that some people disagree with your evaluation and have reason to do so? I propose to you that starting a discussion about what you wanted with comments to discuss would have worked towards achieving what you wanted, and that complaining and doing the same exact thing, except with your own opinion on "morality" instead merely prompts, at best, people to tell you exactly why they disagree.

    The opinion that what you term "morality" cannot be established is just an opinion, not an edict that entitles you to condemn everyone else accordingly without your standards being applied to yourself. Are you "listening" to what I say, by the way? If not, my own standards will make this my last reply at such length on this discussion you started.

    OK, and why are you right in where you establish that criteria? You are just proposing your own decision as right, and everyone who disagrees as wrong. Are you more than a bit egotistical, or can you recognize how unproductive that is...i.e., that people won't just line up and obey your telling them to "get a grip" as you define they should?

    You are substituting your own expectations in place of people who specifically disagree, and then telling them their opinions do not matter. Singularly and consistently useless, it seems to me. "The obvious choice here"? Ack!
    What I said was that you maintained that you presumed to speak for "everyone", and though you just said you disagree, it is my observation that this is exactly what you are continuing to do.

    Your commentary is improperly selective. Simplest description of the flaw in it: Doom 3 is close to its final form, outside of bug squashing, but the actual hardware and drivers are rather far removed from that which will be available at its launch. All of these products have bearing on your question, not just the first, as does the person's own opinion of "satisfactory". With each person determining what is "satisfactory" for a $500 purchase, and the issues that I brought up, that question was not answered.

    However, perhaps it was answered for you and people who share your opinion atleast that far, but I've already covered why I think your application of that evaluation universally is a problem.

    Eh? No, I'm simply maintaining that that question was not answered. I'd think my stating what seems to have been answered instead would have avoided the need for such a silly proposition as to what I wanted. :-?
    OK, I'm supposed to guess at your particular meaning? Which product? Doom 3 designed for NV35? 44.03 designed for Doome 3? NV35 designed for Doom 3? None dispute my conclusion, so were you just agreeing?

    Yes.

    No mention was made of changing the render path settings, and if you saw indication, it would have been useful to just point it out to me to it instead of something as ridulous as implying that I should implicitly trust reviewer thoroughness. :shock:

    In any case, from Anandtech, since my reference to HardOCP's commentary wasn't sufficient: "The only options that we could change in game were the quality settings which we could set to low, medium or high."

    For the HardOCP article in particular, I ask you to note again that Carmack provided the demo, he did not dictate the testing methodology or provide his "stamp of approval" beyond replacing the nVidia made demo.

    Why are you making up things at random and proposing them as actuality, and since when are reviewers above reproach? :shock:

    I'm talking about reality too. You seem to be implying that should or shouldn't don't matter at all, only what actually happens. That is pretty circular.
    Is the criteria for should and shouldn't that you continue to propose "reality" rather than your own opinion? Or are you simply maintaing that once something has happened "in reality", criticism of it is not valid?

    You mean wrong in your assumptions? I think so, and I don't recall having seen them jump all over any game or benchmark when their card was shown to ill favor. I have, on the other hand, seen this specifically and at great lengths from nVidia, and also seen this from them in technical criticism of competing hardware. Therefore, it is my opinion that your assumption is based on ATI acting like nVidia when I personally don't see indication of justification for that.

    You are free to correct me, or even simply disagree, though I will consider that opinion unfounded without such correction.

    What is with the consistent introduction of these silly comments, like this one about Daikatana and Rampage? The comment in question was "Ati has had years to optimize their hardware and software", and it remains a comment that doesn't make sense to me when talking about Doom 3.

    What do you think the Cat 3.4 results showed, exactly, if not that ATI didn't have the opportunity to do some testing? What about the game issues for NV3x products which were conceived at the same time as the R300 and R350? Are you instead proposing that ATI's R300 drivers last year were just as optimized for Doom 3? What about the performance changes since the E3 presentation that seem to directly contradict this?

    Who said detriment of everything else? Please step back and stop making up the viewpoint you are attacking. I consider the NV35 Doom 3 results valid and representative of final performance with the game for that path. What I was proposing as invalid was the presentation of Cat 3.2 (recognizing only 128MB according to ATI) versus Cat 3.4 (driver performance improvement + recognition of 256MB, but having issues with the Doom 3 beta build selected), and no discussion or apparent attempt to use the R200 path for the R300.
    As the NV30 path is a direct mapping to nVidia hardware, I consider ATI unfairly disadvantaged with the Cat 3.4 performance issues displayed, with no mention of the apparent issues the NV3x has with that same path (according to what I recall from elsewhere). Doom 3's bugs, and both ATI's and nVidia's driver bugs for the ARB fragment shader extension, were not fairly discussed, nor was it clarified or explored, apparently, that the R200 path might not have had the issues with Cat 3.4.

    Now, I think the NV35 might have shown similar performance leadership anyways, and I'm not surprised given Doom 3 is directly targetted as "PS1.3+better texture access" level shaders (the ideal situation for the NV35). My complaint is about what I regard as some significant failings in information provided that it occurs to me wouldn't have conflicted with their focus on getting the gaming exclusive.

    Well, if you can see why I have a problem with the other parts, you can maybe see how your goal would have been served by focusing on this discussion a bit more.

    To answer: Consider that the E3 results last year were shown with lower settings than used in these benchmarks, and were achieving around 30 fps (IIRC). Consider that with later drivers the leaked demo used for that presentation delivers better performance for the R300 family. Finally, consider that one reason Doom 3 is still not released and seems to perform even better than that leaked demo is that there are significant changes and optimizations since even one year ago, and that the interaction between the latest drivers, hardware, and the latest Doom 3 build is not something that was or could be determined "years" ago.

    If you want to continue with a discussion of this nature, that would be a good thing, but other people will still have problems with details of the benchmarking and remain singularly unconvinced by your telling them to "get a grip". :-? Next time, try just asking for such a discussion?

    No, ATI has done and continues to do things in their self interest, those things just don't seem to have included what you propose in recent history, and in the same time period where they have rather prominently for nVidia. The problem here is that you aren't just stating that they are a business, you are proposing that they would have done a specific action, and you don't seem to have a basis for it. What fits their pattern is to address this type of issue with driver updates, not to start a FUD campaign of the nature you just decided randomnly to propose that they would.

    You seem singularly unwilling to base your expectations on actual observations. Atleast, that is my opinion, you are free to correct me.

    That was a remarkably fruitless way to go about it. People react adversely to being told to get a grip. However, some people react the same way to detailed reasoning as to why you disagree with them as they would to posts like yours, but I happen to think it is better if their reaction is their fault, rather than mine. How about yourself?
    Didn't stop you from repeating those opinions again, while again labelling differing opinions as useless, I note. :(

    Sure, just stop making the comments you've already established as useless. I've tried to illustrate why I think they are, instead of just flaming you. It is up to you how you react to that.
     
  16. BenSkywalker

    Regular

    Joined:
    Feb 6, 2002
    Messages:
    823
    Likes Received:
    5
    Joe-

    Where I work was the statement that began the quote. The information that we handle for our customers is financial in nature. Not only will leaking it get you fired, it could easily land you in jail. Why don't you try working where I do and leak some information, then come back with you rhetoric about a BS statement, although that could be in five to ten years :)

    I'm sure Rockstar was terrified that Doom3 would outsell GTA Vice City. Do you follow E3 at all? Overall, PC gaming is pushed in to the corner as its marketshare tends to dictate. Why don't you ask some of the console developers what the weeks leading up to E3 are like. Look at the mainstream gaming press, Doom3 isn't nearly the topic that GTA, Zelda, MGS and the FF titles are. For the PC gaming industry Doom3 was huge because of the visuals, the same way Quake3 was huge. Quake3 sold, in relative terms compared to the real big titles in the gaming industry, poorly.

    Which was a Glide wrapper that wouldn't work under OpenGL for some time.

    I never said id wasn't responsible for making it less fair then it could have been.

    Politics are always at play. Is there some reason to believe that nV will not continue to get special attention from Carmack and crew?

    As fair as possible, not actually fair. There is a big difference.

    Andy-

    I'm talking about DX8 parts here, apologies for not being clear on that point. Most of the rest of your points regarding this aspect revolve around DX9 class board it appears. When 3DM2K3 launched nV had no DX9 level part.

    And proprietary extensions are an option for OpenGL. ATi is free to utilize their own extensions offering comparable functionality to nV's. Their hardware lacking some of those features may be an issue, but like WBuffer it is up to the vendor.

    Looking through the old DX8 docs a bit and comparing it to the information on what they are doing with SC's shadows for the NV boards and I'm not seeing what isn't supported even under 8.0. Actually, it appears that the NV specific version is much simpler, I don't see why ATi doesn't have support...?

    Doom-

    It should run the precission required to get the end results needed. If people can't see any differences(and when I say any I mean any, if there is any visible differences then they should up their precission), then it is good enough.

    He also states-

    You think that everything the ARB decides is without disagreement? :shock:
     
  17. MuFu

    MuFu Chief Spastic Baboon
    Veteran

    Joined:
    Jun 12, 2002
    Messages:
    2,258
    Likes Received:
    51
    Location:
    Location, Location with Kirstie Allsopp
    E3 Doom 3 video and screenshots up: http://www.beyond3d.com/forum/viewforum.php?f=7

    The video is very impressive - after seeing that, I don't think it can be stressed enough just how important it is for IHVs to produce products that perform well in this title!

    MuFu.
     
  18. Joe DeFuria

    Legend

    Joined:
    Feb 6, 2002
    Messages:
    5,994
    Likes Received:
    71
    Right...the point being that you cannot apply this logic to the relationship between Id and ATI. It's completely different. You can no more assert that someone at ATI should go to jail, any more than you can assert it's "fair" for id to do what it did to ATI. It all depends on the specific circumstances of the situation, of which neither you nor I are prithy to.

    We are talking about PC games, with different hardware platforms to play games on. You think any console developer is worried about an executable demo being leaked for a console? of which the hardware was just made available a week or so eariler? Of course everyone is frenzied to get things up and running for an E3 "demo", PC and console developers alike.

    The point is, the Doom3 / R300 situation at E3 was unique.

    What is your point in all of this?

    In terms of a game selling PC HARDWARE, which single upcoming game do believe is most important? Isn't it obvious?

    Right...id was 3dfx's bitch at the time.

    That what exactly are you arguing against?

    Yes, if ATI's hardware is seen by the public as the superior platform for Doom3. If that happens, Carmack may cater more to ATI. If Carmack puts up artificial road-blocks to help prevent that from happening, then it's "not fair."
     
  19. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    NOt that I've followed this discussion, but how does this differ from 3DMark2001, but with differing vendors?
     
  20. boobs

    Newcomer

    Joined:
    Jan 27, 2003
    Messages:
    66
    Likes Received:
    0
    I've made my points. I stand by them.

    Does anyone care to take a guess at what Ati will be able to do in their next driver to improve performance over the 3.2 drivers?
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...