boobs said:
...
If that was the truth, then this thread would have taken on a completely different tone.
I don't think you should presume to judge the thread and treat it with contempt according to your evaluation. That doesn't improve it, and you're just persisting in making it deviate from what you propose you desire. Your contempt is your own conceit, please don't inflict it on the forum as the only truth worth stating.
Instead of discussing various conspiracy theories, people might have taken the time to discuss what optimizations Nvidia's hardware/software might have, what Ati might have done to optimize their software, etc, which, given the membership of this forum, might have been very interesting and informative. Instead, it degenerated into a completely academic debate about the morality of something that has no clear grounds for establishing a moral bearing.
So you decree. Does that decree do anything to address that problem you think is there? Does it change that some people disagree with your evaluation and have reason to do so? I propose to you that starting a discussion about what you wanted with comments to discuss would have worked towards achieving what you wanted, and that complaining and doing the same exact thing, except with your own opinion on "morality" instead merely prompts, at best, people to tell you exactly why they disagree.
The opinion that what you term "morality" cannot be established is just an opinion, not an edict that entitles you to condemn everyone else accordingly without your standards being applied to yourself. Are you "listening" to what I say, by the way? If not, my own standards will make this my last reply at such length on this discussion you started.
Well, you're trying to say "everyone" like you have the exclusive right to speak for them.
Since this has turned into a morality debate, there's a need to establish a moral rubric.
OK, and why are you right in where you establish that criteria? You are just proposing your own decision as right, and everyone who disagrees as wrong. Are you more than a bit egotistical, or can you recognize how unproductive that is...i.e., that people won't just line up and obey your telling them to "get a grip" as you define they should?
The obvious choice here is the expectations of the readership community versus what was being served.
You are substituting your own expectations in place of people who specifically disagree, and then telling them their opinions do not matter. Singularly and consistently useless, it seems to me. "The obvious choice here"? Ack!
I didn't say that "everyone" have the exclusive right to speak for them, but if one were to establish "morality" for video card tests, then the consumer has priority.
What I said was that you maintained that you presumed to speak for "everyone", and though you just said you disagree, it is my observation that this is exactly what you are continuing to do.
You see, that's exactly the question that was not answered. The only thing that was answered was the performance of Doom 3 today for the fastest path for each card, the game itself is months from release.
No, that questiong WAS answered. You are holding the answer to a standard that no answer could fullfill. John Carmack indicated that this test would be indicative of performance of the final products.
Your commentary is improperly selective. Simplest description of the flaw in it: Doom 3 is close to its final form, outside of bug squashing, but the actual hardware and drivers are rather far removed from that which will be available at its launch. All of these products have bearing on your question, not just the first, as does the person's own opinion of "satisfactory". With each person determining what is "satisfactory" for a $500 purchase, and the issues that I brought up, that question was not answered.
However, perhaps it was answered
for you and people who share your opinion atleast that far, but I've already covered why I think your application of that evaluation
universally is a problem.
What do you want these people to do? Sign contractual guarantees that things would never change?
Eh? No, I'm simply maintaining that that question was not answered. I'd think my stating what seems to have been answered instead would have avoided the need for such a silly proposition as to what I wanted.
...
Planned from day one that product was designed.
OK, I'm supposed to guess at your particular meaning? Which product? Doom 3 designed for NV35? 44.03 designed for Doome 3? NV35 designed for Doom 3? None dispute my conclusion, so were you just agreeing?
Come to think of it, it might be a very significant issue that the Cat 3.4 were used as they were (even given the priorities of HardOCP) to run the ARB path, and the R200 path wasn't tried. Did they try that method of overcoming the Cat 3.4 issue? I don't recall mention of it.
Do you think that they none of the 3 sites that got this were smart enough to go into a menu and try a different setting?
Yes.
No mention was made of changing the render path settings, and if you saw indication, it would have been useful to just point it out to me to it instead of something as ridulous as implying that I should implicitly trust reviewer thoroughness.
In any case, from Anandtech, since my reference to HardOCP's commentary wasn't sufficient: "The only options that we could change in game were the quality settings which we could set to low, medium or high."
Even if they made that horrible oversight, do you think that Carmack himself would then give his stamp of approval?
For the HardOCP article in particular, I ask you to note again that Carmack provided the demo, he did not dictate the testing methodology or provide his "stamp of approval" beyond replacing the nVidia made demo.
Why are you making up things at random and proposing them as actuality, and since when are reviewers above reproach?
Actually, it should be the job of reviewers who tested Doom III now to let you know then, as that is the avenue to address the inherent imbalance of the testing situation that was presented.
I'm talking about reality, not who should or who shouldn't.
I'm talking about reality too. You seem to be implying that should or shouldn't don't matter at all, only what actually happens. That is pretty circular.
Is the criteria for should and shouldn't that
you continue to propose "reality" rather than your own opinion? Or are you simply maintaing that once something has happened "in reality", criticism of it is not valid?
In this case, Ati would likely find out first. I'm assuming that their marketing department would then jump all over it. Am I wrong on this?
You mean wrong in your assumptions? I think so, and I don't recall having seen them jump all over any game or benchmark when their card was shown to ill favor. I have, on the other hand, seen this specifically and at great lengths from nVidia, and also seen this from them in technical criticism of competing hardware. Therefore, it is my opinion that your assumption is based on ATI acting like nVidia when I personally don't see indication of justification for that.
You are free to correct me, or even simply disagree, though I will consider that opinion unfounded without such correction.
Hmm...this comment doesn't make sense to me. Are you under the impression that nVidia drivers are perfect, or do you recognize that driver development is an ongoing process? What about Doom 3 development...do you recognize it is still changing?
Drivers are continuously evolving, and so is DOOM III, but the basic concepts were nailed down at least over a year ago, otherwise the development progress on DOOM III and R9800 would look like the progress on Daikatana and Rampage.
What is with the consistent introduction of these silly comments, like this one about Daikatana and Rampage? The comment in question was "Ati has had years to optimize their hardware and software", and it remains a comment that doesn't make sense to me when talking about Doom 3.
What do you think the Cat 3.4 results showed, exactly, if not that ATI didn't have the opportunity to do some testing? What about the game issues for NV3x products which were conceived at the same time as the R300 and R350? Are you instead proposing that ATI's R300 drivers last year were just as optimized for Doom 3? What about the performance changes since the E3 presentation that seem to directly contradict this?
Well, nVidia arranged this showing of Doom 3, what do you think that says about their focus for the 44.03 drivers used? ATI personnel seem to indicate that they have no current builds for which they could have performed optimizations. In a sense, that's good, they seem to focus on games people are playing and game specific hand tuning was hindered, but it is bad because there are obvious issues they could have addressed for their latest drivers, and nVidia was advantaged by only having to sucessfully implement the direct expression of their hardware functionality via their own OpenGL extension.
No, actually, one of their focuses from day one, which is likely over a year ago, has been DOOM III. Now, if people can show where they could have specifically optimized drivers for DOOMIII to the detriment of everything else, THAT would be interesting, but all this talk is worthless speculation.
Who said detriment of everything else? Please step back and stop making up the viewpoint you are attacking. I consider the NV35 Doom 3 results valid and representative of final performance with the game for that path. What I was proposing as invalid was the presentation of Cat 3.2 (recognizing only 128MB according to ATI) versus Cat 3.4 (driver performance improvement + recognition of 256MB, but having issues with the Doom 3 beta build selected), and no discussion or apparent attempt to use the R200 path for the R300.
As the NV30 path is a direct mapping to nVidia hardware, I consider ATI unfairly disadvantaged with the Cat 3.4 performance issues displayed, with no mention of the apparent issues the NV3x has with that same path (according to what I recall from elsewhere). Doom 3's bugs, and both ATI's and nVidia's driver bugs for the ARB fragment shader extension, were not fairly discussed, nor was it clarified or explored, apparently, that the R200 path might not have had the issues with Cat 3.4.
Now, I think the NV35 might have shown similar performance leadership anyways, and I'm not surprised given Doom 3 is directly targetted as "PS1.3+better texture access" level shaders (the ideal situation for the NV35). My complaint is about what I regard as some significant failings in information provided that it occurs to me wouldn't have conflicted with their focus on getting the gaming exclusive.
Why are you bringing up "years" when talking about Doom 3 anyways? I tend to think that yes, an extra 2 weeks with a recent build would have made a more than slight difference to Cat 3.4 performance with Doom 3, which don't seem to have been made with Doom 3 in mind at all.
And why would you think that? If you can come up with specific examples, I'm all ears because I think such discussion would be very interesting and I'm eager to learn more about these things.
Well, if you can see why I have a problem with the other parts, you can maybe see how your goal would have been served by focusing on this discussion a bit more.
To answer: Consider that the E3 results last year were shown with lower settings than used in these benchmarks, and were achieving around 30 fps (IIRC). Consider that with later drivers the leaked demo used for that presentation delivers better performance for the R300 family. Finally, consider that one reason Doom 3 is still not released and seems to perform even better than that leaked demo is that there are significant changes and optimizations since even one year ago, and that the interaction between the latest drivers, hardware, and the latest Doom 3 build is not something that was or could be determined "years" ago.
If you want to continue with a discussion of this nature, that would be a good thing, but other people will still have problems with details of the benchmarking and remain singularly unconvinced by your telling them to "get a grip".
Next time, try just asking for such a discussion?
I think you're confusing ATI with nVidia, as there do remain some rather distinctive differences to their marketing. I also don't think simply labelling people's reaction as FUD is a very useful way to discuss opposing viewpoints.
I think you're confusing business with charity,
No, ATI has done and continues to do things in their self interest, those things just don't seem to have included what you propose in recent history, and in the same time period where they have rather prominently for nVidia. The problem here is that you aren't just stating that they are a business, you are proposing that they would have done a specific action, and you don't seem to have a basis for it. What fits their pattern is to address this type of issue with driver updates, not to start a FUD campaign of the nature you just decided randomnly to propose that they would.
and calling people's reactions FUD is a useful way of prodding things in the right direction, sometimes.
You seem singularly unwilling to base your expectations on actual observations. Atleast, that is my opinion, you are free to correct me.
...
Other people are entitled to their opinions. I'm entitled to thinking their opinions are worthless. Frankly, I wrote the post to try to move debate in a different direction and express my frustrations.
That was a remarkably fruitless way to go about it. People react adversely to being told to get a grip. However, some people react the same way to detailed reasoning as to why you disagree with them as they would to posts like yours, but I happen to think it is better if their reaction is their fault, rather than mine. How about yourself?
What I should have done was stick to the first part and forget about the second, and in that, I'm guilty of being as lame as all the other people expressing worthless opinions.
Didn't stop you from repeating those opinions again, while again labelling differing opinions as useless, I note.
There I've said it, can we move on to a more interesting discussion now?
Sure, just stop making the comments you've already established as useless. I've tried to illustrate why I think they are, instead of just flaming you. It is up to you how you react to that.