Another ATIvsNVIDIA review from gamersdepot

DaveBaumann said:
Bouncing Zabaglione Bros. said:
The Nvdia D3 benchmark is another matter - IMO, that's just JC being childish and getting his own back on ATI for leaking the D3 beta.

I think people are forgetting there is a publisher involved here as well.
I don't know much of the business end of games....how much say/pull/power does the publisher have over the game maker? :?:
 
DaveBaumann said:
Since the leak the D3 builds were changed such that a hardware dongle is required to allow the current beta builds to run - since the leak ATI have neither had the dongle nor recieved any internal builds.


So is Carmack going to do all the optimising, testing and bug finding for the ATI 97 percent of the $300+ card market that will play D3 better than everything else? Or is JC going to be childish by saying "Screw the ATI owners" at the same time as everyone is congratulating him for going out of his way to code a special path so NV3x owners can still play the game at more than 3 fps?

DaveBaumann said:
However, I think JC may have been fairly unhappy about the manner in which the benchmarks were conducted and the types of comments its generated subsequently, which could be one of the contributing factors why he might not include a benchmark.

Looked like he got shafted to me. Those demos were all about how great the NV3x was, not about D3. Even then, the Nvidia cheats were in evidence, even down to Nvidia wanting to use their own demo (static clip planes in the drivers anyone?), up against unoptimised 9800 Pros with the wrong drivers installed.

Carmack should be unhappy - by tying himself closely to Nvidia, Carmack is just getting tarred with the same brush, which seems to be happening a lot to everyone around Nvidia at the moment.
 
Bouncing Zabaglione Bros. said:
Carmack should be unhappy - by tying himself closely to Nvidia, Carmack is just getting tarred with the same brush, which seems to be happening a lot to everyone around Nvidia at the moment.
I don't see him expressing ANY signs of unhappiness about his choice of allegiances....that's pretty much the whole root of me angst against him at the moment. :(
 
John Reynolds said:
Why don't we all wait to see just how noticeable the visual differences are between FX12 and FP16/24/32 before going on the attack?

Exactly. I'm waiting expectantly and with baited breath to see the first game engine requiring fp24 accuracy, and I have a feeling that I might be waiting for sometime and so holding my breath wouldn't be wise...:) But it's a shame, really, since the current crop of game engines is designed to do little more than provide 32-bit integer color ranges through the fp pipelines. When we see a game actually providing 1,000+ shades of color on screen versus 32-bit integer's 256, the difference I would think would be staggering. But I don't expect to see this until fp is well and truly driven down into most market segments, developers get their tools, etc.
 
digitalwanderer said:
Bouncing Zabaglione Bros. said:
Carmack should be unhappy - by tying himself closely to Nvidia, Carmack is just getting tarred with the same brush, which seems to be happening a lot to everyone around Nvidia at the moment.
I don't see him expressing ANY signs of unhappiness about his choice of allegiances....that's pretty much the whole root of me angst against him at the moment. :(

I think he's expressing that unhappiness by not providing the traditional ID benchmark. If it's true, he's simply withdrawing from the whole issue, while keeping his mouth shut in order to not have to justify anything to anyone.

I'm wouldn't be surprised if Activision is getting the same answer of "I'm just not doing a benchmark for D3" with no explanation or justification.
 
Bouncing Zabaglione Bros. said:
digitalwanderer said:
Bouncing Zabaglione Bros. said:
Carmack should be unhappy - by tying himself closely to Nvidia, Carmack is just getting tarred with the same brush, which seems to be happening a lot to everyone around Nvidia at the moment.
I don't see him expressing ANY signs of unhappiness about his choice of allegiances....that's pretty much the whole root of me angst against him at the moment. :(

I think he's expressing that unhappiness by not providing the traditional ID benchmark. If it's true, he's simply withdrawing from the whole issue, while keeping his mouth shut in order to not have to justify anything to anyone.

I'm wouldn't be surprised if Activision is getting the same answer of "I'm just not doing a benchmark for D3" with no explanation or justification.
Inaction is a decision too. :(
 
DeanoC said:
Was JC biased at E3 last year when it was shown on ATI hardware only?
Having a 5800 Ultra in his machine is probably a sign he is having to do extra work to get to a similar quality. I had a Kyro2 in my machine while on SH2 for quite some while, wasn't actually a recommandation quite the opposite. I spent much longer on Kyro2 and Matrox G400 then I did on ATI 9700 for SH2, so I guess I'm biased for writing all those 'custom' pipelines.

As for the higher precision argument, what? He has stated that he thinks higher precision is a good thing but shock horror his CURRENT engine doesn't need it. If you had actually listened to developers, we've clearly stated that higher precision only comes into play when the art path and lighting algorithms NEED it. He doesn't use it because his engine doesn't NEED it because its a fixed point fragment processing engine, and by doing so millions of his customers get a better deal! Its just so happens that one card doesn't do fixed point fragment processing, so has to run at a higher precision. Doesn't mean all the other cards should be forced to when its NOT nessecary for HIS artwork and lighting calculations. Thats one of the reason its would make a crappy benchmark, it has half a dozen different pipelines, all using the same basic artpath and lighting model.

I prefer the ATI 9700, both as a developer and a gamer but don't feal the need to insult somebody who is making a BETTER game for his customers. Its a game developers job to make the best game for whatever hardware is out there. JC should be praised for going the extra mile for all those GFFX owners not insulted. Just the same as he did for those of us with hardware accelerators back in the Quake days or releasing the source code for newbies devs to look at etc

Dean I appreciate your insight on this, as well as Hyp-X...I'm not saying there is Bias here although evidence has shown in the past that there definatley was a preference that Carmack had for Primary Development IHV selection.
He also refers only to Nvidia cards when talking future engines, and I have the original Doom 3 leak and there was already a NV30 custom path in the .ini file(so no ARB support from the initial engine design for Nvidia), this being 6 months prior to any FX cards available.
As of late his 'plan' updates have been more politcally correct, not just bantering about one IHV, but the comments in his latest plan update where he states:

I am using an NV30 in my primary work system now, largely so I can test more
of the rendering paths on one system, and because I feel Nvidia still has
somewhat better driver quality (ATI continues to improve, though). For a
typical consumer, I don't think the decision is at all clear cut at the
moment.

So what I read from that, he is stating there is no clear cut winner between a 9700 and 5800 , when we already know the 5800 was cancelled by Nvidia(they did his thinking for him)...I could post some threads from other forums where DOOM 3 benchmarks from [H] and Anand are used as the Bible.
Nvidia felt it was so important they 'sponsored' the benchmarking for [H], even had their own 'demo' :LOL:

I've always stated, that it is in the developers best interest to ensure the title runs well on all hardware, but when benchmarking (which is the part I am referring too) the Doom 3 engine used as a benchmark..doesn't make logical sense. Workload must be the same for all cards, to determine which card has the superior architecure.
I have no problem with developers ensuring it runs well on all hardware, in fact I totally AGREE...my problem is 'benchmark modes'.
As I said before, why does the ARB exist in Opengl ?? I always thought the idea was to streamline the work for the developer..one code path for all cards...Nvidia is part of that process along with all the other IHVs.
 
Doomtrooper said:
...but when benchmarking (which is the part I am referring too) the Doom 3 engine used as a benchmark..doesn't make logical sense. Workload must be the same for all cards, to determine which card has the superior architecure.

Well, I have to disagree in part here.

Mostly, because when you perform a "benchmark", the first question you have to ask is "what are you trying to benchmark?"

Benchmarking to determine the "superior architecture" as you put it is one valid goal for any benchmark. Benchmarking to determing the "superior Doom3 card" is a different, though equally valid goal for a Doom3 benchmark.

The problem of course, comes with the typical web reviewer who is unable to distinguish between the two drastically different goals. So often, you get the "this card runs game X better, therefore it is a superior architecture / card."

So the long and short of it is, when Doom3 arrives, the best head-to-head card reviews will show all cards running a variety of applicable code paths. NV3x cards should be tested on BOTH NV30 paths, and ARB2 paths. R3xx cards should be tested in ARB2, and with R200, if it turns out that R200 path has any significant performance increase.

Perhaps NV3x cards will be a bit faster on Doom3 with the NV30 path, than the competing R3xx cards will be on Doom3 with the ARB2 path. That is something I want to know, and is a valid comparison.

However, I ALSO want to know how NV30 cards perform using the ARB2 path, especially if there is a large discrepancy. Doing both tests / comparisons is the only way to get an overall picture of how each card performs with the game, AND how each card would perform when asked to do the same workload.

On a related note, for a synthetic benchmark like 3DMark that is designed to "stress the GPU to the max", I don't see much of any wiggle room at all for allowing different "code paths" that perform a different workload.
 
Nvidia cards will be faster in Doom3, there is no denying that. I questioned Kyle on his Doom 3 benchmarks, and from the replies when the game releases he stated they will be benchmarked the way ID software 'auto detects' when it runs the game. The NV30 path is the default for all FX cards according to Carmack.
 
Doomtrooper said:
Nvidia cards will be faster in Doom3, there is no denying that. I questioned Kyle on his Doom 3 benchmarks, and from the replies when the game releases he stated they will be benchmarked the way ID software 'auto detects' when it runs the game. The NV30 path is the default for all FX cards according to Carmack.

nVidia cards may in fact be faster. There's no reason why this shouldn't be known via benchmarking. There's also no reason why Doom3 shouldn't also be benchmarked with all cards using the ARB2 path, and if R3xx is much faster than NV3x, that should be known as well.

The "default" rendering path should be whatever rundering path the developer feels is the best for that architecture. That being said, the developer should also make any other relevant rendering paths available as options for the game, and for benchmarking.
 
Doomtrooper said:
Nvidia cards will be faster in Doom3, there is no denying that. I questioned Kyle on his Doom 3 benchmarks, and from the replies when the game releases he stated they will be benchmarked the way ID software 'auto detects' when it runs the game. The NV30 path is the default for all FX cards according to Carmack.

You know, if true you could take that logic back a few years ago when games were either Glide or software and force 3dfx's competitors to run in 'software' mode and post those scores because that's what the game detects when it's installed. Why is apples to apples (or as close as possible) so hard for so many reviewers to grasp?
 
John Reynolds said:
Doomtrooper said:
Nvidia cards will be faster in Doom3, there is no denying that. I questioned Kyle on his Doom 3 benchmarks, and from the replies when the game releases he stated they will be benchmarked the way ID software 'auto detects' when it runs the game. The NV30 path is the default for all FX cards according to Carmack.

You know, if true you could take that logic back a few years ago when games were either Glide or software and force 3dfx's competitors to run in 'software' mode and post those scores because that's what the game detects when it's installed. Why is apples to apples (or as close as possible) so hard for so many reviewers to grasp?
Because it tends to give them results that they don't like... ;)
 
digitalwanderer said:
Reverend said:
If you'll read what JC has said thus far, you'll know what video card he recommends without saying it.
Dumb question: why doesn't he just say it? I'm really glad you can read in between the lines to know what JC means, but I'm used to dealing with people who have a hard time understanding what drivers even are....all they know about Doom3 and performance is that stupidly unbalanced benchmark, and from that they've concluded that the FX is a better card for D3.
Why doesn't he "just say it"? Because there are many GFFX owners/gamers out there and I doubt he wants to get into any "trouble" by recommending a non-GFFX card -- I have to imagine he'll be thinking "If I say buy a DX9 ATI card now, gamers are going to jump on me for not saying so earlier".

JC continues to refine DOOM3... but when he started out he had no idea GFFX has the limitations it has vis-a-vis ATI's DX9 offerings. It would be impossible for him to guesstimate a future hardware's performance as he goes along in his development work. At anytime during his development work, he will "tell you like it is"... and his recent .plan updates have been exactly that, telling folks what problems he's encountering in a matter-of-fact way... instead of the "just say it" way that video card enthusiasts like, or the way Gabe Newell says it of late. JC's a bit more into talking details than Gabe's "just say it" way.

I do agree that the old NVIDIA-arranged DOOM3 benchmarking is a little odd, not just from the POV that JC agreed to it but that id Software as a whole agreed to it as well, although many have speculated on the behind-the-scenes-get-back-at-ATI reason why this is so. At least I thought JC was right to reject the demo NVIDIA recorded themselves (that reviewers were given by NVIDIA). Surely that says something about JC.

digitalwanderer said:
Mebbe it is, mebbe it ain't; but JC is most definately helping out nVidia right now with his silence.
That's the way you (and presumably others) read it... but, again, his .plan updates have been filled with important details about which card is giving him more work to do as well as where GFFX cards are falling behind performance-wise. That's not staying silent in my books.

digitalwanderer said:
Reverend said:
As well as not knowing prcisely what JC and his involvement with id Software is about.
Hmmmmm....I do believe he owns it and is the driving factor behind it, or something silly like that.

C'mon Anthony, what are you trying to say? That JC doesn't have the power of decision making at iD? Puh-lease!
I'm trying to say John Carmack is at id Software, a game development house. He, and his fellow id Software colleagues, makes games in the hope that their games sell well. In order to do that, he, his id Software colleagues as well as Activision has to ensure that many gamers out there can play their games well, in terms of performance as well as available features. As for the decision making, what decision making would that be? That he has to make games according to the wishes as expressed by his .plan updates made many, many months ago? That's inconsiderate thinking IMO. Surely it would be inconsiderate of John Carmack to think that "This is the way my engine works, it must not offer any IHV-specific codepaths because I don't want it this way, so all you potential engine licensees has to think the same way as I do".

I have been a great proponent of FP32 versus FP24 in some of my postings here.

That doesn't mean that I will make a game that will not run on FP24-only hardware, a game that I envisioned will be released in six months time.

Tim Sweeney have said that his next-gen engine will not run on anything less than FP32. Some have ridiculed him for this, others have said it's a good thing. If by the time the first game ships that features this next-gen engine of his, and you personally is still "stuck" with a R3x0, what will you be thinking? Come on, be honest... will it be "Shit, Tim Sweeney sucks!"... or will it be "Oh well, Tim Sweeney did say that my card won't be able to run this game... I have absolutely nothing to complain about."... ?

You want some cut-and-dry comments and decisions by developers. I think it is best that developers provide as many options as possible regardless of whatever "musings" they may have made months ago about "pushing the envelope". It is no different if I was a hardware reviewer with many of the latest gee-whiz video cards laying around to play with, or if I'm a guy that has to decide what video cards to buy. Providing many options is a Good Thing. Informing the public about how 3D technology should evolve is also a Good Thing. Don't make the mistake of thinking the two should be absolutely separated.

It amazes me just how simple-minded the public can be with regards to the many considerations involved in the game and 3D industry.

Doomtrooper said:
... I'm not saying there is Bias here although evidence has shown in the past that there definatley was a preference that Carmack had for Primary Development IHV selection.
He also refers only to Nvidia cards when talking future engines, and I have the original Doom 3 leak and there was already a NV30 custom path in the .ini file(so no ARB support from the initial engine design for Nvidia), this being 6 months prior to any FX cards available.
I think a developer can choose what his primary work card is. And if he has a preference (and demonstrably so, by stating this time and again in public postings), then it could be down to a number of valid reasons, which that developer should hopefully also express in public postings. I don't think we should fault John Carmack for saying his experience has shown that NVIDIA has had better/best drivers. He said it, not many other developers have argued it. Experience and association history helps. Carmack has had more good experiences developing with NVIDIA cards/drivers than most others in his experience. I don't think we should fault Carmack for having a preference due to his experience.

Doomtrooper said:
John Carmack said:
I am using an NV30 in my primary work system now, largely so I can test more of the rendering paths on one system, and because I feel Nvidia still has somewhat better driver quality (ATI continues to improve, though). For a typical consumer, I don't think the decision is at all clear cut at the moment.
So what I read from that, he is stating there is no clear cut winner between a 9700 and 5800 , when we already know the 5800 was cancelled by Nvidia(they did his thinking for him)...I could post some threads from other forums where DOOM 3 benchmarks from [H] and Anand are used as the Bible.
I'm sorry but I have forgotten the dates of Carmack's posting of that, and the 5800's cancellation. Which came first?

Doomtrooper said:
I've always stated, that it is in the developers best interest to ensure the title runs well on all hardware, but when benchmarking (which is the part I am referring too) the Doom 3 engine used as a benchmark..doesn't make logical sense. Workload must be the same for all cards, to determine which card has the superior architecure.
I have no problem with developers ensuring it runs well on all hardware, in fact I totally AGREE...my problem is 'benchmark modes'.
As I said before, why does the ARB exist in Opengl ?? I always thought the idea was to streamline the work for the developer..one code path for all cards...Nvidia is part of that process along with all the other IHVs.
Your concern is a valid one, and it may go beyond that too. I'd hate to be an IHV that does not have as good a relationship with Carmack as NVIDIA apparently does... I mean, why have NVIDIA-specific codepaths and none for, say, PowerVR (provided PVR has the specific extensions)?

As for the "benchmarking modes", it comes down to a game ensuring all the different codepaths can be specified when benchmarking, and that reviewers benchmark using all the relevant codepaths as well as informing the public what the (image output) differences are.
 
Reverend said:
If by the time the first game ships that features this next-gen engine of his, and you personally is still "stuck" with a R3x0, what will you be thinking? Come on, be honest... will it be "Shit, Tim Sweeney sucks!"... or will it be "Oh well, Tim Sweeney did say that my card won't be able to run this game... I have absolutely nothing to complain about."... ?
If I still have an R3xx card at that point, I won't get as offended if you call me a simpleton. ;)
 
Reverend said:
Doomtrooper said:
John Carmack said:
I am using an NV30 in my primary work system now, largely so I can test more of the rendering paths on one system, and because I feel Nvidia still has somewhat better driver quality (ATI continues to improve, though). For a typical consumer, I don't think the decision is at all clear cut at the moment.
So what I read from that, he is stating there is no clear cut winner between a 9700 and 5800 , when we already know the 5800 was cancelled by Nvidia(they did his thinking for him)...I could post some threads from other forums where DOOM 3 benchmarks from [H] and Anand are used as the Bible.
I'm sorry but I have forgotten the dates of Carmack's posting of that, and the 5800's cancellation. Which came first?

When someone says the decision between a 9700 and a 5800 isn't clear cut I have to question their objectivity (regardless of when the 5800 was cancelled). I don't think there is any question about which one of these two cards is better, and JC can't claim ignorance for his comment.
 
Okay, Carmack's .plan update that has what DT quoted was dated Jan 29, when the 5800 wasn't yet cancelled. In fact, that Jan 29 .plan update of his was, IIRC, not very long after the 5800 NDA expired.

As for questiong Carmack's objectivity, I don't think we know what cases (quality alone? performance alone? both being considered? for who, the ultimate 3D enthusiasts or those that don't even know what floating point means and who just wants to play games?) he was referring to at the time of his development/experimentation, and especially if he meant only within DOOM3 and not any other game.
 
Reverend said:
Okay, Carmack's .plan update that has what DT quoted was dated Jan 29, when the 5800 wasn't yet cancelled. In fact, that Jan 29 .plan update of his was, IIRC, not very long after the 5800 NDA expired.

As for questiong Carmack's objectivity, I don't think we know what cases (quality alone? performance alone? both being considered? for who, the ultimate 3D enthusiasts or those that don't even know what floating point means and who just wants to play games?) he was referring to at the time of his development/experimentation, and especially if he meant only within DOOM3 and not any other game.

So you are arguing that even though JC said he wanted FP and knew the NV30 path used FX to be competative and was aware that NV30 PS 2.0 performance was terrible, his statement was a reasonable one. When you consider that he likes to push the envelope I find that argument a strange one. And to make that statement even remotely reasonable you have to assume he is refering to Doom 3 performance exclusively.
 
Fred da Roza said:
Reverend said:
Okay, Carmack's .plan update that has what DT quoted was dated Jan 29, when the 5800 wasn't yet cancelled. In fact, that Jan 29 .plan update of his was, IIRC, not very long after the 5800 NDA expired.

As for questiong Carmack's objectivity, I don't think we know what cases (quality alone? performance alone? both being considered? for who, the ultimate 3D enthusiasts or those that don't even know what floating point means and who just wants to play games?) he was referring to at the time of his development/experimentation, and especially if he meant only within DOOM3 and not any other game.

So you are arguing that even though JC said he wanted FP and knew the NV30 path used FX to be competative and was aware that NV30 PS 2.0 performance was terrible, his statement was a reasonable one. When you consider that he likes to push the envelope I find that argument a strange one.
It's not so strange when JC makes money from selling games he develops and he knows there are many gamers who own GFFX cards.

Never try to assume that if someone likes to push the envelope that he may never choose to not do so in absolute terms in things he does as a means of making money.

He has implemented FP in DOOM3. He has "pushed the envelope" insofar as implementing what he wished for many months ago (higher precision). He has found the GFFX sucks at FP performance. He is considerate enough to GFFX owners by providing additional options.

Again, he implemented FP support in DOOM3 and has "pushed the envelope". He has told all of us what the problems with the GFFX are. Make your decision.

If you meant "pushing the envelope" to be "don't do anything extra for GFFX", then that is not being considerate to GFFX owners, of which I'm sure you're not one. If you were, you wouldn't've posted what you posted.

And to make that statement even remotely reasonable you have to assume he is refering to Doom 3 performance exclusively.
This is ridiculous... there is no attempt to make my statement remotely resonable because it is a reasonable assumption already. He said "typical" folks... we can read that in many different ways and I expressed one possible way of reading it.
 
Reverend said:
Fred da Roza said:
Reverend said:
Okay, Carmack's .plan update that has what DT quoted was dated Jan 29, when the 5800 wasn't yet cancelled. In fact, that Jan 29 .plan update of his was, IIRC, not very long after the 5800 NDA expired.

As for questiong Carmack's objectivity, I don't think we know what cases (quality alone? performance alone? both being considered? for who, the ultimate 3D enthusiasts or those that don't even know what floating point means and who just wants to play games?) he was referring to at the time of his development/experimentation, and especially if he meant only within DOOM3 and not any other game.

So you are arguing that even though JC said he wanted FP and knew the NV30 path used FX to be competative and was aware that NV30 PS 2.0 performance was terrible, his statement was a reasonable one. When you consider that he likes to push the envelope I find that argument a strange one.
It's not so strange when JC makes money from selling games he develops and he knows there are many gamers who own GFFX cards.

Never try to assume that if someone likes to push the envelope that he may never choose to not do so in absolute terms in things he does as a means of making money.

He has implemented FP in DOOM3. He has "pushed the envelope" insofar as implementing what he wished for many months ago (higher precision). He has found the GFFX sucks at FP performance. He is considerate enough to GFFX owners by providing additional options.

Again, he implemented FP support in DOOM3 and has "pushed the envelope". He has told all of us what the problems with the GFFX are. Make your decision.

If you meant "pushing the envelope" to be "don't do anything extra for GFFX", then that is not being considerate to GFFX owners, of which I'm sure you're not one. If you were, you wouldn't've posted what you posted.

And to make that statement even remotely reasonable you have to assume he is refering to Doom 3 performance exclusively.
This is ridiculous... there is no attempt to make my statement remotely resonable because it is a reasonable assumption already. He said "typical" folks... we can read that in many different ways and I expressed one possible way of reading it.

I am not a FX owner nor a Radeon owner. I do presently own a nVidia card.

Your first statement is irrevevant. Objectivity does not take into consideration money. If he is being influenced by money he clearly isn't being objective.

The point is the 9700 is clearly the better card even by JC's criteria (wanting FP and pushing the capabilities and performance of future games). I wouldn't be surprised if he was also aware of the 9700 better AA and AF performance.

Edit

Finally where are these typical objective folks that believe the 5800 is on par with the 9700? Are you saying you don't believe the 9700 is clearly superior to the 5800. Even Kyle admitted it.

By typical folks did you mean those who do not know better. If so then thank you very much JC for clouding the issue. Clealy nVidia didn't even think the 5800 was on par with the 9700 otherwise they wouldn't have discontinued it so quickly.
 
Fred da Roza said:
I am not a FX owner nor a Radeon owner. I do presently own a nVidia card.

Your first statement is irrevevant. Objectivity does not take into consideration money. If he is being influenced by money he clearly isn't being objective.

The point is the 9700 is clearly the better card even by JC's criteria (wanting FP and pushing the capabilities and performance of future games). I wouldn't be surprised if he was also aware of the 9700 better AA and AF performance.

Edit

Finally where are these typical objective folks that believe the 5800 is on par with the 9700? Are you saying you don't believe the 9700 is clearly superior to the 5800. Even Kyle admitted it.

By typical folks did you mean those who do not know better. If so then thank you very much JC for clouding the issue. Clealy nVidia didn't even think the 5800 was on par with the 9700 otherwise they wouldn't have discontinued it so quickly.
Well, as a developer, I think you can be entirely objective while having to consider making money. I do not think the influence of money makes it impossible to be objective in both work as well as public comments. I'm no game developer but someone like DeanoC (who is one) can probably back me up in this opinion of mine.

I think the provision of the standard ARB2 path as well as specific NV path is being both objective as well as being considerate. Carmack has said that he couldn't provide specific ATI paths because there are no available ATI specific paths. I would take that to mean he would've done it (i.e. provide specific ATI paths) if it was possible.

Personally, I kind of marvel at JC's ability to express what he has expressed in almost all his .plan updates (and interviews) -- to the point stuff, some details of his work, all the while never really giving the impression that he gives undeserved care for a particular IHV and/or their products. The fact that he has spoken at length about what he'd been doing wrt NVIDIA's GFFX range, instead of ATI's DX9 range, doesn't mean he isn't being objective (i.e. some people thinks he's paying more attention to NVIDIA/GFFX... but they forget that this is simply what needs to be done) -- he's just telling us things many, many developers don't tell the public. And that is describing the shortcomings of a particular type of product.

Folks rarely talk about things they have no problems with (ATI's DX9 offerings wrt DOOM3) but it is probably important to talk about things you have problems with (GFFX wrt DOOM3) as well as talking about what fixes you intend to do.

For being purely objective as well as pushing the envelope, I think hardware reviewers have a part to play as well. If a game ships with different rendering paths (be it IHV specific, or different pixel shader versions), then a hardware reviewer can choose to not test a video card with anything less than the "ultimate" configuration. If a game/software ships with ps_2_0 as well as ps_1_4 paths, and it also comes with different IHV extensions as well as a standard path, then a hardware reviewer would be "pushing the envelope" if he only tests with ps_2_0 and standard paths, instead of "dumbing down" to ps_1_4 and non-standard paths.

The point is that if a developer had previously stated his wishes for exciting new technologies, he shouldn't be faulted for providing options in his game later (while also implementing such exciting new technologies) in order for it to sell better -- objectivity can still be practiced while providing these options in order to sell more copies of the game... it is just common sense. "Pushing the envelope" then falls on the shoulder of hardware reviewers and how they choose to perform tests/conduct reviews.

I can understand what you mean however (i.e. Carmack said he wanted higher precision so he should do nothing but stick by what he said and don't care if his game runs lousy on certain video cards), but we do not live in a perfect world where everyone has the same video card. Nor in a world where a developer don't give a damn. What Valve has done wrt HL2 is another example -- Valve is "pushing the envelope" while also is being considerate (and complaining about being considerate, by stating how much additional time is spent "caring for" GFFX owners) in addition to being, well, objective.

We want objectivity practiced with common sense, not objectivity to the point of being inconsiderate.

As for your edited additions, I have always maintained that I appreciate the 9700 more than any GFFX cards (in fact, I'm so impressed by it that I have become frustrated by its lack of FP32 since I want it to be my primary reasearch card because of its speed but I have go back and forth between it and a GFFX due to my precision research... very frustrating when you have only one machine!). Carmack has not, IMO and on the contrary, clouded the issue. He has stated the problems he has been encountering with GFFX cards.
 
Back
Top