digitalwanderer said:
Reverend said:
If you'll read what JC has said thus far, you'll know what video card he recommends without saying it.
Dumb question: why doesn't he just say it? I'm really glad you can read in between the lines to know what JC means, but I'm used to dealing with people who have a hard time understanding what drivers even are....all they know about Doom3 and performance is that stupidly unbalanced benchmark, and from that they've concluded that the FX is a better card for D3.
Why doesn't he "just say it"? Because there are many GFFX owners/gamers out there and I doubt he wants to get into any "trouble" by recommending a non-GFFX card -- I have to imagine he'll be thinking "If I say buy a DX9 ATI card now, gamers are going to jump on me for not saying so earlier".
JC continues to refine DOOM3... but when he started out he had no idea GFFX has the limitations it has vis-a-vis ATI's DX9 offerings. It would be impossible for him to guesstimate a future hardware's performance as he goes along in his development work. At anytime during his development work, he will "tell you like it is"... and his recent .plan updates have been exactly that, telling folks what problems he's encountering in a matter-of-fact way... instead of the "just say it" way that video card enthusiasts like, or the way Gabe Newell says it of late. JC's a bit more into talking details than Gabe's "just say it" way.
I do agree that the old NVIDIA-arranged DOOM3 benchmarking is a little odd, not just from the POV that JC agreed to it but that id Software as a whole agreed to it as well, although many have speculated on the behind-the-scenes-get-back-at-ATI reason why this is so. At least I thought JC was right to reject the demo NVIDIA recorded themselves (that reviewers were given by NVIDIA). Surely that says something about JC.
digitalwanderer said:
Mebbe it is, mebbe it ain't; but JC is most definately helping out nVidia right now with his silence.
That's the way you (and presumably others) read it... but, again, his .plan updates have been filled with important details about which card is giving him more work to do as well as where GFFX cards are falling behind performance-wise. That's not staying silent in my books.
digitalwanderer said:
Reverend said:
As well as not knowing prcisely what JC and his involvement with id Software is about.
Hmmmmm....I do believe he owns it and is the driving factor behind it, or something silly like that.
C'mon Anthony, what are you trying to say? That JC doesn't have the power of decision making at iD? Puh-lease!
I'm trying to say John Carmack is at id Software, a game development house. He, and his fellow id Software colleagues, makes games in the hope that their games sell well. In order to do that, he, his id Software colleagues as well as Activision has to ensure that many gamers out there can play their games well, in terms of performance as well as available features. As for the decision making, what decision making would that be? That he has to make games according to the wishes as expressed by his .plan updates made many, many months ago? That's inconsiderate thinking IMO. Surely it would be inconsiderate of John Carmack to think that "This is the way my engine works, it must not offer any IHV-specific codepaths because I don't want it this way, so all you potential engine licensees has to think the same way as I do".
I have been a great proponent of FP32 versus FP24 in some of my postings here.
That doesn't mean that I will make a game that will not run on FP24-only hardware, a game that I envisioned will be released in six months time.
Tim Sweeney have said that his next-gen engine will not run on anything less than FP32. Some have ridiculed him for this, others have said it's a good thing. If by the time the first game ships that features this next-gen engine of his, and you personally is still "stuck" with a R3x0, what will you be thinking? Come on, be honest... will it be "Shit, Tim Sweeney sucks!"... or will it be "Oh well, Tim Sweeney did say that my card won't be able to run this game... I have absolutely nothing to complain about."... ?
You want some cut-and-dry comments and decisions by developers. I think it is best that developers provide as many options as possible regardless of whatever "musings" they may have made months ago about "pushing the envelope". It is no different if I was a hardware reviewer with many of the latest gee-whiz video cards laying around to play with, or if I'm a guy that has to decide what video cards to buy. Providing many options is a Good Thing. Informing the public about how 3D technology should evolve is also a Good Thing. Don't make the mistake of thinking the two should be absolutely separated.
It amazes me just how simple-minded the public can be with regards to the many considerations involved in the game and 3D industry.
Doomtrooper said:
... I'm not saying there is Bias here although evidence has shown in the past that there definatley was a preference that Carmack had for Primary Development IHV selection.
He also refers only to Nvidia cards when talking future engines, and I have the original Doom 3 leak and there was already a NV30 custom path in the .ini file(so no ARB support from the initial engine design for Nvidia), this being 6 months prior to any FX cards available.
I think a developer can choose what his primary work card is. And if he has a preference (and demonstrably so, by stating this time and again in public postings), then it could be down to a number of valid reasons, which that developer should hopefully also express in public postings. I don't think we should fault John Carmack for saying his experience has shown that NVIDIA has had better/best drivers. He said it, not many other developers have argued it. Experience and association history helps. Carmack has had more good experiences developing with NVIDIA cards/drivers than most others in his experience. I don't think we should fault Carmack for having a preference due to his experience.
Doomtrooper said:
John Carmack said:
I am using an NV30 in my primary work system now, largely so I can test more of the rendering paths on one system, and because I feel Nvidia still has somewhat better driver quality (ATI continues to improve, though). For a typical consumer, I don't think the decision is at all clear cut at the moment.
So what I read from that, he is stating there is no clear cut winner between a 9700 and 5800 , when we already know the 5800 was cancelled by Nvidia(they did his thinking for him)...I could post some threads from other forums where DOOM 3 benchmarks from [H] and Anand are used as the Bible.
I'm sorry but I have forgotten the dates of Carmack's posting of that, and the 5800's cancellation. Which came first?
Doomtrooper said:
I've always stated, that it is in the developers best interest to ensure the title runs well on all hardware, but when benchmarking (which is the part I am referring too) the Doom 3 engine used as a benchmark..doesn't make logical sense. Workload must be the same for all cards, to determine which card has the superior architecure.
I have no problem with developers ensuring it runs well on all hardware, in fact I totally AGREE...my problem is 'benchmark modes'.
As I said before, why does the ARB exist in Opengl ?? I always thought the idea was to streamline the work for the developer..one code path for all cards...Nvidia is part of that process along with all the other IHVs.
Your concern is a valid one, and it may go beyond that too. I'd hate to be an IHV that does not have as good a relationship with Carmack as NVIDIA apparently does... I mean, why have NVIDIA-specific codepaths and none for, say, PowerVR (provided PVR has the specific extensions)?
As for the "benchmarking modes", it comes down to a game ensuring all the different codepaths can be specified when benchmarking, and that reviewers benchmark using all the relevant codepaths as well as informing the public what the (image output) differences are.