Doom III thread revisited

Gollum

Veteran
Dave closed the 12 pages long D3 thread about 5 secs before I could post my reply, and while I agree with the closing I'd still like to not let my 15 minutes of typing go wasted. ;)

What I was about to reply:

Pitty, this thread [R300 the fastest for DoomIII, John Carmack Speaks Again] started out interesting, now its mostly Nvidia vs. ATI - why does this have to keep happening, this is Beyond 3D, not some fansite!
If you think ATI drivers suck then don't buy their cards, if you think all Nvidia products are inferior then good for you, everybody has his reasons to prefer one product over another (and not "liking" a company is sadly among them althogh IMHO it shouldn't be important).

What's worst is that people constantly dodge each other's arguments, here's how this thread looks to someone not participating in the "discussion":

- first several pages of interesting Doom3 and 3D talk
- arguments about which card JC is developing Doom3 for begin - IMHO he doesn't develop it "for" any card but his development platform is apparently a GF4 Ti, which he has said countless times. that doesn't make a R8500 an inferior Doom3 card in any significant way and nobody of us will be playing Doom3 on either of these cards when it comes out anyway!
- somebody suggests the reason for this [GF4Ti as development platform] might be that Nvidia has put out the superior [unwise choice of word, but not mine] products for each timeframe for years now
- somebody else caims "no, the R8500 was superior to GF3 offerings at that time"
- reviews and benchmarks are brought forth that demonstrate rather clearly that at the time of release and in the following up to 6 months, the R8500 had both more stability problems/driver issues and sometimes (close to 50/50 though) weaker performance than a GF3 Ti500
- people complain that newer ATI drivers are better and that the proof shown is thus worthless, which is definitely true but has nothing to do with the original argument [at release and the following few months]!
- flaming as to how GF3/4 suck and R8500 have bad drivers commences

Why can't both sides just listen to each other more carefully and be less defensive, nobody is attacking your belief system by saying he *thinks* a GF might be better than a R or the other way around - settle down for once and be happy the future looks so interesting! To sum this and many other threads up:

-Nvidia Supporters:
ATI drivers have improved to a point where at the time of this writing its obviously clear a R8500 is a very good card to own and can compete well even with the much newer GF4 Ti line in compatibility and speed...

-ATI Supporters:
Nvidia still has the more mature drivers and despite what you think, the GF3/4 series are powerfull and well featured cards that in reality (=games) are not in any significant way inferior just because it lacks a bit in certain specs and no matter what some people claim, the IQ isn't that terrible either...

Currently you can't do wrong with purchasing either, a R8500 or a GF4Ti. Then again, this is all just my opinion so feel free to disagree...
 
Excellent post. Very true. Can't go wrong with either card. I've always liked this place as a break from the flame wars at rage3d. Everyone here is more knowledgeable and less likely to get involved in "my card is better than your card" "my dad can beat up your dad" discussions. Hope it stays that way.
 
Gollum your post is excellent but what you wrote will never happen, I was born in the 60's and have been gaming, browsing forums since my Amiga 500 days on the net (artillery on the old Amiga 500 with 9600 Baud Modem..upgraded from a 1200 baud modem..wohoo)...the truth is nobody ever agree's with each all the time and different opinions is what makes the world go around.
I try to be open minded and have no problem saying I was wrong, some people on the other hand it's like pulling teeth :)
Its bold statements that boil my blood especially when that statement is being made without experience with the product, statements that were being posted in the closed thread....
 
It's one thing to have an opinion...Everybody is entitled to one. However, there are certain people who write their opinions as "I'm telling you guys...you don't know what you're talking about" or "Listen to me...I know everything there is to know about next-generation products....blah blah blah."

To a man, I don't think there's a single reasonable person who visits this forum who will disagree with the notion that we simple _don't_ know what will happen come this Fall.

Nobody will know if the R300 will launch before the NV30...or how Parhelia will end up after it's all said and done....Or if NV30 will really be a radically different approach....Or if DX9 will end up playing a pivitol role in when these architectures are launched...Or how NV30 vs. R300 will end up performing in Doom III...etc.

As I see it, the problem usually begins when individuals try and turn a bunch of statements into fact...and then extrapolate a whole bunch of other statements, and turning those into fact.

I love discussing the future, as I'm sure most others do as well...But I'm also realistic enough to know that anything is possible when all of these things happen this year.
 
Come on, while such things are not said here.... in some dark corners of the internet some people are still convinced there will be a 'second' coming.

Let that be the end of all 3dfx discussion in this topic.

-Colourless
 
Colourless said:
Come on, while such things are not said here.... in some dark corners of the internet some people are still convinced there will be a 'second' coming.

I hope so! Maybe then I'll be able to sell my 3dfx shares for a profit! (I bought 200 shares at $2.25 a week before they went under.)
 
Gollum, i agree. I am sorry i even entered the thread, i was just sick of people raggin on ATI drivers for the state they USED to be in.
No card is perfect.
No drivers are perfect - all have bugs, issues, and problems.
The number of issues/bugs in ATI drivers for the 8500 have decreased since the release of the 8500.
nVidia has fixed some bugs, but as the number of issues was lower, there has been less stability improvement in their cards. Hence the driver quality is approaching equailty. It may not be there yet, but avid nVidia fans constant parade of old results is irrelevant to a board that is supposed to be about current events. Start a "Six Months Ago" thread if you want to talk about the past - who cares, we all know the score. What is interesting is the here and now.
ATI fans need to admit that there SUED to be problems with the 8500 drivers
nVidia fans need to admit that NOW, its not really an issue.
But there is enough bias in enough people here that when someone makes a comment that violates one fo the two above "fan" statements, it sets off a nasty chain reaction.
 
I have to thank you for closing that thread. This forum is SO different form every other one that I've seen, especially in the calibre of 3D knowledge. To see people trashing talking two of the finest video cards ever to hit the streets is pathetic. The GF3 was a HUGE leap when it came out, and the 8500 was very cool too, especially for programmers who wanted some real pixel programmability rather than the mess of ps 1.0-1.3 (which was still a huge leap from the DX7 days, no disrespect to GF3/4).

Either way, I wouldn't trade one for the other (though my money would buy the 8500 in the first place). If you got a GF3 early on, you got a card that will last beyond the most graphically intensive game 2 years later. If you got an 8500, you got some nifty features, plus you saved a good lot of money (here in TO you've got to buy ATI if you have any concern for your wallet, unless you buy online).

Both cards took around 6 months to get the drivers up to snuff (OK, ATI took a bit longer). Both cards did a lot for these companies. NVidia finally made a balanced architecture (the GF2 series has some horribly inefficient silicon), and ATI finally had a card that performed with the best of NVidia (whereas Radeon had a performance lead for only a month before Detonator 3's came out).

Overall, we've finally got a game using the potential of these chips, and even this is only some of the potential. We need something like Doom3 to get people to really know what can be done with the newest generation of video cards, and to push these lazy ass game developers to use their features.
 
Doomtrooper said:
Gollum your post is excellent but what you wrote will never happen, I was born in the 60's and have been gaming, browsing forums since my Amiga 500 days on the net (artillery on the old Amiga 500 with 9600 Baud Modem..upgraded from a 1200 baud modem..wohoo)...the truth is nobody ever agree's with each all the time and different opinions is what makes the world go around.
I try to be open minded and have no problem saying I was wrong, some people on the other hand it's like pulling teeth
Doomtrooper, I know my post won't prevent any flame wars from happening, I've been in too many forums in my 8 years online to expect that to happen. I just wanted to express my dislike of them becoming out of hand like in the end of that thread, especially at Beyond 3D. I don't think different opinions are something bad, quite the contrary, I LOVE a good discussion between people with different opinions, its what makes things interesting. Its just when people start to misinterpret other's statements, take them out of context - sometimes on purpose, thinking they can't argue against something otherwise - that I call unneccessary and a thread ceases to be a discussion IMO. Beyond 3D has always managed to stay a rather calm and certainly is a less-than-elsewhere-biased place to be, lets try to keep it that way and not let every thread turn into an ATI vs. Nvidia Misinterpretation Deathmatch... ;)

Althornin said:
Gollum, i agree. I am sorry i even entered the thread, i was just sick of people raggin on ATI drivers for the state they USED to be in.
Quite understandable, I used to say that all the time myself in fact, I used to despise ATI for their past cards and drivers, but ever since the radeon they've been doing increasingly well, so i started following them more and could see for myself how much has changed. Its just that in that thread, the argument started out being about the past, so it was valid to talk about it IMO.
As for ATI's current driver state, for the End User/Gamer it has improved dramatically and is basically almost on par with the Detonators. Several developers still say that they're not yet ready to become their choice as a primary platform (JC being the most influential among those), so I think there still must be some minor issues with them left, while most show-stoppers must be gone. ATI has learned a lot from the past years and is only gonna get better and better I think.

Talking about compatibility, my WinXP won't let me install anything but the default VGA drivers for my brand new GF4Ti4200 (98SE works, might be an issue with the graphicscard/mainboard combination in some way too, although a fresh install of XP supposedly solves the problem, so I blame XP the most, hmpf), what do I care if the drivers are more mature, if they can't get installed... :LOL:
 
JC update

some new John Carmack quotes to take apart, ripped from nV News:

John Carmack said:
* Misrepresented


This batch of comments from me have let people draw conclusions that leave me scratching me head wondering how they managed to get from what I said to what they heard.

Other people have outlined the issues in detail in comments already, but the crux is that, even with driver quality removed from the discussion (not counting conformance issues, running at fill limited resolutions), GF4 hardware is still faster than 8500 hardware on basically everything I tested. The 8500 SHOULD have been faster on paper, but isn't in real life.

The hardware we used at E3 was not an 8500, and while the drivers were still a bit raw, the performance was very good indeed.

Take with a grain of salt any comment from me that has been paraphrased, but if it is an actual in-context quote from email, I try very hard to be precise in my statements. Read carefully.

* High End Hardware Reasoning

We know for sure that we will be excluding some of the game buying public with fairly stiff hardware requirements, but we still think it is the right thing to do.

The requirement for GF1/Radeon 7500 as an absolute minimum is fundamental to the way the technology works, and was non-negotiable for the advances that I wanted to make. At the very beginning of development, I worked a bit on elaborate schemes to try and get some level of compatibility with Voodoo / TNT / Rage128 class hardware, but it would have looked like crap, and I decided it wasn't worth it.

The comfortable minimum performance level on this class of hardware is determined by what the artists and level designers produce. It would be possible to carefully craft a DOOM engine game that ran at good speed on an original SDR GF1, but it would cramp the artistic freedom of the designers a lot as they worried more about performance than aesthetics and gameplay.

Our "full impact" platform from the beginning has been targeted at GF3/Xbox level hardware. Slower hardware can disable features, and faster hardware gets higher frame rates and rendering quality. Even at this target, designers need to be more cognizant of performance than they were with Q3, and we expect some licensee to take an even more aggressive performance stance for games shipping in following years.

Games using the new engine will be on shelves FIVEYEARS (or more) after the initial design decisions were made. We had a couple licensees make two generations of products with the Q3 engine, and we expect that to hold true for DOOM as well. The hardware-only decision for Q3 was controversial at the time, but I feel it clearly turned out to be correct. I am confident the target for DOOM will also be seen as correct once there is a little perspective on it.

Unrelated linux note: yes, there will almost certainly be a linux binary for the game. It will probably only work on the nvidia drivers initially, but I will assist any project attempting to get the necessary driver support on on other cards.
 
The hardware we used at E3 was not an 8500, and while the drivers were still a bit raw, the performance was very good indeed.

I had a feeling about the drivers being green, I'm sure the hardware was not final either...be it clock speed or chip stepping.
 
The Inquirer is reporting that the RV250 will be called a Radeon 9xxx. So will the R300 assume a new name, or are we heading into quintuple-digits? :)
 
When Inquirer reports something it means that it is not going to happen. Besides, ATi stated that the first digit in the model number correspond to supported version of DirectX, so unless RV250 supports DX9 they are not going to call it 9xxx.
 
Back
Top