Carmack on best current video cards for Doom3

SteveG

Newcomer
From http://www.gamespy.com/e32002/pc/carmack/

Carmack confirms it: ATI drivers suck.

"In order from best to worst for Doom:

I still think that overall, the GeForce 4 Ti is the best card you can buy. It has high speed and excellent driver quality.

Based on the feature set, the Radeon 8500 should be a faster card for Doom than the GF4, because it can do the seven texture accesses that I need in a single pass, while it takes two or three passes (depending on details) on the GF4. However, in practice, the GF4 consistently runs faster due to a highly efficient implementation. For programmers, the 8500 has a much nicer fragment path than the GF4, with more general features and increased precision, but the driver quality is still quite a ways from Nvidia's, so I would be a little hesitant to use it as a primary research platform. "
 
Notice that this time around, JC is comparing the 8500 to the GF4, while relegating the GF3 to behind the pack status...

They certainly have their flaws, but the 8500 drivers have come an aweful long way in the last few months...
 
Yep. I'll give this to ATI- the 8500 architecture had great potential, but as usual the driver team couldn't fully back it up. (Assuming the 8500's failing to perform to Carmack's expectations is really because of the drivers)

It was still a big leap for ATI (much like the Parhelia is for Matrox). This competition is probably the main reason nV-based cards have dropped in price so quickly after release, and why we're seeing newer/faster hardware and features from nVidia so often. The GF3Ti series was expected, at least by me, but a move to GF4Ti wasn't, as it's basically a second refresh (speed and quality boost primarily) over the original GF3 and GF3Ti cards.

I get the feeling that without ATI, and now Matrox, we wouldn't have seen the same performance from whatever nVidia would've had out right now, and we'd probably be paying quite a bit more for it (even compared to the $300 - $400 for Ti4600). :D
 
Lol I was wondering when this post was going to show up, I have no problems with my 8500, I have never seen any game on my hardrive give me any grief and I have lots.
I guess Carmack has better handle on them as he is a Dev and must still be seeing something but in the last couple of leaks the drivers have been excellent.
For every bad driver issue posted about ATI I could pull up the same on Nvidia, how about texture compression..can Nvidia fix this finally after two chip revisions ??
I like the higher precision rendering to give a superior image, Humus showed this on the old forums and it does show what ATI hardware can do for image quality, and thats what is important to me...the graphics. If it gets 75 fps and looks great vs. 105 and looks inferior, I'll take the 75 anyday. Its always frames ,frames, frames.

7.jpg


Humus's game engine shown here, the 1st shot he is showing what the Geforce 3 and 4 supports with [0,1] while the Radeon 8500 supports [-8, 8]. You can see the difference higher precision rendering does for image quality.

Geforce 3-4
range1.jpg


Radeon 8500
range8.jpg
 
You guys never let up..

He ONLY says he would *hesitate* to use it as his primary research platform.

It still is running a close second (apparently) to the GF4.. Yet you yahoos find a way to turn it into a negative.

It still does not change the fact that they are demoing DoomIII at E3 using an ATI card.
 
DoomTrooper...


If the GF4's IQ deficit you show in those screenshots was not the fault of the developer, don't you think we would have heard about it from Carmack or somebody like that by now? I really doubt believe the difference would be that big if any..

As far as John Carmack and Doom3 is concerned It's obvious NV20 (From previous statements) and NV25 are doing better then R200. It's pretty clear to me..
 
Livecoma said:
DoomTrooper...


If the GF4's IQ deficit you show in those screenshots was not the fault of the developer, don't you think we would have heard about it from Carmack or somebody like that by now? I really doubt believe the difference would be that big if any..

As far as John Carmack and Doom3 is concerned It's obvious NV20 (From previous statements) and NV25 are doing better then R200. It's pretty clear to me..

How would it be the developers fault ??? :LOL:

Each hardware has limitations, in fact Carmack already stated since that is what this thread is about...

"The Radeon 8500 has far and away the most flexible pixel shading pipeline of current boards, allowing me to do in one pass what requires two or three passes on competitor's boards," said John Carmack, Co-owner, Technical Director, Id Softwareâ„¢ "The increased internal precision also makes possible a quality improvement in high dynamic range scenes."

From my understanding the Geforce 3 and 4 support [0,1] although I think negetive numbers can be used while the 8500 supports [-8,8]

How does the developer overcome hardware restrictions if one card renders at a higher precision, like Carmacks quote ??
 
I am pretty sure that GF3-4TI support [-1,1].

Scott, what are you actually trying to do, prove Carmack wrong?
 
Geeforcer said:
I am pretty sure that GF3-4TI support [-1,1].

Scott, what are you actually trying to do, prove Carmak wrong?

No, I agree with him :LOL:

"The Radeon 8500 has far and away the most flexible pixel shading pipeline of current boards, allowing me to do in one pass what requires two or three passes on competitor's boards," said John Carmack, Co-owner, Technical Director, Id Softwareâ„¢ "The increased internal precision also makes possible a quality improvement in high dynamic range scenes."
;)

Does ATI drivers need work, sure do..but ATI being a OGL ARB member and the with the aquisition of FireGL, ATI drivers can only get better and have....
 
Another thing most gamers seem to forget is that driver quality to a gamer and a developer can be two completely different things. To a gamer a driver set is bad if it crashes your computer after 8 hours of Q3 or your fps drop by 5%. For a developer a driver set is bad if he can feed an esoteric ogl/d3d call with weird parameters which suddenly makes his pet feature/algorithm puke junk all over the screen.
 
Your missing the point. Obviously there are differences between the chips, but if real games actually showed or will show such a large difference like those screenshots we would have heard all kinds of stuff about it by now...

Those shots look much more like theoretical best/worse case senarios.


Considering how current generation cards will run Doom3 (640*480*32 @ 30 FPS on a GF3 with features turned on?) if I had to make a choice I would take the speed advantaged GF3/4, but I'll be running something much faster by that time so it matters little, IMO...
 
Livecoma / Geeforcer,

If the GF4's IQ deficit you show in those screenshots was not the fault of the developer, don't you think we would have heard about it from Carmack or somebody like that by now?

Carmack say in relation to 8500 “with more general features and increased precisionâ€￾ – DM’s pictures are illustrations of this point showing where the higher precision is giving benefits.

AFAIK the first of those two images is from the DX8 SDK so I believe it pretty much considered the ‘reference’ for this particular pixel shading technique.
 
Livecoma said:
Your missing the point. Obviously there are differences between the chips, but if real games actually showed or will show such a large difference like those screenshots we would have heard all kinds of stuff about it by now...

Those shots look much more like theoretical best/worse case senarios.


Considering how current generation cards will run Doom3 (640*480*32 @ 30 FPS on a GF3 with features turned on?) if I had to make a choice I would take the speed advantaged GF3/4, but I'll be running something much faster by that time so it matters little, IMO...

Its been around for years already Coma, the old reviews always stated the Radeon 1's Iq was superior in every way. What delivered that superior IQ, I'm sure rendering precision had alot to do with it.

Geforce 2 No Texture Compression
q3_1_geforce.png


Radeon DDR No Texture Compresion
q3_1_radeon.png
 
Geeforcer said:
Doomtrooper said:
No, I agree with him :LOL:

Alright then...:D

Carmack said:
I still think that overall, the GeForce 4 Ti is the best card you can buy.

I am a selective reader and refuse to read that part :D

Seriosuly I'm quite happy with the comments coming from JC, Of course I will be running a R300/Matrox 512 setup when Doom 3 is released anyways, just good to see the 8500's advanced features at work.
 
As far as John Carmack and Doom3 is concerned It's obvious NV20 (From previous statements) and NV25 are doing better then R200. It's pretty clear to me..

The previous comments were on older drivers and even then the r200 was about the same as the nv20 under doom 3. Carmack said it was upto 30% faster in some tests (texturing I think). An ati card was also used at E3 to show the game, presumably because of its much better precision.

For stability the drivers on the 8500 have been equal to the ones on my geforce256 SDR. I had no problems that couldnt be sorted by a driver change on either. For speed the ati drivers are lacking although they have improved noticeably since its release. From being about the same speed as a gf3 to beating the gf4ti4200 on some tests is pretty good imo.

Carmack confirms it: ATI drivers suck.

They certainly don't suck however and carmack also never said that.
 
Its been around for years already Coma, the old reviews always stated the Radeon 1's Iq was superior in every way. What delivered that superior IQ, I'm sure rendering precision had alot to do with it.


Yup but R200 vs Nv20/25 does not equal GF2 vs Radeon1.

What games show an IQ difference anywere close to what you are implying with those demo shots? I've been running an 8500 for a few months now and I have not witnessed one game yet that looks different from a GF3.
 
Livecoma said:
Its been around for years already Coma, the old reviews always stated the Radeon 1's Iq was superior in every way. What delivered that superior IQ, I'm sure rendering precision had alot to do with it.


Yup but R200 vs Nv20/25 does not equal GF2 vs Radeon1.

What games show an IQ difference anywere close to what you are implying with those demo shots? I've been running an 8500 for a few months now and I have not witnessed one game yet that looks different from a GF3.

The shots from Humus's engine showing the difference requires the engine to be coded that way, from what I read from his page you would have to code in that precsion in the game and also have the hardware to support it, does any game currently today support that level of precision..probably not but COULD, with Doom 3 it looks like it will.

Edit: The diffuse bumpmapping shot from the Dx SDK must be adaptive as their is no coding difference between the two shots. I guess possibly Dx can detect the level of precision and adjust the output ??
 
You don't need to code it differently. It's just that the GF3 clamps the results when some part of the calculations goes over 1.0, even intermediate results. Due to the higher internal range of up to 8.0 the Radeon 8500 can keep higher numbers in the registers when combining, which allows for better lighting dynamics, as shown in both examples.
 
Back
Top