R300 the fastest for DoomIII, John Carmack Speaks Again

Status
Not open for further replies.
What I can tell you is that John is a long-time supporter of NVIDIA and Doom III is being designed with NVIDIA graphics in mind.

I don’t see it like that - I would say its designed with what John wants in mind, with and with an eye on all hardware. Its safe to say that John has liked NVIDIA card because of the quality of their drivers and probably because they are first to market with the fastest cards, however I would not say that this equates him designing the game exclusively with NVIDIA’s feature set in mind. He has, after all, been fairly vocal over GF3/4’s limited fragment abilities (and compared it negatively to ATi’s in the past) and I’m sure if he was designing DoomIII to NVIDIA’s hardware, and not his own needs / specifications, then I doubt it would have operated in 2 passes on GF3/4.

While its almost definitely true he’s designed the game with NVIDIA graphics in mind, I would doubt that’s to the exclusion of other graphics hardware as well.
 
Chalnoth said:
Application detection for benchmarks is cheating. Benchmarks are meant to guage relative performance across more programs than those benchmarked. Application detection skews the results, meaning that the Radeon 8500 will almost certainly be generally worse than other cards than it is in the major benchmark programs.

So, if a company sees that a application is doing something suboptimally, they shouldn't take action to make it work better? Sorry, that's not going to happen or else the people who didn't do these sort of optimizations would be at a disadvantage versus those who did.

Secondly, you seem to think ATI is the only company doing application detection... :LOL:
 
Goddammit! Howcome we're having this utter crap 8500 vs GF3/4 over and over and over again? :cry:

A few of your guys are really dense in my book (none mentioned, none forgotten).
 
Pete said:
but I must reply when you overlook facts. The fact is, with the original driver, the 8500 showed a marked difference in benchmark scores between a quake and quack executable. So there was an "issue" with the drivers, and, cheat or mistake, it did affect scores.

I'm not sure what to say--I fault you (tho that was mostly directed toward Hellbinder) for overlooking truths, and you fault me for actually remembering them. I guess I can't win. I admire ATi's driver improvements, but I won't overlook problems, especially those that look like deliberate attempts to mislead consumers...

...Unless ATi's later drivers showed the same speed difference between quake/quack, but with the exact same image quality for both (specifically, no messed up MIP-map levels). If that happened, and I just missed that part of [H]'s follow-up, then I apologize for my ignorance.

I'll try one last time:
- ATI had an optimization for Quake 3
- There was a bug in the optimization that affected image quality
- Changing the name of the Quake 3 executable disabled the optimization
- ATI released newer drivers with the optimization fixed
- Quake 3 scores didn't appreciably change
- No more image quality problem in Quake 3 and changing the name of the executable has no effect

I figured this out months before I joined ATI. It's not that hard.

Tell me again what I am overlooking?
 
...and probably because they are first to market with the fastest cards
Personally, speed isn't his primary concern, not since he started on the Q3 engine. More like feature set. It helps that the original GeForce was first with T&L and was the speediest card at the time.

Let's hope he replies to my latest questionaire, especially regarding IHV-specific OGL calls.
 
More like feature set. It helps that the original GeForce was first with T&L and was the speediest card at the time.

Well, I was going to include that but its not necessarily strictly true that NVIDIA are first to market with the highest featureset – i.e. Radeon’s featureset eclipsed the GF2 or the time until GF3 became available, and Radeon 8500’s features eclipse both GF3&4 in areas, specifically is fragment abilities which JC has highlighted as being more flexible / capable and slightly closer to what he’s been calling for.

Another issue with featureset is the OpenGL lag factor – while I don’t doubt JC has played with all the various vendor specific extensions he has demonstrated a reluctance to support them in the past (unless numerous vendors support it – i.e. S3TC) within the actual game engine. Given that, currently, OpenGL has fallen behind hardware development I’m not sure how much feature set can play into his game engine design given this situation – I think OpenGL2 development is very important in this respect because it should move the API beyond the capabilities of the hardware for a little while meaning there is a larger blanket coverage if features that are supported and the compilers should be able to figure out the capabilities of the hardware; it for this reason I’m most keen to hear his thoughts on OpenGL2 development.
 
Chalnoth said:
Doomtrooper said:
Alpha Textures are not going away anytime soon.

No, they're not. The problem with multisampling goes away once an alpha blend is used. As I said before, Morrowind uses alpha blends, and UT2k3 will as well. Those games are very important for me. I've also modified UT's OpenGL renderer to support alpha blends (It was very, very easy...).

But as I've pointed out before in the general case it's not possible to just rip alpha testing out and replace it with alpha blending. In an UT renderer it's easy as the engine takes care of depth sorting for you, but in the general case it's much harder to do. Alpha testing discards fragments which means depth buffer values will be correct after drawing, but alpha blending does not, thus putting "incorrect" depth values in the depthbuffer so that you can't see through it if stuff are drawn behind it later on.

btw, Doom, it's not alpha textures that's the problem, it's alpha testing. Textures with alpha channels are used for both techniques.
 
ben6 said:
Hehe, John , not only are we expecting Nvidia to talk about NV30 in more than general terms in the middle of a product cycle, but we expect them to comment on the refresh of the NV30 or the part after that? Just a silly thought.

Heh. Nah, I was being facetious. With Doom 3 not shipping until sometime next year, Brian really should've just stated next-gen. Nvidia hardware rather than a specific product.

Hey, us English majors like to nit-pick. 8)
 
Application detection for benchmarks is cheating. Benchmarks are meant to guage relative performance across more programs than those benchmarked. Application detection skews the results, meaning that the Radeon 8500 will almost certainly be generally worse than other cards than it is in the major benchmark programs.


If you notice that in Q3 the 8500 pretty much looses all the time to the GF3 Ti500 by a decent margin in most reviews. However in all other Q3 based games, the 8500 is dead on if not slightly ahead of the GF3 Ti500. Kind of an interesting tidbit to consider. Also if you point was true we then would only see high spike in the scores for that game that is optimized. We are not seeing that as the 8500 scores are consistent through out a lot of other games (consistent meaning is scoring where we think it should given its specs and the scores of other cards). That's just not the case here at all as the 8500 has not be generally worse than other cards in the other benchmark scores.
 
jb said:
That's just not the case here at all as the 8500 has not be generally worse than other cards in the other benchmark scores.

Here's an interesting read:
http://www.simhq.com/simhq3/hardware/reviews/atiradeon8500/index5.shtml

Notice the failure to run Janes F/A-18 or Ghost Recon properly...and don't forget, this was in March (i.e. long after the release of the 8500 and 3Ti's).

It's also interesting to note that after throwing out the games that didn't run correctly, the GeForce3 (not Ti) still won half the benches...

Oh, and if you're going to look at just specs, the 8500 should win far more than it does...after all, my GeForce4 Ti 4200 beats out an 8500 in most situations, even when it's clocked a fair bit lower (250/512).
 
The Gf3 really shone performance wise when BOTH FSAA and Aniso were enabled.

Taken seperately, it was less good

So ignoring image quality


Regular tests, both cards were roughly a tie
aniso only ATI by a long shot
FSAA only Nvidia by a long shot

Aniso +FSAA Nvidia with a good edge

Of course, the image quality was always and will always be a subject of debate. The combination of Aniso+FSAA (the highest quality setting at a playable rate) is probably where one would want to look at.

R8500 high quality settings (everything maxed) seemed to have better texture clarity, more vibrant colors, as well as having no issues with alpha tests. The downside was you could see mipmap issues due to there weird LOD.

Nvidia's high quality setting otoh had much better texture antialiasing (motion is much more agreable with Nvidia's offering), generally slightly higher resolution (for equivalent framerate with everything maxed). Jaggies were about equal between both cards from what I saw, (slight edge maybe to R8500) alpha textures aside. The downside of Nvidia's IQ was there is noticieable blurring in some distant texture (lack of 128 tap aniso), as well as annoying issues with Texture compression in some games.

Thats my opinion at least
 
Fred said:
R8500 high quality settings (everything maxed) seemed to have better texture clarity, more vibrant colors, as well as having no issues with alpha tests. The downside was you could see mipmap issues due to there weird LOD.

Nvidia's high quality setting otoh had much better texture antialiasing

Higher texture LOD always makes the image look like it has "more vibrant colors." I think it's sad that ATI will call a setting where the texture LOD is set so high as to cause significant texture aliasing "high quality." The only reason I can think of for this is that texture aliasing is usually only apparent when the scene is in motion...and most review sites look at/post screenshots for comparison...

Btw, for those of you with GeForces, you can also adjust the texture LOD through a tweak utility...but I've always thought that it looks quite bad, even if the aliasing is only increased slightly.
 
Here's an interesting read:
http://www.simhq.com/simhq3/hardware/reviews/atiradeon8500/index5.shtml

Notice the failure to run Janes F/A-18 or Ghost Recon properly...and don't forget, this was in March (i.e. long after the release of the 8500 and 3Ti's).

Old review on old drivers – not really that interesting seeing as most users will be running newer drivers and such issues are likely not to be present.

Perhaps you might like to look at some review that use current drivers, it might update your knowledge a little.

http://www.sharkyextreme.com/hardware/videocards/article.php/3211_1143561__5

Higher texture LOD always makes the image look like it has "more vibrant colors." I think it's sad that ATI will call a setting where the texture LOD is set so high as to cause significant texture aliasing "high quality."

Higher internal precision can also make colours look more vibrant. I think it’s a bit ‘sad’ that NVIDIA has chosen not to implement something like this to attain high quality.

fanboys - gotta love em :rolleyes:
 
Chalnoth said:
jb said:
That's just not the case here at all as the 8500 has not be generally worse than other cards in the other benchmark scores.

Here's an interesting read:
http://www.simhq.com/simhq3/hardware/reviews/atiradeon8500/index5.shtml

Notice the failure to run Janes F/A-18 or Ghost Recon properly...and don't forget, this was in March (i.e. long after the release of the 8500 and 3Ti's).

It's also interesting to note that after throwing out the games that didn't run correctly, the GeForce3 (not Ti) still won half the benches...

Oh, and if you're going to look at just specs, the 8500 should win far more than it does...after all, my GeForce4 Ti 4200 beats out an 8500 in most situations, even when it's clocked a fair bit lower (250/512).

Yes, the drivers should have been better than that at that time, however, current drivers are very very nice.

pulled from nvews.net news post:
http://www.gamepc.com/labs/view_content.asp?id=vt4200&page=7

as you can see from this, the 8500 actually BEATS a ti4600 at 1600X1200.

Ati's drivers are far far better than everyone gives them credit for. Care to go on about DXTC and nVidia? I know you can hack it to make it work, but does it work straight with current drivers???

Not flaming, not dropping to fanboyism (although some might think otherwise, Ive owned too many different cards). I can see that older drivers did not perform as expected. This is obvious, but every time someone mentions a 8500 beating a ti500, OLDER benchmarks are always broken out. Doomtrooper and myself have shown a few links of current drivers, they keep toe to toe with the ti4XXX cards in currrent games not older games and tests (who the hell still plays Q3 at 800X600@16bpp? as some sites test still).

Another thing, every company otpimizes for something. nVidia does it just as well/bad as Ati. Think about this, Quake3 (long used as a game benchmark), the GF3/GF4 cards blow by the Ati cards. But in games such as Jedi Knight 2 (a quake 3 engine game (yes I know its modified)) the 8500 stands on even grounds playing king of the hill with the ti4XXX cards.

I like the ti4XXX cards, they are fast, but they have flaws (just like 8500 cards). I've gone so far as challenge friends for image comparison shots, and have played side by side on trinitron monitors against a Visiontek ti4600. Stop at the same point, you can see differences. Problem with that is that image comparison by eyes is very subjective and prone to failure.
 
Oh, and if you're going to look at just specs, the 8500 should win far more than it does...after all, my GeForce4 Ti 4200 beats out an 8500 in most situations, even when it's clocked a fair bit lower (250/512).

My dear good sir, one constant law here in the 3d world is specs mean nothing (wasn't that Dave Barron's old sig?). A simple look at the K2 shows us that diffent technologys can provide great results with less specs (K2 spesc are that of a TNT2 Ultra yet it perfroms near the level of a GF2). The Gf4 has many more enhacncements in its rendering engine that help it be more efficent (its new memory controller for one, loss-less zcompression, ect). Really its a no brainer to see why the GF4 Ti200 was/is faster. Beside I would hope a new product is faster than an older one :)


Back on topic, isn't it strange how as we get more answers from JC we get yet more questions? I am not so sure I know much more before we had Rev and Chalnoth ask him......
 
jb said:
Back on topic, isn't it strange how as we get more answers from JC we get yet more questions? I am not so sure I know much more before we had Rev and Chalnoth ask him......

thats his secret...

let everyone else make the news for him.
 
Can you guys look at what each other writes?

Yes Today, in most case, the R8500 beats the GF3 TI 500. That's a great improvement.

Yes, in the 6-7 month after the release, the R8500 was not the best card, and the GF 3 TI 500 was.

Issue Closed.
 
Evildeus said:
Can you guys look at what each other writes?

Yes Today, in most case, the R8500 beats the GF3 TI 500. That's a great improvement.

Yes, in the 6-7 month after the release, the R8500 was not the best card, and the GF 3 TI 500 was.

Issue Closed.

not fair, you summed it all up in less than 5 sentences...

;)
 
Status
Not open for further replies.
Back
Top