R300 the fastest for DoomIII, John Carmack Speaks Again

Discussion in 'Architecture and Products' started by multigl2, May 28, 2002.

Thread Status:
Not open for further replies.
  1. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    I don’t see it like that - I would say its designed with what John wants in mind, with and with an eye on all hardware. Its safe to say that John has liked NVIDIA card because of the quality of their drivers and probably because they are first to market with the fastest cards, however I would not say that this equates him designing the game exclusively with NVIDIA’s feature set in mind. He has, after all, been fairly vocal over GF3/4’s limited fragment abilities (and compared it negatively to ATi’s in the past) and I’m sure if he was designing DoomIII to NVIDIA’s hardware, and not his own needs / specifications, then I doubt it would have operated in 2 passes on GF3/4.

    While its almost definitely true he’s designed the game with NVIDIA graphics in mind, I would doubt that’s to the exclusion of other graphics hardware as well.
     
  2. OpenGL guy

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,357
    Likes Received:
    28
    So, if a company sees that a application is doing something suboptimally, they shouldn't take action to make it work better? Sorry, that's not going to happen or else the people who didn't do these sort of optimizations would be at a disadvantage versus those who did.

    Secondly, you seem to think ATI is the only company doing application detection... :lol:
     
  3. LeStoffer

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,262
    Likes Received:
    22
    Location:
    Land of the 25% VAT
    Goddammit! Howcome we're having this utter crap 8500 vs GF3/4 over and over and over again? :cry:

    A few of your guys are really dense in my book (none mentioned, none forgotten).
     
  4. OpenGL guy

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,357
    Likes Received:
    28
    I'll try one last time:
    - ATI had an optimization for Quake 3
    - There was a bug in the optimization that affected image quality
    - Changing the name of the Quake 3 executable disabled the optimization
    - ATI released newer drivers with the optimization fixed
    - Quake 3 scores didn't appreciably change
    - No more image quality problem in Quake 3 and changing the name of the executable has no effect

    I figured this out months before I joined ATI. It's not that hard.

    Tell me again what I am overlooking?
     
  5. Nappe1

    Nappe1 lp0 On Fire!
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,532
    Likes Received:
    11
    Location:
    South east finland
    Le Stoffer: well, this is what you get when few ppl forgets what is objectivity. :(
     
  6. Randell

    Randell Senior Daddy
    Veteran

    Joined:
    Feb 14, 2002
    Messages:
    1,869
    Likes Received:
    3
    Location:
    London
    What I really really like, is one side accusing the other of 'fanboyism' in these arguements :roll:
     
  7. Reverend

    Banned

    Joined:
    Jan 31, 2002
    Messages:
    3,266
    Likes Received:
    24
    Personally, speed isn't his primary concern, not since he started on the Q3 engine. More like feature set. It helps that the original GeForce was first with T&L and was the speediest card at the time.

    Let's hope he replies to my latest questionaire, especially regarding IHV-specific OGL calls.
     
  8. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    Well, I was going to include that but its not necessarily strictly true that NVIDIA are first to market with the highest featureset – i.e. Radeon’s featureset eclipsed the GF2 or the time until GF3 became available, and Radeon 8500’s features eclipse both GF3&4 in areas, specifically is fragment abilities which JC has highlighted as being more flexible / capable and slightly closer to what he’s been calling for.

    Another issue with featureset is the OpenGL lag factor – while I don’t doubt JC has played with all the various vendor specific extensions he has demonstrated a reluctance to support them in the past (unless numerous vendors support it – i.e. S3TC) within the actual game engine. Given that, currently, OpenGL has fallen behind hardware development I’m not sure how much feature set can play into his game engine design given this situation – I think OpenGL2 development is very important in this respect because it should move the API beyond the capabilities of the hardware for a little while meaning there is a larger blanket coverage if features that are supported and the compilers should be able to figure out the capabilities of the hardware; it for this reason I’m most keen to hear his thoughts on OpenGL2 development.
     
  9. Humus

    Humus Crazy coder
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,217
    Likes Received:
    77
    Location:
    Stockholm, Sweden
    But as I've pointed out before in the general case it's not possible to just rip alpha testing out and replace it with alpha blending. In an UT renderer it's easy as the engine takes care of depth sorting for you, but in the general case it's much harder to do. Alpha testing discards fragments which means depth buffer values will be correct after drawing, but alpha blending does not, thus putting "incorrect" depth values in the depthbuffer so that you can't see through it if stuff are drawn behind it later on.

    btw, Doom, it's not alpha textures that's the problem, it's alpha testing. Textures with alpha channels are used for both techniques.
     
  10. John Reynolds

    John Reynolds Ecce homo
    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    4,491
    Likes Received:
    267
    Location:
    Westeros
    Heh. Nah, I was being facetious. With Doom 3 not shipping until sometime next year, Brian really should've just stated next-gen. Nvidia hardware rather than a specific product.

    Hey, us English majors like to nit-pick. 8)
     
  11. jb

    jb
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,636
    Likes Received:
    7

    If you notice that in Q3 the 8500 pretty much looses all the time to the GF3 Ti500 by a decent margin in most reviews. However in all other Q3 based games, the 8500 is dead on if not slightly ahead of the GF3 Ti500. Kind of an interesting tidbit to consider. Also if you point was true we then would only see high spike in the scores for that game that is optimized. We are not seeing that as the 8500 scores are consistent through out a lot of other games (consistent meaning is scoring where we think it should given its specs and the scores of other cards). That's just not the case here at all as the 8500 has not be generally worse than other cards in the other benchmark scores.
     
  12. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    Here's an interesting read:
    http://www.simhq.com/simhq3/hardware/reviews/atiradeon8500/index5.shtml

    Notice the failure to run Janes F/A-18 or Ghost Recon properly...and don't forget, this was in March (i.e. long after the release of the 8500 and 3Ti's).

    It's also interesting to note that after throwing out the games that didn't run correctly, the GeForce3 (not Ti) still won half the benches...

    Oh, and if you're going to look at just specs, the 8500 should win far more than it does...after all, my GeForce4 Ti 4200 beats out an 8500 in most situations, even when it's clocked a fair bit lower (250/512).
     
  13. Fred

    Newcomer

    Joined:
    Feb 18, 2002
    Messages:
    210
    Likes Received:
    15
    The Gf3 really shone performance wise when BOTH FSAA and Aniso were enabled.

    Taken seperately, it was less good

    So ignoring image quality


    Regular tests, both cards were roughly a tie
    aniso only ATI by a long shot
    FSAA only Nvidia by a long shot

    Aniso +FSAA Nvidia with a good edge

    Of course, the image quality was always and will always be a subject of debate. The combination of Aniso+FSAA (the highest quality setting at a playable rate) is probably where one would want to look at.

    R8500 high quality settings (everything maxed) seemed to have better texture clarity, more vibrant colors, as well as having no issues with alpha tests. The downside was you could see mipmap issues due to there weird LOD.

    Nvidia's high quality setting otoh had much better texture antialiasing (motion is much more agreable with Nvidia's offering), generally slightly higher resolution (for equivalent framerate with everything maxed). Jaggies were about equal between both cards from what I saw, (slight edge maybe to R8500) alpha textures aside. The downside of Nvidia's IQ was there is noticieable blurring in some distant texture (lack of 128 tap aniso), as well as annoying issues with Texture compression in some games.

    Thats my opinion at least
     
  14. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    Higher texture LOD always makes the image look like it has "more vibrant colors." I think it's sad that ATI will call a setting where the texture LOD is set so high as to cause significant texture aliasing "high quality." The only reason I can think of for this is that texture aliasing is usually only apparent when the scene is in motion...and most review sites look at/post screenshots for comparison...

    Btw, for those of you with GeForces, you can also adjust the texture LOD through a tweak utility...but I've always thought that it looks quite bad, even if the aliasing is only increased slightly.
     
  15. iRC

    iRC
    Newcomer

    Joined:
    May 29, 2002
    Messages:
    12
    Likes Received:
    0
    Old review on old drivers – not really that interesting seeing as most users will be running newer drivers and such issues are likely not to be present.

    Perhaps you might like to look at some review that use current drivers, it might update your knowledge a little.

    http://www.sharkyextreme.com/hardware/videocards/article.php/3211_1143561__5

    Higher internal precision can also make colours look more vibrant. I think it’s a bit ‘sad’ that NVIDIA has chosen not to implement something like this to attain high quality.

    fanboys - gotta love em :roll:
     
  16. jandar

    Newcomer

    Joined:
    May 27, 2002
    Messages:
    225
    Likes Received:
    1
    Location:
    JVille, FLA
    Yes, the drivers should have been better than that at that time, however, current drivers are very very nice.

    pulled from nvews.net news post:
    http://www.gamepc.com/labs/view_content.asp?id=vt4200&page=7

    as you can see from this, the 8500 actually BEATS a ti4600 at 1600X1200.

    Ati's drivers are far far better than everyone gives them credit for. Care to go on about DXTC and nVidia? I know you can hack it to make it work, but does it work straight with current drivers???

    Not flaming, not dropping to fanboyism (although some might think otherwise, Ive owned too many different cards). I can see that older drivers did not perform as expected. This is obvious, but every time someone mentions a 8500 beating a ti500, OLDER benchmarks are always broken out. Doomtrooper and myself have shown a few links of current drivers, they keep toe to toe with the ti4XXX cards in currrent games not older games and tests (who the hell still plays Q3 at 800X600@16bpp? as some sites test still).

    Another thing, every company otpimizes for something. nVidia does it just as well/bad as Ati. Think about this, Quake3 (long used as a game benchmark), the GF3/GF4 cards blow by the Ati cards. But in games such as Jedi Knight 2 (a quake 3 engine game (yes I know its modified)) the 8500 stands on even grounds playing king of the hill with the ti4XXX cards.

    I like the ti4XXX cards, they are fast, but they have flaws (just like 8500 cards). I've gone so far as challenge friends for image comparison shots, and have played side by side on trinitron monitors against a Visiontek ti4600. Stop at the same point, you can see differences. Problem with that is that image comparison by eyes is very subjective and prone to failure.
     
  17. jb

    jb
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,636
    Likes Received:
    7
    My dear good sir, one constant law here in the 3d world is specs mean nothing (wasn't that Dave Barron's old sig?). A simple look at the K2 shows us that diffent technologys can provide great results with less specs (K2 spesc are that of a TNT2 Ultra yet it perfroms near the level of a GF2). The Gf4 has many more enhacncements in its rendering engine that help it be more efficent (its new memory controller for one, loss-less zcompression, ect). Really its a no brainer to see why the GF4 Ti200 was/is faster. Beside I would hope a new product is faster than an older one :)


    Back on topic, isn't it strange how as we get more answers from JC we get yet more questions? I am not so sure I know much more before we had Rev and Chalnoth ask him......
     
  18. jandar

    Newcomer

    Joined:
    May 27, 2002
    Messages:
    225
    Likes Received:
    1
    Location:
    JVille, FLA
    thats his secret...

    let everyone else make the news for him.
     
  19. Evildeus

    Veteran

    Joined:
    May 24, 2002
    Messages:
    2,657
    Likes Received:
    2
    Can you guys look at what each other writes?

    Yes Today, in most case, the R8500 beats the GF3 TI 500. That's a great improvement.

    Yes, in the 6-7 month after the release, the R8500 was not the best card, and the GF 3 TI 500 was.

    Issue Closed.
     
  20. jandar

    Newcomer

    Joined:
    May 27, 2002
    Messages:
    225
    Likes Received:
    1
    Location:
    JVille, FLA
    not fair, you summed it all up in less than 5 sentences...

    ;)
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...