riddick performance comparison

Discussion in '3D Hardware, Software & Output Devices' started by hovz, Feb 26, 2005.

  1. hovz

    Regular

    Joined:
    May 10, 2004
    Messages:
    920
    Likes Received:
    0
  2. jvd

    jvd
    Banned

    Joined:
    Feb 13, 2002
    Messages:
    12,724
    Likes Received:
    9
    Location:
    new jersey
    or get thier hardware in the console with the most games ported to the pc .... oh wait they did that this time :)
     
  3. Mendel

    Mendel Mr. Upgrade
    Veteran

    Joined:
    Nov 28, 2003
    Messages:
    1,350
    Likes Received:
    17
    Location:
    Finland
    I cry foul, its just way too twimbtastic game, and even originated from xbox (nvidia graphics there too)

    So it's not like they spent even amounts of time optimizing for both Ati and Nvidia.
     
  4. wireframe

    Veteran

    Joined:
    Jul 14, 2004
    Messages:
    1,347
    Likes Received:
    33
    Of course. Everything should be improved :D Priorities are priorities, however and if you look at the numbers the difference is not that vast. At the very top it is probably unplayable on any hardware anyways and X800 has a knack of levelling up once FSAA and AF come into play.

    1280x1024 4xFSAA & 8xAF looks like a nice setting and it might feel playable at around 40 FPS. Not sure how different the experience would be between 48 and 42, 6800 U and X800XT PE respectively.
     
  5. Mark

    Mark aka Ratchet
    Regular

    Joined:
    Apr 12, 2002
    Messages:
    604
    Likes Received:
    33
    Location:
    Newfoundland, Canada
    Good article, but Riddick does have benchmark commands. timedemo, record, etc all work as expected.

    I used it in my last 2 videocard reviews, here and here, and it worked fine.
     
  6. ondaedg

    Regular

    Joined:
    Oct 5, 2003
    Messages:
    350
    Likes Received:
    1
    Welcome to the 21st century! Just remember, HL2 was developed on ATI hardware, yet it is considered a good benchmark. Doom3 may be more suitable for Nvidia and it too is benchmarked heavily. I don't know how we can demand a "level playing field" for a benchmark. Take the numbers as a result and live with it. That's my motto! Good news is that all the cards performed well with AA and AF. The Geforce owners may get to up the res a notch or two. I don't see it as a demolilshing of ATI though. If the ATI cards had trouble turning the eye candy on and the GeForce didn't, then there would definitely be a big advantage to Nvidia. I just don't see a clear cut winner or loser. I would call it a "slight" advantage to Nvidia in that game.

    Also worthy to note, it is an OpenGL game and we know how Nvidia has typically been better under that API.
     
  7. SsP45

    Newcomer

    Joined:
    Feb 28, 2003
    Messages:
    141
    Likes Received:
    3
    Location:
    Canada
    I find it interesting how turning on PS 2.0++ mode drops performance by a factor or 3 on GeForce 6800 cards. What extra effects are being used that kill performance?
    ________
    Malaysian Recipes
     
    #7 SsP45, Feb 26, 2005
    Last edited by a moderator: Mar 13, 2011
  8. wireframe

    Veteran

    Joined:
    Jul 14, 2004
    Messages:
    1,347
    Likes Received:
    33
    I have no idea, but I assumed I should be running it in 2.0++ and it runs terribly. I quickly put the game away until I read this article. Now that I set it to 2.0 it runs very well and I haven't been able to tell what the difference is (I didn't look very closely).

    I had a very interesting experience when I switched from 2.0++ to 2.0. The increased frame rate greatly increased the perceived quality. It had been a long time since I saw and felt that effect. Fluid motion makes all the difference.

    Whatever it is 2.0++ does over 2.0, it is not conducieve to a good game playing experience on current hardware.

    It would be interesting to see a 2.0 and 2.0++ comparison on x800 hardware. (or is x800 not permitted to use 2.0++ mode at all?)
     
  9. Richard

    Richard Mord's imaginary friend
    Veteran

    Joined:
    Jan 22, 2004
    Messages:
    3,508
    Likes Received:
    40
    Location:
    PT, EU
    IIRC, that path is only available to NV4x cards and the benefit is soft shadows.
     
  10. mustrum

    Regular

    Joined:
    Dec 26, 2002
    Messages:
    288
    Likes Received:
    0
    Used driver: Driver version 67.66 (obtained off nZone website)

    Riddick gets a major performance boost with the new 75.90 FW so the difference is even bigger.
     
  11. wireframe

    Veteran

    Joined:
    Jul 14, 2004
    Messages:
    1,347
    Likes Received:
    33
    I just want to point out that I use 75.90 and the "major boost" is not big enough to play 2.0++ mode. Perhaps this mode is even unaffected by the boost in performance, who knows.

    This game is also showing very strong shimmering/moire everywhere. Some parts are fine and then there are areas that are complete eyesores.
     
  12. Mendel

    Mendel Mr. Upgrade
    Veteran

    Joined:
    Nov 28, 2003
    Messages:
    1,350
    Likes Received:
    17
    Location:
    Finland
    I find the major drawback in the game the mediocre polygonal detail.

    This game is not the only one in this respect but when you see the intersection point of any two walls for example, the line between them is a bit too obvious for my liking.
     
  13. wireframe

    Veteran

    Joined:
    Jul 14, 2004
    Messages:
    1,347
    Likes Received:
    33
    Here it is interesting to note and observe that Riddick has much smoother character models than Doom 3. This is readily observable in 3rd person view when you enter a medical station and in cut-scenes. What is interesting is to look for shadow polygon popping. It doesn't always happen and when it does it is not always a major problem. However, I find this interesting in relation to Doom 3 as a design decision because sometimes when lower poly models are used it makes more sense when we also see the less smooth shadows. It makes geometrical sense to the eyes and brain, whereas a very smooth human head with sudden erratic sharp polygonal shadow popping is quickly picked up as an abberation.

    Highy off topic, but I found this interesting.

    PS. Is anyone else seeing moire in this? I am getting sick and tired of all the moire I am seeing in some games (way too many) and am considering starting a new thread about it. We had a "shimmering with 6800 GT even with HIgh Quality" before, but it went nowhere. This time I am determined to grab as many screens as possible. I would make videos documenting it as well if I had some place to host them.
     
  14. DudeMiester

    Regular

    Joined:
    Aug 10, 2004
    Messages:
    636
    Likes Received:
    10
    Location:
    San Francisco, CA
    The moire is caused by aliasing within the surface itself, the only solution is super-sampling. Although this is kind of redundant as long as you can just run at a higher actual resolution. Also the super-sampling in the NV40 is limited to 1024x780, which can be problematic. Although I must say when you combine 4xMSAA and 4xSSAA for an effect 16xAA it looks damn good! btw, you'll need rivatuner or something to enable this mode.
     
  15. Mendel

    Mendel Mr. Upgrade
    Veteran

    Joined:
    Nov 28, 2003
    Messages:
    1,350
    Likes Received:
    17
    Location:
    Finland
    the other only solution:

    on older drivers I get the moire, on newer drivers there is a checkbox option in the advanced quality options that seems to work for me: the lod bias option.
     
  16. jb

    jb
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,636
    Likes Received:
    7
    wasnt the port an opengl version? If so why was it not a D3D? Techincal reasons? Just wondering as other console ports to PC where keep at D3D (Halo, RedFaction, RedMercuy, ect..)
     
  17. DudeMiester

    Regular

    Joined:
    Aug 10, 2004
    Messages:
    636
    Likes Received:
    10
    Location:
    San Francisco, CA
    LOD bias just scales the mip-map levels the driver selects to bias lower resolution or higher resolution. Setting it to postive values gives you lower, and negative values gets you higher. Probably, what you are doing is causing it to use lower resolution normal maps and such, thus smoothing out the aliasing issue. However, this is not really a good solution because it puts a limit on your texture resolution.
     
  18. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    18,180
    Likes Received:
    2,791
    Location:
    Winfield, IN USA
    You mean it gets too blurry. ;)
     
  19. wireframe

    Veteran

    Joined:
    Jul 14, 2004
    Messages:
    1,347
    Likes Received:
    33
    Ok, what I wanted to begin a new thread about was if this moire was seen on other hardware or if even my 6800 was somehow defective (which seems unlikely because it seems to be functioning fine otherwise). So you are saying this moire is seen on x800 hardware as well and the only choice available to them is upping the resolution? For some reason I was thinking that this may be some issue specific to the 6000 series. Let's just say my level of trust for Nvidia is not where it once was.

    Upping the resolution helps a bit, but some titles are not too happy running at 1600*1200 and I can imagine this will only get worse with more demanding titles coming out. I also think it's safe to say that SSAA will be out of the question.
     
  20. wireframe

    Veteran

    Joined:
    Jul 14, 2004
    Messages:
    1,347
    Likes Received:
    33
    For some reason I think the 6800 renders things too sharp. More recent drivers seem to help quite a bit in this regard, but with the first revision of drivers it was really like the hardware was trying to show way more detail than what was available/possible.

    This moire is also becoming more apparent in newer titles, but some older ones have it albeit perhaps in a more limited form. I am having a hard time understanding why one texture out of 20 on the screen would exhibit this behavior and no other. For a while I was ready to throw my 6800 in the trash and get a X800, but I am not sure if that would help with the problem. Furthermore, and like I already said, newer drivers seem to help, but this also seems to be on a game-to-game basis. This makes very little sense to me because the problem would seem to indicate a general problem and not something that requires a specific fix on a per-title basis. Anyways.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...