Tomb Rider AOD with Cg

Discussion in 'Graphics and Semiconductor Industry' started by cho, Aug 15, 2003.

  1. cho

    cho
    Regular

    Joined:
    Feb 9, 2002
    Messages:
    416
    Likes Received:
    2
    Tomb Raider AOD with Cg

    [​IMG]

    [​IMG]

    it is need to install the "CgSetup.exe"(download from nvidia site ) to make the "Cg compiler" option could be active.
     
    #1 cho, Aug 15, 2003
    Last edited by a moderator: Jul 16, 2007
  2. Ostsol

    Veteran

    Joined:
    Nov 19, 2002
    Messages:
    1,765
    Likes Received:
    0
    Location:
    Edmonton, Alberta, Canada
    Doesn't look any different, to me, and performance is basically the same. . . Am I missing something?
     
  3. cho

    cho
    Regular

    Joined:
    Feb 9, 2002
    Messages:
    416
    Likes Received:
    2
    yes, they are almost same, but... :D

    RADEON 9800 PRO 256MB:

    [​IMG]

    [​IMG]
     
  4. StealthHawk

    Regular

    Joined:
    May 27, 2003
    Messages:
    459
    Likes Received:
    0
    Location:
    I exist
    Ok, I have a question. Were TB:AoD's shaders hand written or was the DX9 HLSL used :?:

    Performance difference = :shock: Good thing NVIDIA is letting up on pushing Cg. If we had lazy developers who only used Cg it would kill everyone else's performance.
     
  5. kipper67

    Newcomer

    Joined:
    Jul 18, 2003
    Messages:
    119
    Likes Received:
    1
    Location:
    UK
    Cho - is there any chance you could put up a couple of screenshots or details of the settings you are using, as I'm getting nowhere near the same as you with my 9800 Pro.

    Thanks

    Mark
     
  6. Ostsol

    Veteran

    Joined:
    Nov 19, 2002
    Messages:
    1,765
    Likes Received:
    0
    Location:
    Edmonton, Alberta, Canada
    Ah. . . A clear indication of Cg prioritized lower register usage over lower instruction counts. Heh. . . It doesn't even appear to have any positive effect for the GeforceFX, in this case, but screws over other video cards.
     
  7. MuFu

    MuFu Chief Spastic Baboon
    Veteran

    Joined:
    Jun 12, 2002
    Messages:
    2,258
    Likes Received:
    51
    Location:
    Location, Location with Kirstie Allsopp
    Cool, they wrote a shader to emulate memory errors on a manhole cover. :D

    MuFu.
     
  8. Chris123234

    Regular

    Joined:
    Jan 22, 2003
    Messages:
    306
    Likes Received:
    0
  9. ZoinKs!

    Regular

    Joined:
    Nov 23, 2002
    Messages:
    782
    Likes Received:
    13
    Location:
    Waiting for Oblivion
    This clinches it... Cg is evil. :evil:

    Of course, some of us already knew that... :wink:



    edit: I've thought of an alternate explanation in which evil is not required... maybe Cg just simply doesn't work. nVidia may be backing away from it becuase it's not worth the effort to try to fix it. :?:
     
  10. Doomtrooper

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,328
    Likes Received:
    0
    Location:
    Ontario, Canada
    This is exactly the type of examples Russ needs to look at, where people that questioned CG's use were told they were looking at conspiracy theories. :lol:
     
  11. nooneyouknow

    Newcomer

    Joined:
    Feb 8, 2002
    Messages:
    87
    Likes Received:
    0
    CG was used for Nvidia boards and HLSL used for ATI boards.
     
  12. RussSchultz

    RussSchultz Professional Malcontent
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,855
    Likes Received:
    55
    Location:
    HTTP 404
    You never quit, do you?

    I mean, this is OBVIOUS PROOF that NVIDIA was out to screw competitors.

    It couldn't be that the Cg backend optimizes for something different than what the R300 finds optimal.

    Nope, proof that NVIDIA is evil.

    It also couldn't be that the Cg compiler is somewhat sub-optimal in general?

    Nope, its proof that NVIDIA is evil.

    :roll:
     
  13. gkar1

    Regular

    Joined:
    Jul 20, 2002
    Messages:
    614
    Likes Received:
    7
    Wait wait!
    You forgot to argue that they were using FRAPS. According to Nvidia this application is defective right? :lol:
     
  14. OpenGL guy

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,357
    Likes Received:
    28
    So what's the benefit of using Cg? If the compiler generates inferior code, then there seems like there's little reason to select it over HLSL.

    P.S. I'd call the results on R300 a bit more than "somewhat" sub-optimal.
     
  15. Doomtrooper

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,328
    Likes Received:
    0
    Location:
    Ontario, Canada
    You know me better than that :lol:
     
  16. jjayb

    Regular

    Joined:
    Feb 13, 2002
    Messages:
    358
    Likes Received:
    1
    Russ, wasn't it you who started this thread:

    http://www.beyond3d.com/forum/viewtopic.php?t=1764&highlight=

    And I'll quote from that thread:


    Funny how you have no problem using "those little smily faces will eyes that roll upward" that you so detest when someone actually shows you an example of Cg being a detriment to a card other than Nvidia's.

    :roll: :roll: :roll: :roll: :roll: :roll: :roll: :roll: :roll: :roll:
     
  17. RussSchultz

    RussSchultz Professional Malcontent
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,855
    Likes Received:
    55
    Location:
    HTTP 404
    Sigh. Are we going to have the same discussion every time?

    1) No, this example, or the last one, or the next one, does not show the _language_ is somehow geared toward screwing anybody.
    2) No, this example, or the last one, or the ntext one, does not show that the _idea_ of pluggable backends is geared toward screwing anybody.
    3) No, this example, or the last one, or the next one, is not a technical argument as to how Cg (the language or the idea) is only good for NVIDIA and a detriment toward others.

    It does show the implementation (e.g. the backend) of the is geared toward NVIDIA hardware. How? We can presume they optimize for fewest registers required, rather than shortest number of assembly instructions.

    If ATI would (though they won't) develop their own backend, then this problem wouldn't exist. Their backend would optimize for whatever is best for them. But they won't, so it does.

    So, please, just let it rest. Its apparent Cg is dead on the vine, but that doesn't mean it was:
    a) A bad technical idea
    b) An idea borne to control the market by putting others at a disadvantage
    c) Inherently evil
     
  18. Reverend

    Banned

    Joined:
    Jan 31, 2002
    Messages:
    3,266
    Likes Received:
    24
    If you're speaking from the developer's point of view, this is wrong.

    D3DX is default. Cg is an option for ALL boards. Core Design said (this is from the benchmarking readme of this game that I wrote) :

    I am at a lost as to why users need to download and install the Cg runtime package to enable the use of the Cg compiler (which is bundled with the game although it is an outdated one) in this game. This shouldn't be the way it works. Looks like NVIDIA doesn't want this to work as intended for a reason.
     
  19. jjayb

    Regular

    Joined:
    Feb 13, 2002
    Messages:
    358
    Likes Received:
    1
    As you like to say: Prove to me that it wasn't any of the things you've said above. You've asserted lots of things here, now prove them. ;)

    The only thing I see from this particular example is that it does nothing but lower ATI's fps. It doesn't change the FPS on the GFFX at all. I would think that the GFFX card would have at least seen some benefit from using CG, yet that's not the case. So what we have here in this example is that using cg does nothing to help the gffx card yet dramatically lowers the competition's cards FPS. **Cues the x-files theme music*** :p
     
  20. Ostsol

    Veteran

    Joined:
    Nov 19, 2002
    Messages:
    1,765
    Likes Received:
    0
    Location:
    Edmonton, Alberta, Canada
    Well, as Russ said: the problem is simply that there is no profile for ATI cards. Of course he defnitely knows that there's no way in hell that ATI will provide any support for Cg unless it becomes a standard accepted by a bunch of other vendors. Given that ATI and NVidia are the only ones who have DX9 cards out, it's unlikely. The only other reason why ATI might support Cg is if either Microsoft or the ARB officially endorses it as a standard HLSL. Given that both APIs now have their own, though. . . Again: support isn't ever to be likely.

    Didn't someone say that DX HLSL will soon have the capability to prioritize lowering register usage for NVidia cards? If that happens, then the only purpose Cg will serve is to provide functionality to explicitly make use of the GeforceFX's multiple precisions -- specifically FX12.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...