Tomb Rider AOD with Cg

Discussion in 'Graphics and Semiconductor Industry' started by cho, Aug 15, 2003.

  1. Reverend

    Banned

    Joined:
    Jan 31, 2002
    Messages:
    3,266
    Likes Received:
    24
    Whoops, a combination of my misunderstanding and Core Design's mistake.

    Game bundles Cg compiler. Two DLLs are supplied - cg.dll and cgD3D8.dll ... cgD3D8.dll was bundled by mistake... it should've been cgD3D9.dll . Next patch should fix this. No need to download/install NVIDIA's compiler/runtime package (after the next patch of course).
     
  2. Ante P

    Veteran

    Joined:
    Mar 24, 2002
    Messages:
    1,448
    Likes Received:
    0
    I don't know if it's the lack of 256 MBs on my board but I just tried the game with my 9800 Pro 128 MB and there's no chance in hell I ever reached 85 fps. I checked with fraps and it was running at 35-55 fps, never went above 60 fps.

    Running the PS2.0 settings, 1024x768, no AA/AF, vsync off. Catalyst 3.6, nForce2, Athlon XP 2600+, 512 MB PC2700 etc.
     
  3. Doomtrooper

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,328
    Likes Received:
    0
    Location:
    Ontario, Canada
    Presently with the current DX9 HLSL, FX cards are FASTER running that now vs CG..as shown with the new tenebrae mod author AND now this example.

    I might add that Russ was wrong, but he is too stubborn to admit it, and that is a fact :D
     
  4. RussSchultz

    RussSchultz Professional Malcontent
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,855
    Likes Received:
    55
    Location:
    HTTP 404
    Wrong about what, exactly?

    That Cg is some sort of conspiracy? Well, no, I haven't seen any PROOF of that.

    That Cg, _the language_, is designed to hurt competitors? Nope, no proof of that either.

    That Cg, _the idea_, is designed to hurt competitors? Nope, no proof of that either.

    Like I said, you can come up with a whole bunch of examples that show how by using the Cg system, competitors are disadvantaged, but it won't prove anything. They're disadvantaged by their own lack of work, not because of the design, the language, or the idea. Until somebody comes up with their own backend that is tuned for their own piece of hardware and shows that they're being screwed, all the examples in the world won't show what you're asserting it does.

    But, its obvious that'll never happen, so its all a complete circle of mental masturbation.
     
  5. jjayb

    Regular

    Joined:
    Feb 13, 2002
    Messages:
    358
    Likes Received:
    1
    From the example above, CG does not benefit the GFFX card. Does it not have it's own backend? What makes you think an ATI backend would help ATI cards if CG doesn't even help the GFFX with it's own backend?
     
  6. RussSchultz

    RussSchultz Professional Malcontent
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,855
    Likes Received:
    55
    Location:
    HTTP 404
    Why don't you just say "I hate NVIDIA", rather than coming up with these examples that don't prove what you're trying to assert?

    Yes, the compiler seems to stink.

    No, that doesn't show what you're asserting it does.
     
  7. Doomtrooper

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,328
    Likes Received:
    0
    Location:
    Ontario, Canada

    Why don't you just say "I like to argue" rather than coming up with these lame ass excuses ??



    Now lets stop playing this game, and say hey I was wrong, it isn't that hard :!:
     
  8. RussSchultz

    RussSchultz Professional Malcontent
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,855
    Likes Received:
    55
    Location:
    HTTP 404
    Never let logic get in the way...

    You haven't shown through any technical argument or empirical evidence that Cg (the language, or the idea behind it) is somehow only good for NVIDIA hardware, and is a detriment to others.

    You also haven't outlined how Cg (the language, or the idea behind it) favors NVIDIA products while putting others at a disadvantage.

    These examples don't do that either.

    All these examples have shown that some part of the Cg compiler isn't very good, and/or the standard profile isn't optimal for the R3xx core. Both of which have nothing to do with the language, or the idea behind it favoring NVIDIA and handicapping competitors.

    It does not offer any proof of your assertion, nor does it somehow make me wrong.
     
  9. Joe DeFuria

    Legend

    Joined:
    Feb 6, 2002
    Messages:
    5,994
    Likes Received:
    70
    Why don't I just step in here and end this.

    1) There's nothing "theoretically" that would make Cg a bad thing. In a non-competitive, altruisitic environment, Cg could have even been "the best thing for everyone."

    2) Practically speaking, in a competitive landscape, including one with 2 other "standards" for HLSL development framework, Cg is just a bad, stinking pile of dog poo of an idea. To make matters worse, the state of the compiler suggests that the bad idea has been been compouded with crap execution.

    Everyone happy?
     
  10. nelg

    Veteran

    Joined:
    Jan 26, 2003
    Messages:
    1,557
    Likes Received:
    42
    Location:
    Toronto
    Does it not just plain suck when the practical gets in the way of the theoretical.
     
  11. jjayb

    Regular

    Joined:
    Feb 13, 2002
    Messages:
    358
    Likes Received:
    1
    People in glass houses shouldn't throw stones.
     
  12. DaWizeGuy

    Newcomer

    Joined:
    Aug 19, 2003
    Messages:
    24
    Likes Received:
    0
    Same here. Frame rate never goes above 60. On my system Cg only reduce frame rates by 3-5 FPS average. Running 35-50 FPS average, 60 FPS indoors at 1024x768x32 4x AA and 8X AF, vsync on (same with off). Somethings's fishy...

    System:
    P4 2.53@3.02 GHz
    Asus P4P800
    512 MB DDR 400 Dual Channel
    Radeon 9800 Pro 128 MB, Omega 3.6a
     
  13. OpenGL guy

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,357
    Likes Received:
    28
    What's really fishy is people making up conspiracies. Bored, I guess?

    P.S. Maybe people should wait until someone with a 256 MB board posts results before jumping to conclusions?
    P.P.S. Do you even know for certain that you're running the same settings as the reviewer?

    Edited to fix typo.
     
  14. DaWizeGuy

    Newcomer

    Joined:
    Aug 19, 2003
    Messages:
    24
    Likes Received:
    0
    OK, I found out why the FPS were limited to 60. It is because "frame rate compensation" is ebabled by default. When you tun it off frame rates go beyond 60. My apologies to the original poster for any innuendo :oops: I reckon it is an option similar to the frame rate limiter in Vice City to eliminate or alleviate visual artifacts. It is still rather odd that the Radeon 9800 Pro is over twice as fast as the 5900 Ultra. I guess nVidia's Pixelshaders 2.0 are even weaker than I thought.
     
  15. jjayb

    Regular

    Joined:
    Feb 13, 2002
    Messages:
    358
    Likes Received:
    1
    Which is what all of those "unreliable" "useless" synthetic benchmarks have been showing us for some time.
     
  16. StealthHawk

    Regular

    Joined:
    May 27, 2003
    Messages:
    459
    Likes Received:
    0
    Location:
    I exist
    Where is PS1.4 support?
     
  17. Reverend

    Banned

    Joined:
    Jan 31, 2002
    Messages:
    3,266
    Likes Received:
    24
    Joe, what do you know about the "state of the compiler" that leads you to think what the compiler suggests to you?
     
  18. Joe DeFuria

    Legend

    Joined:
    Feb 6, 2002
    Messages:
    5,994
    Likes Received:
    70
    1) Doesn't support PS 1.4
    2) Based on the evidence in this thread...doesn't improve FX performance over the HLSL compiler in some cases, while severely crippling performance of other architectures.
    3) Generall "buginess" and "twitchiness" of the compiler I've read about (on CG web-boards) since Cg's release, especially compared to HLSL
    4) Russ said "it seems to stink". ;)
     
  19. RussSchultz

    RussSchultz Professional Malcontent
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,855
    Likes Received:
    55
    Location:
    HTTP 404
    I'd have more faith in the compiler iff:
    1) It actually improved the FX benchmarks over HLSL (like its supposed to)
    2) It didn't degrade the R300 performance to such a degree. (like its not supposed to)

    From those two facts, it APPEARS that the compiler isn't doing a terribly good job of generating tight code.

    The PS1.4 thing I'm not sure is a compiler-being-bleh thing as much as a lack of impetous on the part of NVIDIA (no desire to support it in the backend because none of their parts use it very well)
     
  20. Reverend

    Banned

    Joined:
    Jan 31, 2002
    Messages:
    3,266
    Likes Received:
    24
    High level shading languages in general are not perceived by developers using them to provide any performance advantages.

    High level shading languages are generally perceived as a time-saving "feature" that allows developers to flex their creative skills without spending as much time as assembly.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...