Using Cg for benchmarking

Discussion in 'Architecture and Products' started by Arun, Jun 23, 2003.

?

Would using Cg shaders be a reasonable option to benchmark shading hardware?

  1. Yes, this will enable use to easily benchmark shaders and not bother seeing what optimized code can

    100.0%
  2. Yes, but don't use it too much, because it isn't well representative of future games.

    0 vote(s)
    0.0%
  3. No, it is way too nVidia biased.

    0 vote(s)
    0.0%
  4. No, there are better ways to benchmark shading capabilities ( please specify )

    0 vote(s)
    0.0%
  1. cho

    cho
    Regular

    Joined:
    Feb 9, 2002
    Messages:
    422
    Likes Received:
    16
    yes, i know Cg can not support ps 1.4 right now although they had promise there wolud be a ps 1.4 profile in next version compiler .

    but Cg was used for coding the shader code in real game, that is meant the relation between some(many? :roll: ) ISV and NVIDIA is ok. if NVIDIA is ok on this, i(we?) think ATi should be ok on this too no matter the game developer use Cg, HLSL or GLshading . :roll:
     
  2. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    I think what needs to be questions is what proportion of the game developer community will actually be using Cg off their own back. Futuremark did a survey on what HLSL's developers would be using and very few came back with Cg as a reply, which is one of the reasons why they just didn't use it for 3DMark.
     
  3. cho

    cho
    Regular

    Joined:
    Feb 9, 2002
    Messages:
    422
    Likes Received:
    16
    so can you tell me who choice Cg in their survey? :lol:
     
  4. Evildeus

    Veteran

    Joined:
    May 24, 2002
    Messages:
    2,657
    Likes Received:
    2
    Didn't use CG or HLSL ;)
     
  5. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    HLSL is used for the PS2.0 feature test.
     
  6. Evildeus

    Veteran

    Joined:
    May 24, 2002
    Messages:
    2,657
    Likes Received:
    2
    Really? Good to know. I was under the impression FM didn't use it.
     
  7. Arun

    Arun Unknown.
    Legend

    Joined:
    Aug 28, 2002
    Messages:
    5,023
    Likes Received:
    302
    Location:
    UK
    Hmm, I too was under the impression FM only used assembly language. Eh...

    Anyway, the idea of this benchmark was more to PROOF that the NV3x is inferior to the R3xx even when put under its best light, but since I guess you people don't like it, I guess I won't try to do it :)

    Thanks for the feedback,


    Uttar
     
  8. Doomtrooper

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,328
    Likes Received:
    0
    Location:
    Ontario, Canada
  9. pocketmoon_

    Newcomer

    Joined:
    Nov 15, 2002
    Messages:
    117
    Likes Received:
    0
    The biggest issue with both Cg and HLSL at the time was the optimisation that the compilers were capable of. Some of the compiled output varied greatly in terms of instruction/register counts and in turn performance. IMO it's far better to test raw performance using low level shaders - which I cant be arsed to write anymore :)

    NB none of those shaders were particlularly long or complex. I'd love to try my volumetric fog shader an ATI hardware though !

    [​IMG]
     
  10. elchuppa

    Newcomer

    Joined:
    Jan 20, 2003
    Messages:
    56
    Likes Received:
    0
    sorry to repeat myself, but I think people have failed to fully take in the statement I made earlier. HLSL and Cg are the same thing, porting a shader to HLSL involves doing nothing. They are the same.. the same language. (just wanted to drill that in :) ), If a developer choses to write a HLSL shader in effect they have written a Cg shader, the obvious benefit for them is that on an nvidia card that shader will be openGl compatible as well.
    There is nothing I see that prevents ATI from developing a compiler for it, what the hell they can just call it a HLSL to OpenGL compiler (assuming M$ is cool with that) and be done with it.
    I'll be perfectly honest and mention that I like Cg/HLSL, it is logical and very easy to learn because it is basically C and very much similar to renderman.
     
  11. nooneyouknow

    Newcomer

    Joined:
    Feb 8, 2002
    Messages:
    87
    Likes Received:
    0
    I only read 2 pages of this and stopped reading because it was too funny reading it.

    Here is the bottom line folks:

    HLSL compiler was written with input from all IHV's (Nvidia and ATI at the very least). CG compiler was written with input ONLY from Nvidia.

    MS compiler guys are FAR better than Nvidia compiler guys.

    MS is an independant controller of HLSL, Nvidia is the SOLE controller on CG, a competing IHV.

    HLSL supports ALL versions of Pixel and Vertex Shaders. CG does not.

    HLSL will supply better shader code than CG.

    AND FOR THE LOVE OF GOD, HLSL is NOT the same as CG!!!!! Microsoft will tell you that any time you ask!!
     
  12. jvd

    jvd
    Banned

    Joined:
    Feb 13, 2002
    Messages:
    12,724
    Likes Received:
    9
    Location:
    new jersey
    see i don't know much but i can pretty much agree with all you have said here.
     
  13. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    So? I don't see how this means much of anything. Other companies can write their own back-ends, if they choose to do so.

    Why? Do you know who was on both teams? Are you aware of how many resources Microsoft and nVidia dedicated to their compilers? Do you know how much experience with compilers the relevant engineers had?

    Just because Microsoft also develops other compilers doesn't mean that they'd do better at this one.

    Cg is still a language, and any other IHV can make their own back-end. The front end is also open-source. Another IHV could add anything they wanted to.

    Cg supports OpenGL.

    Where did you get this from?

    The languages are close to identical.
     
  14. Doomtrooper

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,328
    Likes Received:
    0
    Location:
    Ontario, Canada
    If Chalnoth only knew who he was talking too :lol:
     
  15. pocketmoon_

    Newcomer

    Joined:
    Nov 15, 2002
    Messages:
    117
    Likes Received:
    0
    Don't you just hate it when people say that ?

    Cg and HLSL are, if not twins, close cousins. Cg was developed to be as close to HLSL as possible. Cg is capable of supporting OpenGL on all platforms that support the required ARB (note - not Nvidia) extensions. It gives you the added ability to squeeze the best out of Nvidia hardware and the ability to carry your shaders, unchanged, from OpenGL the DX and back again.

    My findings when comparing the compiled output is that Cg and HLSL ane very close in terms of optimisation. Some shaders HLSL nudged ahead, other Cg. There were one or two cases where CG blew HLSL out of the water and vica versa! Personally I'd like to see Intel write a back-end for Cg, they really know how to write a compiler :)
     
  16. elchuppa

    Newcomer

    Joined:
    Jan 20, 2003
    Messages:
    56
    Likes Received:
    0
    oh, I forgot to respond to the "open" thingy..I use open in the sense that other people can implement compilers for it. i.e. Nvidia has open sourced the compiler technology.

    as to HLSL and Cg not being the same language, for my purposes they are. My Cg shaders work as HLSL shaders. If my shaders compile on the HLSL compiler, forgive me for thinking they are written in HLSL. Perhaps it is a subset... but how many differences are there really? Semantically and syntactically they are very hard to differentiate.
     
  17. cho

    cho
    Regular

    Joined:
    Feb 9, 2002
    Messages:
    422
    Likes Received:
    16
    :roll: "Tomb Rider : angel of Darkness" seems support Cg too.
     
  18. Doomtrooper

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,328
    Likes Received:
    0
    Location:
    Ontario, Canada
    You are doing a diservice to your clients by using a HLSL that is not optimized for all cards, it is as simple as that.
    How many times has this been pointed out, no other IHV supports CG and Glslang is here..so again it is not needed.

    http://www.3dlabs.com/support/developer/ogl2/index.htm
     
  19. Reverend

    Banned

    Joined:
    Jan 31, 2002
    Messages:
    3,266
    Likes Received:
    24
    Can we stop this from starting all over AGAIN?

    With DX9HLSL, no developer will want to use Cg unless they are simply genuinely interested in discovering Cg or NVIDIA DevRel/PR did a good job. No developer in their right mind would use Cg if, while they're working with it with their title, they discover results/behaviour that is not what they expect if the results are negative.

    If a developer uses Cg, the only things they must/should worry about is that it works, that it works on non-NVIDIA hardware and that the IQ results are the same regardless of hardware. If using Cg results in noticeably better performance on NVIDIA hardware against other hardware that are reasonably and logically same spec-for-spec, then it is up to the developer to determine if it is (more) important that using DX9HLSL results in almost same performance on all spec-for-spec hardware regardless of company. If they don't, that means they don't care. Just like they think many users don't care either, except for the minority that sticks to "ethics" no matter what.
     
  20. Demirug

    Veteran

    Joined:
    Dec 8, 2002
    Messages:
    1,326
    Likes Received:
    69
    We allready have a big problem with shadercode on DX9 hardware.

    NV3X chips do not run well with code that is build with the *s_2_0 profiles.
    NV3X chips prefer code that is build with the *s_2_A profiles or Cg NV3X targets.
    R(V)3XX chips prefer the *v_2_0 profiles.
    R(V)3XX can not run every code that is build with *v_2_A profiles or Cg NV3X targets.

    There will be other chips and other profiles (*s_2_B) right around the corner.

    We will end with a different shaderset for each chip family. If you allready have to compile a high level shader for each chip family there is now big difference if you use one compiler with different profiles or using different compilers to do this job.

    Even if you want to compile at runtime you can implement the use of different compilers in less the 100 lines of code.

    last but not least. Cg is at the moment they only possible solution to write an engine that support DX and OpenGL and high level shaders at the same time without the need to write every shader two times.

    PS: Excuse my bad english it is not my first language. I am better with programming languages.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...