Using Cg for benchmarking

Would using Cg shaders be a reasonable option to benchmark shading hardware?

  • Yes, but don't use it too much, because it isn't well representative of future games.

    Votes: 0 0.0%
  • No, it is way too nVidia biased.

    Votes: 0 0.0%
  • No, there are better ways to benchmark shading capabilities ( please specify )

    Votes: 0 0.0%

  • Total voters
    296
pocketmoon_ said:
nooneyouknow said:
I only read 2 pages of this and stopped reading because it was too funny reading it.

Here is the bottom line folks:

Don't you just hate it when people say that ?

Cg and HLSL are, if not twins, close cousins. Cg was developed to be as close to HLSL as possible. Cg is capable of supporting OpenGL on all platforms that support the required ARB (note - not Nvidia) extensions. It gives you the added ability to squeeze the best out of Nvidia hardware and the ability to carry your shaders, unchanged, from OpenGL the DX and back again.

My findings when comparing the compiled output is that Cg and HLSL ane very close in terms of optimisation. Some shaders HLSL nudged ahead, other Cg. There were one or two cases where CG blew HLSL out of the water and vica versa! Personally I'd like to see Intel write a back-end for Cg, they really know how to write a compiler :)

1 - I stopped reading simply because there was still the usual mis-information out there. Gets tiring after a while.
2 - I would suspect that on Nvidia hardware, the output should be similar. On ATI hardware, I would suspect a far different picture, obivously dependent on the length / complexity of the shader.
 
Chalnoth said:
nooneyouknow said:
HLSL compiler was written with input from all IHV's (Nvidia and ATI at the very least). CG compiler was written with input ONLY from Nvidia.
So? I don't see how this means much of anything. Other companies can write their own back-ends, if they choose to do so.

Hmmm. Lemme think. Oh yeah, with HLSL, no back-end needs to be written. And if you are another IHV, your competitor doesn't control its' future. Oh yeah.... So, I guess that means something to someone.

Chalnoth said:
nooneyouknow said:
MS compiler guys are FAR better than Nvidia compiler guys.
Why? Do you know who was on both teams? Are you aware of how many resources Microsoft and nVidia dedicated to their compilers? Do you know how much experience with compilers the relevant engineers had?

Just because Microsoft also develops other compilers doesn't mean that they'd do better at this one.

I knew someone would nickpick this but I really don't care. I will retract this comment only because I am not going to spend the time defending it. I know the MS guys that did the compiler work, and I believe my statement is correct.

Chalnoth said:
nooneyouknow said:
MS is an independant controller of HLSL, Nvidia is the SOLE controller on CG, a competing IHV.
Cg is still a language, and any other IHV can make their own back-end. The front end is also open-source. Another IHV could add anything they wanted to.

Hmm.. Let me ask you a question. How would ATI get features added to CG? Because if it is simply telling Nvidia what they want, or submitting code to Nvidia, I doubt ANY IHV would do so. Giving away feature sets of future hardware is not wise.


Chalnoth said:
nooneyouknow said:
HLSL supports ALL versions of Pixel and Vertex Shaders. CG does not.
Cg supports OpenGL.

Okay, I do not argue that. But, does it support ARB only functions or vendor extensions as well (both Nvidia and ATI)?

Chalnoth said:
nooneyouknow said:
HLSL will supply better shader code than CG.
Where did you get this from?

Again, this is from the viewpoint of PS 2.0 code on ATI hardware. I could say 1.4 code on ATI hardware, but CG doesn't support that ;)

Chalnoth said:
nooneyouknow said:
AND FOR THE LOVE OF GOD, HLSL is NOT the same as CG!!!!! Microsoft will tell you that any time you ask!!
The languages are close to identical.

As I said before, ask Microsoft, I am sure you will not get that answer.
 
Doomtrooper said:
I always get flamed on this subject, have since the original thread about CG was posted a year ago, but the 1st DX9 title that used CG didn't even run on a 9700 Pro in December but ran on a Geforce 3.

Gunmetal Demo.

Funny you bring that up.

The game failed to run on ATI hardware because ATI's cards failed the hardware check, not due to some language incompatibility (and don't all nVidia demos have this check? If so, why the surprise that they included the check on this demo?). Soon after the release of the demo, the check for the card was bypassed and the game ran fine. It's failing to run on ATI hardware had nothing to do with Cg.

If anything, the game shows that Cg works on ATI's hardware.
 
Since when is Gunmetal a Nvidia Demo, I downloaded it off file planet, Not Nvidia. :rolleyes:

A little refresh on the game demos inital release.

http://www.beyond3d.com/forum/viewtopic.php?t=3662&start=0

Now if the demo can run on a GF3 and GF4 but not on a DX9 card like a Radeon 9700, and this 'supposed' to be a DX9 Game Demo :LOL:

From tb, author of 3Danalyze that got it working on 9700's back then.

I've released a new version of 3DA, if someone want to try the demo on his 9500 / 9700. Its a dx8 game, so how can they use dx9 features, maybe in the full version....

It basicly checks some supported texture formats/frame buffer combinations, which the geforce and the ref. rast. supports, but it never uses this combination, becuase it runs without errors on the radeon 9500/9700, I just had to return different results in the "check if" phase.

Regards,
Thomas
_________________
http://www.tommti-systems.com

No one said CG wouldn't run on a 9700, since CG is limited to 2 year old PS 1.1 technology then a game designed on CG is not using the strength of ATI cards feature set, PS 1.4 being one of them.

FYI before I removed 3Dmark 03 I tested PS 1.1 vs PS 1.4 on a 9700 using the force PS 1.1 option in the panel, almost 20% speed improvement using PS 1.4.

That alone is worth some developer support, and CG doesn't support it...still.
 
Hmm.. Let me ask you a question. How would ATI get features added to CG? Because if it is simply telling Nvidia what they want, or submitting code to Nvidia, I doubt ANY IHV would do so. Giving away feature sets of future hardware is not wise.

The most sensible post on this subject in a long time.
 
Chalnoth said:
nooneyouknow said:
AND FOR THE LOVE OF GOD, HLSL is NOT the same as CG!!!!! Microsoft will tell you that any time you ask!!
The languages are close to identical.

You could argue that British english and US english are close to identical too. But all you have to do is ask an American and a Brit how they spell "realize" ("realise") or "color" ("colour") to see that they are not the same.
 
And if they are soooo identical, then why did Nvidia write it? They had to have known MS was doing HLSL. Heck, what about vica-versa.

Also, why are there a couple developers that support CG also support HLSL? Hmmm.. Oh yeah, they want something to work on everyones card ;)
 
nooneyouknow said:
Also, why are there a couple developers that support CG also support HLSL? Hmmm.. Oh yeah, they want something to work on everyones card ;)
It simply shows that they both have their strengths.
 
nooneyouknow said:
And if they are soooo identical, then why did Nvidia write it? They had to have known MS was doing HLSL. Heck, what about vica-versa.

Maybe they wanted it to be the industry standard instead of MS' HLSL, as it would allow developers to include extra features for their cards without doing much extra work? [EDIT] Forgot to add: And would let Nvidia dictate where the industry is going to a much greater extent than they do today [/EDIT]

As for vice-versa, maybe because MS didn't like the fact that would/could allow IHV's more freedom when implementing DX9/DX10/DXn...
(ie not following the spec, but using a lower precision instead...)

Oh, and maybe because MS isn't too keen on OpenGL, as they don't have very much to say about where it's going?
 
nooneyouknow said:
Also, why are there a couple developers that support CG also support HLSL?
Hmm... all of a sudden I found this to be an odd statement... I'd always looked at high level shading languages as supporting developers, not the other way round.
 
Reverend said:
nooneyouknow said:
Also, why are there a couple developers that support CG also support HLSL?
Hmm... all of a sudden I found this to be an odd statement... I'd always looked at high level shading languages as supporting developers, not the other way round.

Sure but this coin have two sides as most features. If you want a good high level shading language support you have to support the high level shading languages. If nobody use HLSL or Cg it will die a fast dead.
 
Reverend said:
nooneyouknow said:
Also, why are there a couple developers that support CG also support HLSL?
Hmm... all of a sudden I found this to be an odd statement... I'd always looked at high level shading languages as supporting developers, not the other way round.

Rev, I was merely trying to point out that CG must not be that great since developers who have been using CG have also decided to use HLSL. My interpretation from that is that CG is obviously great for Nvidia hardware and HLSL is great for everyone else. Again, my opinion.
 
nooneyouknow said:
Reverend said:
nooneyouknow said:
Also, why are there a couple developers that support CG also support HLSL?
Hmm... all of a sudden I found this to be an odd statement... I'd always looked at high level shading languages as supporting developers, not the other way round.

Rev, I was merely trying to point out that CG must not be that great since developers who have been using CG have also decided to use HLSL. My interpretation from that is that CG is obviously great for Nvidia hardware and HLSL is great for everyone else. Again, my opinion.

CG may have been concieved of as a generic HLSL for all cards, but as the NV30 debacle unfolded over the last year, it is clear that CG turned into a software solution to save NV from their badly misjudged (or else badly executed) performance for the NV30... thereby turning it into a way for developers to at least get decent shading performance on NV cards, screw everyone else, wherease the generic HLSL (DX/OGL) are good for everyone but NV.

It seems to me that any smart developer that supports CG to get decent NV card performance, at this point Has to support MS HLSL as well to get good performance out of everything else.
 
The instruction scheduler on R300 is pretty effective as it is, so using Cg will not exactly "screw" ATI's hardware as its performance will still be pretty good.
 
How about beyond3D doing a R3.xx shader comparison profiles using HLSL vs CG Dave ??

I keep seeing this, it would be nice to see a performance comparison.
 
DaveBaumann said:
The instruction scheduler on R300 is pretty effective as it is, so using Cg will not exactly "screw" ATI's hardware as its performance will still be pretty good.

Suggesting it wouldn't be better using standard HLSL? or just that if a dev used CG it would be "good enough" on the R300 simply because it is a great chip with good drivers?
 
A Article on this would be very interesting, find some common shader programs, and compile them with both HLSL and compare.

:?: :?:
 
Back
Top