Using Cg for benchmarking

Would using Cg shaders be a reasonable option to benchmark shading hardware?

  • Yes, but don't use it too much, because it isn't well representative of future games.

    Votes: 0 0.0%
  • No, it is way too nVidia biased.

    Votes: 0 0.0%
  • No, there are better ways to benchmark shading capabilities ( please specify )

    Votes: 0 0.0%

  • Total voters
    296

Arun

Unknown.
Moderator
Legend
Hello everyone,

I've had that idea of using nVidia's Cg tools and FRAPS to benchmark hardware shading speed. Seems insane? Well, maybe it isn't...
By using Cg for benchmarking, we *are* doing the same thing as a few games are doing in the next few months.

And we also unable nVidia to say their hardware isn't exposed in a favorable light. Sure, ATI might complain - but you could always do a HLSL port if required. Although I believe that's useless, becaue nVidia's DX9 profile is probably VERY near of DX9's HLSL compilation, nVidia even developped it with them they claim, eh...

Using Cg for benchmarking, we'd really be able to proof ( or disproof, or whatever ) that nVidia's NV3x shading power is inferior. I'm not saying we should forever use that either - but It'd be a nice, 100% GPU-limited test.

I don't know which tool to use for benchmarking yet, I'm considering using Cg's book "Tutorial Examples" system, and simply change the used Cg programs used. I don't think that's public so using something else might be required...


Feedback, comments?


Utar
 
The idea of a port to HLSL seems specious to me - why not just use HLSL in the first place?

Both are designed to solve the same problem. One of them isn't controlled by one of the manufacturers whose hardware will be benchmarked. Isn't that therefore inherently a far more level playing field?
 
Actually, no. One of the ideas of this system is having a situation where there's no way nVidia can complain, and in fact also show them in their best lights possible which will be shown in a few Cg games. It obviously isn't a solution to everything - but I believe it'd still be an interesting benchmark.

It is MEANT to be biased - the idea being seeing how some games which will, if you want, "be biased", react to today's hardware.


Uttar
 
I would like to say that end users shouldnt really concern themselves with Cg. It's a tool for programmers. I know that nvidia's marketing with Cg is widely spread.
 
HLSL and Cg are really the same thing for all intents and purposes. The only difference is one is marketed as the microsoft DX9 shading language, the other is nVidia's Open shading language that can compile to OpenGL or DX9. I really don't see why people have a problem with Cg...
 
Uttar said:
By using Cg for benchmarking, we *are* doing the same thing as a few games are doing in the next few months.

The key word here being few. I wouldn't assume that the Cg compiler is ready to be used as a fire-and-forget shader builder just yet. Game developers can't rely on it to work good enough on all hardware and I would think that they probably still have to check out the code and make some corrections.

In other words I just think that you would be benchmarking the state and/or progress of the Cg compiler.
 
AndrewM said:
I would like to say that end users shouldnt really concern themselves with Cg. It's a tool for programmers. I know that nvidia's marketing with Cg is widely spread.

The end user pays the bills don't they, we are the ones the buy the products and if the product in this case is a game that is designed using a inferior HLSL, and it is inferior as shown by pocketmoon_ and still no PS 1.4 support, then there is no reason for the consumer to buy that title as the developer has not earned my money.

Games developed on CG will not be installed on this PC, sorry there is no need for it with DX9 HLSL which does support all PS versions.
Not to mention the complaints I see reguarly on Cgshaders.org about issues with other IHV cards besides Nvidia.

http://www.pocketmoon.com/Cg/Cg.html
 
elchuppa said:
HLSL and Cg are really the same thing for all intents and purposes. The only difference is one is marketed as the microsoft DX9 shading language, the other is nVidia's Open shading language that can compile to OpenGL or DX9. I really don't see why people have a problem with Cg...

Sure...name some IHV's that support it and you get a cookie :LOL:
 
There's no real point in trying to argue with you Doomtrooper.

Missing PS 1.4 support isnt a real biggy, and having a HLSL for OpenGL is important to some people, especially because it allows them to take their code to DX if they choose to.

I think your decision to be totally against anything to do with Cg is naive.
 
Cg absolutely should not be used for benchmarking as it creates apples vs. oranges comparisons.

Right now, there is only one (1) IHV that has an optimized back-end for Cg.. and you only get two guesses who that is.
 
Doomtrooper said:
it is inferior as shown by pocketmoon_
...
http://www.pocketmoon.com/Cg/Cg.html
Umm, did you read his conclusions? He doesn't come to the same one you do.

What is also clear is that currently HLSL does a better job at optimising some shaders than Cg (shader 4), while Cg has a slight edge in others (shader 5). ... Both Cg and HLSL are relatively young tools and I would expect to see more consistent and better optimised results as both compilers mature.
 
AndrewM said:
There's no real point in trying to argue with you Doomtrooper.

Missing PS 1.4 support isnt a real biggy, and having a HLSL for OpenGL is important to some people, especially because it allows them to take their code to DX if they choose to.

I think your decision to be totally against anything to do with Cg is naive.

There is a HLSL in the works for OpenGL, and it is governed by the ARB not a single body, and to make matters worse that single body is a IHV.
It is no different then Ford designing the reference for all future Automotive engines and expecting GM and Chyrsler to support it.
It is a conflict of interest, simple as that, I hope CG dies a horrible death.
 
Doomtrooper said:
There is a HLSL in the works for OpenGL, and it is governed by the ARB not a single body, and to make matters worse that single body is a IHV.

A lot of people know that it will be good for OpenGL to have it's own shader language, but it's also known that not everything that the ARB does is absolutely the right way to do things. This is getting off topic.
Edit: I'd also like to point out that Cg has been available for developers to use for about a year now. We wont have an OpenGL shader language for a little while yet.

Doomtrooper said:
It is no different then Ford designing the reference for all future Automotive engines and expecting GM and Chyrsler to support it.
It is a conflict of interest, simple as that, I hope CG dies a horrible death.

I dont like that analogy :LOL: .
 
AndrewM said:
A lot of people know that it will be good for OpenGL to have it's own shader language, but it's also known that not everything that the ARB does is absolutely the right way to do things. This is getting off topic.

All the other IHV's are supporting HLSL and GLslang, that is what us consumers need to support also.
PS 1.4 support is important to users of the 100's of thousands of 8500/9000/9000M/9100/9200 and now the integrated ATI chipset.

Sorry it is better, it should be used...not to mention being part of the DX 8.1 spec.

A lot of people know that it will be good for OpenGL to have it's own shader language, but it's also known that not everything that the ARB does is absolutely the right way to do things

Maybe not, but it also is not controlled by a single voice, which makes it far better than a one dictatorship voice that is only concerned to corner the market :rolleyes:
CG is far from perfect.
 
*ponder* We've gone far afield. You contended Pocketmoon said Cg was inferior, whereas if you read what he said, he said nothing of the sort.

Then you switch gears to a speculative statement that MS will have the compiler technology that will be better, well...because.

I've offered at least one situation where they don't. I'm not saying that will bear out for the case of Cg vs. HLSL on every piece of hardware, but irregardless: pocketmoon did not come to the conclusion you said he did. (Just wanted that to be known to all the folks that are too lazy to go read what pocketmoon actually said)
 
Back
Top