Ah. . . A clear indication of Cg prioritized lower register usage over lower instruction counts. Heh. . . It doesn't even appear to have any positive effect for the GeforceFX, in this case, but screws over other video cards.cho said:yes, they are almost same, but...
RADEON 9800 PRO 256MB:
<pic>
<pic>
You never quit, do you?Doomtrooper said:This is exactly the type of examples Russ needs to look at, where people that questioned CG's use were told they were looking at conspiracy theories.
So what's the benefit of using Cg? If the compiler generates inferior code, then there seems like there's little reason to select it over HLSL.RussSchultz said:It also couldn't be that the Cg compiler is somewhat sub-optimal in general?
You never quit, do you?
RussSchultz said:You never quit, do you?
I mean, this is OBVIOUS PROOF that NVIDIA was out to screw competitors.
It couldn't be that the Cg backend optimizes for something different than what the R300 finds optimal.
Nope, proof that NVIDIA is evil.
It also couldn't be that the Cg compiler is somewhat sub-optimal in general?
Nope, its proof that NVIDIA is evil.
RussSchultz said:Ok, put up or shut up. I'm tired of hearing the incessant bleating of "Cg is optimized for NVIDIA hardware" without any proof than little smily faces will eyes that roll upward.
Lets hear some good TECHNICAL arguments as to how Cg is somehow only good for NVIDIA hardware, and is a detriment to others.
Moderators, please use a heavy hand in this thread and immediately delete any posts that are off topic. I don't want this thread turned into NV30 vs. R300, NVNDIA vs. ATI, my penis vs. yours. I want to discuss the merits or de-merits of Cg as it relates to the field as a whole.
So, given that: concisely outline how Cg favors NVIDIA products while putting other products at a disadvantage.
If you're speaking from the developer's point of view, this is wrong.nooneyouknow said:CG was used for Nvidia boards and HLSL used for ATI boards.
Core Design said:The Cg compiler, as supplied with the game, is not as good as the D3DX one. But, when nVidia release a new compiler you can just drop the DLLs into the \bin\ directory and you get an instant upgrade. The D3DX compiler can only be upgraded with a new version of DirectX and a game patch.
So, please, just let it rest. Its apparent Cg is dead on the vine, but that doesn't mean it was:
a) A bad technical idea
b) An idea borne to control the market by putting others at a disadvantage
c) Inherently evil
Well, as Russ said: the problem is simply that there is no profile for ATI cards. Of course he defnitely knows that there's no way in hell that ATI will provide any support for Cg unless it becomes a standard accepted by a bunch of other vendors. Given that ATI and NVidia are the only ones who have DX9 cards out, it's unlikely. The only other reason why ATI might support Cg is if either Microsoft or the ARB officially endorses it as a standard HLSL. Given that both APIs now have their own, though. . . Again: support isn't ever to be likely.jjayb said:As you like to say: Prove to me that it wasn't any of the things you've said above. You've asserted lots of things here, now prove them.
The only thing I see from this particular example is that it does nothing but lower ATI's fps. It doesn't change the FPS on the GFFX at all. I would think that the GFFX card would have at least seen some benefit from using CG, yet that's not the case. So what we have here in this example is that using cg does nothing to help the gffx card yet dramatically lowers the competition's cards FPS. **Cues the x-files theme music***