Cg released

Discussion in 'Architecture and Products' started by DemoCoder, Jun 13, 2002.

  1. Doomtrooper

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,328
    Likes Received:
    0
    Location:
    Ontario, Canada
    Well that would be because ATI, Matrox, and 3Dlabs are working on OGL 2.0 and DX 9 HLSL wouldn't it :roll:
     
  2. gking

    Newcomer

    Joined:
    Feb 9, 2002
    Messages:
    130
    Likes Received:
    0
    As is NVIDIA. There is a need *right now* for a high-level shading language (and a direct path for content creation in 3D modeling packages).

    You're putting a lot of stock on OpenGL 2.0 being finished (and ready for use) sometime soon, when I think that isn't going to be the case at all. The "standard" isn't finalized (even the shading language is just a proposal at this point), and once it is, it will take at least a year before OpenGL 2.0 ICDs are stable and ready for everyday use, let alone development.

    And, I think OpenGL 2.0's shading language is deficient in the way that it handles texture fetching, as it will require far too many resources to be spent on compiler development than a model such as Cg's (where the texture fetch instructions are bundled with the output profile), resulting in either worse performance or higher cost, without any tangible benefits.
     
  3. Doomtrooper

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,328
    Likes Received:
    0
    Location:
    Ontario, Canada
    I'm putting alot of stock on DX9 HLSL also, with Dx9 coming approx in the fall, that isn't very far away...
    For a another graphic vendor to start working on some profile for their GPU right now, (where is the profile Docs ??) by the time it would be ready DX 9 HLSL will be here.
     
  4. gking

    Newcomer

    Joined:
    Feb 9, 2002
    Messages:
    130
    Likes Received:
    0
    And Cg will have a DX9 profile, so all DX9 compliant cards will be able to take advantage of it as soon as DX9 is ready.
     
  5. Doomtrooper

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,328
    Likes Received:
    0
    Location:
    Ontario, Canada
    Why have a profile for a another HLSL if Nvidia is working on DX9 HLSL ??
     
  6. Docwiz

    Newcomer

    Joined:
    May 29, 2002
    Messages:
    162
    Likes Received:
    3
    Are you high or did you just make that up? Read "any" interview and it clearly states that ATI or anyone else can make their own compilers.
    Don't be such a tool because you don't like Nvidia. I mean 3DFX is dead and it was dead before Nvidia bought them and don't hate them because they bought 3DFX. Its like you have such a grudge against anything thats positive about Nvidia. I have been reading this forum for quite some time and its like you are never bullish when it comes to anything positive about Nvidia.

    I think telling you to get a girlfriend or a wife is a contribution to your social life and social skills.

    As for 3D related items, I told you that Carmack was not using a NV30 chip and I was right. I told you that the chips might be taped out, but the silicon wasn't in developers hands and I was also right.

    I also told you that Carmack was using an Geforce 4 chipset and comparing that to the new ATI chipset and I was right again. What do you not get about me not contributing.

    I see you contribute theories on how nvidia sucks at architecture and how you don't care for them, but thats all I see you post. Big whoop.
     
  7. Doomtrooper

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    3,328
    Likes Received:
    0
    Location:
    Ontario, Canada
    What do you post then ace ?? [​IMG]
     
  8. Saem

    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    1,532
    Likes Received:
    6
    Take a course in critical thinking. You're entire argument is based on a fallacy of black and white thinking. nVidia already rules the 3D arena, why stop now? That's great there buddy, but a monopoly would be far worse than a competitive/colaborative atmosphere in this marketplace.
     
  9. Guest

    Newcomer

    Joined:
    May 17, 2002
    Messages:
    18
    Likes Received:
    0
    Assume for a moment that NVIDIA does not have displacement mapping in NV30, do you then think that the functionality to use the displacement mapping value in the vertex shader will go into Cg ? There will be a DX9-As-Nvidia-Sees-Fit profile when DX9 is ready and NV30 is ready.

    Quite possibly because NVIDIA told no competitor about this ? Quite possibly that ATI and others only got access to the syntax and functionality on the same day (or even later) than the developers ? Or do you want yet another language ? The more languages we get the more it goes against what we are trying to reach : write the program once and use it on all hardware, all platforms and reuse it in the future without too much work. If there are 10 languages each with different advantages then the whole idea is up in the air.

    G~
     
  10. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    Grow up. This has nothing to do with 3dfx or a hatred of nVIDIA this comes from the fat that I sat face to face in a nice little meeting with David Kirk on Friday, and when I asked the question will they allow others to make compilers for their own hardware will that be OK by nVIDIA the answer I got was certainly not a straight ‘Yes’ – now, admittedly I was tired when I got there and I haven’t listened to the tape again, but it seemed to me as though he was basically saying “Why fragment things again?â€.

    Interestingly the only other interview I’ve read with Kirk he skirts around the issue of other vendors creating their own compilers:

    http://www.hothardware.com/hh_files/S&V/nvcg(2).shtml

    Of course, the other issue is that NVIDIA own all the syntax so other vendors cannot create syntax for features that are not currently exposed in Cg.

    If you really think that then you’re obviously the one troubled.

    Well read again, because you obviously read incorrectly.

    Consider this a warning.
     
  11. pikkachu

    Newcomer

    Joined:
    May 9, 2002
    Messages:
    17
    Likes Received:
    0
    oh great,so they updated nvparse
    then call it CG,and says its
    "open industry standard"? :roll:
    :evil: :roll: sheesh
     
  12. Sharkfood

    Regular

    Joined:
    Feb 7, 2002
    Messages:
    702
    Likes Received:
    11
    Location:
    Bay Area, California
    Democoder-
    On the surface, this is exactly how I see it. Nothing "earthshattering" about it and I think discussions of all this hooplah are way out of proportion. Speaks volumes for the effectivity of PR and marketing. :)

    On the surface, it seems innocuous enough. The real test will be if a hypothetical ATI or 3DLabs or *whoever* comes out with their own compiler that can be fed NVIDIA design Cg code that pops out a working set.. and what proprietary, legal or IP wars would surround such a translation project.

    If this can be done fairly painlessly from a legal standpoint, the only last vestige of debate will have to do with the level of the initial framework- obviously if the framework level is dictated by NVIDIA, this means they will have to hold the pieces for the highest common denominator. In the case of GF3 vs 8500, this would not be the case as PS/VS code levels would be intentionally crippled to the maximum capabilities of an NVxx chipset and likely not have the framework for higher levels of API compliancy.

    Hopefully NVIDIA will be the highest common denominator with DX9 going forward. The debate begins in the hypothetical situation where they are a notch or lower of this level- as they couldnt possibly be accountable for a framework that would not support their own chipsets... it would be ludicrous for NV to make a framework to produce code their own VS/PS couldnt handle, and thus begins the controlling aspect. Cg quickly becomes useful for the NV-level downwards, never exploiting anything new or innovative if such ever occurs by other chipsets and the APIs in use.

    Luckily I dont see this being that earth shattering to have such impact in the realm of games/3d entertainment. Like said previously, it's a simple CASE tool- likely of only partial interest to game coders. Most pro's are going to prefer back-of-the envelope shaders over a tool and tinker at the lowest level. All the John Carmack's of the world aren't "wizard" users- they are slide-rule and polish-notation calculator users. It's of great significance to plug-in designers, demos, 2d/3d application designers and the like- but I just dont see it's game impact being more than a novelty for the "lower half" of the developers that pretty much pawn off pretty demos with horrible models, textures and gameplay now. They will finally be able to have nice water and shader effects in their buggy, 2-star games. :)

    Just my $0.02,
    -Shark
     
  13. Geeforcer

    Geeforcer Harmlessly Evil
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,320
    Likes Received:
    525
    Ultimately, it will be up to game developers to try CG and decide whether its a good or a bad thing. Only the "field testing" by people it was intended for will determine the CG's usefulness (or lack thereof); no amount of armchair salivating about the "great potential" or moaning about "unfair advantage" will change that.
     
  14. Matt Burris

    Newcomer

    Joined:
    May 22, 2002
    Messages:
    110
    Likes Received:
    0
    I agree wholeheartedly. The developers will be the one to choose if this will be the industry standard, no amount of hyping by NVIDIA or Microsoft will help if developers don't adopt it.
     
  15. MikeC

    Newcomer

    Joined:
    Feb 9, 2002
    Messages:
    194
    Likes Received:
    0
    NVIDIA's Nick Triantos mentions that other vendors are at liberty to create their own Cg compilers.

    http://www.cgshaders.org/forums/viewtopic.php?t=18

    This approach could cause nightmares for developers wanting to make use of high level shading languages. In fact, the industry might want to go ahead and adopt Microsoft's popular slogan:

    "What Cg compiler do you want to use today?" :) "
     
  16. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    Then why didn't David just say that if its the case.

    However, this still doesn't get around the fact that NVIDIA owns the IP and controls the Cg language, so if theres no support in the language for something then other vendors still can't expose that functionality. Hopefully all DX9 hardware will equally support DX9 functionality, in which case the playing feild will be levelled somewhat as the DX9 CG will cater for all hardware that supports DX9.

    Well, supporting separate compilers is not as difficult as supporting separate languages, it is similar to me taking some C code and compiling it on HPUX and Sun to allow the application to run on the respective hardware OS.

    However it does raise the question of how they would do this. Do you ship the code in Cg on the game CD and use a Runtime compiler, thus hoping that all the other vendors either have their own runtime compiler in their drivers or are shipping nVIDIA generic one? Doesn’t seem likely that that will happen anytime soon. Alternatively do you generate the assembly code through each compiler and possibly have hardware specific executables (errr, no) or attempt to piece each one together and produce code branches for the particular hardware that’s being supported? Ouch, although probably not too dissimilar to what may actually be occurring for those that wish to use PS1.1/1.3/1.4 (even though they a have to do it long hand).

    This approach is cumbersome and annoying which is why it should be transparent for the developer to be able to support the features they want.

    As an aside, I wonder why nVIDIA haven’t created a code path for PS1.3 yet?
     
  17. pascal

    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    1,968
    Likes Received:
    221
    Location:
    Brasil
    My guess is Cg will be good for some work but not all work.

    High speed great looking FPS probably will be hand coded and optimized.

    Some people (any guess?) will continue to do "graphics acrobatics" that are not in the books 8)
     
  18. Zap

    Zap
    Newcomer

    Joined:
    Jun 17, 2002
    Messages:
    12
    Likes Received:
    0
    Location:
    Melbourne, FL
    I was watching my friend Gump use the Cg compiler on his PC, running an Athlon and Radeon 8500. Some of the shader effects did not work properly. Gotta make you wonder if the whole thing is just biased towards Nvidia...
     
  19. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    Thats not exactly a conclusive, exhaustive test, afterall that could just be a driver issue.
     
  20. Sharkfood

    Regular

    Joined:
    Feb 7, 2002
    Messages:
    702
    Likes Received:
    11
    Location:
    Bay Area, California
    Bad example. If you've ever ported ANSI C code between GNU, HP/UX and Solaris C compilers, you'd know what a freaking nightmare this is. :)
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...