New nvidia graphics language

SteveOh

New member
Hi guys,

I've been thinking of getting into graphics coding, but was put off by the huge learning curve. However, this morning, I see nvidia has a brand new, high level, graphics programming language called Cg. It's here at http://developer.nvidia.com/view.asp?PAGE=cg_main.

My question is, since I have a Radeon 8500 card, can I use this method of coding for my card or is only for nvidiots? This could be the answer to my coding dreams, but I'm not willing to "change sides".
 
Re: New nvidia graphics language

SteveOh said:
Hi guys,

I've been thinking of getting into graphics coding, but was put off by the huge learning curve. However, this morning, I see nvidia has a brand new, high level, graphics programming language called Cg. It's here at http://developer.nvidia.com/view.asp?PAGE=cg_main.

My question is, since I have a Radeon 8500 card, can I use this method of coding for my card or is only for nvidiots? This could be the answer to my coding dreams, but I'm not willing to "change sides".

You sure can! Just as long as you use D3D (OpenGL uses nVIDIA's shader extensions).
 
Thanks for the reply, NitroGL. Think I'll dl it this weekend and see what we have. Should be quite a learning experience!
 
Nitro, what's your opinion on this? To me, it seems like a higher level shading language becomes a necessity as shader complexity increases, especially if you want to get this kind of functionality "to the masses"...but I don't agree with Nvidia's statements about how they feel PS 1.4 and the like should be handled. As far as I can tell, they seem to want everyone's code to compile to 1.1-1.3 compliance (dumbed down), and that it should be a driver function to "smarten them up" for 1.4 compliance---wouldn't this introduce a lot of unecessary overhead when performing these operations on an 8500? To me, it seems as if this detrimental to ATI, until ATI decides they like the language itself, and writes their own compiler (it didn't seem as if ATI or any other company was given access to the compiler code, is this the case?)

Either way, by making it infinitely more convenient for developers to code for thier hardware, it seems like Nvidia is pushing developers into the idea that PS 1.4 is something they shouldn't necessarily have to worry about. I just don't like the idea of Nvidia's compiler sitting between developer code and the hardware. What Nvidia is doing is definitely a step in the right direction, but it seems as if having Nvidia take this step is not the optimal scenario.

While people like Carmack couldn't care less about Nvidia's compiler, i think it may become the defacto standard should a lot of lower-level developers decide to use it. In my opinion, ATI should respond to this, and let developers know where they stand on the language itself, and what direction they are taking to make sure that their hardware is used and used easily.

Do you take all this seriously? Do you think Nvidia's "standard" will reach the level of defacto standard? Are we going to discover that cg is simply a subset of HLSL?

What is the timeframe for HLSL?
 
Last edited:
I think it's a good idea, but I don't think it will ever become a standard for doing graphics (maybe for making demos, but not games), not until they support hardware other than their own. Either that, or release all the source code and let companies add the support themselves (or maybe someone else *cough* *cough* ;)).
 
hithere said:
but I don't agree with Nvidia's statements about how they feel PS 1.4 and the like should be handled. As far as I can tell, they seem to want everyone's code to compile to 1.1-1.3 compliance (dumbed down), and that it should be a driver function to "smarten them up" for 1.4 compliance---wouldn't this introduce a lot of unecessary overhead when performing these operations on an 8500? To me, it seems as if this detrimental to ATI, until ATI decides they like the language itself, and writes their own compiler (it didn't seem as if ATI or any other company was given access to the compiler code, is this the case?)

Either way, by making it infinitely more convenient for developers to code for thier hardware, it seems like Nvidia is pushing developers into the idea that PS 1.4 is something they shouldn't necessarily have to worry about. I just don't like the idea of Nvidia's compiler sitting between developer code and the hardware. What Nvidia is doing is definitely a step in the right direction, but it seems as if having Nvidia take this step is not the optimal scenario.

I thought most of the compiler was going to be open source? Not the code generation bits that contain Nvidia IP, obviously. But enough for ATI to produce their own compiler to go straight from Cg to PS 1.4. It's like C, just a spec for the language, anyone can make a compiler for their hardware.

Originally posted by NitroGL
I think it's a good idea, but I don't think it will ever become a standard for doing graphics (maybe for making demos, but not games), not until they support hardware other than their own. Either that, or release all the source code and let companies add the support themselves

Buh? It produces DX8 standard shaders, which run fine on ATI hardware. As for OpenGL, well shader extensions are a mess there anyway atm, the next release of the toolkit should add more functionality.

Judging by the commitment level from developers (you did watch the video?), I don't think it's fair to say it's going to be restricted to a few demo makers... and as above companies can support themselves with their own compilers?

Personally, I had a paroxsym of ecstacy when I read about it. Don't forget it was jointly specified with MS as well, it's not just an Nvidia-only thing.
 
Myrmecophagavir said:


I thought most of the compiler was going to be open source? Not the code generation bits that contain Nvidia IP, obviously. But enough for ATI to produce their own compiler to go straight from Cg to PS 1.4. It's like C, just a spec for the language, anyone can make a compiler for their hardware.

Well what is open source of the Cg compiler is the lexical analysis the syntax analysis and the type checking phases. Making them open source is not as generous as nVidia would like people to think. Those phases are relativly straight forward to code since it is a deterministic process.

Remember that nVidia would probably get alot of critisism if they made it completely closed and their success would be limited. Now they can advertise their product as Open Source and supportive of other 3d vendors. But they do have surpreme executive power over how the top level syntax is, so I dont think that Cg is apealing to any other 3d vendor than themselves.

Now I havent looked at the language itself to any depth, but from what I do know I dont think we expect any other 3d vendors to support it in its current state.
 
Onde Pik said:
Remember that nVidia would probably get alot of critisism if they made it completely closed and their success would be limited. Now they can advertise their product as Open Source and supportive of other 3d vendors. But they do have surpreme executive power over how the top level syntax is, so I dont think that Cg is apealing to any other 3d vendor than themselves.

Now I havent looked at the language itself to any depth, but from what I do know I dont think we expect any other 3d vendors to support it in its current state.
Nvidia can't force anyone to use syntax they don't like... if say ATI wanted some changes that Nvidia wouldn't include in the spec (which is unlikely given how much the people involved said they *want* it to be open to everyone) they could just produce their own compiler which recognised the new syntax, same as MSVC++ recognises changes to the ANSI syntax. Of course you'd have to stick with precompiled shaders then, but it wouldn't be in Nvidia's interest to add new features which only favoured themselves for exactly the same reason - run-time compilers included with competitors' drivers wouldn't be able to accept the syntax.
 
While I think the idea behind cg is a good one for the future, it's not really very important until we start seeing much larger ps programs. But once that times comes we don't want to be scrambling to come up with something. Also while there implementation is decent it is missing features that should have been included, but weren't (obviously because current nvidia hardware wouldn't support these features). From this I deduce that this language's feature set will probably pretty much mirror nvidia's hardware in the future as well.

Also there are similar languages coming from ogl2 and microsoft. I don't see the need for one from a biased source like nvidia. Sure they say this one is better (because it's written with their hardware in mind).
 
Myrmecophagavir said:
Nvidia can't force anyone to use syntax they don't like... if say ATI wanted some changes that Nvidia wouldn't include in the spec (which is unlikely given how much the people involved said they *want* it to be open to everyone) they could just produce their own compiler which recognised the new syntax, same as MSVC++ recognises changes to the ANSI syntax. Of course you'd have to stick with precompiled shaders then, but it wouldn't be in Nvidia's interest to add new features which only favoured themselves for exactly the same reason - run-time compilers included with competitors' drivers wouldn't be able to accept the syntax.

If ATI did that, then it wouldnt be Cg anymore.
 
Back
Top