Consensus on NV3x Floating-Point Pixel Shaders?

Which is closest to your opinion about NV3x FP pixel shaders, specifically relative to R3xx?

  • B) Apps that still show shaders slow are broken, or will be fixed by driver improvements

    Votes: 0 0.0%
  • C) Slow due to architecture even on NV35, NVidia is cheating benchmarks to cover it up

    Votes: 0 0.0%
  • D) Doesn't matter, since real games will be developed with NV3x shader architecture in mind

    Votes: 0 0.0%
  • E) Doesn't matter, since all current cards will be obsolete when FP-shader dependent games appear

    Votes: 0 0.0%

  • Total voters
    201
Xmas said:
Dave H said:
Suddenly it becomes crystal clear (to me) why ATI refuses to develop a runtime Cg compiler. Without 3rd-party support, games aren't going to use runtime-compiled Cg, so Nvidia loses their chance at evading DX9's specs when it comes to shader precision. Or could a game easily use runtime Cg on Nvidia cards but pre-compiled (or hand-coded) PS2.0 for ATI et. al.?
A game could easily use runtime Cg on cards from any vendor.

How does that work? Is there a Cg runtime built into the game which compiles it into an intermediate representation e.g. PS 2.0?

Eh, I suppose I should just go to nvidia's website and read up on Cg, although making incorrect assertions and getting corrections is so much easier. :)
 
Dave H said:
Xmas said:
Dave H said:
Suddenly it becomes crystal clear (to me) why ATI refuses to develop a runtime Cg compiler. Without 3rd-party support, games aren't going to use runtime-compiled Cg, so Nvidia loses their chance at evading DX9's specs when it comes to shader precision. Or could a game easily use runtime Cg on Nvidia cards but pre-compiled (or hand-coded) PS2.0 for ATI et. al.?
A game could easily use runtime Cg on cards from any vendor.

How does that work? Is there a Cg runtime built into the game which compiles it into an intermediate representation e.g. PS 2.0?

Eh, I suppose I should just go to nvidia's website and read up on Cg, although making incorrect assertions and getting corrections is so much easier. :)
The Cg runtime is layered on top of either D3D or OpenGL (part is common to both APIs). There is a theoretical possibility that the Cg runtime circumvents either API and communicates directly with the driver when binding a shader, but the documentation says it doesn't. And it would be simple to prove.
 
Doomtrooper said:
Also keep in mind that this is, apparently, the direction that OpenGL 2.0 is going, only it will be even more extreme.
Difference is that the HLSL is not controlled by a IHV, some day you will wake up and that little light bulb will come on above your head.
DX9 HLSL is a disappointment. It doesn't offer runtime compiling, so it's really only useful as a tool for prototyping.
 
Xmas said:
Dave H said:
Xmas said:
Dave H said:
Suddenly it becomes crystal clear (to me) why ATI refuses to develop a runtime Cg compiler. Without 3rd-party support, games aren't going to use runtime-compiled Cg, so Nvidia loses their chance at evading DX9's specs when it comes to shader precision. Or could a game easily use runtime Cg on Nvidia cards but pre-compiled (or hand-coded) PS2.0 for ATI et. al.?
A game could easily use runtime Cg on cards from any vendor.

How does that work? Is there a Cg runtime built into the game which compiles it into an intermediate representation e.g. PS 2.0?

Eh, I suppose I should just go to nvidia's website and read up on Cg, although making incorrect assertions and getting corrections is so much easier. :)
The Cg runtime is layered on top of either D3D or OpenGL (part is common to both APIs). There is a theoretical possibility that the Cg runtime circumvents either API and communicates directly with the driver when binding a shader, but the documentation says it doesn't. And it would be simple to prove.

Still confused. :?

How does the Cg runtime get on my computer? Is it part of a game I buy, or is it built into my video card drivers? I thought Cg is not being supported by anyone but Nvidia, which would tend toward the first answer.

Your answer seems to indicate I've got that right. But then the only way for the Cg runtime to get FX12 into a DX9 application would be, as you say, by sneaking around the API and talking directly to the drivers. Of course non-Nvidia drivers will expect everything the Cg runtime tells them to come through the API.

So if that's the only way to get FX12 in DX9, and the documentation claims the runtime doesn't do that, how come the documentation is also said to claim you can get FX12 in DX9 by means of the runtime?

Or was I misinformed about that?
 
Chalnoth said:
DX9 HLSL is a disappointment. It doesn't offer runtime compiling, so it's really only useful as a tool for prototyping.

Ehm ... it does. Some ATI demos are using runtime compilation already.
 
Dave H said:
Still confused. :?

How does the Cg runtime get on my computer? Is it part of a game I buy, or is it built into my video card drivers? I thought Cg is not being supported by anyone but Nvidia, which would tend toward the first answer.

Your answer seems to indicate I've got that right. But then the only way for the Cg runtime to get FX12 into a DX9 application would be, as you say, by sneaking around the API and talking directly to the drivers. Of course non-Nvidia drivers will expect everything the Cg runtime tells them to come through the API.

So if that's the only way to get FX12 in DX9, and the documentation claims the runtime doesn't do that, how come the documentation is also said to claim you can get FX12 in DX9 by means of the runtime?

Or was I misinformed about that?
The Cg runtime comes with the application that uses it.

The documentation doesn't say you can get FX12 in DX9. It says you can use the datatype "fixed" in Cg shaders, which doesn't even exist in DX9 HLSL. And the part of the Cg spec I quotet says that "fixed" will be treated like "float" when using PS2.x profiles.
 
Dave H said:
How does the Cg runtime get on my computer? Is it part of a game I buy, or is it built into my video card drivers? I thought Cg is not being supported by anyone but Nvidia, which would tend toward the first answer.

It will ship with the game. Not sure if static linking is supported, but if it is, then it can be linked directly into your application. Otherwise as a .dll file. The bad thing then is of course that you will be stuck with an old .dll and support for old shader targets only in the game executable. With the OpenGL 2 approach that directly targets the underlying hardware you'll always get the most out of your hardware and no need to patch the game to use better hardware. Just upgrade your drivers, as you should on regular basis anyway.
 
tEd said:
Doomtrooper said:
You really believe that games will be coded for NV3x in mind?

No..not DX9 titles.

it's already happening : stalker,breed

Yeah, I don't really see allot of GeforceFX 5600, 5800, 5900 cards being sold in light of all nvidias goof ups lately. The Geforce 5900 is really not worth the money (nor is the 256mb Radeon 9800 IMO.) I don't think that OEMs are looking on nvidia favorably at this point along with their DX9 architecture.

While the 5200 will enjoy more sales as it is cheap as snot there wont be allot of development with that core in mind because it is really a poor DX9 card. To put things as simply as possible ... right now I think Cg is more popular then the NV3x series will ever be ...... in light of how really good the Radeon 3XX series are and how long that architecture has been shipping.

I don't understand the drive to use Cg for software development. I mean really Microsoft’s’ HLSL covers all DX9 architecture, why should developers have to bother with Cg at all? Particularly if the installed DX9 hardware base is mostly not nvidia based products at all. I guess the assumption is that nvidia hardware will be a larger installed base ..... even though they are way behind in DX9 hardware sales totally.

AFAIK ATi owns DX9 current installed hardware based products (nearly 100% AFAIK) and given what I have seen from nvidia lately I don't see why that will be changing anytime soon. In fact I would wager that nvidia looses a large portion of market share to ATi as a result of its poor DX9 hardware.
 
There is someone that visits this forum occasionally that could give DX9 marketshare numbers, but last time I heard..there was really no one else even in the picture.

It makes sense, you still can't buy FX cards in Canada in numbers...and we are into June.
Development of DX9 titles started back last September, at least Stalker did on a 9700.
 
Doomtrooper said:
There is someone that visits this forum occasionally that could give DX9 marketshare numbers, but last time I heard..there was really no one else even in the picture.

It makes sense, you still can't buy FX cards in Canada in numbers...and we are into June.
Development of DX9 titles started back last September, at least Stalker did on a 9700.

There is no such study AFAIK. All I know is that Radeon 9700 Pro, Radeon 9700 non pro, Radeon 9500 Pro and Radeon 9500 non pro have been shipping for approx 2/3s of a year with farily good expectance with OEMs. The Radeon 9800 Pro has been shipping for a couple of months even the Radeon 9800 Pro 256 is shipping. Now ATi is about to start shipping the Radeon 9600 pro and non pro versions and this entire line up if you consider price to performance is way better. I expect to see a wide variety of mid to low range design wins with the RV350. Further the M10 mobile part will be the best DX9 mobile hardware by far. I can't see any good reason for the continuance of nvidias market share dominance carried over from DX8 level hardware at this point.

EDIT: I also see something for ATi in the integrated market this summer with their new integrated chipset that uses the 9200 core.(but this is not DX9 hardware.)

On nvidias hardware front... the GeforceFX 5800 is canceled the Geforce FX5900 is not even shipping yet. Margins on the GeforceFX5600 will be poor at best and nvidias third parties are going to be looking for ATi hardware to be in their products. The Geforce FX5200 is cheap however but...... nvidia no longer is the undisputed market leader and I don't think with all the negative publicity that nvidia has been getting that their high end hardware will win many OEMs over and as a result I don't see the GeforceFX 5200 being all too popular with OEMs. This is all speculative but it isn't without reason that I say nvidia has a lot of market share to lose particularly with the advent of DX9.
 
Sabastian said:
Doomtrooper said:
There is someone that visits this forum occasionally that could give DX9 marketshare numbers, but last time I heard..there was really no one else even in the picture.

It makes sense, you still can't buy FX cards in Canada in numbers...and we are into June.
Development of DX9 titles started back last September, at least Stalker did on a 9700.

There is no such study AFAIK.

Actually there are. They're just not free. :)

Tommy McClain
 
AzBat said:
Sabastian said:
Doomtrooper said:
There is someone that visits this forum occasionally that could give DX9 marketshare numbers, but last time I heard..there was really no one else even in the picture.

It makes sense, you still can't buy FX cards in Canada in numbers...and we are into June.
Development of DX9 titles started back last September, at least Stalker did on a 9700.

There is no such study AFAIK.

Actually there are. They're just not free. :)

Tommy McClain

I know that their are studies explaining market share overall from over 3 months ago but that includes DX7, DX8 hardware as well..... but I don't know of any that specificaly address DX9 hardware market share. Please share where this more specific study is.
 
Sabastian said:
I know that their are studies explaining market share overall from over 3 months ago but that includes DX7, DX8 hardware as well..... but I don't know of any that specificaly address DX9 hardware market share. Please share where this more specific study is.

My former employer, Jon Peddie Associates, used to break down market share to different market segments. Their Quarterly Report study would have included that data. Unfortunately JPA is no more, but Jon Peddie started another company called Jon Peddie Research(www.jonpeddie.com). From what I understand his Market Watch report would include this type of data.

Tommy McClain
 
AzBat said:
Sabastian said:
I know that their are studies explaining market share overall from over 3 months ago but that includes DX7, DX8 hardware as well..... but I don't know of any that specificaly address DX9 hardware market share. Please share where this more specific study is.

My former employer, Jon Peddie Associates, used to break down market share to different market segments. Their Quarterly Report study would have included that data. Unfortunately JPA is no more, but Jon Peddie started another company called Jon Peddie Research(www.jonpeddie.com). From what I understand his Market Watch report would include this type of data.

Tommy McClain

Thanks for that. I know Mercury does allot of these sorts of studies the last I saw of theirs was clearly not as articulate as to point out the percentage of DX9 hardware market share. Anyhow I will look into this, thanks again.
 
Sabastian said:
AzBat said:
Sabastian said:
I know that their are studies explaining market share overall from over 3 months ago but that includes DX7, DX8 hardware as well..... but I don't know of any that specificaly address DX9 hardware market share. Please share where this more specific study is.

My former employer, Jon Peddie Associates, used to break down market share to different market segments. Their Quarterly Report study would have included that data. Unfortunately JPA is no more, but Jon Peddie started another company called Jon Peddie Research(www.jonpeddie.com). From what I understand his Market Watch report would include this type of data.

Tommy McClain

Thanks for that. I know Mercury does allot of these sorts of studies the last I saw of theirs was clearly not as articulate as to point out the percentage of DX9 hardware market share. Anyhow I will look into this, thanks again.

Hi Tommy, they don't have the data I was talking about.(possibly some expensive estimates.) Sorry for the belated response. Maybe someone else does?
 
Sabastian said:
Hi Tommy, they don't have the data I was talking about.(possibly some expensive estimates.) Sorry for the belated response. Maybe someone else does?

Hmm. Figured Jon would of had that. It's possible he's not breaking it down anymore due to lowering the prices on all his products. I do know that Jon does like to do custom studies, but it would cost you. Other than Jon and Mercury, possibly some of the bigger market researchers would have it. Can't remember them off the top of my head though.

Tommy McClain
 
AzBat said:
Sabastian said:
Hi Tommy, they don't have the data I was talking about.(possibly some expensive estimates.) Sorry for the belated response. Maybe someone else does?

Hmm. Figured Jon would of had that. It's possible he's not breaking it down anymore due to lowering the prices on all his products. I do know that Jon does like to do custom studies, but it would cost you. Other than Jon and Mercury, possibly some of the bigger market researchers would have it. Can't remember them off the top of my head though.

Tommy McClain

No prob, my response is a bit cryptic because the info is not intended for others. My best estimate is that ATi has a virtual monopoly on DX9 chips (tounge in cheek: 98% ;), *cough* $1000 please.*cough* ) though after nvidia moves its products into the market a little quicker that ought to change, but how much? Seems the average end users are not clever enough to realize if their Graphics hardware is DX9 or not. (I have been surprised at how many people whom should know actually don't.)
 
I wouldn't be suprised if the 5200 has sold more units than the R300.

But I wouldn't be suprised the other way either.

(p.s. don't believe any of those market reports. The ones that report on consumer electronics do a piss poor job from my experience)
 
Back
Top