nvidia shader compiler details

Status
Not open for further replies.
Bjorn said:
Evildeus said:
Well, would be interesting to see what Futuremark does if future drivers includes cheats for NV3* family and not for NV4* family :D
They're not doing any "per chip" approving for the drivers as it is now so i guess they will continue with that. And i think that's the way to go, it'll be a mess otherwise.
Sure they are, Bjorn. Just look at their approved drivers webpage:
Notes: The ATI Catalyst 3.9, Catalyst 3.10, Catalyst 4.1, Catalyst 4.2 & Catalyst 4.3 Drivers have been tested with the 9x00 series and 8x00 series.
 
Evildeus said:
Well, i don't think they would because they have nothing to gain from that, contrary to Futuremark and bench companies.

And that goes to my point: look at the mindshare beating FM took over the entire situation, probably receiving more blame and losing more marketshare for how they handled nVidia's cheating than the very company who consciously chose to cheat not only synthetics but also specific games too (those used as benchmarks). We routinely see certain editors and reviewers lambasting FM for their business model, labelling their latest test suite as worthless while obviously lacking the technical knowledge to accurately determine whether or not it really is, and yet I can't help but stop and think about the absolute silence on the developer-front and the contrast their silence and market position paint to that of FM's.
 
John Reynolds said:
We routinely see certain editors and reviewers lambasting FM for their business model, labelling their latest test suite as worthless while obviously lacking the technical knowledge to accurately determine whether or not it really is, and yet I can't help but stop and think about the absolute silence on the developer-front and the contrast their silence and market position paint to that of FM's.

Hey, share some of them thoughts you're stopping to think about with us...you got me all interested now! :|
 
digitalwanderer said:
Hey, share some of them thoughts you're stopping to think about with us...you got me all interested now! :|

No thx. I don't want to come off sounding like an apologist for FM (which I'm not).
 
Inq?

My real question is how long this will take to show up on the Inq and will they credit jsea and/or B3D? ;)

Regards, Chris.
 
My thoughts after reading the text:

Hash values are being used to identify shaders and apps. It is a convenient way to automate the naming of shaders and apps. Just as importantly, hashing makes it difficult for a third party to determine which apps and shaders are being replaced.

There is an internal ‘Verdict’ team. This teams responsibility is to test hand tuned shaders against developer shaders, hence the passthrough hash code of the original shader requested by this team. I am guessing that their main function is for the public and reviewers to arrive at the correct ‘Verdict’ (from nvidia’s point of view).

The looping and branching flow structures are very interesting to me. I am still curious as to whether this is handled completely in the hardware.

I am also curious as to whether others on this board were able to download the original file containing this text in order to absolutely establish the origin of the information. I mean no disrespect to jsea, this would simply verify the source. Anaqer, did you extract the text yourself?

Regards, Chris.
 
Chalnoth said:
Bjorn said:
Evildeus said:
Well, would be interesting to see what Futuremark does if future drivers includes cheats for NV3* family and not for NV4* family :D
They're not doing any "per chip" approving for the drivers as it is now so i guess they will continue with that. And i think that's the way to go, it'll be a mess otherwise.
Sure they are, Bjorn. Just look at their approved drivers webpage:
Notes: The ATI Catalyst 3.9, Catalyst 3.10, Catalyst 4.1, Catalyst 4.2 & Catalyst 4.3 Drivers have been tested with the 9x00 series and 8x00 series.

The 7500 series is rather old though and not really relevant for 3D Mark 2003. It would be a completely different thing to approve drivers for NV4X and not for NV3X.
 
Bjorn said:
The 7500 series is rather old though and not really relevant for 3D Mark 2003. It would be a completely different thing to approve drivers for NV4X and not for NV3X.
That's not what I was trying to say. I was trying to show that Futuremark does indeed test drivers on different hardware, so there is nothing keeping them from approving drivers for the NV4x and not for the NV3x. Just because they haven't in the past (except for older chips) doesn't mean they won't in the future.
 
Chalnoth said:
That's not what I was trying to say. I was trying to show that Futuremark does indeed test drivers on different hardware, so there is nothing keeping them from approving drivers for the NV4x and not for the NV3x. Just because they haven't in the past (except for older chips) doesn't mean they won't in the future.

No, of course not. It would be a very simple thing to approve drivers for the NV4x only. I just think that it could easily be misleading. But i guess a "ONLY FOR THE NV4X" in bold in the approved drivers page would do the trick.
 
nelg said:
If these shader replacements are innocuous then why hide them? Save for 3Dmark, where it is against the EULA, no reasonable person would object to having code run faster with the same I.Q.

There are very good reasons to oppose hand-tuned shaders even if they maintain the same IQ.

Hand-tuned shaders effectively hold game developers hostage to IHVs' driver teams. Developers can not change their own programs without adverse effects.

An IHV codes hand-tuned app-detection-based shader replacements for a game. The game runs well. Nice. Customers are happy. The developer has to patch the game due to bug fixes or enhancements. This breaks the app-detection built into the drivers. Oops. People download the patch, game performance is halved. Who do you think they blame for this? The publisher's support forums are flooded with angry customers and they receive bad publicity for a messed-up patch even when it is not their fault. The developer can't do anything about it.

Not a viable solution at all.

Additionally the shader detections might not carry over to mods or custom maps, again holding their developers hostages to the driver teams, leaving them to hope that they'd bother to app-detect them at some undetermined point in the future.

And need I tell you that shader replacements built into the drivers are bloat? The code should be in the application, not in the drivers.

Application detection is only acceptable when it is the only way to fix a problem in a borked application. It is not a viable strategy for performance enhancements.
 
What i don't get is that people talk about optimisations in drivers as if they're bad things, yeah we have some sort of fine line here where at some stages IQ is dropped even if its slightly, and yes doing so while not letting advanced users have the option to turn it of is a bit restrictive.

But this is the way i see it, if i select 4xAF for instance, it's going to run at a certain speed and give me a certain quality, if that quality is slightly lower than expected to give a boost to FPS then ok fine. If i have the frame rate left over when configuring my game i'll simply whack the AF up another level.

At the end of the day the performance vs quality is a balance and the NV3X range had some hardware bottlenecks which stopped them from really competing with the radeons. But everyone see the words "optimisations" and "cheats" but no one ever seems to realise theres a benefit side to this, otherwise Nvidia woudln't be doing it.

Great so we add the "take away optisiations button" and now im left with 100% pure quality of the image, now i have to drop my driver and game settings becuase the frame rate isnt what i'd call acceptable.

EITHER WAY, the hardware can only run so fast, and so as long as the trade ofs are reasonable then i don't see the problem.

Yeah so maybe they atempt to manipulate common benchmarks and games used frequently for benchmarks, any good reviewer will tell you if theres a IQ difference when running a benchmark, you can compare IQ shots, THATS what the benchmark is for, theres no such thing as apples to apples, if you're under the impression there is then please for the love of god drop this idea now. It's not as if benchmarks have ever been a particuarly accurate way to compare performance in complex things, its the nature of benchmarking.

But its nice to finger point and cause a load trouble while we're bored and waiting for the next line of cards isn't it. Tbh image quality goes out the window when im having fun blasting people online in UT2004 at speeds so fast you're not even sure what colours you're looking at, let alone if they're smoothly AA'd

-Princess_Frosty
 
Princess_Frosty said:
What i don't get is that people talk about optimisations in drivers as if they're bad things, yeah we have some sort of fine line here where at some stages IQ is dropped even if its slightly, and yes doing so while not letting advanced users have the option to turn it of is a bit restrictive.

But this is the way i see it, if i select 4xAF for instance, it's going to run at a certain speed and give me a certain quality, if that quality is slightly lower than expected to give a boost to FPS then ok fine. If i have the frame rate left over when configuring my game i'll simply whack the AF up another level.

At the end of the day the performance vs quality is a balance and the NV3X range had some hardware bottlenecks which stopped them from really competing with the radeons. But everyone see the words "optimisations" and "cheats" but no one ever seems to realise theres a benefit side to this, otherwise Nvidia woudln't be doing it.

Great so we add the "take away optisiations button" and now im left with 100% pure quality of the image, now i have to drop my driver and game settings becuase the frame rate isnt what i'd call acceptable.

EITHER WAY, the hardware can only run so fast, and so as long as the trade ofs are reasonable then i don't see the problem.

Yeah so maybe they atempt to manipulate common benchmarks and games used frequently for benchmarks, any good reviewer will tell you if theres a IQ difference when running a benchmark, you can compare IQ shots, THATS what the benchmark is for, theres no such thing as apples to apples, if you're under the impression there is then please for the love of god drop this idea now. It's not as if benchmarks have ever been a particuarly accurate way to compare performance in complex things, its the nature of benchmarking.

But its nice to finger point and cause a load trouble while we're bored and waiting for the next line of cards isn't it. Tbh image quality goes out the window when im having fun blasting people online in UT2004 at speeds so fast you're not even sure what colours you're looking at, let alone if they're smoothly AA'd

-Princess_Frosty

Wow, this thread even pulled you over from [H]? Color me unimpressed. :rolleyes:

It shouldn't be the IHV's decision to replace things like trilinear with brilinear and say, "it's good enough"...it's a decision that should be up to the end user.

All your arguments don't change the fact that nVidia ain't delivering what they say they're delivering, and that ain't right. :(
 
DemoCoder said:
If your hardware can run at superior quality or precision to other hardware, I think you're justified doing apples to apples comparisons.
You seem to be taking a lot of flak for your comments, but I have a question to ask you.

We have seen that the 9600 can do "brilinear" here. Do you think ATI should make this the standard mode? It's the only way ATI can get an "apples to apples comparison" with NVidia from every review site (even HardOCP hasn't taken this into account since that article). I think ATI is trying to be the bigger man here and hoping the gaming community will reward them for it, but IMO it's a lost cause.

What should ATI do?
 
Princess_Frosty said:
But its nice to finger point and cause a load trouble while we're bored and waiting for the next line of cards isn't it. Tbh image quality goes out the window when im having fun blasting people online in UT2004 at speeds so fast you're not even sure what colours you're looking at, let alone if they're smoothly AA'd

-Princess_Frosty

And that example is supposed to be somehow representative of all PC gaming? In fact, under that logic let's all just drop back to 640x480x16 with point sampling for our texture filtering since all gaming looks like a Monet.
 
Princess_Frosty said:
What i don't get is that people talk about optimisations in drivers as if they're bad things, yeah we have some sort of fine line here where at some stages IQ is dropped even if its slightly, and yes doing so while not letting advanced users have the option to turn it of is a bit restrictive.

Blah, blah, blah

Yes, let's justify Nvidia's tactics, and in fact, encourage them to do it more. Why even try and make hardware that can perform at the highest quality when you can just cheat your want to high frames per second? Let's not even think about hardware with good speed at doing quality AF.

Brilinear for all!

Because THAT'S ALL YOU NEED. Why? BECAUSE NVIDIA SAID SO. Know your role.
 
I'm sure it wasn't NVIDIA's intention to release such a poorly performing product line, but that was all they had. One can't blame them for this; they tried their best and failed. I imagine if the previous posters actually owned one of these products and couldn't afford to replace it with something better, their view on the situation would change. Of course the option should be given, but it still doesn't change the reality of the majority of their cards needing the optimizations to perform well. It isn't an excuse of NVIDIA’s behavior; it is an acceptance of the reality of how their products perform. It doesn't make it right; it is just a reflection of reality that the majority of the cards being sold are not high end.

In fact, under that logic let's all just drop back to 640x480x16 with point sampling for our texture filtering since all gaming looks like a Monet.

Huh? The reason the optimizations exist is to ensure that the user can run at higher resolutions with acceptable frame rates. Would you say choose 640x480 with trilinear, full precision shaders, and correct AF, or 1280x1024 w/ brilinear, pp shaders, and optimized AF? This situation wouldn't occur of course if the initial nv3x products didn't suck, but they do. If you're unfortunate enough to be stuck with one of these cards, you're probably happy with NVIDIA’s lie, cheat, steal tactics with regard to performance.
 
Status
Not open for further replies.
Back
Top