Legitimizing shader replacement.

Me

Newcomer
I was just wandering through some of the driver cheating threads, and it strikes me that there is probably a lot of games that could legitimatly benifit from shader replacement by IHVs. Nvidia has obviously put large amounts of effort into making sure that specific popular titles run as fast as possible on its hardware.

There was a thread about the Nature benchmark in 3DMark 2K1 gaining ~50% from shader replacement. That is the kind of performance gain that many people would consider making tradeoffs for. Also, shader replacement is something that its easy to judge the IQ tradeoffs for yourself fairly easily.

So what I was thinking is that it would be an interesting idea if the IHVs offered "Acceleration Packs" that users could download, in which they would be free to replace shaders to their hearts content. Reviewers could either have shootouts based on full optimized versions, or "clean" drivers.

Not that I think its likely to happen, but it sure would be nice.
 
Shader replacement is useful if your hardware is poorly utilized by games and you don't have a shader compiler.
 
im all for it as long as there is a direct perfomance increase and it doesnt change any output in any way shape or form. meh it only my opinion
 
It's up to the people that write the program..... and NO one else. You think John Carmack would want - or let - anyone rewrite parts of any program he's written..... I don't think so.
 
I'll be quite intrigued to see what ATI does if games start coming out with only ps1.1/3.0 & no 2.0 as suggested by Dany Lepage
I'm expecting that Splinter Cell - X will only support SM 1.1 and SM 3.0 when it comes out.
Strikes me he won't get very big sales for his game :?
 
Then the user works out how to edit the shader pack and they puts cheats into online games ( IE transparent walls ect ) and punkbuster ect won't pick it up.
 
arrrse said:
I'll be quite intrigued to see what ATI does if games start coming out with only ps1.1/3.0 & no 2.0 as suggested by Dany Lepage
I'm expecting that Splinter Cell - X will only support SM 1.1 and SM 3.0 when it comes out.
Strikes me he won't get very big sales for his game :?

Well SC: PT looks pretty good on Xbox's SM1.1, so SC-X should still look pretty good, just SM3.0 cards will get something nicer.

Anyway, by the time SC-X comes out, R500 will probably released, so I think they are doing the right thing, targeting a future baseline.
 
If anyone wants to replace part of my work with theirs, I would want them to ask me first. Getting my endorsement and publishing it isn't as important however.

For the first part, I'm not sure if it's legally OK if they don't ask me, although in the general scheme of things, it's not something I'd pursue legally nor bug that person/IHV about it but it would be annoying nonetheless if they didn't ask me first. MHO of course.
 
DemoCoder said:
Anyway, by the time SC-X comes out, R500 will probably released, so I think they are doing the right thing, targeting a future baseline.

Especially since the X-Box 2 seems to be a SM3.0 part (R500 based).
 
Reverend said:
If anyone wants to replace part of my work with theirs, I would want them to ask me first. Getting my endorsement and publishing it isn't as important however.

For the first part, I'm not sure if it's legally OK if they don't ask me, although in the general scheme of things, it's not something I'd pursue legally nor bug that person/IHV about it but it would be annoying nonetheless if they didn't ask me first. MHO of course.

Wow! Now there's a tactic that never occured to me to fight it. Bring an IP suit based on creating a "derivative work" without permission!
 
If an ISV is so dead set on putting its own shaders, couldn't it just submit them to the developer to be included in a patch if the substitution is found to be acceptable? Then everyone could do it, and the guys making the games would still have the final say.

If a developer finds that the replacement is too fragile for possible modifications to the game or user customization, then they can write back to the video guys with their concerns.

The alternative of sneaking the shaders into the drivers just reeks of impropriety. Not only could they screw up the behavior of the affected application, but now any hope for communication is in the form of confrontation and stonewalling. It's unhealthy for any relationship to be based on miscommunication and deceit, which is what would be the result.

If the driver guys are working outside of the organization and authority of the developer company, then they have no say in messing with the guts of the product.
 
Think about it from this viewpoint:

Why would it be necessary to replace the shaders in the game by the driver? :?:

Are it's developers too stupid to write correct shaders? Doubtfull...
And if they are, why not just give them the better shaders, and let them patch the game? :?

Wouldn't this mean, there's a different reason to replace the shaders by the driver? For example, because it does NOT do exactly was the programmer had intended. Or that it does increase speed, but ONLY in the part that is used in the benchmark, but not in the rest of the game?

Let's not kid ourselves.... replacing shaders is done for one reason only. To raise the (apparant) speed in benchmarks.
Benchmarks are performed to give an idea how videocards perform in general. NOT just in the benchmarked games.... Increasing speed in the benchmark, even if they are true optimizations, give a wrong indication on how the videocard performs in other games. I'm not so naive to think non-benchmarking games would receive the same optimizations. ;)

Actually, you can see this illustrated when reviewers use a less well know game for benchmarking. Some cards (we all know which ones :devilish: ) suddenly perform a whole lot worse compared to the competition.

Guess what... I don't play UT2k3 or Halo or Quake...

So.. a generic recompiler, that really would work on most games, without destroying them, is ok by me. (not that I believe for one second that these so called generic compilers really are generic...) Any benchmark specific optimizations is not ok, because it gives the wrong information about the games I play. And probably also the wrong information if you DO play the typical benchmark games. (Again... why is the game programmed differently?)
 
Ylandro said:
Are it's developers too stupid to write correct shaders? Doubtfull...
And if they are, why not just give them the better shaders, and let them patch the game? :?
Developers have a limited time to develop, and may not have time to develop specific paths for all of your chips, and I would guess that the driver developers WOULD have better knowledge of performance tuning than just about any developer.
Ylandro said:
Or that it does increase speed, but ONLY in the part that is used in the benchmark, but not in the rest of the game?

Let's not kid ourselves.... replacing shaders is done for one reason only. To raise the (apparant) speed in benchmarks.
Benchmarks are performed to give an idea how videocards perform in general. NOT just in the benchmarked games.... Increasing speed in the benchmark, even if they are true optimizations, give a wrong indication on how the videocard performs in other games. I'm not so naive to think non-benchmarking games would receive the same optimizations. ;)
The thing that brought this on was the fact that nvidia was doing shader replacement in timedemos. They are obviously doing shader replacement for the entire game at that point, which means users are getting the high speed shader whenever they play the game, which means it is speeding up their gameplay experiance.
Also the games used for benchmarks tend to be the most popular ones, which means that the ones optimized are the ones users are most likely to be playing.
Note that this does not diminish the need for "Pure" benchmarks without the specific optimizations, because people do play the less popular games that will never get optimized.
 
Let's make excuses for what is, with little arguement, a poor implementation of DX9 in hardware. IF, the hardware in question would have been designed with DX9 compliancy, then this would be a non dicussion, period. And, if you think not, just wait till the new cards by said manufacturer comes out, and see just how much support thes "old" cards get - they wiill be considered DX8 compliant only.........
 
Me said:
Developers have a limited time to develop, and may not have time to develop specific paths for all of your chips, and I would guess that the driver developers WOULD have better knowledge of performance tuning than just about any developer.
That still does not explain why you you put the replacement shader in the driver, and not simply patch the game. What do they have to hide?

And we're not talking about your average developers here... It's software from Id and Epic that's being manipulated.... Are not even they capable of writing efficient shaders? Maybe, then the problem is not the developers, but the hardware? :rolleyes:

Me said:
The thing that brought this on was the fact that nvidia was doing shader replacement in timedemos. They are obviously doing shader replacement for the entire game at that point, which means users are getting the high speed shader whenever they play the game, which means it is speeding up their gameplay experiance.
Does it really spead up gameplay experience? Obviously, it does for the small part that's in the timedemo. But does that also hold for the rest of the game :?: :!:

Some part will not be used anywhere else in the game... thus useless. And who says that those shader replacements have a positive effect on the rest of the game? Might just as well give less performance in the rest of the game. Or serious artefacts, image quality issues.... Who is to know?

Again.. why not give the code to the developers, and patch the game? Maybe because they would see the problems associated with the replacement shader? :devilish:

Me said:
Also the games used for benchmarks tend to be the most popular ones, which means that the ones optimized are the ones users are most likely to be playing.

People really do play more than the 8 games out here that are being used for benchmarking. And they certainly play more levels than just the ones being used in the timedemos...
 
Ylandro said:
That still does not explain why you you put the replacement shader in the driver, and not simply patch the game. What do they have to hide?
As I said above, the developer may not even be supporting the chip as a seperate code path.
Ylandro said:
And we're not talking about your average developers here... It's software from Id and Epic that's being manipulated.... Are not even they capable of writing efficient shaders? Maybe, then the problem is not the developers, but the hardware? :rolleyes:
There was a thread about Nvidias drivers claiming there was a 73 entry table of applications. It isn't just UT & Quake.
Ylandro said:
Does it really spead up gameplay experience? Obviously, it does for the small part that's in the timedemo. But does that also hold for the rest of the game :?: :!:
Are there really that many shaders that are used in one area of a game? I honestly don't know what the shader/game and shader/area rations are.
Ylandro said:
Might just as well give less performance in the rest of the game. Or serious artefacts, image quality issues.... Who is to know?
I seriously doubt performance for the replaced shaders will ever be lower than the originial shader. For that to be true there would have to be changes in the input which changed performance, which would require dynamic branching.

Ylandro said:
Again.. why not give the code to the developers, and patch the game? Maybe because they would see the problems associated with the replacement shader? :devilish:
In general I agree with you, having the original developer do the changes is best.
Playing the Devils Advocate, I will bring up the following points:

  • The developer may not want to support the chip as a seperate code path
    The developer may not want to support seperate quality levels
    The developer may not be supporting the code/game anymore.
Ylandro said:
People really do play more than the 8 games out here that are being used for benchmarking. And they certainly play more levels than just the ones being used in the timedemos...
Me said:
Note that this does not diminish the need for "Pure" benchmarks without the specific optimizations, because people do play the less popular games that will never get optimized.
 
Me said:
There was a thread about the Nature benchmark in 3DMark 2K1 gaining ~50% from shader replacement. That is the kind of performance gain that many people would consider making tradeoffs for. Also, shader replacement is something that its easy to judge the IQ tradeoffs for yourself fairly easily.

And these are the kinds of algorithms that should never be used in games when a much faster one would suffice.

IHVs at this point should get in contact with the ISV and give them suggestions as to use a less intensive algorithm.

App detection is not an elegant solution. What happens if the ISV patches their game? Say we have a situation where Algorithm X is in some program and an IHV detects and replaces it with Algorithm Y, which is 50% faster and decreases IQ by 10%. Then the ISV patches the game and replaces Algorithm X with Alogrithm Z which is 25% faster than Algorithm X but results in no IQ loss. The end user runs benchmarks and finds that they lost ~20% peformance and runs off to his favorite message board haunts yelling and screaming that the developers are idiots.

Likewise, every single IHV would have to each waste time doing algorithm replacement, versus the ISV doing work once via a patch(or hopefully during development).

Me said:
The developer may not be supporting the code/game anymore.

Chances are such games are already fast enough, or are buggy enough that they aren't worth playing.
 
Driver-level shader replacement by an IHV is never legitimate. Never. Ever. Period. They can talk to the developer or improve their compiler if they want better performance.
 
Right, this thread is annoying me.

1) I don't care what IHVs replace shaders with, as long as the output is the SAME!
2) Only when IHVs replace shaders that DO NOT give the same output as the input shader, is it a bad thing to do

I believe that JC said something similar about this issue.

To make an analogy, if you write C code, and compile it with GCC, would you be cheating if you compiled it with MSVC instead? No, because they both produces the same result.

Face it, all IHVs do not support a 1:1 mapping with VS and PS opcodes. Just like AMD or Intel, they have micro code that the VS/PS code gets compiled to. Sometimes the drivers VP/PS->Microcode compiler just won't produce the most optimised microcode, and for critical games, thats not good enough. Therefore the relevant shaders would get replaced with hand optimised microcode shaders that perform much better than what the game developer could ever do with that hardware. this is a good thing.
 
Back
Top