Killer-Kris
Regular
I was wondering, do GPUs have a carry bit much like CPUs do? If they did, could the drivers use this bit to detect whether or not they can legitimately use partial precision?
I'd imagine this might cause problems, or be difficult if it where to try and do this on the fly, but if perhaps Nvidia created a tool for developers to use that profiled their shaders, textures, etc and either optimized it for them, or rather showed them where they could use PP it might be of some use. It would high light areas that could legitimately use the lower precision (for the given source material).
The only real problem I can come up with is, what happens if a Mod developer uses the game's shaders but provides new source art, which when combined requires a higher precision. This would require the mod developer to run the profiler as well. While this likely wouldn't be a big problem, but just the only down side I could think of.
Anyone else have any thoughts on the matter?
I'd imagine this might cause problems, or be difficult if it where to try and do this on the fly, but if perhaps Nvidia created a tool for developers to use that profiled their shaders, textures, etc and either optimized it for them, or rather showed them where they could use PP it might be of some use. It would high light areas that could legitimately use the lower precision (for the given source material).
The only real problem I can come up with is, what happens if a Mod developer uses the game's shaders but provides new source art, which when combined requires a higher precision. This would require the mod developer to run the profiler as well. While this likely wouldn't be a big problem, but just the only down side I could think of.
Anyone else have any thoughts on the matter?