JF_Aidan_Pryde
Regular
As I understand, shaders are descriptors of material properties - how light and surface interact. Normally it is written in HLSL which at run time, compiles the assembly code for the graphics card which is installed in the system. This is known as compiling shaders 'dynamically'. But there's also building shaders dynamically (on the fly), which I'm failing to understand.
Here's a quote from a developer who I can't name:
"Our solution is to procedurally make only the shaders that are needed for rendering the next frame. If a new material appears in the next frame, we make a shader for it, but no sooner than that. This way the code and content is easier to manage, and the rendering engine is more flexible."
The context to the above is that in normal game development, shaders are created by hand and with thousands of materials to code for, this becomes inpractical. The answer is to automate shader generation.
What I don't understand is - how is this possible? One still have to sit down and write down the light to surface interaction in one way or another. How can this just be generated at 'run time'? How is the automated process as described above possible?
Here's a quote from a developer who I can't name:
"Our solution is to procedurally make only the shaders that are needed for rendering the next frame. If a new material appears in the next frame, we make a shader for it, but no sooner than that. This way the code and content is easier to manage, and the rendering engine is more flexible."
The context to the above is that in normal game development, shaders are created by hand and with thousands of materials to code for, this becomes inpractical. The answer is to automate shader generation.
What I don't understand is - how is this possible? One still have to sit down and write down the light to surface interaction in one way or another. How can this just be generated at 'run time'? How is the automated process as described above possible?