DmitryKo
Veteran
Render states and shaders are high-level abstractions, introduced 25-30 years ago in software frameworks (specifically OpenGL and Photorealistic Renderman) for 'workstation' computers that had several dozen MBytes of memory, a single CPU with several MFLOPs, and a simple 'framebuffer' graphics card.
During the GPU revolution, abstractions were necessary to make life simpler for both developers and end-users because there were many different GPU architectures on the market, so proprietary 'direct-to-metal' APIs from 3Dfx (Glide), Rendition (RRedline), and PowerVR (PowerSGL) were ultimately displaced by MiniGL / OpenGL.
But today the hardware landscape is quite different. As more fixed-function blocks are replaced and emulated with compute shaders - and meshlet rasterization and programmable BVH ray tracing displaces the traditional triangle rasterization pipeline - legacy state-management APIs and shader languages will be supplanted by parallel processing frameworks written in C/C++ and Python. IMHO this will happen overnight once AMD brings their ROCm compilers and runtime to the Windows platform, as it's source-code compatible with CUDA.
During the GPU revolution, abstractions were necessary to make life simpler for both developers and end-users because there were many different GPU architectures on the market, so proprietary 'direct-to-metal' APIs from 3Dfx (Glide), Rendition (RRedline), and PowerVR (PowerSGL) were ultimately displaced by MiniGL / OpenGL.
But today the hardware landscape is quite different. As more fixed-function blocks are replaced and emulated with compute shaders - and meshlet rasterization and programmable BVH ray tracing displaces the traditional triangle rasterization pipeline - legacy state-management APIs and shader languages will be supplanted by parallel processing frameworks written in C/C++ and Python. IMHO this will happen overnight once AMD brings their ROCm compilers and runtime to the Windows platform, as it's source-code compatible with CUDA.
Last edited: