We've had the although simple and straight-forward yet flawed and limited S3TC for a long good while now. While it fulfils its purpose, it isn't stellar either from an image quality standpoint, nor compression ratio either.
Replacing it however isn't easily accomplished with the way APIs are designed. Rather than doing what Microsoft did and decide upon one compression standard, why didn't they introduce function calls to compress textures either at runtime or when installing a game using algorithms the graphics driver supplies?
As we move along, more and more texture formats appear, formats that the S3TC algorithm is poorly equipped to handle; 3D textures, normal maps etc. ATi now introduces their own "standard" just like 3dfx did before they went bust, and even though they probably made it open it won't help much because there is no cap in DX for either 3dfx's or ATi's compression format.
So why not decouple the algorithm completely from the API? The application could indicate what type of texture it is when it calls the compress function or at least hint wether lossy compression is permitted or not if it is some type of texture there is no ready-made definition for it and the driver could do some autodetection to find out which algorithm works best for that particular texture. It would take more time, but today's CPUs are fast and if compression was done at the time of installation it would only have to be done once, or at least infrequently in case the user switches graphics cards and the game needs to recompress the textures due to unsupported formats etc.
Surely this would be the smartest approach, rather than setting one or a few standards in stone and then be locked into an aging format that will gain support only slowly and then has to be supported for all time to come to ensure legacy apps to continue to function?
Replacing it however isn't easily accomplished with the way APIs are designed. Rather than doing what Microsoft did and decide upon one compression standard, why didn't they introduce function calls to compress textures either at runtime or when installing a game using algorithms the graphics driver supplies?
As we move along, more and more texture formats appear, formats that the S3TC algorithm is poorly equipped to handle; 3D textures, normal maps etc. ATi now introduces their own "standard" just like 3dfx did before they went bust, and even though they probably made it open it won't help much because there is no cap in DX for either 3dfx's or ATi's compression format.
So why not decouple the algorithm completely from the API? The application could indicate what type of texture it is when it calls the compress function or at least hint wether lossy compression is permitted or not if it is some type of texture there is no ready-made definition for it and the driver could do some autodetection to find out which algorithm works best for that particular texture. It would take more time, but today's CPUs are fast and if compression was done at the time of installation it would only have to be done once, or at least infrequently in case the user switches graphics cards and the game needs to recompress the textures due to unsupported formats etc.
Surely this would be the smartest approach, rather than setting one or a few standards in stone and then be locked into an aging format that will gain support only slowly and then has to be supported for all time to come to ensure legacy apps to continue to function?