Mar 2, 2004
"ATI's chips process all texture data in a 24-bit floating-point format, while the nVidia GeForce FX line processes data in either 16-bit or 32-bit floating-point. This difference has created some headaches for software developers. As a result, software written for ATI cards often runs slower on nVidia hardware."
Damn that ATI for sabotaging NV cards.
Commenting on NV's compiler drivers:
"Additionally, tweaks like these can be somewhat fragile. If a game is patched, or a mod created for a game, it might not work properly with this type of on-the-fly optimization. The result may be incorrectly rendered images. This would require the release of a new driver to patch the problem. There have been reports of compatibility issues with a few game modifications."
Really, I missed those reports --what are they? (Not saying they don't exist; just hadn't heard).
"ATI's chips process all texture data in a 24-bit floating-point format, while the nVidia GeForce FX line processes data in either 16-bit or 32-bit floating-point. This difference has created some headaches for software developers. As a result, software written for ATI cards often runs slower on nVidia hardware."
Damn that ATI for sabotaging NV cards.
Commenting on NV's compiler drivers:
"Additionally, tweaks like these can be somewhat fragile. If a game is patched, or a mod created for a game, it might not work properly with this type of on-the-fly optimization. The result may be incorrectly rendered images. This would require the release of a new driver to patch the problem. There have been reports of compatibility issues with a few game modifications."
Really, I missed those reports --what are they? (Not saying they don't exist; just hadn't heard).