These questions are for anyone with knowledge in this area. I am tired of wondering about it myself and decided to just ask.
I have noticed in many games you may net massive FPS gains by turning off shadows, or turning them to the lowest settings. Two examples that jump readily to mind are EQ II and Tomb Raider Legends. I can actually run EQ II at 1600x with 4xAA|16xAF and extreme quality if I degrade shadows to min. One EQ II dev has stated on the record, "All shadow work is done on the CPU in EQ II".
I think this may also be the case in many other games in addition to things like creature AI, skinning and tangent space transformations.
The performance hit on my X2 4400+ even running at 2.84 GHz is massive. I can gain a good +25 FPS in both games I have pointed out by reducing shadow work.
Now after all this babbling my question would be why is the choice often made to solely use the CPU for this work, particularly when even the strongest stripe out will not be able to put up playable frame rates using the highest, or near highest quality? Does poor coding also factor into this? Limitations of current D3D?
My uneducated guess is that these choices are made because most folks don't own flagship video cards or ones capable of assuming the duty. It's frustrating when you have something like Crossfire at your disposal and it is put to poor use or crippled by an over taxed CPU.
Any thoughts or enlightment would be appreciated.
I have noticed in many games you may net massive FPS gains by turning off shadows, or turning them to the lowest settings. Two examples that jump readily to mind are EQ II and Tomb Raider Legends. I can actually run EQ II at 1600x with 4xAA|16xAF and extreme quality if I degrade shadows to min. One EQ II dev has stated on the record, "All shadow work is done on the CPU in EQ II".
I think this may also be the case in many other games in addition to things like creature AI, skinning and tangent space transformations.
The performance hit on my X2 4400+ even running at 2.84 GHz is massive. I can gain a good +25 FPS in both games I have pointed out by reducing shadow work.
Now after all this babbling my question would be why is the choice often made to solely use the CPU for this work, particularly when even the strongest stripe out will not be able to put up playable frame rates using the highest, or near highest quality? Does poor coding also factor into this? Limitations of current D3D?
My uneducated guess is that these choices are made because most folks don't own flagship video cards or ones capable of assuming the duty. It's frustrating when you have something like Crossfire at your disposal and it is put to poor use or crippled by an over taxed CPU.
Any thoughts or enlightment would be appreciated.
Last edited by a moderator: