Reading the post 'Brent latest review' I checked out the article http://www.hardocp.com/article.html?art=NTk5 and I started to wonder about the lighting artefacts the Geforce cards show in Farcry.
A simple question: Why do they exist?
I mean... this generation of videocards allready exist for more than a year. Why do we still see these kind of problems?
Is it something in the game itself? Then why isn't the ATI card affected? Why does it show up in Need for Speed also? Then you would think it's the drivers... But shouldn't the drivers be mature by now? And why are only these two games affected? Are games so completely different from each other, that they always stumble upon new driver bug?
I don't see these kind of game behaviour concerning CPU's. It's not like you suddenly get weird behaviour when running and AMD instead of Intel. Why in videocards? Do videocards have so much room for their own 'interpretation' of what they should do?
Or is it that the videocard manufacturers are too busy 'optimizing' performance by second guessing the game manufacturers, that they introduce artefacts themselves?
Anyone care to shine his light about the matter?
NB: Before people think I'm attacking NVidia specifically... It's not just Farcry. You see these kind of problems very often, with both NVidia and ATI cards. Just using this case as an example.
A simple question: Why do they exist?
I mean... this generation of videocards allready exist for more than a year. Why do we still see these kind of problems?
Is it something in the game itself? Then why isn't the ATI card affected? Why does it show up in Need for Speed also? Then you would think it's the drivers... But shouldn't the drivers be mature by now? And why are only these two games affected? Are games so completely different from each other, that they always stumble upon new driver bug?
I don't see these kind of game behaviour concerning CPU's. It's not like you suddenly get weird behaviour when running and AMD instead of Intel. Why in videocards? Do videocards have so much room for their own 'interpretation' of what they should do?
Or is it that the videocard manufacturers are too busy 'optimizing' performance by second guessing the game manufacturers, that they introduce artefacts themselves?
Anyone care to shine his light about the matter?
NB: Before people think I'm attacking NVidia specifically... It's not just Farcry. You see these kind of problems very often, with both NVidia and ATI cards. Just using this case as an example.