SeeNoWeevil
Newcomer
I really can't get my head round this, why does every game insist on showing me a (most of the time awful) calibration image and brightness/gamma slider? Why do consoles not just stick a calibration image in OS settings and prompt users to use it and adjust their TV?? Adjusting video levels in software is stupid, get the screen right once and that's the end of it. Or am I wrongly assuming developers are sticking to the same standard for video output?
Another question, can the video levels setting (i.e Full/Limited) give correct results for one game, and incorrect for another?
Another question, can the video levels setting (i.e Full/Limited) give correct results for one game, and incorrect for another?