Generally speaking on most Samsung's sharpness set to 0 is the correct setting to avoid any edge enhancement or unnatural softening. You can use test patterns to find out on your set.
I think the issue here is that Samsung like to have somekind of noise reduction permanently enabled on their TVs - at least on most of their European models. Film grain and other high frequency detail will get smoothed over at a result, which is a real pain if you can't get around the problem.
Try using Game Mode and then switching to Standard Mode. (I.E using the Standard Picture mode with Game Mode activated). This works as a fix for most of the 2010 - 2012 models. Although, this can create auto dimming issues with the backlight on some sets / floating blacks on the Plasmas.
The above works a treat on my C580 though, and the APL fluctuations aren't visible by eye either, only when using a meter. Scaling is just as good too when looking at the SMPTE and HQV test patterns.
Thanks for the input, KyoDash. I have taken some of the opinions in this thread into consideration and I definitely switched to Game mode, and now I am trying to find the sharpness setting I am most comfortable with. My Samsung TV is european as you mention, from 2008, it's relatively small for an HDTV (22") and I have a larger one -46"- in the living room, but I always play in my bedroom.
Alas I don't have the same model as you, the C580 seems to be a more modern HDTV compared to mine, but activating Game mode/Standard picture might produce similar results.
After some more experimenting I find sharpness to be a tricky feature.
I jacked it up the other day when playing Red Dead Redemption and it looked way better than the setting I was using -Sharpness at 20-.
I find that everything looks better, more embossed, but fellow forumer TheWretched here mentioned halos and stuff, which might mean that Sharpness adds noise, and by adding more sharpness you are altering the original picture coming from the console in weird ways too.
It must be true especially in the case of fonts, but games look a bit crisper to me. Oddly enough, I can't recall exactly if it was in the manual of the TV in my bedroom or the one in the living room but it recommended in it to not use the sharpness setting on HDMI inputs, which is the input I utilize.
Yesterday I did a little experiment decreasing the Sharpness to 0, and Sharpness seems to soften the image at some point more than just removing noise.
You suggest putting it at 0, TheWretched mentioned that 50 can be 0 in some TVs and the manual says that it's not something you're supposed to mess around with setting it to more than 0 using HDMI.
The problem is what it is true "0" and what the TV *says* "0" is. To find out I think some time and effort might be necessary. Your C580 has "0" sharpness at 0, and others seem to have it at 50, and maybe others have it at some other weird number.
I had read time ago that Sharpness at low levels for HD input content produces the best image quality.
However in my HDTV it doesn't seem so easy, because if you actually do this, I have a feeling it really makes the picture soft, sometimes out of focus even.
Maybe sharpness was meant for CRT TVs, and doesn't function or doesn't make sense in modern TVs, like LCDs, DLPs, OLEDs.... etc.
My experience with it is that there has to be a number that balances between noise added by sharpenss and the crispness. That should be the correct value for sharpness.
So in order to get it *right* I think the only way is to adjust it until I find something I like.
My interpretation is that for a non-CRT television, perhaps there is no such thing as correct, knowing how complex these TVs are, only what looks good to you.
As I said, yesterday I set sharpness to 0, -Game Mode/Standard this time around- the flea bitten spots of the Hungarian half (RDR) didn't almost vanished, unlike my perception the other day when I wasn't discerning the fine detail well, and the contrast seemed to fall off noticeably.
But yesterday it looked good to me. I could distinguish the aliasing in some places, and I was happy with it, because it is how the image is meant to be displayed. The game doesn't render at 1080p with AAx4, but at 720p with AAx2, so that's the original image quality for the game.
Also my TV native res is 1680x1050 and it is set to Just Scan, so maybe the upscaling could add a little noise, and like joker454 I don't like at all an artificial enhanced picture if it adds lag, being able to discern little flaws here and there was good news for me.
I also switched values, did reset it to 0, then stepped slowly up to 75 -using Dynamic Mode (a mode I like, but the colours are oversaturated and a bit off!) default-.
There was very little increase in detail, but there was also noise beginning to be clearly apparent in some places of the background.
As I dialed back to 0, disabling sharpness, the noise decreased but I am trying to find out if some specific details in textures and stuff stay about constant or not, because I really dislike a blurred image.
I am thinking about keeping the sharpness lowered to 0 -I had it at 20 for almost a year-, and hope it will take me a day to get used to it but I noticed it does look more uniform, and still watercolour like.
So here I am stuck with my Sharpness dilemma. In short, I switched to Game Mode, Standard and Sharpness 0.
Let's see how it goes.