I'm not talking static images, unless you don't actually play games?
Why does everyone also have to fall back to passive-aggressive language?
No it's not, it's all part of the same discussion.
I remember big strobing steps playing on CRT...
Yes they do, maybe try picking one up and trying it.
Everyone here has CRT experience from prior to LCD and OLED invention. Gaming was full of jaggies and shimmer (or worse, blurred texture because apparently AF wasn't invented until 2010...
) on HDTVs in the PS3 era. Aliasing was of least concern on low-quality CRTs, but then they were blurry. CRT monitors with proper RGB input were crisp, and the jaggies clearly visible with plenty of scrawl. That's why we had AA options. The IHVs added MSAA, advertising up to 16x in their big cards, and Quincunx, and even supersampling, to try and eliminate the aliasing. Articles were written showing different dotty sampling patterns and their results. The average was 2xMSAA, maybe 4xMSAA if you were lucky. MLAA was a revelation as it provided the edge clarity of 16x supersampling without the ridiculous cost. I remember commenting at the time I wondered why no-one drew polygon edges with something like Wu's Algorithm.
So, we've all had the history and the experience and the discussion in the age of CRTs. We all know aliasing is much reduced on CRTs with motion but it's also still there with the step crawling. We experienced it! We had phrases like "those jaggies! They shred my eyes!!"
Please stick to talking technical facts and refrain from 'anyone who doesn't have my experiences is ill-informed and/or stupid' position as if we're teenagers on a reddit sub.
The original 2007 release with 4xMSAA+4xTrSSAA absolutely trashes the remake in terms of image quality.
Why have techniques like this fallen foul? If they look better and work well, you'd think they'd be used still. Notably, in competitive shooters clarity counts for a lot. If you lose framerate on fancier AA, then 120 fps TAA might well be the better choice over 60 fps 4xMSAA+4xTrSSAA? So it's a concious choice by the devs? Or is high tier AA just too costly?
Rather than just complaining, B3D should be looking the different results and costs of different rendering methods. Back in the day we had people posting loads of samples of different AA images. Obviously the 'what it looks like on a display' can't come with visual evidence, but reliance on just that as the argument goes nowhere, plus
it doesn't matter as no-one's going to be reverting to CRTs ever. We want the best AA in games now on the TVs we have now. That should be the discussion.