Current Generation Games Analysis Technical Discussion [2024] [XBSX|S, PS5, PC]

TAA is not better than MSAA.
There are different AA qualities for methods to be judged on. Not least of which is 'can your algorithm be applied to your engine?' AFAIK MSAA does not work well with deferred rendering that is a preferred technique, so it might not even be an option for some games. Furthermore, you need perhaps at least 4xMSAA to approach the jaggy reduction of TAA, which can be excellent, and even then you only get geometry AA from it and not surface AA, whereas TAA samples the whole scene.

In short, all techniques have pros and cons. The best quality is supersampling, but the cost is prohibitive. I expect TAA is used because it's realistically the only option in a fair number of cases. What was the last AAA game to use MSAA? I feel it's unheard of since the PS3 era!
 
Ignoring that 1080p on a CRT isn't actually 'sharper' than on an LCD(it's usually the opposite) and does nothing for the sort of image quality problems caused in 3d rendering at such a resolution.

CRT as a technology does plenty to aid resolution.

The solution is better value for GPU's so people dont have to play at 1080p. No magic required!

Why does the original 2007 of Crysis with 4xMSAA running at native 1080p look crisper than the remake running at 4k with TAA?
 
In terms of what? OLEDs don't blur anything.

They have basically zero ghosting (near instant colour transitions), but they have motion blur because of sample and hold. MPRT (moving picture response time) allows you to understand how much blur you will perceive. https://blurbusters.com/gtg-versus-mprt-frequently-asked-questions-about-display-pixel-response/

30 fps has double the blur of 60 fps which has double the blur of 120 fps and so on. The amount of blur sub 120 fps on 120Hz displays is significant and effectively lowers resolution in motion. 30 fps is great for people who don't actually play games and want to pixel peep, but it's basically a travesty for gaming. It's why the resolution a game renders at is not a measure of visual clarity. People want native 4k at 30 fps for "sharpness" but the second you move the camera it becomes blurry. This is a physically reality that applies to all of us. There is not a single person that isn't affected by motion blur on sample-and-hold displays, because we all have eyes that move when tracking objects. People don't understand, and it's a good resolution why pixel peeping to count out native resolution is not the whole story or even useful on its own.

As you track your eyes on moving objects on a screen, your eyes are in a different positions at the beginning versus end of a refresh cycle.

On most 60Hz displays, a frame is continuously displayed for no less than 1/60sec. Your eye movements can “smear” the frame across your vision, creating motion blur.

You can greatly reduce the amount of blur with strobing or black frame insertion, but many people are sensitive to flickering. Ideally, until we get some brand new display technology, 120 fps on 120Hz with strobing is going to be the best in terms of minimizing people getting headaches from flicker and eliminating most motion blur while being a reachable goal. It's why research into frame generation is important. Even if games run at a base 60 fps, generating 120 fps will have huge improvements in visual clarity and allow for strobing with lower impact on eye fatigue, headaches etc.


1730579111125.png

1730579136822.png

The little alien images in those pictures above are representative of https://www.testufo.com
It is the easiest way to understand the difference in motion blur. It's especially illuminating if you have a 120 Hz or higher display, and you can compare all the way down to 30 Hz. There are visible differences between 120, 60 and 30 even when the image pans as slowly as 1 pixel per frame. Bump it up to around 4 pixels per frame and anything under 120Hz looks like a joke.
 
Last edited:
The above is why CRT is still the king, their resolution during motion is still leagues a head of flat panel displays.

It's also why I keep saying 1080p is fine as a resolution, it's the blur of modern AA methods and the motion blur of modern flat panels that gives people the impression it's not good enough.

462598774_3063685917117584_4563824815632175830_n.jpg

1080p on a CRT looks like it's running at a much higher resolution while in motion than a game running on a 4k monitor because of the lack of blur.

Resolution isn't really the problem, the displays are, and people are using higher resolutions to compensate for a panels piss poor ability to show a pixel transition without blurring the shit out of it.
 
Yes.

The little graphics you posted are identical. :???:

They have the same images but the text is explanatory. A 60Hz display strobed for 1ms MPRT will have the same blur as a 1000Hz sample and hold display with 1ms MPRT.

The problem with 60Hz strobing is flicker which carries its own issues. 120 fps on a 120Hz display with strobing eliminates a lot of the flicker effects for most people.
 
CRT as a technology does plenty to aid resolution.



Why does the original 2007 of Crysis with 4xMSAA running at native 1080p look crisper than the remake running at 4k with TAA?
CRT does nothing to make the image sharper than an LCD at the same resolution. If anything, LCD's tend to look sharper than a CRT.

CRT's have better motion clarity, but this is a different discussion.

And no, CRT's do absolutely nothing whatsoever about the sort of rendering issues that cause aliasing and lack of defined fine and distant detail that modern, complex shader-heavy 3d rendering inherently comes with at a resolution of say, 1080p.

As for how Crysis with 4xMSAA at 1080p looks crisper than modern games at 4k with TAA - it doesn't. And we all know full well that TAA is softer than MSAA all else being equal. This is a silly argument to bring up as it again has nothing to do with CRT vs LCD. And it ignores that modern games basically need TAA or some reconstruction alternative. CRT's do not solve the instability of high detail, complex shader graphics in motion. CRT's aren't magic. No display technology exists that can make a 3d game rendered at 1080p look like some stable, sharp and detailed high resolution image.
 
Last edited:
But it’s not a standard rendering resolution. The consoles are nowhere close and they’re the benchmark for mainstream gaming.
Again, I'm not suggesting everything should be native 4k to be next gen, just something perceptibly above 1080p.

1080p is simply old hat by now. We should be beyond it by now. It lacks the sheer pixel density to properly resolve all the small and distant detail in modern games, and to minimize aliasing.

And It should not take some $600+ modern GPU in 2024 to play games with the latest graphics(not even maxed) at >1080p/60fps. It's ridiculous.
 
Last edited:
CRT does nothing to make the image sharper than an LCD at the same resolution. If anything, LCD's tend to look sharper than a CRT.

I'm not talking static images, unless you don't actually play games?

CRT's have better motion clarity, but this is a different discussion.

No it's not, it's all part of the same discussion.

And no, CRT's do absolutely nothing whatsoever about the sort of rendering issues that cause aliasing and lack of defined fine and distant detail that modern, complex shader-heavy 3d rendering inherently comes with at a resolution of say, 1080p.

Yes they do, maybe try picking one up and trying it.

As for how Crysis with 4xMSAA at 1080p looks crisper than modern games at 4k with TAA - it doesn't.

It does, the remaster's TAA is very blurry, even at 4k.

The original 2007 release with 4xMSAA+4xTrSSAA absolutely trashes the remake in terms of image quality.

And we all know full well that TAA is softer than MSAA all else being equal.

*Blurrier.

This is a silly argument to bring up as it again has nothing to do with CRT vs LCD. And it ignores that modern games basically need TAA or some reconstruction alternative.

They rely on that because they ideally need to be 1:1 pixel mapped with the display.

That's not a problem on a CRT.

CRT's do not solve the instability of high detail, complex shader graphics in motion. CRT's aren't magic. No display technology exists that can make a 3d game rendered at 1080p look like some stable, sharp and detailed high resolution image.

You really should go and try a CRT.
 
I'm not talking static images, unless you don't actually play games?
Why does everyone also have to fall back to passive-aggressive language?
No it's not, it's all part of the same discussion.
I remember big strobing steps playing on CRT...
Yes they do, maybe try picking one up and trying it.
Everyone here has CRT experience from prior to LCD and OLED invention. Gaming was full of jaggies and shimmer (or worse, blurred texture because apparently AF wasn't invented until 2010...;)) on HDTVs in the PS3 era. Aliasing was of least concern on low-quality CRTs, but then they were blurry. CRT monitors with proper RGB input were crisp, and the jaggies clearly visible with plenty of scrawl. That's why we had AA options. The IHVs added MSAA, advertising up to 16x in their big cards, and Quincunx, and even supersampling, to try and eliminate the aliasing. Articles were written showing different dotty sampling patterns and their results. The average was 2xMSAA, maybe 4xMSAA if you were lucky. MLAA was a revelation as it provided the edge clarity of 16x supersampling without the ridiculous cost. I remember commenting at the time I wondered why no-one drew polygon edges with something like Wu's Algorithm.

So, we've all had the history and the experience and the discussion in the age of CRTs. We all know aliasing is much reduced on CRTs with motion but it's also still there with the step crawling. We experienced it! We had phrases like "those jaggies! They shred my eyes!!"

Please stick to talking technical facts and refrain from 'anyone who doesn't have my experiences is ill-informed and/or stupid' position as if we're teenagers on a reddit sub.

The original 2007 release with 4xMSAA+4xTrSSAA absolutely trashes the remake in terms of image quality.
Why have techniques like this fallen foul? If they look better and work well, you'd think they'd be used still. Notably, in competitive shooters clarity counts for a lot. If you lose framerate on fancier AA, then 120 fps TAA might well be the better choice over 60 fps 4xMSAA+4xTrSSAA? So it's a concious choice by the devs? Or is high tier AA just too costly?

Rather than just complaining, B3D should be looking the different results and costs of different rendering methods. Back in the day we had people posting loads of samples of different AA images. Obviously the 'what it looks like on a display' can't come with visual evidence, but reliance on just that as the argument goes nowhere, plus it doesn't matter as no-one's going to be reverting to CRTs ever. We want the best AA in games now on the TVs we have now. That should be the discussion.
 
And It should not take some $600+ modern GPU in 2024 to play games with the latest graphics(not even maxed) at >1080p/60fps. It's ridiculous.

I don’t think we can stipulate what it should cost to run the latest graphics because that is an undefined workload that varies significantly from game to game. Maybe reviewers should back off their obsession with ultra max settings.

What we can expect is that games should look good and run well on affordable cards and they do for the most part.
 
Back
Top