Shimmering lines, common in menus, HUDs: Term for the artifact? And possible to "fix" with post-processing?

No. It just means the devs had higher priorities than fixing some pixel crawling in their GUI...
That only exists when animated! Blurry UIs that are blurry 100% of the time even when static are less desirable by most users likely than softer UIs that don't shimmer when scrolled smoothly.
 
I'm not going to argue over the definition of "bug."
You can't really because it's well defined. ;) A bug is code (or hardware) that doesn't work as it should. It's incorrect and counter to technical discussion to use the term 'bug' to mean an unwanted effect that isn't caused by defective code. The rendering here is working correctly but the maths causes unwanted visuals, like geometry aliasing is unwanted but not a bug, and moire patterns are unwanted but not a bug, and pixelated textures are unwanted but not a bug, and blurry textures are unwanted but not a bug.

Well, it could be a bug. They may have intended a pixel-aligned scroll and not applied the intended change, or their math library could be screwing up the rounding, or some other unlikely fringe possibility. But shimmering is not in-and-of-itself a bug.

So when anyone mentions sacrificing "sharpness," you have to realize there's a good chance this "sharpness" you're trying to preserve is incorrect in terms if DSP. It should have been filtered away but never was, so although you may subjectively "prefer" the sharpness and resultant shimmer, the signal you're preferring is technically invalid.
Human perception doesn't necessarily follow the mathematical purities of signal processing. It's notoriously subjective - you will have people more tolerant of soft UIs with no shimmering ever and others who'd rather have crisp UIs with the occasional shimmer. Most users probably don't care one way or the other and neither particularly offends them! To say one or other is wrong is to try to argue red or blue is mathematically better. The qualitative domain here isn't signal processing but human psychology.

But no-one is actually arguing in favour of this artefact so I don't really know why you are raising this point that it's mathematically wrong. The only points here are what it is and the limitations in rendering that can cause it.

The Gloomhaven example, even when rendering at 480p, if I remember correctly still exhibits the shimmer, with is further evidence the developers screwed up AKA made a mistake AKA it's a bug.
Not a bug but a poor choice or compromise (in your opinion. One we probably all agree with but we don't know what the alternative was and what it would have cost to correct).

Let's not start arguing semantics. Are you better understanding the issues now some have weighed in with their perspective and are you able to find a solution? Do you appreciate why I'm saying a post-processing solution doesn't exist and do you have a response, agreement or correction, that furthers the discussion in a positive way?
 
Last edited:
I'm not going to argue over the definition of "bug." It's obviously undesirable, as I defined in the first post. If you think that example is desirable, your system of values is incompatible with a logical discussion.

In the example, they did it correctly for some textures, but not others. Even more evidence.

If you know anything about digital signal processing, like from audio domain, you know there are mirror images AKA "aliases" of the signal beyond the Nyquist frequency. Same problems occur in video signals, it's just that video people generally are forced not to care due to performance or hang their hats on anti-aliasing compromises, because most AA techniques aren't really sound in terms of Digital Signal Processing.

Like saying you prefer car wheels spinning the wrong direction in video.

So when anyone mentions sacrificing "sharpness," you have to realize there's a good chance this "sharpness" you're trying to preserve is incorrect in terms if DSP. It should have been filtered away but never was, so although you may subjectively "prefer" the sharpness and resultant shimmer, the signal you're preferring is technically invalid.

As an extreme example, I might consider 480p upscaled to 1080p with standard bilinear as preferable to "sharpness." Obviously I'd much rather preserve as much detail as possible by rendering at a higher resolution (again "sample rate" if thinking about audio), but not at the expense of shimmering 2D elements.

A little "blur" is probably more correct.

The Gloomhaven example, even when rendering at 480p, if I remember correctly still exhibits the shimmer, with is further evidence the developers screwed up AKA made a mistake AKA it's a bug.

If some of you are developers and that offends you somehow, good. Maybe you'll improve your skills and not do it next time. At least try to use 2 pixels for your skin thickness.
So to be clear, this is unfairly offensive for people trying to assist you here. Solutions and explanations have been offered here as to why graphics appear the way they do.

They were not responsible for games made decades ago, and the types of choices made then are completely unknown, but regardless I think an important part of this discussion is recognizing that people are trying to help you in this discussion and you’re coming away with fairly aggressive tone here. If what is coded matched the intended goal, even with acceptable graphical compromises, that’s not a bug. That’s deliberate.

I don’t see why any of your behaviour here is justified. No one here is obligated to assist you, so I would ask that you tone your language way down.

This is not the type of community we are trying to build here. We would like most of our users to be contributors here whether they are just beginning their hobby in graphics or full fledged developers, and I can see several trying their best at meeting your goals, but they deserve none of your attitude.
 
Way smarter graphics people in here than I; would one of the modern AA models which consider motion vectors of underlying geometry (DLSS / DLAA / FSR) help here? My understanding is the underlying render engine must expose motion vectors for the AA model to consume and project, and of course I'm making an assumption the menu elements in question are their own geometry and would generate motion vectors. Would then one of the AA models understand the box is moving and thus infer the necessary pixel-level gradients as the geometry moves?

Did any of what I just typed makes sense? :D

I do understand that a temporally-based AA would also probably solve this too, at the expense of some degraded IQ in the form of a slight motion blur. Would FXAA solve this too though, if it's just "popping" from one pixel to another? I know FXAA is only for geometry edges, but I have it in my dumb head it was more for stairstepping artifacts.
 
@Albuquerque , yes what you typed makes sense. In theory temporal AA could make this look better, but that particular game's TAA doesn't and/or the graphics settings in that game are questionably implemented. Instead of temporal making only motion "blurry", I'm saying the ideal is the entire perception of graphics should change to not output an invalid high frequency content signal in the first place. Therefore the entire image would look "blurry" (it's not blurry, it's band-limited, and if it's done right, it's imperceptible at standard viewing distances, which are already scientifically determined).

But yes, motion vectors could be part of a solution, probably.

FXAA won't solve it because FXAA is fully post-processing the image. All it has to work with is a static screenshot, so knowledge of the high frequency content edge's aliasing is unknown.
 
You still haven't said what exactly you are trying to do. ;)

You're wanting to mod games? How far back - is this mainly for retro, or a mod for current games? Or is this not even what you're intending?
 
I took it to be he's a gamer who wants to get rid of shimmering and is asking if there is some sort of post processing (like reshade, gedosato or DgVoodoo2) that will achieve that
 
Last edited:
Back
Top