Digital Foundry Article Technical Discussion Archive [2015]

Status
Not open for further replies.
Imagine something like that today, but deciding if the pixels to the screen are lighted or not ;).
How do you calculate if they should be lit or not? And to what degree? That requires megamaths, and that's what GPUs are. There's very little room for coprocessors any more because the workloads aren't specialised and trivial. That's why we have unified shaders, for example. Also, any lighting hardware would be locked to a specific type of lighting, unless you make it programmable at which point it becomes a generic compute unit. I suppose a Global Illumination unit would make sense as pretty much all games can benefit, even toon rendered. Although any dedicated silicon can't be repurposed, which is why the Holy Grail is unified processing power and a single maths engine doing whatever you want, from 3D graphics to audio simulation.

One thing I have considered is actual 2D hardware. Currently 2D is handled in 3D with quads and it seems terribly wasteful. If I want custom characters, I have to write character pieces to the character texture which is dog slow (at least in Unity!), or use a draw call for each piece of a character. In the 16 bit era I'd just copy arbitrary bits of memory for each part to the destination location. Blitting 2D has a lot of value still, I think, especially in mobile. I don't know if a 2D blitting engine could be written in GPU nowadays?
 
One thing I have considered is actual 2D hardware. Currently 2D is handled in 3D with quads and it seems terribly wasteful. If I want custom characters, I have to write character pieces to the character texture which is dog slow (at least in Unity!), or use a draw call for each piece of a character. In the 16 bit era I'd just copy arbitrary bits of memory for each part to the destination location. Blitting 2D has a lot of value still, I think, especially in mobile. I don't know if a 2D blitting engine could be written in GPU nowadays?

I'm very surprised to hear GPU's don't have 2D blitting.
 
Copper was very clever. A co-processed synchronised to the display beam so you could write code (a copper list, 4 instructions only: wait, move, skip and end) that pimped values into the custom hardware depending where the beam was.

A good sprite multiplexer routine could squeeze dozens of hardware sprites from the theoretical eight and using copper you could get more.

You could even paint the background colour (dff180) to about a four horizontal pixel precision producing a display with effectively no colours and no display memory. I loved that thing!
Cut my teeth on that bad boy. I remember writing one particular bootstrap demo where I had the top half of the screen mirrored in a rippling lake in the bottom half of the display where the ripple effect was created by adjusting the horizontal offset based on some sort of sin wave table. Amazing what could be achieved "for free" with the blitter.
Another thing I found useful was the ability to tune and debug the assembler by turning the background of the screen a different colour at various points during my frame processing to see where the processing time was going...or how much I had left. Good days.
 
Another thing I found useful was the ability to tune and debug the assembler by turning the background of the screen a different colour at various points during my frame processing to see where the processing time was going...or how much I had left. Good days.

This was so common lots of people measured complicated routine times in vertical scan lines :LOL:
 
New DF article on the GeoW beta. A good article overall. I just wonder though..why they have that penchant for motion blur? The game doesn't need motion blur at all.

http://www.eurogamer.net/articles/digitalfoundry-2015-hands-on-with-gears-of-war-ultimate-beta

None do. It's the new cool tech in this generation of consoles I guess. If your game is not heavily motion blurred, it's not a proper AAA console game because it got bashed by Digital Foundry-type outlets and videogames experts.
 
None do. It's the new cool tech in this generation of consoles I guess. If your game is not heavily motion blurred, it's not a proper AAA console game because it got bashed by Digital Foundry-type outlets and videogames experts.
Exactly my thoughts, that's why motion blur should be removed, it's unnatural. I was rotating the camera in Kaher Morten -TW3- yesterday and the trees in the distance that look beautiful in the distance when the camera isn't moving became messy and I couldn't see them well. I found the effect to be weird.

If you play on 144ms displays with 1ms of lag, PC gamers always want to get rid of ghosting and the annoying blur.

From a thread on Lightboost techonology and how it works.

http://www.neogaf.com/forum/showthread.php?t=720113

Why use a high refresh rate monitor?

The importance of a high refresh rate monitor is something that is hard to describe, because it really is an interesting experience going from one to the other. Outside of subjective enjoyment level, there are some hard facts on why it is objectively better than using standard 60Hz displays.

Much is said about graphical fidelity to present a lifelike creation to fully immerse the player. I’m sure everyone has been told by Dennis how important this is. No matter how pretty those pixels are, you also need proper illusion of motion. Without this, you just have a pretty slideshow.

“But 60 fps/hz is great, it’s so smooth”, you say. There’s also a lot of people who insist their 30 fps times are great too. But, there’s still room for improvement. Though for some the transition from 60 to 120/144hz might not be as significant as going from 30 to 60 for some, for others it’s the difference between night and day.

Generally this split is between competitive gamers and those who enjoy more standard AAA, RPG, strategy, or indie games. The high speed and movement in competitive games does lend itself towards being more obviously smooth with a higher refresh rate. Don’t count yourself out, as even people who are bad at games like Sethos and Smokey will attest to an improved experience in most games. It’s even been known to reduce or eliminate frame tearing.

"The other major factor is motion blur. For those who started being extremely interested in games in the last console lifecycle, I can see the confusion on your face right now. Motion Blur is not a graphical improvement. It was designed to make a sub-par 30 fps experience feel smooth. What it does is muddy everything on screen, reducing graphical fidelity. Getting rid of that mess is the key to having a consistent level of clarity and richness."

Using everyone’s favorite blur tool, Testufo, take a look at these differences.

60Hz

dVvSd0n.jpg


120Hz
oC8tKwY.jpg


120Hz w/ Lightboost
q2UnDhq.jpg


"Lightboost? What is this?

You should read this. But, the tl;dr version is that it strobes the backlight only when the LCD panel has fully refreshed, presenting a series of clear images rather than holding on to a refresh before the next occurs.
Click this to read more about why even OLED’s with a zero response time have motion blur, due to the sample-and-hold effect.

What this means is a return to CRT days when everything was crisp, without the downsides of the technology."
 
Exactly my thoughts, that's why motion blur should be removed, it's unnatural. I was rotating the camera in Kaher Morten -TW3- yesterday and the trees in the distance that look beautiful in the distance when the camera isn't moving became messy and I couldn't see them well. I found the effect to be weird.

If you play on 144ms displays with 1ms of lag, PC gamers always want to get rid of ghosting and the annoying blur.

From a thread on Lightboost techonology and how it works.

http://www.neogaf.com/forum/showthread.php?t=720113





"The other major factor is motion blur. For those who started being extremely interested in games in the last console lifecycle, I can see the confusion on your face right now. Motion Blur is not a graphical improvement. It was designed to make a sub-par 30 fps experience feel smooth. What it does is muddy everything on screen, reducing graphical fidelity. Getting rid of that mess is the key to having a consistent level of clarity and richness."

Using everyone’s favorite blur tool, Testufo, take a look at these differences.

60Hz

dVvSd0n.jpg


120Hz
oC8tKwY.jpg


120Hz w/ Lightboost
q2UnDhq.jpg


"Lightboost? What is this?

You should read this. But, the tl;dr version is that it strobes the backlight only when the LCD panel has fully refreshed, presenting a series of clear images rather than holding on to a refresh before the next occurs.
Click this to read more about why even OLED’s with a zero response time have motion blur, due to the sample-and-hold effect.

What this means is a return to CRT days when everything was crisp, without the downsides of the technology."
Good motion blur behaves a lot differently than ghosting. Ghosting doesnt make exceptions on what's on screen. Motion blur and especially object motion blur is trying to mimic a camera or the way we see fast moving objects. What is blurred depends on speed and distance. It is an effect. Depending how it is used, it can have good or bad results. It can also be used for gameplay reasons
 
Good motion blur behaves a lot differently than ghosting. Ghosting doesnt make exceptions on what's on screen. Motion blur and especially object motion blur is trying to mimic a camera or the way we see fast moving objects. What is blurred depends on speed and distance. It is an effect. Depending how it is used, it can have good or bad results. It can also be used for gameplay reasons
By your definition ghosting is maybe what Halo Reach looked like on the X360 with the temporal AA. Motion blur on the other hand is an effect that was important back in the day, but I see it as a distraction nowadays. Most games are high speed, high resolution and motion blur hides a lot of things, I think motion blur is up there with Film Grain as one of the worst effects. IMO, it takes away from the game. I wish games would let you turn it off. It could work with very very fast moving objects, like ammunition, but other than that in my eyes it covers some things with vaseline.
 
There is one important thing to note however, which is that film grain is not a real world phenomenon but motion blur is.
Meaning film grain only exists for things captured through a camera but your eyes....they experience motion blur in real life.
 
Motion blur as post process solution is destruction of information, let's be clear about that. But it's also a real phenomenon, objects do not jump between positions in real life. The problem with motion blur as a post process instead of the one we experience in real life (yes, it happens also on our retina as well as film or sensor), is that, in real life, you can "reduce" or nullify the motion blur on a specific object by tracking with your eyes (if at trackable speeds), whereas as a post process solution, there's no way to retain that information anymore.

Having said that, there are a couple of ground facts:
  • 60fps is not enough to depict natural motion above a (not so) certain speed. 60fps does not provide enough temporal samples for this.
  • Most people have 60hz displays.
  • People especially with typical LCD's without light boost constantly suffer from ghosting. It's not the CRT days anymore.
  • Motion blur helps depict a natural motion along a continuous path instead of "jumping" betwen positions, and reduce strobing artefacts.
I made a poor man's test by modifying an existing shader on shadertoy, and shared it on GAF for a simple poll . Majority preferred the 60fps + motion blur as a pleasing depiction of motion. The test is also adaptable, I hope(!), for 120hz and 144hz monitors by modifying the #define MAXFPS 60 line to your liking, just make sure the actual fps matches the defined fps. (Btw, test runs at a simulated 60hz even on 120hz monitor)
https://www.shadertoy.com/view/4lSGDy
Of course the test is only limited to the context of moving red balls :)
 
Motion blur as post process solution is destruction of information, let's be clear about that. But it's also a real phenomenon, objects do not jump between positions in real life. The problem with motion blur as a post process instead of the one we experience in real life (yes, it happens also on our retina as well as film or sensor), is that, in real life, you can "reduce" or nullify the motion blur on a specific object by tracking with your eyes (if at trackable speeds), whereas as a post process solution, there's no way to retain that information anymore.

Having said that, there are a couple of ground facts:
  • 60fps is not enough to depict natural motion above a (not so) certain speed. 60fps does not provide enough temporal samples for this.
  • Most people have 60hz displays.
  • People especially with typical LCD's without light boost constantly suffer from ghosting. It's not the CRT days anymore.
  • Motion blur helps depict a natural motion along a continuous path instead of "jumping" betwen positions, and reduce strobing artefacts.
I made a poor man's test by modifying an existing shader on shadertoy, and shared it on GAF for a simple poll . Majority preferred the 60fps + motion blur as a pleasing depiction of motion. The test is also adaptable, I hope(!), for 120hz and 144hz monitors by modifying the #define MAXFPS 60 line to your liking, just make sure the actual fps matches the defined fps. (Btw, test runs at a simulated 60hz even on 120hz monitor)
https://www.shadertoy.com/view/4lSGDy
Of course the test is only limited to the context of moving red balls :)

Like you said your test is not comparable at all on what happen in a 1080p highly detailed animated image with a 2015 videogame. It's only really pertinent for games like Pong or tetris, or maybe the ball in Mario tennis.

For a game like UC4, all those considerations lose most of their weight because there are so much high frequency details everywhere, it's not anymore a bland circle in front of a white background. Others considerations like clarity, image integrity and image quality, immersion in the game become much more important in the opinion of many gamers.

The more you add details in the image, the more motion blur (full screen and per-objected) have a disastrous perceived impact: the VASELINE effect.
 
It becomes a compromise between static visual fidelity and fluidity of motion. That's both subjective and varies by game. Please, please, please stop assertion that your preferences are the one true way. Accept that others of us, quite possibly the majority considering the devs themselves prefer this route, prefer smooth motion to pixel perfect textures. Very much so at 30 fps where crystal clear, juddering scenery is far more damaging to immersion than moving objects looking like they're moving.
 
Like you said your test is not comparable at all on what happen in a 1080p highly detailed animated image with a 2015 videogame. It's only really pertinent for games like Pong or tetris, or maybe the ball in Mario tennis.

It's definitely not as simple as that. How much fidelity do you need on a moving part of the image? Like Shifty says, there's a compromise there. When you click on the shadertoy window in that example, the refresh rate is halved, so you get to see that even 15hz with motion blur is more pleasing to look at than 30hz without motion blur as far as motion fluidity is concerned.
 
It becomes a compromise between static visual fidelity and fluidity of motion. That's both subjective and varies by game. Please, please, please stop assertion that your preferences are the one true way. Accept that others of us, quite possibly the majority considering the devs themselves prefer this route, prefer smooth motion to pixel perfect textures. Very much so at 30 fps where crystal clear, juddering scenery is far more damaging to immersion than moving objects looking like they're moving.
This is true.
A 30FPS game without motion blur will have a lot of judder while panning the camera and between player animations, lots of skipping and nothing to mask these skips.It's only horrible when done like FF Type Zero. When done well with enough samples per object motion blur can be a thing of beauty, especially for 30FPS games and it goes a long way in smoothing out the transitions.

Also Witcher 3 actually seems to be a bit strange in that it only uses motion blur for extremely far off stuff, suppose the backdrop of Novigrad from top or looking at the mountains from Kaer Morhan at the start of the game, these blurred quite heavily while panning the camera and you won't notice any judder or skipping at all. But you cannot see any hints of motion blur the rest of the times like for example if you are in the city and rotate the camera then you won't see any motion blur but you will see the judder and skips, I also do not notice any per object motion blur on Geralt, other characters or the objects in the world.

So actually using Witcher 3 to show why motion blur is horrible is a bad example because the game rarely even uses it.

*This is all based on my PS4 playthrough ofcourse.
 
Last edited:
There is one important thing to note however, which is that film grain is not a real world phenomenon but motion blur is.
Meaning film grain only exists for things captured through a camera but your eyes....they experience motion blur in real life.
I think that it's all about what @Nesh said then, if used well.

Motion blur is a good idea for specific situations in racing games maybe but other than that, really does anyone run fast enough to make images around look blurry? :)

F-Zero GX has no motion blur, but the tail of the vehicles can have some kind of motion blur that is actually nice and coherent. I've seen this in real life. At high speed the air over your face makes things slightly blurry, but only because of the atmosphere.
 
Status
Not open for further replies.
Back
Top