The Great Simulated Optics Debate *spawn

I mean: Silent_Budda, hold your hand in front of your face and wave it around.
What you see if high quality retina grade velocity based object motion blur. (Unless you are waving your hand in front of your old 50hz crt monitor)

<snip>

If you have access to a children playground Silent_Budda, go sit on a carousel and have some children turn it around while you stare outwards.
Is the background perfectly sharp? Or might it have.... blur applied to it?
I am guessing the latter.

Uh, did you bother to read anything I actually typed? Or more importantly even the very first paragraph?

However, if you track your hand's motions your hand will remain perfectly in focus. The background, of course, will then be a blurry mess due to a combination of relative motion and being on a different focal plane. Or if you focus on the background, then your hand is a blurry mess for the same reasons.

You can't track your hand's motions. At least I can't. Even a fairly gentle wave against a window I cannot track the fingers and they blur. I have to slow the wave right down to a slow rotation to be able to fix upon a finger.

Perhaps I'm a bit of an anomaly then. Similar but not the same to how I'm also so reliant on peripheral vision and motion queues compared to most everyone I've met. But I don't have any trouble tracking my hand, although there is a bit of blurring on the quick transition from hand going in direction to hand going in another direction. Maybe my eyes/mind process information differently. /shrug.

Regards,
SB
 
Maybe my eyes/mind process information differently.

No they don't. You're perfectly normal, and your response to nightshade was entirely to the point. A fundamental problem with motion blur in games is that the game engine doesn't know what your eyes are following, and when the assumption is wrong, the result is very unnatural. And this problem remains regardless of how sophisticated the blur algorithm is in and of itself.

L. Scofield said:
One correction: DOF is an optical effect, not a perceptual effect, meaning your brain isn't responsible for it, but the lenses in your eyes. The screens in a VR headset are at a fixed distance so everything on the screen is at the same focus therefore you need postprocessing to create the DOF effect.
DOF in VR is a red herring. For a normal person, everything they look at is in focus, because wherever they look, that's where the focus will be, and the brain disregards the actual small focussing process. We have to actively try to disengage this system in order to be aware of the DOF of our eyes at all, such as focusing on a finger just in front of our eyes and then mentally disengage from the focus point and extend our awareness to the areas behind while trying not to refocus. But that is a highly unnatural process even if possible. I should know because I do not have normally functioning vision anymore. I have had surgery for cataracts done to both my eyes, having had my lenses replaced by fixed plastic ones. Which means that I cannot refocus my eyes at all. And even so everything from one meter on out is perceived to be "in focus" since that is what the combined system of focal length, aperture (pupil size) and resolution of the retina adds up to. All of those blurred backgrounds are simply wrong if you are trying to model human vision, so for VR which casts you into an virtually simulated reality having everything being in focus is actually correct. That is how humans perceive their surroundings.
 
Perhaps I'm a bit of an anomaly then. Similar but not the same to how I'm also so reliant on peripheral vision and motion queues compared to most everyone I've met. But I don't have any trouble tracking my hand, although there is a bit of blurring on the quick transition from hand going in direction to hand going in another direction. Maybe my eyes/mind process information differently. /shrug.
A bit of Googling throws up this Sports Science article on vision.

Article said:
When there is slow relative movement between an observer and an object, the eyes can smoothly move together following the object until visual angular velocities reach 40 to 70 degrees per second (Bahill & LaRitz, 1984; Ripoll & Fleurance, 1988). In observing human movement, this translates to surprisingly slow movements, like a person walking (3 mph) slowly past an observer six feet away...

Most movements in sport require saccadic eye movements in order to observe parts of the action. Volleyball requires visual angular velocities in excess of 500 degrees per second to track the trajectory of a spiked ball (Ridgway & Kluka, 1987). Saccades can reposition eyes at angular velocities exceeding 700 degrees per second (Carpenter, 1988), but the eyes are essentially turning off as they saccade to the next fixation (Cambell & Wurtz, 1978).
Depending on distance to your hand and how far you wave it and whether you're in that 40 or 70 degrees per second category, a waving hand can certainly exceed the physical limits. I wave probably 120 degrees side to side in this test. My middle finger traverses some 13 cm at a distance of ~40 cm. That's an angular movement of 20 degrees per half-wave. 3 leisurely waves per second is 120 degrees per second, so you'll still stretching things there. A more typical (for me) vigorous hand wave of 6 side-to-side oscillations (12x 13cms travelled) is way beyond my naturally ability to track. Instead of trying, the eyes focus to the distance of the hand and let it blur. If I just gently rotate the hand side to side and not very far, I can certainly track the fingers.

I guess you're just not manly enough in your hand waving to exceed your eyes' limitations! ;)
 
DOF in VR is a red herring. For a normal person, everything they look at is in focus
Everything else is out of focus if not at the same distance (cover one eye, focus on something close, the optical effect of DOF blur is very apparent even if we ignore it in day-to-day vision), but in VR it'll be double-imaged rather than blurred. For correct, eye-like vision, VR would need to add eye tracking, determine what distance you're focussing at, and apply subtle DOF blur to everything else. Although peripheral vision limits may make that mostly redundant. But VR is still technically wrong as L. Scofield says. If you close one eye wearing a VR headset and focus on close virtual finger, the background will sill be sharply rendered instead of blurred out as it would be in real life.
 
Everything else is out of focus if not at the same distance (cover one eye, focus on something close, the optical effect of DOF blur is very apparent even if we ignore it in day-to-day vision), but in VR it'll be double-imaged rather than blurred. For correct, eye-like vision, VR would need to add eye tracking, determine what distance you're focussing at, and apply subtle DOF blur to everything else. Although peripheral vision limits may make that mostly redundant. But VR is still technically wrong as L. Scofield says. If you close one eye wearing a VR headset and focus on close virtual finger, the background will sill be sharply rendered instead of blurred out as it would be in real life.

True, but my point is that even when we focus on something very close to our bodies (which is unusual in games), we do so because that is where our attention is - ergo, whether the background is, or is not blurred when peering at something close range is not something you would normally perceive without actively trying. The moment you would take an interest in the background it would be in focus again.

(When I had my surgery I thought not being able to focus at all would be horrible. It turned out it hardly mattered.
The natural state for the eye is that the muscle that contracts when you focus is relaxed, and for a normal sighted person this is at "infinity". However, sharpness extends quite close to the body even then, as I said to a meter or so and in good light even closer, so you don't actually have to refocus at all as you navigate your daily environment. Apart from reading, not being able to focus doesn't affect my daily life at all, DOF be damned.)
 
For correct, eye-like vision, VR would need to add eye tracking, determine what distance you're focussing at, and apply subtle DOF blur to everything else. Although peripheral vision limits may make that mostly redundant. But VR is still technically wrong as L. Scofield says. If you close one eye wearing a VR headset and focus on close virtual finger, the background will sill be sharply rendered instead of blurred out as it would be in real life.

I wonder if there are any harmfull side effects to fooling a persons's eye aperture into opening up as if its focused on a distant object while there is actually a bright LCD screen closely in front of it. I don't have enough physics or biology knowledge to even guess if this is a problem at all or not. Anybody?
 
I wonder if there are any harmfull side effects to fooling a persons's eye aperture into opening up as if its focused on a distant object while there is actually a bright LCD screen closely in front of it. I don't have enough physics or biology knowledge to even guess if this is a problem at all or not. Anybody?

The eye doesn't focus by changing "eye aperture"; pupil dilatation or shrinking only (when leaving emotional response out of it) happens as means to regulate the light input; though when viewing a brightly lit landscape for example in theory you should have a bigger depth of field. But in reality, because the focussed area is so small (only the fovea has a 'sharp' image anyway) it doesn't matter that much for people with normal vision.
 
i think its my eyes that's broken, it looks like this if i made it in photoshop:
l86kjvn.jpg
If the screen is blurry while the image is static the problem is definitely with your TV/monitor.
 
Worst use of motion blur I witnessed in recent years was in The Lat of Us :yes:
Every most minimal movement causes motion blur.
Joel runs = motion blur.
Joel walks = motion blur.
Joel crawls at 0.1m per second = motion blur.
Joel breathes = motion blur
not really but you got the idea ;)
 
Worst use of motion blur I witnessed in recent years was in The Lat of Us

Have you played Okami and ICO on PS3?

I think in Okami's defence that so much MB is more an artistic choice than anything. Developers probably thought it would look good with the particular artstyle in this game, the game is far from being realistic. Also they maybe thought it would be a way to add some cheap and easy to implement "next gen visual effects" in the game compared to the DS or Wii version.

For Ico I think so much motion blur, notably when the camera is manually controlled, is meant to prevent us using the camera too much... that and the purposefully awful control we have on the camera in this game.

:rolleyes:
 
Considering how small pupil is the lack of DoF is most likely less bad than not having one.
Same cannot be said about motion blur which is highly visible even in case of high framerates ~>60 fps.

Been wondering if one could save frames in a way which would keep movement and image information for each frame and then with eye tracking and post processing would create final frame for each refresh.
Basically find out line where eye is looking/going during frame and use the information to create focused motion blur for displayed frame. (with proper prediction it might be possible to create information for couple of refreshes on fast display even if framerate is slower.)

Same might work for DoF, but then the problem would basically be a 5D lightfield..
Certainly would be nice for movies and such as you could have small head movement as well.. (which might be easier/faster to re-create with z-buffer.)
 
Have you played Okami and ICO on PS3?

I think in Okami's defence that so much MB is more an artistic choice than anything. Developers probably thought it would look good with the particular artstyle in this game, the game is far from being realistic. Also they maybe thought it would be a way to add some cheap and easy to implement "next gen visual effects" in the game compared to the DS or Wii version.

For Ico I think so much motion blur, notably when the camera is manually controlled, is meant to prevent us using the camera too much... that and the purposefully awful control we have on the camera in this game.

:rolleyes:

Well I have played them but they are two of my favorite games so even if excessive MB was there it didn't spoil the experience.
Really when I am enjoying a game I don't see defects anymore, or rather I seen them but they don't matter.
Brain magic ;)
 
Last edited by a moderator:
That reads as made-up nonsense to me. The lenses I wear are shaped to provide corrective vision no matter where I look through them, and I can look through the extreme edges and have everything sharp. I certainly don't need to point the eyeglasses towards the object. If one has varifocals though, one clearly is going to have half the spectacles blurring while the other half is sharp.
 
Well speaking of human eye simulation in games I have seen that eye adaptation is becoming more popular.
That IMO is one effect that despite being highly accelerated it's quite well simulated.
 
In the latest Eurogamer Digital foundry analysis on COD Advanced Warfare XB1 E3 gameplay:

another massive upgrade from Ghosts is the use of a pristine motion blur effect ...the game now at least looks far closer to our idea of a next-gen title.

Even well known videogame technical journalists are encouraging this very wrong idea that if you just add some easy and cheap motion blur (and other cinematic optical effects), your game will look "next-gen". He really said it himself: "idea of a next-gen title"...

Well this "idea" is not shared by everybody!

I think we are for a ride for an obligatory cinematic generation with those cheap & easy effects heavily used in many games, now in FPS like Destiny or COD. And this is still my opinion that they use those effects because they can (those >1 tflops GPUs can) and those are a cheap and a easy way to add "next-gen" effects and also hide technical deficiencies like stuttering fluctuating framerate (but the latter does not apply on all games, Destiny sustains a solid 30fps apparently).

But in this game apparently the motion blur is also used to hide the highly fluctuating 30-60fps (but in fact mostly hovering around 30s and 40s) framerates:

The variable 30-60fps performance is disguised to an extent by the heightened effect, blending each frame into the next

Sadly It brings us back to the motion blur used in MGS5 PP. If your game runs at a solid 60fps you don't need a "disguise" to hide a stuttering framerate.
 
I don't' think DF it's encouraging any idea.
DF, or rather Mr Thomas Morgan, is expressing his opinion about what he considers to be next-gen which is something highly subjective.
I think instead that we should try to be as objective as we can and stay away form opinions based on taste.

Maybe it didn't transpire form my previous post about TLOU motion blur but I consider that implementation of motion blur excessive simply because almost ANY action performed by the main character, even the slow movements, causes motion blur which it's physically and objectively incorrect/wrong.
 
Last edited by a moderator:
Reminds me of lens flares and bloom filter abuse not so long ago...
An industry without veteran is doomed to repeat the same mistakes over and over...
 
I don't' think DF it's encouraging any idea.
DF, or rather Mr Thomas Morgan, is expressing his opinion about what he considers to be next-gen which is something highly subjective.
I think instead should try to be as objective as we can.
Given Globalisateur isn't being objective, he can't complain if other people also aren't being objective. ;)

In fact, reading the full quote, Globalisateur has completely misread it because of his anti-moblur stance.

Besides the resolution boost, another massive upgrade from Ghosts is the use of a pristine motion blur effect - of both the full-screen and per-object varieties. Where before its implementation showed heavy, crude banding across its trails, and was limited to spinning helicopter blades or stun grenades, Advanced Warfare's heavier approach adds to the game's cinematic flourish. Be it soldiers jet-boosting between ledges, reloading a rifle, or strafing to the side of an oncoming drone swarm, the game now at least looks far closer to our idea of a next-gen title.
That is, they used old-gen motion blur in Ghosts and it was rubbish. Now they use next-gen, pristine, convincingly realistic, motion blur and it looks good (unless you hate moblur). If your game is going to use motion blur (subjective choice), then use a next-gen motion blur to create what looks like a next-gen game.

The article was not comparing moblur to no moblur and concluding that its addition is necessary to hit a next-gen checkpoint.

I'll even go so far to say that motion blur in COD:AW is aesthetically a plus. It's very slight and only really apparent when it would be present in reality. If someone standing next to you suddenly rockets up into the air, they'd appear as a blur. It can be argued against its inclusion when you yourself jetpack, as if you were tracking a scenery object you'd be keeping it stable, but 1) that's easily addressed by saying that the TV feed is from a head mounted camera and isn't your eyes. 2) It's not like the visuals in games have ever been that realistic anyway, such as everything turning grey and blurry when you get shot a bit. I'm against such near-death clues myself as they make recovery that much harder, but I wouldn't try to argue that the industry is wrong for having different values to myself.

The only real argument is if the motion blur is compromising the framerate and leaving it out would allow a stable 60 fps. With no knowledge of what the bottlenecks are, there's no way to debate that for this game.
 
Back
Top