Image quality - a blast from the past

I repeat my previous statement as a question: if people here claim they can tell the difference between 100fps and 150fps, why do you think your 1/8s averaging is going to make a difference?

Why can I see TV flicker at 50Hz? That's much bigger than 1/8s.

The eye may average over 1/8s but is hypersensitive to flickering.
 
Chalnoth,
How about the following quote from Andrew Glassner:
So far we have only discussed the response of the eye to a single photon. In fact, the chemical processes that occur last several milliseconds, and additional photons that strike the receptor during that time add to the overall response. Thus the output of a receptor is really a time-averaged response, and effect called temporal smoothing. In effect, the sensors impose a low-pass filter over their time response, though the cutoff frequency of that filter changes with respect to the background light level: when there is little light arriving, there is little smoothing.
You are partially correct - there is some time averaging - but several milliseconds corresponds to a lot less than 1/8th of a second.

Glassner goes on to described the "Critical Flicker Frequency" which is the rate at which flashes appear to 'fuse' into one continuous image, and the "flicker rate" which is when the flicker becomes noticeable. The latter seems to be well over 8Hz.
 
noko said:
Yep and my luck hasn't been that keen lately. Hey just found a Tyan Tachyon on E-Bay for $329. Just maybe. . . Hopefully Newegg has them shortly.

I'm with ya on the luck thing Noko, it can only get better. Good luck in finding a card.
 
Simon F said:
You are partially correct - there is some time averaging - but several milliseconds corresponds to a lot less than 1/8th of a second.

Glassner goes on to described the "Critical Flicker Frequency" which is the rate at which flashes appear to 'fuse' into one continuous image, and the "flicker rate" which is when the flicker becomes noticeable. The latter seems to be well over 8Hz.

There's a crucial difference here. I'm not talking about exceeding the "Critical Flicker Frequency" because it's not necessary here. The monitor's refresh rate already takes care of this. The thing is, the different receptors in the eye respond differently to light. The ones that just detect brightness (rods, I believe) respond much faster than those that detect color. They are also predominantly used in low-color scenarios. Anyway, what I'm trying to say is that there's a world of difference between averaging of many bright frames when in-between the entire screen goes black (as happens with CRT's) than averaging between frames of nearly the same color value.
 
Can anyone here post some 16bit Radeon 9700 pro pictures with best quality, as in using 32bit textures without compression at max everything game settings (except for 16bit display)? Using both AA and AF to the max also. Thanks.

Doomtrooper,
I maybe in luck, looks like Micro-Pro put back up their $329 price on the Tachyon G9700 Pro on Price-Watch :D.
 
Chalnoth said:
The ones that just detect brightness (rods, I believe) respond much faster than those that detect color. They are also predominantly used in low-color scenarios.
Rods are denser around the edge of the retina whereas cones are denser near the center. This is why you can often make out faint objects out of the corner of your eye that are invisible when you look directly at them.
Anyway, what I'm trying to say is that there's a world of difference between averaging of many bright frames when in-between the entire screen goes black (as happens with CRT's) than averaging between frames of nearly the same color value.
But your CRT doesn't go completely black due to persistence, unless you are using a refresh rate that is slower than the persistence of your monitor.
 
OpenGL guy said:
But your CRT doesn't go completely black due to persistence, unless you are using a refresh rate that is slower than the persistence of your monitor.

When I was reading up on this, I seem to remember the persistence essentially always being much smaller than the time between displays. This makes sense to me, as CRT's generally never show ghosting, and there are flicker problems with static images (if there was any significant level of persistence, as there is with an LCD, then static images would never show flicker, at any refresh rate).

Anyway, I did some looking, and came up with this Microsoft page which lists a few phosphor types:

http://msdn.microsoft.com/library/d...ry/en-us/graphics/hh/graphics/dpyddi_6f53.asp

But I have yet to find out the persistence of the various phosphors.

Update: Ah, here we go. Medium-short phosphors (which seem to be the standard for computer CRT's) are between 10us and 1ms. Now, I don't know exactly what this time constant is (half-life or exponential constant), but it is relatively safe to say that the display could be called "completely black" after about 5-10 time constants. If the phosphor is closer to 10us, then it will easily turn "completely black" beyond 200Hz refresh. If it's 1ms, then it will turn "completely black" between frames running up to about 100Hz.
 
Chalnoth said:
When I was reading up on this, I seem to remember the persistence essentially always being much smaller than the time between displays. This makes sense to me, as CRT's generally never show ghosting, and there are flicker problems with static images (if there was any significant level of persistence, as there is with an LCD, then static images would never show flicker, at any refresh rate).
I can see some trails left by my mouse pointer over a black background, so I'm quite sure it's not going completely black. Try turning off your lights and doing the same experiments. Granted, some of this may be due to your eye, but not all of it.
 
Try this little test, OpenGL Guy. Stare at your monitor, and wave a single finger in front of your field of view. If you focus on your monitor, you should notice that you do not see any blur in your finger's movement (at least, there doesn't appear to be any on my monitor). This is indicative of only seeing your finger in a series of flashes. Put another way, if there is no noticeable blur, then the persistence of your monitor's phosphors is very short (probably closer to 10us in that range I posted earlier).
 
Chalnoth said:
Try this little test, OpenGL Guy. Stare at your monitor, and wave a single finger in front of your field of view. If you focus on your monitor, you should notice that you do not see any blur in your finger's movement (at least, there doesn't appear to be any on my monitor). This is indicative of only seeing your finger in a series of flashes. Put another way, if there is no noticeable blur, then the persistence of your monitor's phosphors is very short (probably closer to 10us in that range I posted earlier).
I'm not sure exactly what I am looking for here. An easier way would be to have a camera take a picture with a very short exposure time (say one half the time it takes for a single refresh of the monitor). I recently lost my digital camera so I can't do the experiment myself. :(
 
OpenGL guy said:
I'm not sure exactly what I am looking for here. An easier way would be to have a camera take a picture with a very short exposure time (say one half the time it takes for a single refresh of the monitor). I recently lost my digital camera so I can't do the experiment myself. :(

You shouldn't need a camera. Just keep your eyes focused on one portion of the screen and wave a finger in front of your field of view.

If the overall brightness of the monitor does not vary significantly from refresh to refresh, you will see a continuous blur, the exact same thing you'd see when looking at any non-flashing object.

If the overall brightness of the monitor takes an appreciable time to decay, compared to the screen refresh rate, you will see multiple blurred images of your finger.

If the overall brightness of the monitor takes a miniscule time to decay compared to the screen refresh rate, you will see multiple sharp images of your finger. This is what I see on my monitor (at 85Hz).

Update:
I forgot one condition. Make sure the image on your monitor when doing this test is bright. If it's not, you will just see a uniform blur.
 
Not that I disagree with you Chalnoth, but I'll point out that a noticeable drop in brightness != black. It could be, but that isn't a necessary conclusion.
 
Bigus Dickus said:
Not that I disagree with you Chalnoth, but I'll point out that a noticeable drop in brightness != black. It could be, but that isn't a necessary conclusion.

It is. I essentially confirmed it myself with my previous experiment. As I said, if waving an object quickly in front of the screen (could really be any object, though narrow ones work best) results in multiple descreet images, then that means that the screen goes completely black between frames, and very quickly (completely black meaning it is no longer emitting any light).

If the screen merely got "mostly dark" between frames, then the above would still yield a blur.

In the end, I think I've shown pretty well that there's a big difference between the "flicker frequency" and what is required to not detect a large color difference in smaller time increments. This is a similar argument to the reasoning behind film and TV having a higher screen refresh rate than frame rate. Remember that film is on a moving reel, and so there must be a fast-moving shutter that only displays the frame for a split-second in order for the picture on the movie screen to not be in motion. In both situations, for different reasons, a base "flicker frequency" must be avoided, but we do not necessarily need frame updates this quickly to get a good idea of movement.

This all harks back to the idea that there are two sorts of receptors in our eyes, rods and cones. I believe it's the rods that detect only brightness, and they tend to respond much more quickly and are more sensitive than cones. In other words, our eyes are more sensitive to flicker if there is very high contrast. If the contrast is low, the cones take over, and they are much slower to respond (in the range of 1/8th of a second), resulting in a much smoother image.

But for FSAA, a changing sample pattern should improve the effective number of samples, provided that the contrast of roughly 1/8th of a second worth of averaged frames does not change appreciably.

As an example, with the idea I gave above, imagine a pixel that is half black and half white. Let's say that we'll have a threshold of only wanting approximately 1/256 color difference as seen by the eye (at 60 fps, that's every 8 frames). If white is 1 and black is 0, then the standard deviation of each sample is 1/2. The standard deviation as seen by our eyes will be 1/2 / sqrt(n), where n is the total number of samples averaged in our eyes. If we want the standard deviation to be no larger than 1/256, then n=(256/2)^2=16384. At 8 frames averaged in our eye, this would require 2048 samples per pixel per frame. So, perhaps the major flaw here is that it will take too much to get the point where the changing of pixels over the 8-frame period is small enough not to be detected. Eventually it may happen, but most likely that will be with motion blur also taken into account.

At the same time, the idea may still work much sooner if it isn't completely random, but is instead cyclic. I don't think this can really result in images that will be great compared to eventual fully-random sampling, but a sparse-sampled cyclic pattern may be good.
 
OK. We're talking about seeing no flicker in an 85Hz or 100Hz refresh rate, then saying it's OK to have noise that changes every 1/8s just because the level is smaller. I think there's a flaw in this logic.
 
Dio said:
OK. We're talking about seeing no flicker in an 85Hz or 100Hz refresh rate, then saying it's OK to have noise that changes every 1/8s just because the level is smaller. I think there's a flaw in this logic.

I'm saying it's different because different parts of the eye are at work. I'll just add one more little thing in to this topic. 60Hz is just fine if you're not displaying bright images. This is the main reason I haven't bothered to get another refresh rate forcing program for Tenebrae...the darkness of the game masks the 60Hz refresh rate without any problem.

As for the actual frequency that our eyes will detect noise, I'm just going off of what I've read. It may vary somewhat from person to person.
 
Chalnoth said:
Bigus Dickus said:
Not that I disagree with you Chalnoth, but I'll point out that a noticeable drop in brightness != black. It could be, but that isn't a necessary conclusion.

It is. I essentially confirmed it myself with my previous experiment. As I said, if waving an object quickly in front of the screen (could really be any object, though narrow ones work best) results in multiple descreet images, then that means that the screen goes completely black between frames, and very quickly (completely black meaning it is no longer emitting any light).
It goes black: Verified by taking a picture of a monitor set to 85 Hz with 1/120th second exposure time: Pixels that haven't been touched the longest are black. Neat.

Update: Now that my CRT is warmed up, I can see that it's not totally black, but it's quite dim.

If the screen merely got "mostly dark" between frames, then the above would still yield a blur.
I can't verify this as I don't have a slow enough monitor.
Remember that film is on a moving reel, and so there must be a fast-moving shutter that only displays the frame for a split-second in order for the picture on the movie screen to not be in motion.
I don't believe the film is in motion at all while it is being displayed. Else you would get serious problems when the image is shown 2 or 3 times, as most projectors do to reduce flickering.
 
OpenGL guy said:
I don't believe the film is in motion at all while it is being displayed. Else you would get serious problems when the image is shown 2 or 3 times, as most projectors do to reduce flickering.
Hmm... that's interesting. I know that the motion stobe effect at theaters bugs the shit out of me, but you're right... I never notice bad flickering in bright scenes, and you would if the shutter was operating at 24 fps.

Now, I don't know anything about how film projectors actually work, but I can take a pretty reasonable guess. As a mechanical engineer, having the film stop for a few refreshes of each frame before moving to the next one would be beyond a nightmare. I'm not sure it would be physically possible to accelerate the film quickly enough to move it from one frame to the next in 1/100th of a second (roughly 24 fps x 4 refreshes) without the audience being quite disturbed by the Cummings diesel running in the projection booth. Or, perhaps the droning sound of the projector mechanism ratcheting from stop to go with a nice 100Hz tone would drown out the diesel that's driving it.

:D

My guess would be that each frame is simply duplicated three or four times in the actual film reel production process. As the film passes through the projector at a fixed rate, the shutter syncs to the frames, and each image gets three or four identical frames before the next 1/24th of a second comes up.

The film frames would look like 1111222233334444 and so on.

I'm tired, and too lazy to look it up and see if this guess is what they are actually doing or not. Maybe tomorrow. :)
 
Each frame is displayed twice.
And yes, it is on the reel twice - you are quite correct, it would be impossible to stop - start the reel that fast.

The projector uses a spinning discwith a bow tie shaped hole cut in it (this is what the camera uses also) to allow light through.

But in essence, movies are displayed at 24fps with a refresh rate of 48Hz, so you dont see the flicker. (as much - i can still sometimes see it).


Anyways, the bow-tie hole is the solution to your problem, Bigus and OpenGl.
 
Biggus Dickus (and hello to your wife, Incontinentia)
Since you're a mech engineer, have a search for the "rolling loop" film projector technology. It was developed by an Aussie bloke and is now being used, IIRC, for the Imax/Omnimax projectors. They have to handle a much higher framerate (60Hz) and larger images on the film.

With the standard film projector technology, the film just gets shredded because it can't take the forces needed to accelerate/decelerate when changing frames. The rolling loop method, OTOH, is gentle.
 
Back
Top