rockaman
Regular
You can't voluntarily look at the blurred edge of a screen in a 2D movie?
That is a new one to me. I had no idea. I must have been watching movies wrong for many years.
If your eyes are open, there are always stimuli helping direct your attention. This is a function of your brainstem, an area called the superior colliculi. Near the inferior colliculi involved with audio processing
"A reflex loop?" Similarly, a computer is a "collection of transistors."
There is something you are right about, and that is the *distance* to the screen being a very important factor and convergence changing. 2D movies are easier on the eyes, because at distance not much farther than several centimeters, binary vision does not really contribute to any 3D effect. Beyond this very close distance, it is primarily parallax that contributes to our intepretation of depth. So when you watch your regular TV at home, it's not much of a bother.
With 3D, that distance is brought much closer towards the eyes.
But we were not talking about that, we were talking about the quality vs. artistic aspect of a 3D movie. A "blurred tap in the foreground" can be blurred or not blurred and how much so to the director's desire. Whether or not that effect is "realistic" or not isn't really a discussion about whether or not 3D technology is *good* but rather if the director's choice of focus and focus blur is to the viewer's taste or not, for any old reason. In another movie, the same technology could be used to make a more "realistic" foreground object. In that way.... like a 2D movie. The degree of blurriness on a foreground object is not a measure of the quality of the technology, but a comment on the director's choice more likely. That is what I was saying.
That is a new one to me. I had no idea. I must have been watching movies wrong for many years.
Nope.No matter what happens on screen there's no stimuli that would cause the brain to try to focus any differently.
If your eyes are open, there are always stimuli helping direct your attention. This is a function of your brainstem, an area called the superior colliculi. Near the inferior colliculi involved with audio processing
Is that an expert description?This is where headaches happen, because it's a reflex and it's a feedback loop between your eyes and your brain,
"A reflex loop?" Similarly, a computer is a "collection of transistors."
That can be true, but again, is this an issue with the technology itself or how it was used? There seems to be a lot of conjecture in this thread about "why 3D is poor and failed so many times" attributed to technical aspects, and ignoring everything else.which is severely imbalanced with close 3D objects (or Avatar-like crossed background which was extremely stupid).
There is something you are right about, and that is the *distance* to the screen being a very important factor and convergence changing. 2D movies are easier on the eyes, because at distance not much farther than several centimeters, binary vision does not really contribute to any 3D effect. Beyond this very close distance, it is primarily parallax that contributes to our intepretation of depth. So when you watch your regular TV at home, it's not much of a bother.
With 3D, that distance is brought much closer towards the eyes.
But we were not talking about that, we were talking about the quality vs. artistic aspect of a 3D movie. A "blurred tap in the foreground" can be blurred or not blurred and how much so to the director's desire. Whether or not that effect is "realistic" or not isn't really a discussion about whether or not 3D technology is *good* but rather if the director's choice of focus and focus blur is to the viewer's taste or not, for any old reason. In another movie, the same technology could be used to make a more "realistic" foreground object. In that way.... like a 2D movie. The degree of blurriness on a foreground object is not a measure of the quality of the technology, but a comment on the director's choice more likely. That is what I was saying.
Last edited by a moderator: