Multi Focal HUD

Discussion in 'General 3D Technology' started by Emirikol, Sep 3, 2002.

  1. Dio

    Dio
    Veteran

    Joined:
    Jul 1, 2002
    Messages:
    1,758
    Likes Received:
    8
    Location:
    UK
    A second note: I find stereoscopic glasses (w/projector or monitor) approaches don't create the same problem for me, probably because of the aforementioned thing that 'far is far, but there is a lot of near'.
     
  2. Emirikol

    Newcomer

    Joined:
    Aug 28, 2002
    Messages:
    15
    Likes Received:
    0
    You are right SA.

    But I must disagree with your second post.
    The observer pupil size is not correlated to focus depth.
    To change the deep of field of the image doesn't improve the vision quality. Because the image is always focused as if it was a screen XX cm away from the observer eyes. When the eye see an object apparently near (sensation given by the stereoscopic effect) the brain instantly tries a "near" focus. When the eye see an apparently far object, the brain tries to focus the eye to infinity....but in an usual HMD the image is always at the same focus, so changing the observed object forces the brain to try a wrong focus with conseguent fatigue
     
  3. mboeller

    Regular

    Joined:
    Feb 7, 2002
    Messages:
    923
    Likes Received:
    3
    Location:
    Germany
    IMHO it should be possible to track the thickness of the pupil with an laser and use this info to focus the picture in the right distance using an holographic display (or retina display).
     
  4. MfA

    MfA
    Legend

    Joined:
    Feb 6, 2002
    Messages:
    7,610
    Likes Received:
    825
    I think by the time we have a holographic display we will also have the computational power to use it for full 3D ... just using it to put a 2D image with artificial DOF effects at the correct focus depth seems a bit of a waste :)
     
  5. mboeller

    Regular

    Joined:
    Feb 7, 2002
    Messages:
    923
    Likes Received:
    3
    Location:
    Germany
    Sorry;

    I ment holographic HUD's like the one in use in fighters like the new F16 models or more modern fighters. I didn't ment real holographic displays.
    Holographic HUD's project the image so that it appears to be in infinite distance ( or a few meters away, depending on the requirement ).
     
  6. Quaz51

    Regular

    Joined:
    May 18, 2002
    Messages:
    916
    Likes Received:
    1
    Location:
    France
  7. SA

    SA
    Newcomer

    Joined:
    Feb 9, 2002
    Messages:
    100
    Likes Received:
    2
    That's right about the optical screen depth. Dynamically changing the depth of field in the image would only make it appear more photorealistic based on what the eye was looking at and would not help with the focal disparity.

    As was pointed out, the disparity in focal depth is an optical problem and only a dynamic opitical solution can solve it.

    Motorized optics (similar to the optics in an auto focus camera) could be used to solve the problem, but it seems a bit mechanical. You would again use the eye tracker to see what was being viewed in the image, calculate the distance to the 3d surface in the image at that point, and adjust the optical screen depth accordingly using the motorized optics.

    Some other possible solutions, a bit less mechanical, would be to use electromechanical optics such as deformable mirrors and lenses.
     
  8. SA

    SA
    Newcomer

    Joined:
    Feb 9, 2002
    Messages:
    100
    Likes Received:
    2
    The major problem with immersive stereo HMDs, however, is not technical in my opinion. The market never developed to the point to fund the R&D to solve the technical problems. Nvision and many others diverted R&D efforts to other areas where there was a larger market. Stereo glasses and CAVE systems occupied much of the professional market, and the consumer market has been satisfied without stereo and full immersion.
     
  9. Emirikol

    Newcomer

    Joined:
    Aug 28, 2002
    Messages:
    15
    Likes Received:
    0
    quite true
     
  10. Emirikol

    Newcomer

    Joined:
    Aug 28, 2002
    Messages:
    15
    Likes Received:
    0
    I have read the entire documentation but it is not explained what they mean for "Depth Modulation". In any case the document refers to the first microvision retinal scanner prototype...now, after 2 years, they still do not have a HMD capable of multiple focus.
     
  11. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    I think I might have actually seen this one in action.

    I think it was about 1996-1997 or so when I went to check out the flight sim facilities at the Pax River Naval base. They had three flight sims:

    The first was a large dome, 20ft in diameter, if I remember correctly, with an F-18 cockpit suspended in the center. The display was projected all over the dome (with the highest-resolution display in the front). That was very cool, though the ground detail was pretty low.

    The second consisted of three screens arranged in a format similar to what is seen today from Matrox' Triple head tech, though the displays were much larger. It was a simulation of a VTOL aircraft (I forget the name...it had tilt wings and propellers...).

    The third was a VR display with a headset that projected the display directly into the eyes. While I only used it for a few minutes, there was absolutely no eye fatigue, and looking at the computer-generated surroundings was quite natural, though the helmet was rather heavy. It was very, very cool, with the only drawbacks being the heavy helmet, the swing arm used for motion detection (it wasn't connected to the helmet directly...but it had to be close, and so was right above my head...apparently it's been broken a few times, too...), and the fact that the calibration for projecting the image directly into the eyes had to be done exactly, and was different for every person.
     
  12. Emirikol

    Newcomer

    Joined:
    Aug 28, 2002
    Messages:
    15
    Likes Received:
    0
    I think to have found a solution
     
  13. Basic

    Regular

    Joined:
    Feb 8, 2002
    Messages:
    846
    Likes Received:
    13
    Location:
    Linköping, Sweden
    And it is?
     
  14. Emirikol

    Newcomer

    Joined:
    Aug 28, 2002
    Messages:
    15
    Likes Received:
    0
    Sorry I cannot unveil the details
    but I've found a way to cast the rays of light at a different angle depending by the single pixel deep. I'm writing a Paper.
     
  15. MfA

    MfA
    Legend

    Joined:
    Feb 6, 2002
    Messages:
    7,610
    Likes Received:
    825
    Do laser beams stay collimated if you modulate them with a LCD?

    If so Basic's pinhole projection idea would be pretty straightforward, backlight a LCD with lasers and focus the resulting beam somewhere between the minimum focal point of the eye and the eye's lens (wouldnt want you to fry your retina the moment you try to watch something too closely :). You would probably want to do it with several lenses one of which very close to the eye to get the necessary arc to be able to cover enough of the retina. You need to apply a warp to the image to counteract for the distortion from eye's lens of course, but that could all be done digitally.
     
  16. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    Of course, the only problems would be in attempting to deal with people that have vision that's not 20/20. However, since that's still a requirement of pilots, I don't know if the Navy considers it much of a problem...
     
  17. Emirikol

    Newcomer

    Joined:
    Aug 28, 2002
    Messages:
    15
    Likes Received:
    0
    I dont' understand how this sistem works...
    can you further explain me
     
  18. MfA

    MfA
    Legend

    Joined:
    Feb 6, 2002
    Messages:
    7,610
    Likes Received:
    825
    Pinhole cameras have a natural focal point because they act somewhat like a fresnel lense ... but mostly the projection stays in focus, albeit distorted, regardless of the distance to the image plane (and thus regardless of lenses you put behind them). I assumed that effect was what Basic was after.

    It doesnt really matter what kind of optics you use though, with a normally lit LCD only a minute amount of the energy you put into lighting would actually find its way to the eye with pinhole projection. So collimated light put through a virtual pinhole seems to make more sense.

    I dont quite see how he intends to handle when you look sideways though, you could use multiple pinholes like he suggests, covering the full arc the eye can cover ... but because of the discrete number of pinholes the moment one is obscured by the edge of the pupil it can make a big difference in the intensity at the retina. You would need a very large number of pinholes to remedy that (in a way this would be trying to reproduce the lightfield coming towards the eye, instead of trying to reproduce the image ... a poor man's holography so you will).

    Maybe you could track the pupil and put a large lens in front of the eye and aim the beam at it according to the direction you are looking (with beam I mean the collection of rays representing the individual pixels). This would still give you the virtual pinhole projection, but allow you to cover a wide viewing arc.

    I dont have the necessary experience to intuit if this could actually work, and Im not gonna do the math ... so take it with a pinch of salt ;)
     
  19. Basic

    Regular

    Joined:
    Feb 8, 2002
    Messages:
    846
    Likes Received:
    13
    Location:
    Linköping, Sweden
    I think you got the idea right MfA. The intention was to track the pupil, and only send light for pinholes that hit the pupil. This could be done by moving optics as you say, or it could be done by switching light sources.

    Look at this image. (Sorry my web space doesn't alow image leeching.)
    The url hints on how old the idea is. :D

    There are two lightsources in that image, just to show the principle. And they are both hitting the pupil. A real implementation would have lots of light sources. Dense enough to get several pinholes at the pupil, and over large enough area to get possible pinholes on every position the pupil may be.
    The lens system shown is the simplest possible, but it would be a good idea to design it so that the final lens is more "curved" around the eye, so you can get a high FOV.

    What is the meaning with multiple pinholes on the pupil?
    Let's go back to the good ol' T-buffer, and think about how DOF is made with it. You render four images with the camera slightly shifted, but directed so that points at the focal plane coincide. And then blend those images together.
    This is equivalent to aproximate the pupil with four pinholes (spread over the pupil). The camera shifts are the different pinhole positions over the pupil, and the different direction of the cameras comes from how the eye is focused. With T-buffers the program have to decide what eye focus to simulate.

    Now get back to the "DOF HUD". Since we have the possibility to project diferent images on different pinholes, you could take the four images of the T-buffer and project them on four different pinholes, and let the eye decide where the focus should be.

    So the result would be that you calculate an image for one pinhole, send it to the LCD, and light the corresponding lightsource. Calculate the image for next pinhole...

    The problems here are:
    You have to track the pupil to make sure that the active pinholes are inside it.
    You must have nice point shaped light sources. I don't know how small you can make LEDs, but if it isn't small enough you could "enhance" it by putting a mask on top of it. The strength shouldn't be a problem since you actually could get pretty much all of the power into the eye, and even a LED can be quite bright then.
    You need a realy fast LCD. It must have refresh rate of at least (eyes flicker limit) times (number of active pinholes). (say 75*4=300Hz)
    And then, someone (else than me :) ) need to test if the focused light can damage something if pointed at the wrong place.

    PS
    I liked the description "poor mans holography".
    DS
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...