Microsoft HoloLens [Virtual Reality, Augmented Reality, Holograms]

hololensteardown-13.0.jpg


hololensteardown-1.0.jpg


hololensteardown-2.0.jpg


hololenssidebyside.0.jpg


http://www.theverge.com/2016/4/6/11...lens-holograms-parts-teardown-photos-hands-on

Tommy McClain
 
The fact the image is additive is really bothersome to me. eg. The holes made in that vid have glowing cracks because you can't draw draw dark cracks. Which means all design will be around everything glowing, I guess.
 
I like how you're able to get a correct perspective through the holes in the wall. Turning your head looks like it effects the view. Nice.

FOV is awful but I imagine they'd be quite an easy fix for consumer versions.

Dunno whether it's because it's being filmed through an iPhone or whether it's a problem with the display, but those colours look very strange. It's like the hole is phasing between R, G, and B. Maybe when moving some of the colours are displaying for longer than others? Looks weird anyway.

Definitely a way off being ready for the market though.

Edit: colours displaying at different rates? Sorry for the screen shot from my phone.

37a5f0980ace166b2ed6dcf5bf1d328e.jpg
 
Last edited by a moderator:
I think one of the research team members were doing that a few months ago. I figure by consumer release, we should be able to play some X1/PC crossbuy titles natively.
 
Why? As discussed in this thread, the projection tech (as we understand it) is fundamentally limited in FOV. Fixing that is likely anything but easy.
You're saying it's a projected image onto the insides of the glasses? I would have thought that the light would pass straight through the lens... maybe that's why they're dark lenses.

Anyway, I'd really hope they're able to increase the FOV coz that looks horrible.

Sent from my SM-G935F using Tapatalk
 
The big difference between the real thing and the fake MS videos is what most of us expected, but it is also hilarious. It becomes obvious when you look at the inability to subtract light.
 
It's like the hole is phasing between R, G, and B. Maybe when moving some of the colours are displaying for longer than others? Looks weird anyway.

They're probably using sequential RGB via DLP or LCoS to feed their waveguides, in which case there's potential for component separation under certain types of motion. It's a pretty big limitation when you want to do things like low persistence because you're actually making use of the full scan cycle in order to blend the RGB on the retina.

edit:
Oh, and I think that video looks way cooler than any of the faked stuff they've demoed thus far. A shame that it's taken this long to see it for real.
 
It's also possible that the iPhone video is actually filmed at further from the display than the eyes would normally be. That would make the FOV appear worse than it is.

I don't imagine an iPhone has a great aperture which would allow for a wider focal range in low light. So to compensate, the phone would need to be held further away.

Sent from my SM-G935F using Tapatalk
 
Phone cameras have wide apertures and wide angles. You need to hold them close to avoid perspective making things tiny.
 
Phone cameras have wide apertures and wide angles. You need to hold them close to avoid perspective making things tiny.
Wider focal range is different to a wide aperture. To get a wide focal range you need a narrow (or tight) aperture.

Edit: and more light (otherwise the image would be too dark).

Sent from my SM-G935F using Tapatalk
 
Looking at this image I wonder whether it's actually using a transparent display technology

This would work if the panel were sufficiently far away from the eyes to be in a similar focus plane as everything else you're viewing through it. With a VR HMD you are able to throw a simple lens that effects the focus of everything, with an AR HMD you have to ensure that the light being emitted is already focused before it's eventually redirected to the eyes.
 
This would work if the panel were sufficiently far away from the eyes to be in a similar focus plane as everything else you're viewing through it. With a VR HMD you are able to throw a simple lens that effects the focus of everything, with an AR HMD you have to ensure that the light being emitted is already focused before it's eventually redirected to the eyes.
Yes, that makes sense. And you're saying a projected image would solve that problem?

You can definitely see a slight colour difference in the lower part of that photo. Which I guess could be for the projection or the transparent display.

Sent from my SM-G935F using Tapatalk
 
Last edited by a moderator:
The colored rectangle would be the output portion of the waveguide that's viewable. Google "holographic waveguide" images.

edit: something like this:

getImage.xqy
 
Back
Top