Rift, Vive, and Virtual Reality

It's official now

One of the most anticipated features of this week’s Display Week event in LA is a new screen built by Google and LG, designed specifically for VR. At its booth LG is showing off a 4.3-inch OLED panel with a pixel density of 1443 pixels per inch (PPI). We’ve just gone hands-on with it and you can see the results below.

Senior Editor Ian Hamilton is on the ground at the show and took the below images of two displays LG is exhibiting (click to enlarge or open in a new tab for the full size). Inside each lens, which is fitted to a wall and not embedded inside a headset, there’s the same image of a map with text both big and small text. The new OLED display has a resolution of 3840 x 4800 and a refresh rate of 120Hz.

https://uploadvr.com/heres-google-lgs-new-1443-ppi-vr-display/
 

Impressive, although I wish it was much wider. But still one step at a time. Right now this brings resolution, actually PPI, to a minimally acceptable level, IMO. Next, hopefully they'll work to address the FOV issue of VR. This would seem to be an ideal usage of flexible OLED displays. Being able to make them "wrap around" each eye. Hopefully that is something that is being worked on.

Regards,
SB
 
the higher ppi the screen is, the more you can zoom on it with the lenses, increasing the FOV, you do not need wider screens, and that can keep the headset tinier.
 
If you have to optically stretch a narrow field display to a wide FOV, you're introducing more optical artefacts and lensing problems. Ideally you want a curved display that fits the FOV with the minimal amount of lens interaction. Or some altogether better imaging tech.
 
I wonder if they can drive every pixel on those panels at 120Hz? It's a lot of pixels.

Display resolution in rift/vive is 1080*1200 and there is 2 displays.

Vive replacement with 2 of these displays would be near insane.
 
VR really REALLY needs eyetracked foveated rendering, IMO. Updating everything at pixel perfect resolution at 90/120 hz across a high PPI screen that potentially wraps around the eye would require mind boggling amounts of graphical processing power.

What is needed is to be able to have pixel perfect rendering where the user is looking while the peripheral vision areas are rendered at much lower fidelity. But the system must be responsive enough that it can transition the high quality rendering to any arbitrary place the eye looks within 1 frame (possibly more, as it takes a bit of time for the eye to refocus after changing where it is looking) of looking there.

However, we know that is being worked on. My hope is that there is a proper surround FOV solution being worked on for the display. Nothing bothers me more in VR than the overlapping black borders that go through each eye's view as the brain composites each eye's input (including the black surroundings) into one image.

Regards,
SB
 
In the paper RoadtoVR linked, they're targeting 75Hz on a mobile device. They go into detail on using foveated rendering and bandwidth etc.

https://onlinelibrary.wiley.com/doi/full/10.1002/jsid.658

Good paper, but I disagree with the following.

With the use of eye tracking, the foveated (high acuity) region can be made very small (typically less than +/−15°) relative to the overall FoV.17 However, even without eye tracking, the image may be separated into regions with different acuity so the image matches the natural roll‐off of the system optics and the HVS's low peripheral acuity.

Without eye tracking, foveated rendering isn't good. While it's true that in a head mounted display you can move your head to keep things in the higher acuity region, humans still tend to look around with their eyes without necessarily moving their head. This is usually in order to quickly look at something to ascertain what it is and whether it is something that requires more attention.

In my experience with non-eye tracked foveated rendering, it is very much subpar and not worth it. I'd rather have a more simplified scene at lower IQ than to have uneven visual acuity depending on where my eyes decide to look. This is especially disconcerting if my peripheral vision notes something in the periphery, I then look at it with my eyes to judge whether to pay more attention to it, I then decide to bring it to the center, and then see it look different from when I initially looked at it.

It's basically the same problem I have with in game DOF without eye tracking. The game designers make assumptions with regards to where I should look, but my eyes tend to wander around a scene and get disoriented when it can't bring things into focus which then causes severe eye strain over time.

I'm sure it'll be an attractive avenue to pursue until eye-tracked foveated rendering is more mature and more affordable (both in terms of cost as well as complexity), but I feel it's the wrong path to pursue.

Regards,
SB
 
Matching the "natural roll-off" means it has imperceptible effect whether you "focus" on it or not. It's mostly interesting with >60pixels per degree, something VR won't hit for a while because isolating nature makes FOV a priority as suggested ( in #1786 ).
 
It's not clear from his tweet whether he was able to try them in HMDs or hole in the walls with demo images. It'll be good to see how well JDI's display holds up in a HMD with blacks. Should be pretty decent based on Oculus Go?
 
It's also unclear whether the displays were the same size. eg. 1001 ppi LCD, but 12" wide and shrunk by optics to fit the visual field, versus 1443 ppi on a half-inch screen perched on the end of the nose. Display size, resolution, and FOV are the numbers that matter.
 
It's also unclear whether the displays were the same size. eg. 1001 ppi LCD, but 12" wide and shrunk by optics to fit the visual field, versus 1443 ppi on a half-inch screen perched on the end of the nose. Display size, resolution, and FOV are the numbers that matter.

The LCD is smaller 3,25" diagonal vs. 4,3" , both are on relatively expensive polysilicon backplane, again it's the usual suspect stuff , screendoor,

pentile 1443 ppi on OLED (!!!) vs. actual RGB on LCD, near black non-uniformity.

Plus they are having problems directing light into a narrow cone with OLED, the smaller the pixels the bigger.
At least the standard mobile process is under revision:
The response time of an OLED display is related to TFT design and pixel circuit characteristics. For mobile OLED displays, p‐type LTPS technology is considered mainstream, but is susceptible to a “ghost image” artifact that appears when the display is unable to reach the target brightness level in the first frame after changing the image. In order to achieve high resolution and fast driving speed, n‐type LTPS TFTs that have higher mobility and lower hysteresis characteristics than p‐type were chosen for the TFT backplane

If VR fizzles it won't be because of inept LCDs.
 
Last edited:
Yup. These displays aren't suited to VR, but I think it is illustrative that markets that are not tied to hand cranked film or power line frequencies use A LOT higher frame rates if they can.
That is emphasized by the fact that going to 240Hz from 120Hz rate is regarded as so desireable that reducing vertical resolution by half is an acceptable price. People really, really like low latency and smooth framerates, far beyond what we typically see as acceptable in gaming.
(Sensor read-out speed is a limitation in the photo EVF market, so there is little reason to develop an EVF display with properties that can't be utilized anyway. That said, 120HZ EVFs has been typical for a few years now, and hopefully 240Hz EVFs take over as typical beyond the low-end. Also note that higher FPS leads to less light captured by the sensor sending the image to the EVF, so higher frame rates typically carry an EVF noise cost for photographers. Nevertheless, people go for the higher framerates as preferable. Says something, really.)
 
240 Hz noise will come with natural temporal noise reduction in the brain - makes sense. I'm guessing the small size makes the fast transitions easier? Does a tech that can be driven at 240 Hz in a one inch display scale up to drive a 60" display also in 1/240 th of a second, or does the larger screen need more effort.

(I've literally zero knowledge of what it take to make a display show stuff! Beyond RGB coloured dots.)
 
Back
Top