Sony VR Headset/Project Morpheus/PlayStation VR

Ok, so during 60>120fps conversion they take the last rendered frame, look at the current motion data, and then reproject it into new position.

img_54f76ce45c6af.jpg


But what happens if current head movement is too fast?

edit - well, at 120hz, thats just 8.3ms of time left for movement before "fake" frame needs to be inserted. Maybe that's small enough window to make this type of reprojection viable in most situations.
 
Well... not quite. Multiplayer of KZSF rendered in 960x1080, then that render was stretched into 1080i, and "holes where interlacing was supposed to be" were filled with pixel data collected from previous frames.
http://www.eurogamer.net/articles/2...-for-killzone-failing-to-deliver-native-1080p

KZSF was filling of holes using temporal data, Morpheus reprojection is "moving an entire last frame into new position on screen".

I suppose this morpheus tactic works well when user is rotating his head, but what happens if big object is approaching him [or moving away] and user is not moving his head? Can interpolation engine just manipulate the entire last frame or can it also locate and manipulate the individual objects as needed [zoom the approaching box/car/object and leave rest of the frame untouched]?
 
Ok, so during 60>120fps conversion they take the last rendered frame, look at the current motion data, and then reproject it into new position.

But what happens if current head movement is too fast?
It'll fail. You can't reproject data that isn't there. On a fast head turn, you'll be looking at space that hasn't been rendered yet. Maybe the solution is to motion blur that frame and draw the next one? Or only update at 60Hz when there's not suitable data available.
 
It looks like they reproject ALL frames, not just the intermediary ones. It's what gives the less than 18ms lag.

8.3ms is incredibly short, the head can't move much more than 2 or 3 degrees in that time. so that's like a missing strip of 20 or 30 pixels, and it's far into our peripheral vision. They can also do motion prediction to arrive dead on.

Could they render a bit wider to have 20 or 30 pixels of margin for the reprojection, or even literally render that strip on the spot after the reprojection?
 
Here's an interview with Ken (Burwell?) from Valve discussing the Vive. I have not had a chance to view the full thing myself, yet.

8.3ms is incredibly short, the head can't move much more than 2 or 3 degrees in that time.

I could (and probably am) completely mis-understanding but I believe Valve's recommendation is "about 5 degrees" (for presumably an 11.11 ms frame update, so, a bit longer than the 8.3 you mentioned)

img_54f7af051d378.jpg
 
Cool, 5 degrees at 11.1ms is about 3.7 degrees at 8.3ms.

So assuming 100 degrees for 960 pixels:
35 pixels on each side if it was a stereographic fisheye (equal density at the edges than at the center)
But it's probably even less since the pixels are more coarse on the edges. It's really not a big overhead.
 
Confirmed no cable version of the Morpheus. I'm concerned about the connector that has to go into the PS4 for this. Plugging in my controller into the USB is already worrying. Don't want to ruin the socket. And if you're supposed to be flailing around with the Morpheus too.
 
Confirmed no cable version of the Morpheus. I'm concerned about the connector that has to go into the PS4 for this. Plugging in my controller into the USB is already worrying. Don't want to ruin the socket. And if you're supposed to be flailing around with the Morpheus too.
The headset connects into a small external module, and that module connects into the PS4.
 
Does the module use usb or the playstation eye connection? I suppose it must use usb or it has a port on the module you can plug the playstation eye into it.
 
Does the module use usb or the playstation eye connection? I suppose it must use usb or it has a port on the module you can plug the playstation eye into it.
Front of the module : single cable to the headset
Back of the module : HDMI IN, HDMI OUT, USB type-B (I think?), and a power supply connector
 
Does the module use usb or the playstation eye connection? I suppose it must use usb or it has a port on the module you can plug the playstation eye into it.

If I recall correctly, the module (processor unit, PU) has HDMI In, Out and a USB cable. PS4's HDMI output to the PU's HDMI IN. HDMI out from the PU to TV (for "social viewing", as it outputs a corrected (un-distorted) view of 1 of the viewpoints). USB from the PU to the PS4. PS Camera to the PS4.

EDIT

Beaten by Mr Fox
 
Wasn't it Sony that championed wireless HDMI? Surely this is the best place to start? Or am I getting confused? How many wires are coming out of this bloody thing?
 
From the close-up images, It looks like either the fresnel rumor was false, or the fresnel pitch is extremely fine on both Morpheus and Oculus.

The only VR that clearly shows fresnel lines is the Vive.

Occulus have a lot of radial flaring and CA from the outside, I would believe this could be a very fine fresnel, or coarse diffractive optics?

Morpheus looks like it might have an AR coating? Or it's just the lighting playing tricks on me?


Morpheus:
bfluwqlombrm09vn0dh1.jpg
Latest occulus:
P1100447.jpg
Valve Vive:
vs03-02_0915cxs.0.png
 
Last edited:
Wasn't it Sony that championed wireless HDMI? Surely this is the best place to start? Or am I getting confused? How many wires are coming out of this bloody thing?
They said 120Hz 1080p is not possible right now with the first version of wireless HDMI.

There's also a problem with reaching a low target price with such an expensive technology. Wireless HDMI adapters are currently $200 and they would also need a battery pack, and still have a cable going from the headset to a belt clipped battery. Like they did with the HMZ-T3W.

There's only one wire going from the module to the headset, the supplied cable is supposed to be 15 feet.
 
Sony are so lame. Morpheus should be powered by the imagination of the wearer.

Cables!?! What is this, the 1930s?
 
Cool, 5 degrees at 11.1ms is about 3.7 degrees at 8.3ms.

So assuming 100 degrees for 960 pixels:
35 pixels on each side if it was a stereographic fisheye (equal density at the edges than at the center)
But it's probably even less since the pixels are more coarse on the edges. It's really not a big overhead.
Yeah, the outside pixels can probably just be stretched. So shouldn't be an issue. The answer to the original question, "what if your head moves too fast?" is, "it can't" ;)
 
Back
Top