Sony VR Headset/Project Morpheus/PlayStation VR

60Hz is the low bar which is why most are targetting 90Hz (Valve, Oculus) or 120Hz (Morpheus).

Yah, I'm just saying they've got to be really careful about framerate. It's kind of crazy how fast the motion sickness hits you. It's not a gradual thing. It's instant. Anything below 60Hz is definitely a bad idea. Sounds like some of the morpheus games will be 60Hz, and they have to be spot on. The good thing is, on a console, you can make sure the game is perfect.
 
Morpheus is reprojected to 120 Hz. As long as the head tracking is fast enough, it should be okay even if the in-game framerate is a little off.
 
Morpheus is reprojected to 120 Hz. As long as the head tracking is fast enough, it should be okay even if the in-game framerate is a little off.
Yup, it's a laughably simple solution too. Even if the interpolated frames are largely just slightly rotated to compensate for slight head movements that will greatly reduce the nausea that typically results from a clash of senses.
 
I wonder how the brain reacts with the translations being at 60 and rotations at 120.

Time warping the image to take into account the in-game movements (like a car moving at high speed) can't be trivial.
 
This is really turning creepy.

Is this an extract from eastman's guide to strip clubs? Where where I buy the full guide?

This thread has turned very weird very quickly :yep2:
na barney's guide to strip clubs. Watch how I met your mother and Suit up !
 
I'm definitely getting the final consumer version of oculus, but I'm worried about having to fiddle with graphics settings with every PC game. My gtx780 should be OK, but setting the graphics options for a stable high frame rate always needs fiddling to find the best compromise. I hope the VR game engines on PC will be designed to avoid this and adjust everything automatically.

I expect GeForce experience will give this option. It already does for non VR gaming so I expect the option will be added once VR takes off.
 
How does the 120Hz work on Morpheus? I don't expect the PS4 is outputting 120 unique frames per second? At least not in every VR game.
Apparently it does output 120fps for real, but they said they really don't expect many games to push native 120fps unless they are really simple. So most games would render at 60fps and use Sony's reprojection library to reach 120fps.
 
I wonder how the brain reacts with the translations being at 60 and rotations at 120.
Time warping the image to take into account the in-game movements (like a car moving at high speed) can't be trivial.
I presume the full motion is reprojected. A forwards motion only needs the between frames to be upscaled ever so slightly
How does the 120Hz work on Morpheus?
You take a 60 fps game and interpolate frames for head movements. If the player turns their head, you offset the previous frame slightly to the side before rendering the next frame. So you get 60 fps of new content and in-game animation, but 120 fps for motion tracking and player 'grounding'. As long as the motion is high frequency, it should work. You could have a VR Minecraft experience with two-frame animations of cardboard cutouts, and it'd still be immersive and believable as long as the camera is updated quickly and accurately enough.
 
I presume the full motion is reprojected. A forwards motion only needs the between frames to be upscaled ever so slightly
You'd need a smart solution to prevent judder/jitter on any HUD, dashboard or cockpit elements but there's no reason Sony can't make their APIs smart.
 
Apparently it does output 120fps for real, but they said they really don't expect many games to push native 120fps unless they are really simple. So most games would render at 60fps and use Sony's reprojection library to reach 120fps.

When did they say this?
 
Does it even need to be said? Stable 60fps is such a rarity as it is.

It doesn't have to be said but I remember reading Richard Marks or someone saying that most games will be 60 FPS but they hope that devs will go for 120 FPS.



Edit:

The thing with Morpheus being the only VR set on console is the fact that devs can set their own goals around the PS4 hardware. So if they feel that 120fps is the most important thing then that's what they can go for & not worry about what it look like compared to what's on the other system.
 
Last edited:
You're right, I remembered things that were being paraphrased a bit too much. The actual quote from Yoshida is "Developers building games for VR are encourages to target 120 FPS. But for those looking for high graphics, we offer the option to render games at 60 FPS but output at 120 FPS. For these titles, developers can use a technique called reprojection to take the latest sensor data, we have a few demos using this technique today and I'll bet you'll be impressed with the results."

http://www.reddit.com/r/PS4/comments/2xx83v/project_morpheus_question_about_120_fps/
 
You'd need a smart solution to prevent judder/jitter on any HUD, dashboard or cockpit elements but there's no reason Sony can't make their APIs smart.

Would they not just enforce display planes for vr content, game on one ui on the other then it's crystal clear what gets reprojected and what does not, but will vr go the way of static floating hugs?

It's probably a good thing that now game design ideas seem to be the constraint and not so much the technology, it's a shame Nintendo are not on board as they always seem to find the fun.
 
Would they not just enforce display planes for vr content, game on one ui on the other then it's crystal clear what gets reprojected and what does not, but will vr go the way of static floating hugs?
Are both display panes available to developers on PS4? Assuming you can, what does this complexity - some parts of the 3D environment rendered to one pane, other parts to another, do to the rendering pipeline?
 
Are both display panes available to developers on PS4? Assuming you can, what does this complexity - some parts of the 3D environment rendered to one pane, other parts to another, do to the rendering pipeline?

I thought you where talking about 2d floating health bars and such not parts of the 3d world.

I have no technical knowledge about what is or is not available but I assume they must have a plan to segregate them as I thought they said that they where experimenting with games that fed one thing to the headset but showed something else on the screen to allow a more social experience. If that is the case would that require and prove 2 accessible display panes?
 
I thought you where talking about 2d floating health bars and such not parts of the 3d world.
I did mention HUDs, which I think we'll see less of in VR unless they are better thought out to be less immersion breaking, but I was really thinking dashboards (think GTA V in first person mode) and cockpits (think Elite Dangerous).

I have no technical knowledge about what is or is not available but I assume they must have a plan to segregate them as I thought they said that they where experimenting with games that fed one thing to the headset but showed something else on the screen to allow a more social experience. If that is the case would that require and prove 2 accessible display panes?
I don't think it's known but Sony may have reserved one of the two display panes for the system. Stupid NDAs :yep2:

Sebbbi - blink twice if I'm right.
 
Would they not just enforce display planes for vr content, game on one ui on the other then it's crystal clear what gets reprojected and what does not, but will vr go the way of static floating hugs?
The display planes combine for the video out. There's only one signal down the HDMI which is the final composite. But I guess if they use HDMI+Ethernet, they can send the UI layer separately.
 
Back
Top