Sony VR Headset/Project Morpheus/PlayStation VR

Why are we assuming they are sending only 5.1/7.1 and downmixing?

Because it gets them the effect they're after with minimal complexity and cost and doesn't require any changes at the application/engine level for developers. Getting HRTF audio localization to sound 'right' is a matter of 'best-fit' because it's entirely dependent on the user's physiology, so trying to develop an all new audio stack that has no guaranty of sounding more accurate is a waste of time and money to implement. The question isn't 'Why are they only doing X?', but rather 'Why would they do more than X?'
 
Hypothetically, what if the box had a sophisticated calibration system to fine tune to the user? Then each game could sound 'perfect' without the devs having to implement per-user 3D audio processing themselves. I suppose that could be moved to the OS portion of the system operation. Maybe VR is part of the reason for Sony's OS reservations?

But yes, sending hundreds of audio streams to an external box to process would be something special.
 
Well that seems to say quite clearly that 90fps is preferable to 60fps re-projected to 120fps.
What is says quite clearly is:
- There's very few reasons not to use reprojection, it should normally be always on
- 60->120 is the best compromise for frame time and latency
- The situations that causes artifacts are well known
- 60->120 feels very close to native 120 depending on game design

Therefore if your game design doesn't trigger artifacts, 60->120 feels very close to native 120. It is game dependent.

Also, PSVR scanout from a 90Hz render remains 8ms, compared to 11ms from a native 90Hz headset. That makes it a very good target, no need to fuss with avoiding reprojection artifacts, and easy for ports.
 
Last edited:
The box would be doing "3d audio" in the sense of mixing down the multi-channel output into a virtual surround stereo mix. There's really nothing terribly interesting going on there (filters, pans, etc) - that's the same thing that's done with most USB surround headphones, dolby headphone, etc. It doesn't require much horsepower. The really intensive part of sound is the environment propagation aspect and that can't be offloaded to an external box.


There's really no point in doing that in the external box since the PS4 has dedicated DSPs for binaural audio and the HDMI can simply send a 2-channel PCM signal.

What I think the box does have is a high-end upscaler+interpolator (e.g. their relatively new X1 chip) and perhaps something that will do post-processing anti-aliasing (MLAA?) so it doesn't have to tax the console at all. Maybe the anti-aliasing part could be done using a FPGA, for example, without having to order custom chips.
 
box that up the frame rate and to post aa. Up-ing frame rate without bad artefacts is heavy though, combined with post aa... wmaybe thats why the box rather big.
 
What is says quite clearly is:
- There's very few reasons not to use reprojection, it should normally be always on

Where did he say this? They certainly say reprojection is a requirement at 60fps but I don't recall them saying is should be used at 90 and 120 fps as well, merely that it is option at those frame rates.

- 60->120 is the best compromise for frame time and latency

That's obviously dependant on both the game and hardware so not really a point for or against any specific frame rate at this point.

- 60->120 feels very close to native 120 depending on game design

And yet despite stating this, they still go on in a later slide to emphasise the fact that you should aim for 90fps (or more) to start with, and only if you are unable to reach that, settle for 60fps with reprojection. That to me is a pretty clear indication that 90fps is the preferable option of the two if you can afford it frame time wise. I admit though there does seem to be some contradiction between the two statements.

Also, PSVR scanout from a 90Hz render remains 8ms, compared to 11ms from a native 90Hz headset. That makes it a very good target, no need to fuss with avoiding reprojection artifacts, and easy for ports.

Weren't you saying a few weeks back that rendering at a non-native refresh rate and then re-projecting to native would be sub optimal if the refresh wasn't an integer multiple of the frame rate?

This was the reason you gave for PC's not being able to render at 60fps and reproject to 90fps wasn't it?
 
Where did he say this? They certainly say reprojection is a requirement at 60fps but I don't recall them saying is should be used at 90 and 120 fps as well, merely that it is option at those frame rates.
I've definitely seen Shuhei Yoshida state categorically that reproduction always runs even with a native 120Hz framerate but I can't remember which interview and the man does a lot of them. Try last year's hour long GDC presentation.

 
Last edited by a moderator:
Weren't you saying a few weeks back that rendering at a non-native refresh rate and then re-projecting to native would be sub optimal if the refresh wasn't an integer multiple of the frame rate?

This was the reason you gave for PC's not being able to render at 60fps and reproject to 90fps wasn't it?
Yes, and the rule still applies.

The display would be 90Hz reprojected to 90Hz, but the latency of the scan out would be the 120Hz latency (which is 8ms) as it gets to the display faster, and the panel gets the full image faster, then 3ms blanking interval waiting for the next frame. So instead of having 11ms+11ms minimum latency (oculus and htc), they get 11ms+8ms for render time and scan out time.

This only works if they have designed the oled panel driver to accept an arbitrary blanking interval without changing the pixel clock. Which I don't see why not.

And yet despite stating this, they still go on in a later slide to emphasise the fact that you should aim for 90fps (or more) to start with, and only if you are unable to reach that, settle for 60fps with reprojection. That to me is a pretty clear indication that 90fps is the preferable option of the two if you can afford it frame time wise. I admit though there does seem to be some contradiction between the two statements
There's no contradiction when you understand it doesn't always work, and fixing the flaws require a change in game design. So "if you can reach 90" it has less surprises.

We know it's related to the head movement speed and virtual world velocities. He mentions the most "twitchy" games requiring 120 reprojected to 120. There's nothing with such a low input-to-photon latency. You can see in the graphs it's incredibly low, it's about 8ms+8ms.
Where did he say this? They certainly say reprojection is a requirement at 60fps but I don't recall them saying is should be used at 90 and 120 fps as well, merely that it is option at those frame rates.
He said it when explaining the low overhead, that there's few reason not to turn it on because the advantages far overweight the disadvantages. Shu Yoshida said the same thing.
 
Last edited:
I've definitely seen Shuhei Yoshida state categorically that reproduction always runs even with a native 120Hz framerate but I can't remember which interview and the man does a lot of them. Try last year's hour long GDC presentation.


This would seem to suggest that's not the case though:

upload_2015-11-1_16-44-35-png.994


There's no contradiction when you understand it doesn't always work, and fixing the flaws require a change in game design. So "if you can reach 90" it has less surprises.

We know it's related to the head movement speed and virtual world velocities. He mentions the most "twitchy" games requiring 120 reprojected to 120.

The (slight) contradiction comes from the fact that they imply you should always aim for 90fps over 60fps + reprojection while also saying that 60fps + reprojection feels very much like native 120fps depending on game design. Yet on the later slide they don't say, "aim for 60+reprojection if you're game won't have artefacts from reprojection and otherwise aim for 90fps" which implies that despite the earlier comment, 90fps is always the better option. You can argue there is no contradiction there but that implies accepting that 90fps (with or without re-projection) feels closer to a native 120fps than 60fps reprojected to 120fps does.

Yes, and the rule still applies.

The display would be 90Hz reprojected to 90Hz, but the latency of the scan out would be the 120Hz latency (which is 8ms) as it gets to the display faster, and the panel gets the full image faster, then 3ms blanking interval waiting for the next frame. So instead of having 11ms+11ms minimum latency (oculus and htc), they get 11ms+8ms for render time and scan out time.

This only works if they have designed the oled panel driver to accept an arbitrary blanking interval without changing the pixel clock. Which I don't see why not.

I must be misunderstanding something here because I don't understand why when both systems are rendering 90fps and both are outputting those 90fps over 90 screen refreshes every second that one would have a lower display latency than the other. Both systems display a new frame every 11ms with said frame being updated via re-projection just before the refresh.
 
Last edited:
I must be misunderstanding something here because I don't understand why when both systems are rendering 90fps and both are outputting those 90fps over 90 screen refreshes every second that one would have a lower display latency than the other. Both systems display a new frame every 11ms with said frame being updated via re-projection just before the refresh.
Display frame rate is still 90Hz, render time is still 11ms. But the delay between the moment the frame begins transmission, and the moment the user gets the photons to his eyes, is 3ms shorter than a display that cannot refresh faster than 90Hz. It's the pixel clock that matters.
 
Last edited:
Display frame rate is still 90Hz, render time is still 11ms. But the delay between the moment the frame begins transmission, and the moment the user gets the photons to his eyes, is 3ms shorter than a display that cannot refresh faster than 90Hz. It's the pixel clock that matters.

Yeah I realised the scan out is still 90hz (from the table above) so edited my post before you responded.
 
One takes 11ms to get the frame buffer to the entire display. (because if it could get it faster, the max frame rate would be higher, it's a hardware limitation of the maximum pixel clock)
The other takes 8ms to get it to the display and waits 3ms of blanking. (because we know the pixel clock can be high enough)

I don't know how to explain it. The variable sync monitors probably work the same way.
 
Last edited:
I've definitely seen Shuhei Yoshida state categorically that reproduction always runs even with a native 120Hz framerate but I can't remember which interview and the man does a lot of them.
That makes sense. Reprojection is just a small shift relative to head movement. Rather than use the data started 8 ms ago, use the current (couple of ms old) position data to adjust the display. And it saves having to switch it on and off as well, so a simplification of the whole system.

The slides suggest reprojection can be disabled though. If they are newer than the interview you mention (which may predate the 90Hz option), things may have changed.
 
Maybe there are special cases where it's causing some artifacts not visible in native 90 or 120?
Or for some types of games it doesn't change anything, and they'd want it off as it does take a little bit of GPU.
 
I've been wondering, have there actually been any demos for VR where the entire experience is on rails?

I'm thinking if you had aVR game in first person, where you could control only the character's head movement with 1:1 tracking, but the body and traversal was on rails (e.g. time crisis), would that induce sickness in the player?
 
I've been wondering, have there actually been any demos for VR where the entire experience is on rails?

I'm thinking if you had aVR game in first person, where you could control only the character's head movement with 1:1 tracking, but the body and traversal was on rails (e.g. time crisis), would that induce sickness in the player?

Anything with acceleration has the potential to cause sickness, so on-rails experiences that are composed mostly of consistent single vector motion (Epic's Showdown, Senza Peso, etc) are 100% fine, while things like rollercoasters have the potential to not be fine over time. For myself I would say that on-rails tends to be a lot easier on the stomach than analog stick movement on account that the amount of starting, stopping, turning, etc tends to be a lot less frequent and more robotic/deliberate than what you naturally end up trying to do when you have full control.
 
One takes 11ms to get the frame buffer to the entire display. (because if it could get it faster, the max frame rate would be higher, it's a hardware limitation of the maximum pixel clock)
The other takes 8ms to get it to the display and waits 3ms of blanking. (because we know the pixel clock can be high enough)

I don't know how to explain it. The variable sync monitors probably work the same way.

In reading up on this I came across a point on the Oculus site (from John Carmack no less) that seems relevant to this thread:

John Carmack said:
– A subtle latency point is that most displays present an image incrementally as it is scanned out from the computer, which has the effect that the bottom of the screen changes 16 milliseconds later than the top of the screen on a 60 fps display.

– This is rarely a problem on a static display, but on a head mounted display it can cause the world to appear to shear left and right, or “waggle” as the head is rotated, because the source image was generated for an instant in time, but different parts are presented at different times. This effect is usually masked by switching times on LCD HMDs, but it is obvious with fast OLED HMDs.

I wonder how susceptible PSVR will be to this given it uses a fast switching OLED display (compared with the relatively slower 5ms response time on the Rift)
 
John Carmack, works for Oculus Rift, a device that uses an LCD screen, talks down faster response tech on more modern display type of competitor.

Can't exactly say I'm shocked. I guess no future version of the Rift will use an LED display then.
 
Back
Top