Sony VR Headset/Project Morpheus/PlayStation VR

Which is why I thought it would have been good if O.R. had been able to target 144hz with 77 hz reprojection. More tolerant to performance drops and that innit.
HDMI bandwidth is the limitation I think.
 
The judder they're talking about is from animated elements within the scene. For those things, yes of course you'll get motion artifacts when you have an irregular rendering rate. The head tracking however is not disturbed by that with positional reprojection because you have the tracking information and necessary scene information to construct a frame. It's POV interpolation, not scene interpolation.

If you look at a static cubemap in VR you're essentially looking at a 0 FPS scene that's reprojected to whatever your display rate is. If it were an animated cubemap (say, a room with an object in it moving back and forth) an irregular animation rate will result in the object not being tracked consistently by your eye, resulting in a similar judder artifact produced by full persistence displays.

I'm not arguing that constantly hovering between 70-90fps on a 90Hz HMD results in a negligible difference and can be ignored by developers, but rather that brief dips below 90fps are often imperceptible because you don't get that nasty kicked-in-the-head feeling that we used to when you miss a buffer swap.
 
I know hughj will disagree with me, but it's all the same for everybody, for every frame rate, regardless of reprojection.

I can't believe this is still going on. I've been trying to explain this for a fucking long time on B3D, and Sony corroborated my theories, and so did Oculus. I think next time I'll have to quote Jesus. :LOL:

Reprojection/ATW cannot solve the judder problem of mismatched frame rate. Unless they reproject and composite all objects and animations independently (which would be a lot of processing, if possible at all).

This is from last year:
https://developer.oculus.com/blog/asynchronous-timewarp-examined/


Yes, Sony have a major advantage being able to run at 60 reprojecting to 120, which competitors cannot.

The impact of occasional dropped frames is reduced, but it's a problem if it can't keep the frame rate the same.

I don't understand why you think we're in disagreement. The link you posted starts off with "synchronous timewarp (ATW) is a technique that generates intermediate frames in situations when the game can’t maintain frame rate, helping to reduce judder.", so clearly what I said above is exactly right. I never said it was perfect, obviously you want to have full frame rate, but if and when you do drop frames, ATW (aka re-projection) will fill in the blanks with the result being that you never drop a frame to the display.

My point was simply that if you're already using ATW to double your frame rate from 60->120 fps, then drops below 60fps (say into the high 50's) are probably going to be a lot more noticeable than drops into the mid 80's when you're outputting at 90hz. That's why, while Oculus are also pushing for a native 90fps, it's unsurprising that Sony are being more strict with the 60fps requirement.
 
Sony should have included it in PS4 as standard. They could have provided positional audio in all games and saved a bundle on the cost of VR.

Good point.

I thought both MS and Sony would have it. In fact, I'm pretty sure we've had a conversation in the past about how MS could have used Kinect + binaural to create immersive audio. It could have been game independent, powered from the pad and been "free" on a per game basis, at least using MS original Kinect reserves (+ the custom "shape" processor elements).

Immersive audio does seem to have been overlooked by everyone coming into this generation.
 
Think of it in visual terms. 3D (HMZ) puts two screens in front of your eyes but fixes the viewpoint. You see 3D, but not VR. You need the images to adjust based on where you are looking. Surround sound does the same with audio. It provides '3D' audio but only for a fixed position. Say for example someone is on your right babbling annoyingly about how when they get home after the war, they're going to finally start that farm with their boy. You turn to the left and see the open meadows, but you'd still here the man on your right as if he had changed position. You ought to hear him behind you. The audio has to be updated relative to your head position exactly as the visual side is. Someone was posting earlier about how the audio in VR can be really offputting. Kudos to Sony if they have it working well, and the inclusion of the breakout box means they must have had this as a requirement.

Sony should have included it in PS4 as standard. They could have provided positional audio in all games and saved a bundle on the cost of VR.

Wait. I was pretty sure I knew how this all worked but now you've confused me again.

Surround sound is 3D sound, no? I mean in a fps when I turn around, the sounds positions update according to where I'm looking, like you described. That's what I've been hearing for years, please tell me it wasn't all in my head?

How is that different from whatever you now say Sony should have included?
 
Wait. I was pretty sure I knew how this all worked but now you've confused me again.

Surround sound is 3D sound, no? I mean in a fps when I turn around, the sounds positions update according to where I'm looking, like you described. That's what I've been hearing for years, please tell me it wasn't all in my head?

How is that different from whatever you now say Sony should have included?

When you watch movies with fixed, surround sound speakers, the tilt and turn of your head combined with the physics of sound waves and your ears (ears aren't just ugly flaps of flesh, they serve a biological purpose too!) allow you to know where sound is coming from.

If you use only two sound sources such as in-ear earphones, relaying sound for two floating, non-physical fixed position sound receivers, then you lose almost all directional information [other than linear left/right]. Your fleshy ears and the boney-flesh-dome that is your head are now factored out of all calculations.

To allow two, in-ear earphones to provide all the information necessary for VR, the on top of everything needed for VR you need to also adjust for head tilt and the effect of ears [and boney flesh dome] filtering and and muffling and reflecting certain frequencies of sound.
 
Or to explain it really simple - so far surround systems have relied on multiple speakers around you (say, 5) to give direction to the sound you hear. But people only have two eardrums, and so some smart folks figured out that two speakers should be able to convey almost perfect surround, by simulating how sound comes to your two ears when it comes from different directions and distances.
 
Btw does Psvr allows you to independent head tracking, async from the game fps?

If you try dolphin in VR and the game let's say... Crawl to 0 fps...

You will see a 'world frozen in time'. The tracking itself should be still smooth.
 
We've had HRTF/binaural headphone audio now for nearly twenty years in PC gaming, to that extent it's really nothing new. The benefit of VR is that your eyes and ears are now tracked on all six degrees of freedom, so you see something 45deg off center, you'll hear it 45deg off center. On a monitor/TV your window into the game world is constantly at odds with where you're hearing because your 60-90deg game FOV is occupying something like 35-45deg of your vision.

Also having your eyes+ears constantly moving, pitching, rolling, rotating results in your brain continually picking up visual and audio cues that recalibrate your brain's binaural perception. If I put on a cowboy hat or shave my head, the real world HRTF of my head is substantially changed and as a result my ability to correctly discern positional sound is going to sound weird for a while, but given time the brain is able to accommodate the change and you no longer 'hear' the hat or lack of hair. Getting that to happen with traditional gaming is almost impossible because your POV only gets abrupt and sporadic changes on a couple axes with mouse/stick and the information you do see+hear is always at odds with each other.
 
The PS4Mini does dp positional audio!

http://www.eurogamer.net/articles/d...tation-vr-external-processor-unit-actually-do

Further proof also that PS4 doesn't have full TrueAudio and can't do positional audio itself.
I don't think you can conclude it from that (as in PS4 doesn't have full TrueAudio chip.. or something). I always think that PS4 does have it, but it was used for other audio stuff (decoding, effects, etc), thus it didn't have enough to do positional audio on top of it. On PC, the CPU is powerful enough to do the audio stuff, thus extra like TrueAudio can be utilized for positional audio. PS4 CPU needs any help it can get.
 
CUH-ZVR1series

Is the name of Psvr. Look for it in your local import board or certification board to check whether Psvr has arrived in your country or not
 
Also having your eyes+ears constantly moving, pitching, rolling, rotating results in your brain continually picking up visual and audio cues that recalibrate your brain's binaural perception. If I put on a cowboy hat or shave my head, the real world HRTF of my head is substantially changed and as a result my ability to correctly discern positional sound is going to sound weird for a while, but given time the brain is able to accommodate the change and you no longer 'hear' the hat or lack of hair. Getting that to happen with traditional gaming is almost impossible because your POV only gets abrupt and sporadic changes on a couple axes with mouse/stick and the information you do see+hear is always at odds with each other.

Virtual Barber 2016 viable for VR?
 
So 3D sound is a bit like surround sound but processed for two channel headphones. More or less. Ok back on track now.

It's more than surround sound, imagine many sources of sound from anywhere in 3D space that remain stationary (or moving, but in 3D space) as you move your head around, in theory 3D sound is not "only" 5 or 7 or even 11+ channels but anywhere within the 3D world. Something similar is also used in Gear VR, and even google cardboard, although i don't know how precise it is.
 
Ok ok I understand.

An interesting thing to know is whether watching a Bluray movie with headphones will have its 7.1 or 5.1 surround sound somehow mapped to 3D audio through the headphones - as much as it can be, given the limitations you guys very kindly explained.

That'd be neat and infinitely better than scaling the audio down to stereo.
 
That's why there's Dolby headphone (dts blah blah, srs whatever) software mixing that down mix 7.1 or 5.1 Blu-ray surround to headphones
 
I have zero interest in buying very expensive Dolby headphones which don't even work properly with VR (they don't)

EDIT: sorry I think I misread you
 
Surround mixes often are actually somewhat BS: center speaker always carrying the dialogue, even if the person is out of frame, is a big offender to me
 
Back
Top