Sony VR Headset/Project Morpheus/PlayStation VR

One console for each eye, not all that difficult to split the tasks. Would require all assets and game data on each console but with Microsoft talking up cloud gaming and the latency and bandwidth deficiency there, gigabit lan seems like overkill?

Would cost a fortune and be commercial suicide. The ps4,so powerful you need 2.
 
Wasn't there a rumor earlier that the PS4 have two identical SoC stacked?
They should enable the second one. That would be the perfect solution if it existed.
 
Wasn't there a rumor earlier that the PS4 have two identical SoC stacked?
They should enable the second one. That would be the perfect solution if it existed.
The SOC has been torn apart to reveal no such thing, should all other concerns about such a setup be discarded.
 
Given how long Morpheus has been in development (2009, if not earlier) Sony would have been well aware of the challenge of driving two independent displays at 60Hz using the Liverpool APU - assuming they intend to deliver rich detailed worlds - and that the PS4 would certainly be a limiting factor.

They will already have a solution in mind for this. I would not be surprised to find out that their SDK has support for offloading some work to a piece of hardware external to the PS4.
 
It's not so bad. They need to render two very similar viewports in 960x1080p, at 60 fps. I wonder how much processing can be reused across those two viewports, there might be a lot of shortcuts compared to one full screen in 1080p. Maybe at least optimizing cache usage by drawing the same polygon on both sides at the same time?
 
I hope its more than 60fps Mr Fox .

In an ideal scenario for Morpheus it would be a a 1920x1080p screen or higher with a high refresh rate and sony would take a 720p 75fps game and upscale it.

The difference between the 60fps dk1 and the 75fps in the dk2 is night and day. You can even set the dk2 to 60fps and it just looks and feels better at the 75fps mark.


Anyway since AMD made the APU in the ps4

Liquid VR by AMD

Significant features of version 1.0 of the LiquidVR SDK include:

Async Shaders for smooth head-tracking enabling Hardware-Accelerated Time Warp, a technology that uses updated information on a user's head position after a frame has been rendered and then warps the image to reflect the new viewpoint just before sending it to a VR headset, effectively minimizing latency between when a user turns their head and what appears on screen.

Affinity Multi-GPU for scalable rendering, a technology that allows multiple GPUs to work together to improve frame rates in VR applications by allowing them to assign work to run on specific GPUs. Each GPU renders the viewpoint from one eye, and then composites the outputs into a single stereo 3D image. With this technology, multi-GPU configurations become ideal for high performance VR rendering, delivering high frame rates for a smoother experience.

Latest data latch for smooth head-tracking, a programming mechanism that helps get head tracking data from the head-mounted display to the GPU as quickly as possible by binding data as close to real-time as possible, practically eliminating any API overhead and removing latency.

Direct-to-display for intuitively attaching VR headsets, to deliver a seamless plug-and-play virtual reality experience from an AMD Radeon™ graphics card to a connected VR headset, while enabling features such as booting directly to the display or using extended display features within Windows.

AMD released the alpha version of LiquidVR SDK 1.0 to registered developers today.

I wonder if this will be used for the ps4
 
Morpheus will be 120fps if they keep the outboard ASIC, the PS4 only has to render 60fps.
 
Morpheus will be 120fps if they keep the outboard ASIC, the PS4 only has to render 60fps.
I'd forgotten that. Wasn't this was part of the reason for the, currently, 40ms lag?. It'll need at least two buffers to tween two 16ms frames.
 
Morpheus will be 120fps if they keep the outboard ASIC, the PS4 only has to render 60fps.

So they are using a type of that god awful tv tech where they advertise 120 or 240hz panels and they create fake frames from the tv signal ? That looks horrible on a tv , I hope the Morpheus fairs better.

Still 60fps at 1080p is going to be a lot ot ask of the ps4. The uncharted guys said that 1080p 30fps was already a monumental task for uncharted 4. So I could still see 720p with 60fps then upscaled to 1080p at 120fps using the outboard asic. Hopefully it looks good on more demanding games. The demos they showed looked pretty good.
 
So they are using a type of that god awful tv tech where they advertise 120 or 240hz panels and they create fake frames from the tv signal ? That looks horrible on a tv , I hope the Morpheus fairs better.
From all accounts of those who tried Morpheus, you're completely wrong.
 
So they are using a type of that god awful tv tech where they advertise 120 or 240hz panels and they create fake frames from the tv signal ? That looks horrible on a tv , I hope the Morpheus fairs better.
Morpheus should be better because, unlike TVs that only have two or more images from which to create interpolated frames, the game engine can provide data to the interpolation hardware to remove much of the guesswork.
 
Now that foveated rendering has been shown in action in a headset (Hololens and other developments), it'd be stupid not to go that route. And that solves the performance-deficit issue. The reduced high-fidelity area would make the hardware something like an order of magnitude more powerful. That is, 1.8 TF rendering to 1/10th the screen area would be equivalent to an 18 TF GPU rendering in full quality to the whole screen. Total improvement may be more like 5x than 10x, but it's still ample to get PS4 full-screen quality on the headset. 2x framerate + 2x resolution (dual 1080p) == 4x efficiency needed.
 
So they are using a type of that god awful tv tech where they advertise 120 or 240hz panels and they create fake frames from the tv signal ? That looks horrible on a tv , I hope the Morpheus fairs better.

Still 60fps at 1080p is going to be a lot ot ask of the ps4. The uncharted guys said that 1080p 30fps was already a monumental task for uncharted 4. So I could still see 720p with 60fps then upscaled to 1080p at 120fps using the outboard asic. Hopefully it looks good on more demanding games. The demos they showed looked pretty good.
I would think they would just draw 2 separate images, 1 per eye. It's just simpler that way. The setup however takes a bit longer. Each person has to configure their device to their FOV including the space between their eyes.

Overall as a technology it works, but I still regardless of how great it is, there's going to be a lot of people that won't be able to 'take it'.
Myself being one of them. I haven't used Oculus DK2, but Oculus gave me a splitting nausea induced headache for several hours after only 10 minutes of use.

It's going to be a hard sell for everyone getting into VR gaming, you really don't know if it affects you until you've tried it, but watching movies in 3D should be cakewalk for most people I think. I'm am interested to see how all the companies pull off the marketing strategies, adoption rates, and satisfaction rates for their respective VR devices. It's certainly an out of body experience and I'm ready to try it again, or at least the final product.
 
From all accounts of those who tried Morpheus, you're completely wrong.
about tvs or the Morpheus

Morpheus should be better because, unlike TVs that only have two or more images from which to create interpolated frames, the game engine can provide data to the interpolation hardware to remove much of the guesswork.
That makes sense. My question however is why aren't other companies doing this ? If it was a good solution why isn't it on the valve or oculus stuff. 2400x1080p at 90fps is a big over head when they could target 60fps and just have a break out box fill in the interpolated frames

Now that foveated rendering has been shown in action in a headset (Hololens and other developments), it'd be stupid not to go that route. And that solves the performance-deficit issue. The reduced high-fidelity area would make the hardware something like an order of magnitude more powerful. That is, 1.8 TF rendering to 1/10th the screen area would be equivalent to an 18 TF GPU rendering in full quality to the whole screen. Total improvement may be more like 5x than 10x, but it's still ample to get PS4 full-screen quality on the headset. 2x framerate + 2x resolution (dual 1080p) == 4x efficiency needed.
is the tech there for proper eye tracking ? I could imagine if they are using your head location to figure out the focus point but you simply move your eyes left or right without your head it could cause you some problems
 
I would think they would just draw 2 separate images, 1 per eye. It's just simpler that way. The setup however takes a bit longer. Each person has to configure their device to their FOV including the space between their eyes.

Overall as a technology it works, but I still regardless of how great it is, there's going to be a lot of people that won't be able to 'take it'.
Myself being one of them. I haven't used Oculus DK2, but Oculus gave me a splitting nausea induced headache for several hours after only 10 minutes of use.

It's going to be a hard sell for everyone getting into VR gaming, you really don't know if it affects you until you've tried it, but watching movies in 3D should be cakewalk for most people I think. I'm am interested to see how all the companies pull off the marketing strategies, adoption rates, and satisfaction rates for their respective VR devices. It's certainly an out of body experience.

Well the dev kit 2 has helped a lot. Dev kit 1 I could last about an hour with some of my cousins getting sick in minutes. Now with the dev kit 2 I could play for 3-4 hours easily and I think my longest play session was close to 5 hours. Some of my cousins who were game to give the dk2 a try lasted much longer also sometimes hours instead of minutes.

My usage of cresent bay gave me the impression that it was better yet. I have to admit I only had about 20 minutes with it . However I didn't feel as closed in while wearing it as I did with the others.
 
Well the dev kit 2 has helped a lot. Dev kit 1 I could last about an hour with some of my cousins getting sick in minutes. Now with the dev kit 2 I could play for 3-4 hours easily and I think my longest play session was close to 5 hours. Some of my cousins who were game to give the dk2 a try lasted much longer also sometimes hours instead of minutes.

My usage of cresent bay gave me the impression that it was better yet. I have to admit I only had about 20 minutes with it . However I didn't feel as closed in while wearing it as I did with the others.
Oh nice, that's certainly encouraging. Do you wear glasses btw? Did they manage to get around the spectacles issue?
 
I'd forgotten that. Wasn't this was part of the reason for the, currently, 40ms lag?. It'll need at least two buffers to tween two 16ms frames.
I guess the minimum is a little more than one frame, and the interpolation could possibly be almost "inline" when the next frame is inbound. There's the lag of the Move system, the game engine processing, the frame rendering, and the outboard interpolator. It seems to add up to 40ms right now without any predictive algorithm. But they said they are working on reducing it, that's why I'm guessing they could possibly remove the interpolator. Otherwise they get in the territory of defeating the laws of physics... it would be bad, like crossing the streams.

I think I remember Rickard Marks (or was it Anton?) saying that the forward prediction is tuned to be as long as the pipeline. So the effective lag is zero. The longer the pipeline the higher the danger of artifacts.

At events where both occulus and Morpheus were shown, they were considered very competitive with no clear winner, It's quite impressive since that was a situation of comparing a high-end PC versus PS4.

Only 2 hours left...
 
Oh nice, that's certainly encouraging. Do you wear glasses btw? Did they manage to get around the spectacles issue?
no , I had lasik done about 5 years ago. I have slight haloing around lights at night . I noticed it a little with dk1 but now with cresent bay I don't notice it at all. I have to sit with my cousins dev kit 2 and try and look for it to see if its still happening.
 
I have prescription glasses and DK2 was relatively comfortable with them.

I really like that Shu Yoshida wears glasses, you can be sure Morpheus will be designed to accomodate glasses perfectly. :LOL:
 
Back
Top