Sony VR Headset/Project Morpheus/PlayStation VR

Good point, they could also make two versions of the external module, a simple one for the PS4, bring the price down, and a high-end one with buttons for setup, that works also for a stand-alone use, or PC. Still I think the PS4 needs all the help it can get for processing.
 
I'm not doubting the merits of the technology itself. I mean I love 3D TV so I'm really a GO VR!!! kind of guy. But realistically, to think that this will be anything more than a niche of a niche is being extremely optimistic.

I dunno. I feel like right now we are moving out of the palm pilot era and moving into the windows mobile time frame. We aren't quite at mass market yet but there are going to be millions that buy and love these vr products and I bet by the time next gen consoles roll around we will see the rift at 8k resolution with full body tracking and those Omni platforms being popular and that shift will be when it becomes iPhone big
 
So, they are doing ultra fast 60hz>120hz interpolation so that 120hz LCD could show more images? Heh, did not expect that. I wonder can developers make a choice and straight up render game in 120fps? Is PS4 HDMI 1.4b compatibile?

Yoshida has said that the LCD display does not uses global refresh mode [all pixels turning on and off at the same time, Oculus uses that on their AMOLED] but some fast horizontal scan mode.

Anyone who tried both devices claim that Morpheus is a little blurry. Is that result of:
- motion interpolation?
- Sony achieving "low persistence" not by switching pixel off early but by simply pushing 120hz image on screen and making a PR move and calling THAT low persistence mode? 120hz is 8.3ms, while DK2 uses 2ms and 3ms low persistence modes at 72-75Hz.
 
So, they are doing ultra fast 60hz>120hz interpolation so that 120hz LCD could show more images? Heh, did not expect that. I wonder can developers make a choice and straight up render game in 120fps? Is PS4 HDMI 1.4b compatibile?

Yoshida has said that the LCD display does not uses global refresh mode [all pixels turning on and off at the same time, Oculus uses that on their AMOLED] but some fast horizontal scan mode.

Anyone who tried both devices claim that Morpheus is a little blurry. Is that result of:
- motion interpolation?
- Sony achieving "low persistence" not by switching pixel off early but by simply pushing 120hz image on screen and making a PR move and calling THAT low persistence mode? 120hz is 8.3ms, while DK2 uses 2ms and 3ms low persistence modes at 72-75Hz.

I guess (and it's just a guess, I've never tried either of them) it's blurry because of the interpolation between frames.
In Oculus, they switch the pixels off between each frame, so they have less inaccurate data sent to the users eyes.
I think I've read that LCDs can't do 120hz. They simply adjust the time it takes for their pixels to switch from one state to another. So it's a virtual 120hz refresh rate.
I'd like to know if that's the case as well.
 
It would be interesting to see how this 120hz feed would look in true low-persistence mode on some good OLED/AMOLED.
 
If the motion interpolation is that effective, why not include the hardware on the console? A console that can run 60fps games with 30 fps effort would be pretty impressive.
 
If the motion interpolation is that effective, why not include the hardware on the console? A console that can run 60fps games with 30 fps effort would be pretty impressive.

Because it's not???? :devilish:
I think it's the same technology they use with their tvs. That's why we need a "game mode" on tvs to play games. There is significant delay between actual frames and "interpolated" frames that causes "lag".
If their image processor is very fast, it can deliver frames with less latency, but my guess is, there will be latency between frames.
By how much, I have no idea!
 
I don't know if this is correct, but using interpolation on 30fps would at least add 33.3ms of lag (basically 1 frame delay) to insert interpolated image. If the base is 60 fps then it's only 16.67ms of lag. Is the added 33ms of lag still good enough, especially for VR? Or it would need at least 60fps so that when interpolated to 120fps the added lag wouldn't matter much vs the added benefit of motion smoothness?
Outside of VR, I hope devs would pursue interpolation more, especially for non twitch game that can tolerate lags (adventure, rpg, etc). Of course I can turn motion interpolation on my TV, but then it would mean I need to turn it on/off depending on the game.
 
you could use extrapolation, that means you only use previous frames.
Of course that means less data for the algorithm and more ways to mess up... it should be good enough for pedicting camera movements.
 
Sony TVs and projectors have less display lag with interpolation ON than other brands do in their gaming mode. Their current TV implementation adds 45ms lag, so I assume it needs 2 additional frames to calculate motion vectors (their TVs are only 16ms lag otherwise). With Morpheus, if the data is already there (scene data, camera data) they can probably do it in a fraction of a frame, which would include all the other processing at the same time. Heck, this data is known before the frame started rendering, there's a obvious shortcut to be used.

I've been playing with frame interpolation for years. Sony's interpolation works extremely well for games as long as the frame rate is high and stable, and the game must be a standard 3D camera with sufficiently detailed textures (details help calculate motion vectors). The camera pans are perceived as twice resolution, it makes the image surprisingly sharp. However, whenever it starts dropping frames, or the frame rate gets erratic, it causes some visual hiccups that are worse than if interpolation was off. So far the only games that work well are platformers and racers. Because sadly nowadays, an unstable frame rate seems to be the norm. Now, considering VR will need a stable frame rate anyway, the problem solves itself naturally. At 120Hz there's no need for any motion blur too, so it frees up some GPU.

I have played at a 24Hz vsync on my PC and interpolated to 48 from my projector and it works as expected, but artifacts become more noticeable. There's strobing in sections of the image that didn't have enough detail to calculate them correctly, it seems to "give up" and leave these sections at 24Hz instead of adding worse visual artifacts. This is not noticeable at 60Hz->120Hz because there's no strobing for sections that were left at 60Hz. Camera movements are very clean and calculated perfectly, and it's exactly what the doctor ordered for VR. :cool:
 
Last edited by a moderator:
Sony TVs and projectors have less display lag with interpolation ON than other brands do in their gaming mode. Their current TV implementation adds 45ms lag, so I assume it needs 2 additional frames to calculate motion vectors (their TVs are only 16ms lag otherwise). With Morpheus, if the data is already there (scene data, camera data) they can probably do it in a fraction of a frame, which would include all the other processing at the same time. Heck, this data is known before the frame started rendering, there's a obvious shortcut to be used.

I've been playing with frame interpolation for years. Sony's interpolation works extremely well for games as long as the frame rate is high and stable, and the game must be a standard 3D camera with sufficiently detailed textures (details help calculate motion vectors). The camera pans are perceived as twice resolution, it makes the image surprisingly sharp. However, whenever it starts dropping frames, or the frame rate gets erratic, it causes some visual hiccups that are worse than if interpolation was off. So far the only games that work well are platformers and racers. Because sadly nowadays, an unstable frame rate seems to be the norm. Now, considering VR will need a stable frame rate anyway, the problem solves itself naturally. At 120Hz there's no need for any motion blur too, so it frees up some GPU.

I have played at a 24Hz vsync on my PC and interpolated to 48 from my projector and it works as expected, but artifacts become more noticeable. There's strobing in sections of the image that didn't have enough detail to calculate them correctly, it seems to "give up" and leave these sections at 24Hz instead of adding worse visual artifacts. This is not noticeable at 60Hz->120Hz because there's no strobing for sections that were left at 60Hz. Camera movements are very clean and calculated perfectly, and it's exactly what the doctor ordered for VR. :cool:

The Bravia I have is only 7ms lag in game mode.
 
With this technology they could already implement some sort of freesync/gsync feature? Couldn't they?

It could be neat. Because I really hope this tech (freesync/gsync) will be fully implemented in all PS5 games. Like the way remote play works for all PS4 games; I fully expect ALL PS5 games to use a freesync feature natively.

That's the inevitable future of gaming.

IMO
 
That's the inevitable future of gaming.
Only if display manufacturers incorporate it, and it seems to me that they chase movie tech, not gaming tech. So I expect some niche displays to support _sync but I wouldn't place a bet on all displays incorporating that tech as it has no benefit for TV viewing. :(
 
Forza and Gran Turismo both had a feature where you could add additional screens to increase the fov by way of multiple ps3s\xbox360s

If they give the option of using 2 ps4s (one for each eye), then people could game without the decrease in performance/resolution.
 
Last edited by a moderator:
Forza and Gran Turismo both had a feature where you could add additional screens to increase the fov by way of multiple ps3s\xbox360s
<a href="http://www.youtube.com/watch?v=ZRso_IuAYZs">YouTube Link</a>

If they give the option of using 2 ps4s (one for each eye), then people could game without the decrease in performance/resolution.

Gt6 supports multiple screens with multiple ps4s
 
Back
Top