Makes sense
I'm not doubting the merits of the technology itself. I mean I love 3D TV so I'm really a GO VR!!! kind of guy. But realistically, to think that this will be anything more than a niche of a niche is being extremely optimistic.
So, they are doing ultra fast 60hz>120hz interpolation so that 120hz LCD could show more images? Heh, did not expect that. I wonder can developers make a choice and straight up render game in 120fps? Is PS4 HDMI 1.4b compatibile?
Yoshida has said that the LCD display does not uses global refresh mode [all pixels turning on and off at the same time, Oculus uses that on their AMOLED] but some fast horizontal scan mode.
Anyone who tried both devices claim that Morpheus is a little blurry. Is that result of:
- motion interpolation?
- Sony achieving "low persistence" not by switching pixel off early but by simply pushing 120hz image on screen and making a PR move and calling THAT low persistence mode? 120hz is 8.3ms, while DK2 uses 2ms and 3ms low persistence modes at 72-75Hz.
If the motion interpolation is that effective, why not include the hardware on the console? A console that can run 60fps games with 30 fps effort would be pretty impressive.
If the motion interpolation is that effective, why not include the hardware on the console? A console that can run 60fps games with 30 fps effort would be pretty impressive.
Sony TVs and projectors have less display lag with interpolation ON than other brands do in their gaming mode. Their current TV implementation adds 45ms lag, so I assume it needs 2 additional frames to calculate motion vectors (their TVs are only 16ms lag otherwise). With Morpheus, if the data is already there (scene data, camera data) they can probably do it in a fraction of a frame, which would include all the other processing at the same time. Heck, this data is known before the frame started rendering, there's a obvious shortcut to be used.
I've been playing with frame interpolation for years. Sony's interpolation works extremely well for games as long as the frame rate is high and stable, and the game must be a standard 3D camera with sufficiently detailed textures (details help calculate motion vectors). The camera pans are perceived as twice resolution, it makes the image surprisingly sharp. However, whenever it starts dropping frames, or the frame rate gets erratic, it causes some visual hiccups that are worse than if interpolation was off. So far the only games that work well are platformers and racers. Because sadly nowadays, an unstable frame rate seems to be the norm. Now, considering VR will need a stable frame rate anyway, the problem solves itself naturally. At 120Hz there's no need for any motion blur too, so it frees up some GPU.
I have played at a 24Hz vsync on my PC and interpolated to 48 from my projector and it works as expected, but artifacts become more noticeable. There's strobing in sections of the image that didn't have enough detail to calculate them correctly, it seems to "give up" and leave these sections at 24Hz instead of adding worse visual artifacts. This is not noticeable at 60Hz->120Hz because there's no strobing for sections that were left at 60Hz. Camera movements are very clean and calculated perfectly, and it's exactly what the doctor ordered for VR.
The Bravia I have is only 7ms lag in game mode.
42W653
Only if display manufacturers incorporate it, and it seems to me that they chase movie tech, not gaming tech. So I expect some niche displays to support _sync but I wouldn't place a bet on all displays incorporating that tech as it has no benefit for TV viewing.That's the inevitable future of gaming.
Forza and Gran Turismo both had a feature where you could add additional screens to increase the fov by way of multiple ps3s\xbox360s
<a href="http://www.youtube.com/watch?v=ZRso_IuAYZs">YouTube Link</a>
If they give the option of using 2 ps4s (one for each eye), then people could game without the decrease in performance/resolution.