Sony VR Headset/Project Morpheus/PlayStation VR

Its not a simple matter of slapping a GPU inside. Often more upgrades are required to accommodate a more powerful GPU. It depends what the previous configuration was. Some of you guys make it sound as if its as simple as plugging a USB into the port.
At some point many people need to upgrade motherboards, power supplies, memory and CPU's along with the GPU in order to get the desired performance.
And if something goes wrong then its an extra cost

Well the rift hits in 2016 so we can't really tell what 970 performance will be as we should have micron drops hit that year. The one thing we know is that it will get progressively easier to put 970 performance into existing desktops.

The other thing to note is that i5/i7 performance isn't that much of a difference on games through out the generations of those chips. So we are talking about a lot of desktops out there .

And yes MR FOX put more than word together and yes a lot are sold to companies but not all of them. Just like the percentage of ps4 owners willing to spend another $200-$400 is not the full amount I listed.
 
Ubisoft earnings, pc vs ps4 unit sales.

Kinect sales, pc vs 360 hardware sales.

efg4piwisx1tcco4byit.png
 
Unscientifically, I would think they are similar.

No. (see what I did there?)

Okay, the minimum gtx970 is 3.4TFlops,

That's only at it's base clock and even then it's 3.5 TFLOPs. At the more realistic boost clock it' 3.9 TFLOPs and not comparable to AMD in that way anyway given that the 970 is competitive in real world performance to AMD's 5.6 TFLOP 290x. (and generally trying to compare GPU's on TFLOPs alone is stupid).

but does the old rule of thumb (carmack?) still applies that consoles optimizations, lower level API, and fixed hardware provide some 2x advantage for the same hardware?

No it does not and as far as I can tell, with specific reference to GPU's it never has. CPU's especially with DX9 which was in ascendance when Carmack made that comment may require 2x the performance for similar results and 2x system memory seems to be roughly in the right ball park but you certainly do not need 2x the GPU power to match console results. And thanks to Digital Foundries face off's that is now provable as opposed to just anecdotal.

Digital Foundry have shown time and again that the 750Ti is about equivalent in real world performance to the PS4 in the latest games (see the recent Witcher 3 face off for one of many examples) and the 970 is around 2.5x faster than the 750Ti in real world testing:

http://www.bit-tech.net/hardware/graphics/2014/09/19/nvidia-geforce-gtx-970-review/6

Even today, games on an "equivalent" 1.84TF GPU on PC can't exactly push graphics like Driveclub, 1886, or Uncharted 4, with the same frame rate stability,

Digital Foundry say otherwise and post comparison pictures and frame rate/time graphs to prove it. That is of course assuming you accept the best looking cross platform games are graphically comparable to those you list above, which I'm going to go out on a limb and predict that you do not.

and this is only the early games in the generation.

The PS4 is over a year and a half old now and Uncharted 4 isn't due out for another 9 months or so. It can hardly be considered one of this generations "early games".

PC is more prone to micro-stuttering (I never understood why

That's probably because it's not "more prone to micro-stuttering". Not fundamentally anyway. There's certainly more scope for performance issues to arise on the PC because the games graphics settings and framerate are left to the user to configure, but dual GPU issues aside, it's usually possible to get a smoother frame rate on the PC than on consoles provided you have a fast enough system and manage settings/frame rate limiting properly. Here's an excellent example of that:

http://www.pcgamer.com/durantes-witcher-3-analysis-the-alchemy-of-smoothness/

A 2x bigger GPU required sounds reasonable.

Nope, it's complete rubbish. That's why the Sony headset is planned to run at a lower resolution and framerate than the PC headset when powered by a GTX 970. Because a GTX 970 is a lot more powerful. Who'd have guessed?
 
http://www.roadtovr.com/epics-showd...-on-morpheus-after-ue4-optimizations-for-ps4/
Epic painstakingly optimized Showdown to run at Oculus’ 90 FPS VR target framerate on the powerful Nvidia GTX 980 GPU.

But now, after optimizations for Unreal Engine 4 on PS4, the demo runs on Morpheus at Sony’s 60 FPS VR target framerate. The demo of course takes advantage of Sony’s ‘asynchronous reprojection’ technique to ultimately output at 120 FPS.
The enabling factor is general optimizations to UE4 on PS4, rather than specific optimizations to the Showdown demo, Hoesing tells me. That means an increase across the board for UE4’s performance on the PS4. This will be a boon for VR developers that hope to deploy UE4 projects cross-platform between Morpheus and other VR headsets.
 
Well, PPS difference is 1.875x larger for Oculus. If the optimisations include async compute and something like 90% hardware utilisation versus 60% utilisation on 980 (made up figures), the quality could perhaps be identical. I don't believe there hasn't been a quality reduction, but it's not completely unrealistic to think of PS4 matching a 980 in cases where the 980 isn't highly optimised and PS4 is. The devil is in the details.
 
They said they didn't optimize the Showdown demo on morpheus, it was only PS4 optimizations to UE4 engine.
1.25x resolution on oculus
1.5x framerate render target on oculus
2x framerate reprojection on morpheus

The target for oculus games is 90FPS at 2x1080x1200.
The target for Morpheus games is 120fps at 1920x1080 with reprojection.

Anyway, this is looking really good for UE4 games that are cross platform if the demo is in fact unmodified.

Edit: fixed orientation and splitted oculus screen in two to avoid a pointless argument below.
 
Last edited:
They said they didn't optimize the Showdown demo on morpheus, it was only PS4 optimizations to UE4 engine.
1.25x resolution on oculus (2400/1920)
1.5x framerate render target on oculus
2x framerate reprojection on morpheus

The target for oculus games is 90FPS at 2400x1080.
The target for Morpheus games is 120fps at 1920x1080 with reprojection.

It's still only 60 unique frames that are being rendered. Both AMD and Nvidia have a similar technology for use in the PC space to allow games to meet the 90hz output requirement even if the GPU can't render 90 unique frames. It's obviously not going to be as good as the real thing though.
 
Well, PPS difference is 1.875x larger for Oculus. If the optimisations include async compute and something like 90% hardware utilisation versus 60% utilisation on 980 (made up figures), the quality could perhaps be identical. I don't believe there hasn't been a quality reduction, but it's not completely unrealistic to think of PS4 matching a 980 in cases where the 980 isn't highly optimised and PS4 is. The devil is in the details.

It sounds like they actually went to quite a bit of effort to optimise the demo specifically for the 980 so I'd imagine it's being put to pretty good use:

http://www.roadtovr.com/epic-games-...wn-90-fps-oculus-rift-crescent-bay-prototype/
 
I dunno. 2 guys, 5 weeks, isn't necessarily a great optimisation. ;) They probably didn't get to rewrite parts of the Unreal engine. If as part of supporting the consoles, Epic have gone to great lengths to reengineer the inner workings to be an ideal fit (a nice change from the lacklustre results of UE3 on PS3!), there could be considerable performance advantages. eg. Look at slides 17/18. They doubled up draw calls with a possibility of rewriting the shaders to be stereoscopic. Actually looking at that powerpoint some more, I really don't think it was a highly efficient implementation on PC. 2000 draw calls early on was killing them - not an issue on PS4. ;) They had to hack it down to 1000.

Some interesting points there about how VR headsets break a lot of the graphics cheats though, which is worrying. Particles and tesselation look fake. Particles might be solvable with a more sophisticated shader that warps the depth and visual parallax to stop them looking flat.

It's a brave new world...
 
If they do 60 to 90, they get half a frame of additional maximum frame age. Since 60 to 120 already have some artifacts on Morpheus, I would guess that 45 would be pushing it too far.

60 to 120, the additional age will alternate between 0ms and 16.7ms. Scene judder is 1:1 on a 60Hz cycle.
8.33ms average plus the 8.33ms scan out delay to the oled = 16.7ms

60 to 90, the additional age will alternate between 0ms, 22.2ms and 11.1ms. Scene judder is 2:1 on a 30Hz cycle.
11.1ms average plus 11.1ms scan out delay to the oled = 22.2ms

Maybe it's fine, or I'm calculating it wrong, but it's not the same timings as Morpheus.

Also I don't know how sensitive we will be with scene judder, it should only affect close objects with VR, I know I can't stand 3:2 pull down with tvs that can't do 24 correctly. This going to be interesting.
 
Last edited:
They said they didn't optimize the Showdown demo on morpheus, it was only PS4 optimizations to UE4 engine.
1.25x resolution on oculus (2400/1920)
1.5x framerate render target on oculus
2x framerate reprojection on morpheus

The target for oculus games is 90FPS at 2400x1080.
The target for Morpheus games is 120fps at 1920x1080 with reprojection.

Anyway, this is looking really good for UE4 games that are cross platform if the demo is in fact unmodified.

Oculus is 2160x1200, not 2400x1080. When you go for a screen for each eye, the screen needs to be taller than it is wide. HTC Vive is the exact same.
 
Oculus is 2160x1200, not 2400x1080. When you go for a screen for each eye, the screen needs to be taller than it is wide. HTC Vive is the exact same.
Yeah it was just to show the pixels per seconds was the same. It's two screens either way no matter how the scanlines are arranged.
 
Yeah it was just to show the pixels per seconds was the same. It's two screens either way no matter how the scanlines are arranged.

To me putting it that way makes it come accross as if the Oculus uses more pixels to create a wider FOV instead of the actual resolution (fidelity) advantage that it has. I understand now that you were doing it only to make the math easier to follow, but it's still misleading for general comparison.
 
To me putting it that way makes it come accross as if the Oculus uses more pixels to create a wider FOV instead of the actual resolution (fidelity) advantage that it has. I understand now that you were doing it only to make the math easier to follow, but it's still misleading for general comparison.
I don't understand what your problem is.
 
I don't understand what your problem is.

My problem is that you are presenting the differences between the two devices in a way that undermines Oculus' advantages and promotes Morpheus'.

You stated that Oculus' target was 2400x1080 at 90hz. It is not. Yes, the pixel throughput is the same, but the end result when compared to Morpheus being a 1080p device is that the Oculus has a higher fidelity, and thus produces a superior result. Refresh rate is another matter as the effectiveness of Sony's reprojection will have to be tested thoroughly. Also, input response will still be 33% lower than Oculus despite the possible advantage of more fluid motion. My guess would be that even if the reprojection works extremely well the end result will at best be better in one way but worse in another - possibly even making the input response deficit stand out as more apparent.
 
Yes, it's two separate panels that are 1080 horizontal by 1200 vertical. I added them vertically to show the area difference in a linear way. It was about GPU workload and the pixels throughput.

The latency is not just from the rendering, as I tried to explain above. There are many things in line.
 
Back
Top