hesido
Regular
A developer on NeoGAF paints a bleak picture of what to expect from VR inclusing Morpheus:
http://www.neogaf.com/forum/showthread.php?t=1060677
However, I don't see how working on a battery connected device and on Unity can be extrapolated to VR on a console. Also I don't get the part that he mentions he has only 1.5ms left for AI pathfinding, logic updates, and audio mixing. It sounds as if he is not doing those in parallel, if I assume that 1.5ms is not the amount of total execution time left on a fully exhausted pipeline running parallel threads during a VR frame time with small gaps of unusued CPU cycles.
http://www.neogaf.com/forum/showthread.php?t=1060677
However, I don't see how working on a battery connected device and on Unity can be extrapolated to VR on a console. Also I don't get the part that he mentions he has only 1.5ms left for AI pathfinding, logic updates, and audio mixing. It sounds as if he is not doing those in parallel, if I assume that 1.5ms is not the amount of total execution time left on a fully exhausted pipeline running parallel threads during a VR frame time with small gaps of unusued CPU cycles.