D
Deleted member 11852
Guest
The display planes combine for the video out. There's only one signal down the HDMI which is the final composite. But I guess if they use HDMI+Ethernet, they can send the UI layer separately.
Ah, but Sony don't need to work within the limits of HDMI. Aside from being a connector pin layout, HDMI is a software stack but it's a standard that only exists for interoperability between different devices which is a luxury they don't need for PS4 to Morpheus (or its breakout box).
HDMI is also something of a compromise to allow support of different things requiring different bandwidth - the video in 2D or 3D, an ethernet channel, the audio, the audio return channel, control lines and miscellaneous data. HDMI 1.4 has a raw bandwidth of 10Gbps, I wonder if this is enough to send two distinct 1080p frames at 60Hz - probably if you've not sending complete full frames (i.e. the cockpit also consumes the bottom 25% of the screen) and you've not simultaneously trying to send 5.1 or 7.1 lossless audio. Although if you throw out HDMI you basically rule out PC compatibility.
But I'm more curious how an approach like this would complicate the rendering pipeline complexity. For example if you're producing two distinct panes but there's light, shadows or reflections etc. from one (the world) impacting the other (the cockpit overlay), how that have an impact? Will this complicate AA?