Standard speaker layouts exist for a reason. Arbitrarily moving speakers around to enable one specific non-standard implementation to work while simultaneously breaking everything else is a terrible idea.
Who's going to change what layout?
The PS5 will probably output Dolby Atmos to a receiver if you're using it for Netflix or watching a UHD Bluray. If you have a 5.1.2 configuration, then your AV Receiver together with Atmos already knows where the speakers are and provides the sounds accordingly.
But if you're playing a game with a 3D audio engine, there's no need to encode it into Atmos and spend precious compute power on that, just to decode it on the receiver. Just send the 2 top channels as if they were surround back channels on a 7.1 configuration through LPCM and the receiver will just forward the audio to the top speakers.
Have you seen how an A/V Receiver with Atmos names the height speaker outputs? The surround back speaker outputs are the same as the top speaker ones, and they also triple as bi-amplifiers to the main speakers:
The reason why the 2013 consoles had Dolby Digital encoding is probably because they figured a lot of people were still using old surround systems without HDMI, and the only way they'd get surround sound would be with Dolby Digital sent through Optical SPDIF. That's going away this gen, since as you've seen the SeriesX has no optical output anymore, and I'd bet the PS5 won't have it either.
For this gen they assume most people with surround sound setups will have:
- At least a HDMI 1.4 receiver with ARC, through which they can send stuff encoded into Dolby Digital Plus for typical 5.1/7.1 setups.
- A HDMI 2.0 equipped receiver that supports Atmos sitting between the console and the TV, through which they can send the height channels through the LPCM surround back channels.
- A HDMI 2.1 TV and a receiver that supports eARC, through which they can send the same audio streams as if the receiver was sitting between the consoles and the TV.
The
beauty of eARC is that now we can use the same AV Receiver
forever. By supporting LPCM through eARC, we're never going to be limited by a new format or a new HDMI version that provides more video bandwidth. We'll always be able to use the 24bit 192KHz per channel of LPCM and use a standard HDMI cable from the TV to get it.
And I still don't accept the premise that an object-oriented bitstream sent to a decoder conveys no benefit over just sending some number of LPCM channels.
Absolutely no one wrote that. At least I didn't.
First of all, there's the benefit of compression. If it weren't for Dobly / DTS we wouldn't have multichannel surround on optical disks and streaming video.
Secondly, a video using object-oriented audio means it's supposedly agnostic to amount of speakers, which is very welcome addition to home theater setups. A 11 channel receiver wouldn't know what to do with an old Dolby Digital 5.1 audio stream, other than place the front+center+rear+sub on their respective places and do something like an average output of two channels for the channels in between. With Dolby Atmos you can theoretically have dozens of speakers and all of them will have a distinct output based each virtual source.
What Atmos doesn't bring benefit for, in a home theater surround system, is for
games, because games have been using object-based bitstreams created by their 3D audio engines for decades.
If you can use LPCM through HDMI (and with eARC you can use LPCM in pretty much any situation), there's very little reason to bother encoding your videogame audio engine's output into Dolby Atmos just to send it through HDMI and decode it on the other receiver. Unless you're using a very tightly integrated solution like a soundbar but those wouldn't do any decent surround sound anyways.