Playstation 5 [PS5] [Release November 12 2020]

Here's my take: Using it for gaming will put it to work. It's not like this is going to be used for opening word docs/regular PC tasks. Therefore it will very likely get hot and if you don't want to risk any throttling....then get the heatsink. Seems like a no-brainer to me.

I thought someone mentioned you can't fit it nor is it recommended? I'm not into the nvme market but maybe you can buy 3rd party or jerry rig that correctly fits into the nvme.
I'd like to see tests about gaming related nvme throttling on it as well. Compare NVME controller heat from PS5 games vs PC games as someone here mentioned that the nand actually likes heat. Only the controller prefers less heat.
 
Here in the UK, Amazon UK have notified me that my second DualSense controller and PS5 Media remote will be delivered on 12 November, with my Xbox Series X. The PS5 itself launches a week later. That's just going to be weird! :yep2:

What I do need to find before 12 November, is a decent HDMI splitter. My LG only has four HDMI ports and they are currently filled by the MacMini (Kodi), Gaming PC over a long-arse HDMI cable, Switch and PS4. I'm thinking the MacMini and Switch can share a HDMI port through a splitter that preferably does not require more power. Does anybody have any suggestions for decent ones in the UK?
Same here in the States for my Target order. Getting my DualSense and 3D Pulse headset on Nov. 6. Those both release October 30th.
 
@Shortbread I have read a couple serious leaks that some devkits were a bit more powerful (like around 11tf), because those were for PSVR2 dev, I assume they could have used the 40CUs good chips + sample GDDR6 chips (>14gbps) from Samsung for those.
 
Trying to figure out how that would make sense ... Perhaps if the PSVR hardware has programmable processing power built in that needs to be able to be simulated by the DevKit? Even then it is a surprising way to provide that power, I think.
 
@Shortbread I have read a couple serious leaks that some devkits were a bit more powerful (like around 11tf), because those were for PSVR2 dev, I assume they could have used the 40CUs good chips + sample GDDR6 chips (>14gbps) from Samsung for those.

If an early kit there are possibly less than ideal drivers which will suck performance so going high in hardware may make the actual execution closer to target perf.

Also the final "model Apu" which will set final clocks was quite late I think, all Dev chatter on devkits were fixed clocks but deciphering true info from the bad is hard, the actual fluctuations in clock were for retail and probably only the most recent devkits. That would need final silicon to test and work out yields and decide how far to push. Perhaps 11 was on the cards based on AMD projections for RDNA2.

Has there been much reliable post announcement talk that helps decipher the pre launch leaks?
 
Has there been any new info regarding Dolby vision support?
When the Series X was announced as having Dolby Vision, Dolby's press release stated it was "the first" and not only, so perhaps something is in the works, or they hope it is. Licenses cost money, and its console makers' favorite way to shed cost. E.g. you had to buy atmos support at one point, they don't support CD playback, etc.
 
When the Series X was announced as having Dolby Vision, Dolby's press release stated it was "the first" and not only, so perhaps something is in the works, or they hope it is. Licenses cost money, and its console makers' favorite way to shed cost. E.g. you had to buy atmos support at one point, they don't support CD playback, etc.

Sony made a big fuss about supporting DV on their TVs as of 2016, not putting it on their biggest living room entertainment vehicle is puzzling.
 
When the Series X was announced as having Dolby Vision, Dolby's press release stated it was "the first" and not only, so perhaps something is in the works, or they hope it is.
I'm pretty sure the PS4 Pro supports Dolby Vision output, and most probably so do the One S and One X.

Does the SeriesX support Dolby Vision output in videogames?

If so, is there any advantage of using Dolby Vision in real-time rendering? IIRC Dolby Vision is good in pre-encoded video sources because it automatically adjusts the video's encoded brightness into the brightness range of the TV (dynamic metadata I think?), but in a console we're supposed to adjust brightness manually. Besides, I don't know if the consoles can't just support dynamic metadata without paying for the Dolby Vision royalties. Dynamic metadata is an open standard IIRC.
Dolby Vision also supports 12bit color, but I see no reason why either console couldn't simply set the color output to 12bit considering they all support HDMI 2.1, and then let the game engines produce a 12bit picture in the end.

In the end, I always thought of Dolby Vision as a premium encoder for videos, and not particularly useful for real-time rendered videogames.
 
I'm pretty sure the PS4 Pro supports Dolby Vision output, and most probably so do the One S and One X.
It doesn't.
Does the SeriesX support Dolby Vision output in videogames?
Yep, although I'm not too fussed about that, because this...

If so, is there any advantage of using Dolby Vision in real-time rendering? IIRC Dolby Vision is good in pre-encoded video sources because it automatically adjusts the video's encoded brightness into the brightness range of the TV (dynamic metadata I think?), but in a console we're supposed to adjust brightness manually. Besides, I don't know if the consoles can't just support dynamic metadata without paying for the Dolby Vision royalties. Dynamic metadata is an open standard IIRC.
Dolby Vision also supports 12bit color, but I see no reason why either console couldn't simply set the color output to 12bit considering they all support HDMI 2.1, and then let the game engines produce a 12bit picture in the end.

In the end, I always thought of Dolby Vision as a premium encoder for videos, and not particularly useful for real-time rendered videogames.
 
I'm pretty sure the PS4 Pro supports Dolby Vision output, and most probably so do the One S and One X.

Does the SeriesX support Dolby Vision output in videogames?

If so, is there any advantage of using Dolby Vision in real-time rendering? IIRC Dolby Vision is good in pre-encoded video sources because it automatically adjusts the video's encoded brightness into the brightness range of the TV (dynamic metadata I think?), but in a console we're supposed to adjust brightness manually. Besides, I don't know if the consoles can't just support dynamic metadata without paying for the Dolby Vision royalties. Dynamic metadata is an open standard IIRC.
Dolby Vision also supports 12bit color, but I see no reason why either console couldn't simply set the color output to 12bit considering they all support HDMI 2.1, and then let the game engines produce a 12bit picture in the end.

In the end, I always thought of Dolby Vision as a premium encoder for videos, and not particularly useful for real-time rendered videogames.
1S and 1X support DV playback and Atmos is for both. https://www.dolby.com/gaming/
Series SX supports DV for games as I understand it correctly and supports Atmos and DTS X for both.
1S does not support DV games, only 1X support DV gaming.
for DV titles, I only know of BFV and IIRC Mass Effect Andromeda
 
Last edited:
yea, continues to alter my understanding of their power setup.
The assumption was that they set the highest possible power at all times and built the cooling around that. So the fan and heatsink should be designed with the maximum power usage and still stay quiet. So in theory in could only get quieter. This is what we all took from Cerny's presentation.

But it would appear that is not fully the case.

Choice quotes here:
"Various games will be released in the future, and data on the APU's behaviour in each game will be collected," Otori said. "We have a plan to optimise the fan control based on this data."

For example, if a game is under heavy load for a long period of time, Sony can increase the fan speed to make sure everything's cool - even at the expense of quietness.

Perhaps I need to separate game code being the determinant for clock speed, from fans keeping the system cool. But I can only assume that this is a must requirement since all PS5s must run the same code speed, but different environments require different cooling.

But I would have assumed that they had all this resolved from the get go here. I wasn't expecting them to be monitoring the APU for games released in the future and making updates.
 
Last edited:
yea, continues to alter my understanding of their power setup.
The assumption was that they set the highest possible power at all times and built the cooling around that. So the fan and heatsink should be designed with the maximum power usage and still stay quiet. So in theory in could only get quieter. This is what we all took from Cerny's presentation.

But it would appear that is not fully the case.
They'll probably add a Jet Engine Firmware update later on :runaway:
 
Perhaps I need to separate game code being the determinant for clock speed, from fans keeping the system cool. But I can only assume that this is a must requirement since all PS5s must run the same code speed, but different environments require different cooling.
But why would you need to monitor that in the wild, across the user base?

I'm also very confused as I thought their approach would make it a more deterministic thing.
 
Back
Top