It supports UHD BD and is most likely to have a H.265 decoder in its GPU.XBOX One S does it so the answer is most likely yes
EDIT: beaten
Last edited:
It supports UHD BD and is most likely to have a H.265 decoder in its GPU.XBOX One S does it so the answer is most likely yes
I'm too lazy to check all benchmarks here
https://www.reddit.com/r/htpc/comments/31gw0u/
but is the PS4 CPU powerful enough to decode H.265 4K stream (15-18Mbps for Netflix apparently) at 30fps?
Then the problem is not in the CPU.It supports UHD BD and is most likely to have a H.265 decoder in its GPU.
EDIT: beaten
Then the problem is not in the CPU.
So is the PS4 unable to stream 4K content after all?
Thats not the point. If the PS4 slim and fat can do it and they are not been updated for 4k streaming deliberately just so that "Pro is for that purpose" then all PS4 should be for that purpose except for content that they dont have the means to like 4k gaming.I thought that PS4 and pS4 Slim only supported 1080 and HDR via an updated firmware. I never heard or seen mention that it would ever do 4k for anything. That is the job of the PS4 Pro.
I'm too lazy to check all benchmarks here
https://www.reddit.com/r/htpc/comments/31gw0u/
but is the PS4 CPU powerful enough to decode H.265 4K stream (15-18Mbps for Netflix apparently) at 30fps?
It would have generated a lot of power compared to H264 decoding, but I don't think they wouldn't be able to get it running using GPGPU.It isn't. Per Stacey Spears, who was part of the team who worked on the software codecs in the Xbox One, they tried to implement a hybrid decoder that used both the CPU and GPU to decode 4K Netflix streams but were not able to get satisfactory results. And even if they did get it to work it would have used a *lot* of power, and generated a lot of heat, during playback.
Well we were having a similar discussion back in the day about PS3's ability to output stereoscopic 3D, which if I recall required a higher HDMI standard than the one it had, but it eventually had the means to do it.My little knowledge (and shoddy memory) about video codecs is that h.265 has better, more flexible slicing, which means decoding is better suited for multi cpu decoding. Which probably means that GPGPU also is suited quite well for the job, no idea if the power usage would increase.
But the output to the television in regards to 4k, do you not need hdmi 2.0 to do that?
Decoding and being able to output is only part of it.Well we were having a similar discussion back in the day about PS3's ability to output stereoscopic 3D, which if I recall required a higher HDMI standard than the one it had, but it eventually had the means to do it.
I am not sure if this can be the case for PS4 as well. But if it the power usage would increase, is that necessary a problem? The console wont be doing anything else but stream the video when running Netfilx or Youtube at 4k. The console already puts into suspend mode any other application when running a video app
Well I wonder if that can also be solvedDecoding and being able to output is only part of it.
The other bit is hdcp level required by the streaming services like Netflix.
hdcp is the security aspect.
Even if it's technically possible, that doesn't mean it's desirable, especially if it means your ps4 sounds like it's running rocket league while streaming from Netflix.Not with the CPU, but a GPGPU solution could definitely do it. At the cost of consuming as much power as a AAA game, though.
It would have generated a lot of power compared to H264 decoding, but I don't think they wouldn't be able to get it running using GPGPU.
IMO, there's really no reason why 4K H265 video decoding couldn't be parallel enough to take almost full advantage of the Xbone's 1.2 TFLOPs FP32.
IIRC, Intel's Broadwell can do 4K H265 using GPU+CPU, using much smaller GPUs. Sure there might be no practical interest in making it possible (neither the original xbone or ps4 have HDCP2.2 anyways), but I'm pretty sure it would be feasible for the current-gens.
What if they only decoded every other quad of pixels, and interpolated the rest? I'd call it, checker-board decoding. It's been said enough times it looks as good as tge real thing...It isn't. Per Stacey Spears, who was part of the team who worked on the software codecs in the Xbox One, they tried to implement a hybrid decoder that used both the CPU and GPU to decode 4K Netflix streams but were not able to get satisfactory results. And even if they did get it to work it would have used a *lot* of power, and generated a lot of heat, during playback.
Well I wonder if that can also be solved
even if that was the case for hdcp 2.2, which I don't believe it is.Almost certainly, did the 360 not output DVD and HD DVD via component and vga, analogue output with no hdcp at full resolution no less.
I assume then it's on the end device to comply with extending the chain of security to the display, they could disable it just as Sony did with games on PS4 which initially had hdcp applied.
No, you don't need that if you're fine with 30Hz refresh rate, which would suit videos just fine. HDR on the other hand isn't supported.But the output to the television in regards to 4k, do you not need hdmi 2.0 to do that?
No, you don't need that if you're fine with 30Hz refresh rate, which would suit videos just fine. HDR on the other hand isn't supported.