Digital Foundry Article Technical Discussion Archive [2016 - 2017]

Status
Not open for further replies.
@grandmaster :

Why are you using the old H.264 format for your 4K download videos on digitalfoundry.net?

H.264 is not efficient for 4K.

Why don't you use the much more efficient open VP9 format which is already being used by YouTube and considered by Netflix?

Here's a recent video on VP9 vs. H.265 vs. H264 (vs. AV1) done by the Netflix encoding department (and the Alliance for Open Media):


A heavily compressed H.265 video is extremely CPU intensive if you do not have a GPU with H.265 hardware decode support. For example, my i5 2500k @ 4.4 Ghz cannot smoothly play back a heavily compressed 4k H.265 encoded video when the GPU doesn't have H.265 hardware decode support.

H.264 will offer smooth playback for far more people than H.265.

Assuming they had the time (which they may not as encoding can take up a lot of time), they could offer both a H.264 and H.265 download. But if they only have the option to offer one, then H.264 is the better option. VP9 doesn't change things that much in that regard. And I've had some strange issues at times with the VP9 encoded videos on YouTube as well.

Regards,
SB
 
Assuming they had the time (which they may not as encoding can take up a lot of time), they could offer both a H.264 and H.265 download. But if they only have the option to offer one, then H.264 is the better option.
You do realize that digitalfoundry.net wants money for those video downloads, do you?

...

And I've had some strange issues at times with the VP9 encoded videos on YouTube as well.
"Strange issues" that you can not even put into words, gotcha!

VP9 YouTube working fine here on an i7-6700K.

I'm pretty sure x264 encoder still beats all new ones with higher bitrate.
Maybe you should have watched the video that was linked with the post, then you would have known that Netflix found out that:
http://techblog.netflix.com/2016/08/a-large-scale-comparison-of-x264-x265.html said:
[...] x265 and libvpx demonstrate superior compression performance compared to x264, with bitrate savings reaching up to 50% especially at the higher resolutions. [...]

Furthermore, you would have seen in the same video, that the Alliance for Open Media is aiming to make AV1 (the successor to VP9, planned to be finished in 2017) yet another 50% more efficient than VP9/H.265. And that they are commited to have hardware decoding available, which probably isn't too difficult for them, considering that the Alliance for Open Media consists of members like Intel, AMD, NVIDIA, ARM, Broadcom, Google, Microsoft, Amazon, Netflix and so on.

But please feel free to continue praising x264... ;)

And regarding high bitrates:

digitalfoundry.net's 2160p60 H.264 videos apparently have a bitrate around 50 Mbps. That's not high at all. That's a bitrate that would usually be considered high for 1080p60 video, but not for 2160p60 videos (for H.264). Heck, even 1080p24 H.264 Blu-ray Disc video has up to 40 Mbps.

2160p60 @ 50 Mbps @ VP9/H.265 could be considered high, but not @ H.264.
 
"Strange issues" that you can not even put into words, gotcha!.

Sure, I can go into details. Rendering anomalies (blockiness in some parts of a video stream on a GTX 1070 occasionally), video streaming just randomly stopping and giving an error (don't remember the error as I haven't written it down) that results in a static screen that looks like "snow" from old analog CRTs (likely a stylistic choice for the error), etc. Things that didn't happen until YouTube started switching videos to VP1. Also errors that I do not experience with other sites that do not use VP1 for their streamed videos. I don't use Netflix so I can't comment on their streaming service.

Regards,
SB
 
[...] x265 and libvpx demonstrate superior compression performance compared to x264, with bitrate savings reaching up to 50% especially at the higher resolutions. [...]
superior compression does not mean better quality.
At same high enough bitrate x264 is likely still better visually.

XB1 closing the gap, yeah. 900p and worse performance.
 
Last edited:
So...

Image quality: Ps4 Pro >>> Ps4 > X1
Performance: Ps4 > X1 > Ps4 Pro (???)

Wondering why the AA is so fucked up on the regular consoles. That's some pretty terrible shimmer visible even through youtube compression, must be worse up close :nope:
 
Furthermore, you would have seen in the same video, that the Alliance for Open Media is aiming to make AV1 (the successor to VP9, planned to be finished in 2017) yet another 50% more efficient than VP9/H.265. And that they are commited to have hardware decoding available, which probably isn't too difficult for them, considering that the Alliance for Open Media consists of members like Intel, AMD, NVIDIA, ARM, Broadcom, Google, Microsoft, Amazon, Netflix and so on.
Until Qualcom, Samsung and Apple provide hardware support for VP8 and VP9, hardware decoding will benefit just a slither of the actual market of devices on which most streaming media is consumed - mobile devices. This is largely what has held back widespread use of Google codecs for years. There is fair support in TVs and STBs.
 
superior compression does not mean better quality.
At same high enough bitrate x264 is likely still better visually.

XB1 closing the gap, yeah. 900p and worse performance.

That fog on XBO when driving @ 4:20...to help the framerate or coincidence?

Also, I'm not happy about lower frames on the Pro - I thought Sony mandated at least the same? Hopefully it will be patched (not that I'm going to get it)
 
Last edited:
Also, I'm not happy about lower frames on the Pro - I thought Sony mandated at least the same? Hopefully it will be patched (not that I'm going to get it)
This is a problem people should have seen coming. What if the frame rate is better 99% of the time and worse in 1% of the time? What about 95% and 5%? Should Sony take a zero tolerance approach to TRC approval on Pro code if it's ever slower and how do they test and monitor this?

Sony's guidance to developers is just that: guidance. There's no reasonable way for Sony or devs to deliver on it without a lot of extra testing.
 
This is a problem people should have seen coming. What if the frame rate is better 99% of the time and worse in 1% of the time? What about 95% and 5%? Should Sony take a zero tolerance approach to TRC approval on Pro code if it's ever slower and how do they test and monitor this?

Sony's guidance to developers is just that: guidance. There's no reasonable way for Sony or devs to deliver on it without a lot of extra testing.

This is where options would help, let the player decide - performance or details - 2 options. Either that or turn down some effects to hit a 30 lock.
 
Watch Dogs 2 was just patched to fix performance issues on PS4 Pro. DF hasn't tested it yet, but it's in the patch notes and people at GAF are saying it's completely smooth now.
 
Watch Dogs 2 was just patched to fix performance issues on PS4 Pro. DF hasn't tested it yet, but it's in the patch notes and people at GAF are saying it's completely smooth now.
But what will people complain about now? :runaway:

I really liked WATCH_DOGS but I'm giving the sequel a miss because of a somewhat arbitrary visual design reason. Something I absolutely hated about the original, and which looks to be same in WD2, is the way any object which your device is focused on flashes regardless or how large it is. It looks bloody awful and I wish you could turn that off. That and the the arrows on the road.
 
superior compression does not mean better quality.
At same high enough bitrate x264 is likely still better visually.
.

for some visuals. Top end there is little in it but they are somewhat down from that. I enjoyed the low bitrate comparison.
 
They did a vid on Arkham City PS4. Apparently, it had stealth Pro support all this time, and runs better when played on that. That said, it's not really anywhere near as good as just playing it on a 3 year old PC if you're using max settings and a six year old PC or older if you're using X360 settings. The frame rate is still kind of not great, though at least you seem to have a 30 FPS minimum(I didn't watch the whole vid) unlike the non-Pro experience.
 
A game that runs at 1080p on XB1 is by nature a non-demanding game... as a game that runs at native 4K on PS4 Pro is by nature a non-demanding game.
That's some pretty backwards logic. why can't a game be demanding and 1080p, by sacrificing other aspects of the graphics to hit the resolution target? Hypothetically for illustration, a 1080p photorealistic FPS knitting simulator with full fabric fibre simulation on the GPU and raytraced needle tech running at 5fps - by your definition that's undemanding because it's picked a 1080p target. :-?
 
DF has released their technical analysis of Deus Ex: Mankind Divided yesterday... but today Eidos released a patch that improves performance on a Pro.

Bad timing for DF. :)
 
Status
Not open for further replies.
Back
Top