Digital Foundry Article Technical Discussion [2021]

Status
Not open for further replies.
@Globalisateur The image there is 1080p because our frametime overlay for 4way comparisons is 1080p (to save rendering time). The actual Image in the background there is the 4K feed of PS5 (1800p internal of course) downscaled to 1080p to save rendering time for Video Export.
Hitman 3 always runs the described resolutions no matter what: 4K on Xbox, 1800p on PS5.
 
@Globalisateur The image there is 1080p because our frametime overlay for 4way comparisons is 1080p (to save rendering time). The actual Image in the background there is the 4K feed of PS5 (1800p internal of course) downscaled to 1080p to save rendering time for Video Export.
Hitman 3 always runs the described resolutions no matter what: 4K on Xbox, 1800p on PS5.
Odd. Normally we should be able to see the native aliasing in a downsampled image. I have done it plenty of times. And I can even see the specific work of the AA on the first aliasing. Are you sure it's actually 1800p using the full 4K image? Have you actually pixel counted the cutcene?
 
Odd. Normally we should be able to see the native aliasing in a downsampled image. I have done it plenty of times. And I can even see the specific work of the AA on the first aliasing. Are you sure it's actually 1800p using the full 4K image? Have you actually pixel counted the cutcene?
You can count a down sampled Image? How? Inside each Pixel is the information of four, each Pixel is an oversamped gradient. How Do you manage that?
 
You can count a down sampled Image? How? Inside each Pixel is the information of four, each Pixel is an oversamped gradient. How Do you manage that?
Well, yes we can pixel count a downsampled image. Somehow it works in most cases if we have obvious aliasing.
 
Well, yes we can pixel count a downsampled image. Somehow it works in most cases if we have obvious aliasing.
But you wouldn't get exact numbers, because you have falsified aliasing. You could guess (from a downsampled image) that it might not be native resolution, but that is more or less all you can say about. You just don't know which is the original resolution unless you check the original source.
 
But you wouldn't get exact numbers, because you have falsified aliasing.
Im pretty sure the results would be consistent, thus it would be possible to work out if you had enough patience.
This would be something you could train a AI NN to do.
Maybe an idea for a project for someone if they are bored?
 
@Dictator why didn't you pick another scene to make your overall comparison? PS5 is performing better than 2060 super (and about the same as 5700X) there.

x9xy1nN.png

vRBE6h6.png


But you wouldn't get exact numbers, because you have falsified aliasing. You could guess (from a downsampled image) that it might not be native resolution, but that is more or less all you can say about. You just don't know which is the original resolution unless you check the original source.
Pixel counting is never guaranted at 100%. But oddly in my experience if you have a clean aliasing then you can sometimes pixel count a downsampled image well enough. But sure often it doesn't work, it really depends of how the downsampling has being done, sometimes the original aliasing is somehow still there (I don't really know how), sometimes it is replaced by the downsampled resolution. It's probably what happened here anyways.
 
@Dictator why didn't you pick another scene to make your overall comparison? PS5 is performing better than 2060 super (and about the same as 5700X) there.
They test against scenes where the PS5 is not hitting 60 because if you have a frame cap of 60 then there is no way to know if there is performance being left on the table because of the cap.
 
They test against scenes where the PS5 is not hitting 60 because if you have a frame cap of 60 then there is no way to know if there is performance being left on the table because of the cap.

But those selected scenes by Globby, do show a distinct advantage over the RTX 2060 S when "capped" or VSYNC at 60fps across these cards.

Anyhow, this is the inherent problem when selecting certain scenes over others when trying to draw conclusions about which PC GPUs variants are equivalent to these systems -- especially so early on.
 
Last edited:
But those selected scenes by Globby, do show a distinct advantage over the RTX 2060 S when "capped" or VSYNC at 60fps across these cards.

That's why they show a variety of scenes, talk about the average, talk about how subjectively noticeable the frame drops are, and speculate on probable causes (alpha!). Short of having access to ps5 dev tools and looking at it in a profiler, what more do you guys want?
 
That's why they show a variety of scenes, talk about the average, talk about how subjectively noticeable the frame drops are, and speculate on probable causes (alpha!). Short of having access to ps5 dev tools and looking at it in a profiler, what more do you guys want?

Nothing. I was simply making an observation in a tech-related thread.
 
They test against scenes where the PS5 is not hitting 60 because if you have a frame cap of 60 then there is no way to know if there is performance being left on the table because of the cap.

I work in clinical research within data management. In my industry, we have to check multiple scenarios - if something goes wrong one way, we check for it, but we also have to check for the reverse situation. Otherwise we risk concluding on limited data. If a laboratory result is 3x the normal range, we wouldn't look at that result and conclude that the study drug had a dose limiting toxicity, we'd check for other concomitant medications, related adverse events, etc. Define what's causing the data to appear that way.

The last video appeared to determine that there was a bandwidth contention causing framerate drops (agree, the image of a single car with alpha effect doesn't looks like it's maxing out the rendering hardware), but we can't then perform the same test on a 2060S and conclude that the PS5 and the 2060S are broadly similar in their rendering performance in that situation, as the test wasn't measuring the rendering performance of the cards. It highlighted bandwidth contention - which Alex pointed out, but he then went on to conclude that they're broadly similar in rendering performance (which may be true, but that test certainly did not confirm it). The were a few other points that looked like they were from the same situation/scene where the 2060S also dropped frames, but again looks like similar kinds of issues that are perhaps unrelated to rendering performance.

It appears that a lowered framerate situation was determined from the consoles and then moved on to test from that situation alone. We didn't have the reverse situation; were the framerate drops were identified on the PC hardware and compared that back to the PS5. All we have is a demonstration of the proportional difference in bandwidth contention.

I definitely agree that finding framerate drops usually determine the rendering performance, only that this test did not do that. Which appeared to be acknowledged by Alex, only no significant further testing was performed and he instead decided a premature conclusion was that the PS5 was a little above the 2060S. Regardless of whether it is or is not broadly similar, the demonstrated results are inconclusive.

(even the Watch Dogs Legion video wasn't a great example, as it was essentially a RT test and it was limited to 30hz with no deviation)
 
Last edited by a moderator:
@Dictator why didn't you pick another scene to make your overall comparison? PS5 is performing better than 2060 super (and about the same as 5700X) there.

x9xy1nN.png

vRBE6h6.png



Pixel counting is never guaranted at 100%. But oddly in my experience if you have a clean aliasing then you can sometimes pixel count a downsampled image well enough. But sure often it doesn't work, it really depends of how the downsampling has being done, sometimes the original aliasing is somehow still there (I don't really know how), sometimes it is replaced by the downsampled resolution. It's probably what happened here anyways.

But those selected scenes by Globby, do show a distinct advantage over the RTX 2060 S when "capped" or VSYNC at 60fps across these cards.

Anyhow, this is the inherent problem when selecting certain scenes over others when trying to draw conclusions about which PC GPUs variants are equivalent to these systems -- especially so early on.

The point Alex selected was the point of lowest performance shown by the PS5. Surely that's the most logical choice given a selection of possibilities? It's not like he's hiding the fact that in other scenes the PS5 performs better because he specifically calls it out.
 
The point Alex selected was the point of lowest performance shown by the PS5. Surely that's the most logical choice given a selection of possibilities? It's not like he's hiding the fact that in other scenes the PS5 performs better because he specifically calls it out.

No one said he was hiding anything. I was simply making an observation on what Globby posted, and how different scenes can paint a slightly different picture on PC GPU comparisons.
 
But those selected scenes by Globby, do show a distinct advantage over the RTX 2060 S when "capped" or VSYNC at 60fps across these cards.

Anyhow, this is the inherent problem when selecting certain scenes over others when trying to draw conclusions about which PC GPUs variants are equivalent to these systems -- especially so early on.
I don't want to speak for Dictator, but as far as I understood from the video, he wasn't trying to showcase that the PS5 sits below a RTX 2060. But in the particular scenario where all cards dip, including PS5, he is able to attribute it (with high probability) that it is due to bandwidth availability.

So on the second graph posted, only the 2060 dips; the assumption that is if the scene is bandwidth heavy, all of them would dip (all cards owning to 448 gb/s). So something else was likely the culprit. Unfortunately not much information to glean from there.
 
Speaking of possible alpa issues, could this be the reason why "Call of the Sea" is performing at a modest resolution (1440p) on XBSX? I would love to see DF or NX Gamer do a comparative analysis between the Series edition and various PC hardware.
 
Speaking of possible alpa issues, could this be the reason why "Call of the Sea" is performing at a modest resolution (1440p) on XBSX? I would love to see DF or NX Gamer do a comparative analysis between the Series edition and various PC hardware.
I didn't think there was that much alpha going on, it's not visible in the videos. I'd have to download the game individually to see what's happening.
It could be lighting calculations, I'm not sure if it's baked or other.
 
I didn't think there was that much alpha going on, it's not visible in the videos. I'd have to download the game individually to see what's happening.
It could be lighting calculations, I'm not sure if it's baked or other.

Something is going on, I saw a video a week ago where an RTX 2060 or 2070 was performing above 60fps @4K.
 
Status
Not open for further replies.
Back
Top