Digital Foundry Article Technical Discussion [2021]

Status
Not open for further replies.
Your contradicting yourself there, its generally best to compare the same settings and situations. Besides that, WDL and hitman are much better looking titles then say valhalla which can easily be passed as something from last generation. Nothing wrong with 2070/5700XT performance anyway.
maybe I put it in wrong words, its best couse we have exactly same settings but it doesn't mean its best representation of ps5 gpu level of performance as its probably only nextgen game for now ps5 is below 5700xt performance here. Also I strongly disagree H3 looks better than Valhalla or new Cod, it has very plastic characters looks and stiff animation.
 
Well, it sounds like the DF folks struggled to tell if Resident Evil was native or not even when using zoomed in screenshots and presumably pixel counting.
I love this stuff. Look, zoom in on the fire and heat effects and look at these pixels. Or wait for the cutscene to end and dissect the first frame of gameplay.

:cool:
 
maybe I put it in wrong words, its best couse we have exactly same settings but it doesn't mean its best representation of ps5 gpu level of performance as its probably only nextgen game for now ps5 is below 5700xt performance here. Also I strongly disagree H3 looks better than Valhalla or new Cod, it has very plastic characters looks and stiff animation.
Yes for judging PS5 performance this game seems to be the exception, not the rule. In most others big games (Destiny 2, COD, NBA ,Valhalla, Dirt 5, FIFA or even Immortals Fenyx Rising) PS5 has either the edge or similar perf overall as XSX.
 
  • Like
Reactions: snc
Your contradicting yourself there, its generally best to compare the same settings and situations. Besides that, WDL and hitman are much better looking titles then say valhalla which can easily be passed as something from last generation. Nothing wrong with 2070/5700XT performance anyway.

Valhalla is begging to be modded IMO. I've never understood why Ubisoft insist on launching the AC games with such oversaturated palettes. Origins and Odyssey are two of the best looking games I've ever played when modded but they look borderline ugly by default to me. Valhalla appears to be the same but I've only seen my wife playing it unmodded so far and to me it looks pretty ugly for the most part compared with what I'm seeing in Odyssey which I'm playing at the same time.

But a simple re-shade mod like this would be transformative IMO
 
Would a framerate-interpolated 60fps show worse input latency than a blue collar 60fps?

Assuming 2.5 frames game latency for the 30 fps game and 3 frames for the 60 fps game you would get 83 ms for the 30 fps game and 50 ms for the 60 fps game.
 
Valhalla is begging to be modded IMO. I've never understood why Ubisoft insist on launching the AC games with such oversaturated palettes. Origins and Odyssey are two of the best looking games I've ever played when modded but they look borderline ugly by default to me. Valhalla appears to be the same but I've only seen my wife playing it unmodded so far and to me it looks pretty ugly for the most part compared with what I'm seeing in Odyssey which I'm playing at the same time.

But a simple re-shade mod like this would be transformative IMO

Thought I'd have a quick play to prove the theory:

ACValhalla_2021-01-23_17-32-17.png

ACValhalla_2021-01-23_17-32-21.png
 
but it doesn't mean its best representation of ps5 gpu level of performance
I don't think the 'ideal' title exists. Each title targets something else on the GPU to do different things.
Games all have different talents, different goals, different time lines and different budgets. Game development isn't like Formula 1 where everyone has the same rules and regulations and you can expect all the developers to maximize and retune the whole car per driver specifications for each track. It's just the way it is, and if you want to look at the performance of PS5, you're going to have to look over a range of games and not just the ones that are performing on the upper end.

The comparisons in which PS5 is able to edge out PC, are also comparisons in which it was incredibly difficult for Alex to find equivalent console settings for. That's just the nature of the business, PC settings are global, console settings can be made specific per level. So unless a developer comes straight out and gives specifics or allows for console like settings on PC, you're only ever going to get a ballpark idea of where it's performing. It's quite taxing to do Alex's job on these particular 'optimized settings' videos.

The hitman 3 test is ideal, as is WDL, in that Alex was given the exact settings to follow and he can then benchmark it against what he sees with playstation and xbox. The only reason he targets dips is because the framerate is locked 60 so he is unsure of how high PS5 will go.
 
I don't think the 'ideal' title exists(...)
yeah sure but for now ps5 peformance in h3 is deviation from other titles not norm (not too big tough its just around 8% slower in this particular scene than 5700xt and not little faster, still curious how 5700xt perform in mendoza level with sniper rifle zoom from Tom video, @Dictator any idea ?)
 
yeah sure but for now ps5 peformance in h3 is deviation from other titles not norm (not too big tough its just around 8% slower in this particular scene than 5700xt and not little faster, still curious how 5700xt perform in mendoza level with sniper rifle zoom from Tom video, @Dictator any idea ?)
Well, it's much too early to say it's any sort of deviation for a variety of reasons:
a) all the titles you discussed were basically forced through covid
b) we've never had a chance to find equivalent settings (and console settings can be optimized further per stage/level than what PC can achieve)

So you've really got an issue with figuring out what is a correct measurement.

This is the first real and honest measurement. A game that wasn't rushed to meet post launch. A game that hit all of it's delivery targets as well as VR. A game that is clearly well optimized for all of it's platforms. And they've gone ahead and provided Alex the exact setup needed to reproduce PS5 on PC.

All the measurements before this didn't check any of those boxes. All of them rushed to make launch date, many features missing. Games that weren't optimized, and no settings provided for PC to find equivalency to.
We've never had apples to apples comparisons like this before. And now that we do, the result may not be in alignment with what we saw with the earlier titles.

I think looking at the launch titles as being an anchor point for it's performance is probably the wrong idea. It's 1.5 months worth of cross generational games during the height COVID, vs the remaining 70.5 months of matured games done the line that will be without last gen.

That doesn't mean to expect all PS5 games to behave where it is, but that doesn't make this particular title a deviation either. We don't even have 30 titles out yet to cross a basic sample size as statistical guideline.
tldr; I wouldn't label Hitman 3 a deviation. The population is much too small right now that anything in the first 2 years will likely be very spread out and all over the place, then we will see some very solid concertation of performance in years 3 and 4, and finally some minor spreading towards the very high end in the last years 5-6.
 
Well, it's much too early to say it's any sort of deviation for a variety of reasons:
a) all the titles you discussed were basically forced through covid
b) we've never had a chance to find equivalent settings (and console settings can be optimized further per stage/level than what PC can achieve)

So you've really got an issue with figuring out what is a correct measurement.

This is the first real and honest measurement. A game that wasn't rushed to meet post launch.
With all due respect but you dont know if valhalla, cod or any game that run better on ps5 than on xsx and probable little above 5700xt was rushed and thats only your narration ;)
 
With all due respect but you dont know if valhalla, cod or any game that run better on ps5 than on xsx and probable little above 5700xt was rushed and thats only your narration ;)
XSX has its own issues that are separate from PS5. And it’s very difficult to compare the two consoles without some form of medium that Alex did here.
Anyway, this wasn’t about PS5 and Xbox, it was whether or not this data point should be ignored, which clearly there is no reason for it to be ignored. It’s a clean data point, it should be kept.

As for knowing which games are rushed; typically you know titles are being rushed when at launch they are handing out day 0 patches followed by more patches within the following weeks.

the reason Alex does these videos is not to stoke PS5 vs XSX. It’s to answering the technical questions we are dying to know if something has changed or not. And it performing around 5700XT tells us that clockspeed or boost potential can still fall prey to other bottlenecks, namely bandwidth still being a factor.
 
XSX has its own issues that are separate from PS5. And it’s very difficult to compare the two consoles without some form of medium that Alex did here.
Anyway, this wasn’t about PS5 and Xbox, it was whether or not this data point should be ignored, which clearly there is no reason for it to be ignored. It’s a clean data point, it should be kept.

As for knowing which games are rushed; typically you know titles are being rushed when at launch they are handing out day 0 patches followed by more patches within the following weeks.

the reason Alex does these videos is not to stoke PS5 vs XSX. It’s to answering the technical questions we are dying to know if something has changed or not. And it performing around 5700XT tells us that clockspeed or boost potential can still fall prey to other bottlenecks, namely bandwidth still being a factor.
I'm not ignorig data just point out that for now it's outlier, future will show if it will be a norm. And about rushing development, if game was rushed then version on nextgen console probably would suffer mostly, and thats just some narration/theory I don't want to even analyse.
 
I'm not ignorig data just point out that for now it's outlier, future will show if it will be a norm. And about rushing development, if game was rushed then version on nextgen console probably would suffer mostly, and thats just some narration/theory I don't want to even analyse.
Rushed is rushed. There’s not really much point in trying to place where it would perform better or worse. We’re just talking about qualifying data points.

A great deal of people want to know if PS5 is really punching way above it’s weight for its given specs. If it really is, there is something to learn and leverage. If PS5 does not then it falls in the regression line with years of GPU data points. This isn’t to smash PS5, it’s just people having a discussion
 
On your first question, I can barely perceive any resolution above 1440p on my 55" LG C9 at around 8ft! But I watch DF and NXG videos for their insights into graphical techniques like how CBR can look damn close to native 4K. Ultimately I don't care if they like it or if there is abetter implementation on another platform, it's the differences that interest me. :yes: It's not, is it good, it's why is it different.
On my playroom's LG C9 77" at 8ft (245cm) viewing distance, I can't see all of the quality of 4K.

On my PC's LG CX 48" at about 3ft (1m ish) viewing distance, oh boy I can see all that lovely 4Kness.

My old eyes are definitely not as keen as the kids' round here, so my personal experience has room for interpretation.

Rule of thumb: your viewing distance should be about the same as the 16:9 diagonal to get the full benefit of the screen's resolution.

It's a shame that most people have screens that are far too small for 4K at their gaming/viewing distance. So it turns out that gaming resolutions decently above 1920x1080 but less than 4K are probably a very good match for 80%+ of console gamers' setups.
 
It's a shame that most people have screens that are far too small for 4K at their gaming/viewing distance. So it turns out that gaming resolutions decently above 1920x1080 but less than 4K are probably a very good match for 80%+ of console gamers' setups.
on the other hand me playing 1.2m from 55 lg cx and 4k is sudden not so amazingly sharp, and why should be, only 80 ppi... ;)
 
We've never had apples to apples comparisons like this before.

Indeed as Alex mentioned, its the best comparison benchmark alongside some others like WDL. A 5700XT/RTX2070 was depicted along time ago for the PS5. Actually the 5700XT 40CU/448gb/s seems a very good match. Its abit lower clocked but also has 4 extra CUs. A OC'd 5700XT (it will do 2100mhz) and its faster everywhere.
 
What I love about threads like this is that we can revisit them in a few years
 
What I love about threads like this is that we can revisit them in a few years

I do agree that in a few years the 5700XT won't be a favourable comparison to the PS5 (due to it's lack of RT support primarily, but also waning driver and dev support), but at the same time, in a few years, most people will barely remember the 5700XT. Depending on how many a few actually is, the current gen will be 7800XT or even 8800XT by then. The 5700XT will be seriously old news by that point when we'll be comparing the 7600XT or 8500XT to the PS5
 
I do agree that in a few years the 5700XT won't be a favourable comparison to the PS5 (due to it's lack of RT support primarily, but also waning driver and dev support), but at the same time, in a few years, most people will barely remember the 5700XT. Depending on how many a few actually is, the current gen will be 7800XT or even 8800XT by then. The 5700XT will be seriously old news by that point when we'll be comparing the 7600XT or 8500XT to the PS5
outside of the feature support levels, the 8GB VRAM will be the limit while PS5 continues to push forward as well, not to mention whatever that SSD system has, I'm not sure if 5700XT is compatible with supporting that type of thing (if it's hardware related) on the newer mobos.
 
I will respond more in full later - But the 2070S offered flat out better performance than PS5, not just in the bandwidth constrained area that I Highlight (PS5 drops frames on each Camera cut while 2070S does not). 2060S goes below PS5 as I Show in the Video. XSX bests both the 2060S and 2070S, not not a great Deal above the 2070s though and the 2070S actually has less drastic Camera cut frame drops.
The reason why I find this comparison from this video so compelling is because like watchdogs Legion, we have most of the important settings that greatly affect framerate in the scene we are looking at being 100% there on PC - mirror Match for the big settings. It also just happens to be a scene where the bandwidth sapping particles are full on screen. Ass Creed comparison I unfortunately think Was another of these instances where the particle quality was affecting performance greatly (fire effects on screen there). I also think the same for call of duty. So yes, i find this hitman comp to be the truest measured of performance I have produced yet next to WDL. It just happens to Also be measuring over draw performance. Maybe if more devs are nice to us we will get a game with an unlocked framerate and exact settings at some point to do this again but also Show more of the differences.
I think Blops and Ass Creed were disadvantaging PC in ways that cannot be overlooked especially given the scenes that I was realsitically able to choose from to compare performance.
About the cutscene in Hitman 3 where you measured 32fps on XSX and 37fps on PS5. Have you pixel counted the resolution of the scene on PS5 and XSX? Because I have just pixel counted 1080p in the scene on PS5 (in both axes and just a moment before the lowest drop because I could see very noticeable aliasing there). I think depending of XSX resolution (there isn't enough XSX footage in your video) it could change the outcome of your comparison (in either way).

hjoGKmU.png

UFkSTnb.png


Whole image:
ykIwavM.jpg
 
Status
Not open for further replies.
Back
Top