Digital Foundry Article Technical Discussion [2021]

Status
Not open for further replies.
the 2070S offered flat out better performance than PS5, not just in the bandwidth constrained area that I Highlight (PS5 drops frames on each Camera cut while 2070S does not). 2060S goes below PS5 as I Show in the Video. XSX bests both the 2060S and 2070S, not not a great Deal above the 2070s though and the 2070S actually has less drastic Camera cut frame drops.

So yes, i find this hitman comp to be the truest measured of performance I have produced yet next to WDL.

PS5 below 2070S (vanilla 2070) and XSX abit above 2070S is actually where many had put them long time ago. I also agree on that hitman and WDL are the best benchmarks so far, they push things.

I must also add, thanks alot for your videos and the time you lay on all this, and in special your contributions to us here on the forums.

Consoles have always been about tradeoffs.

Yes otherwise we would not have seen 500 dollar boxes.

PS5 is somewhere between the 2060S and the 2070S, or in other words, 2070/5700xt.

DF/Alex called that a very long time ago already (RTX2070 vanilla overall) aside from RT where the 2070 has an advantage. When it released in 2018, it was a very good middle grounder, faster then a 2060 (quite much so), slower then 2080 (and much slower then a 2080Ti which i went for).
 
I was thinking the same thing with regards to Alpha effects with this title and ACV. Given the bandwidth advantage of XSX, it doesn’t feel like it should drop so significantly in these areas when compared to the PS5. Is it possible these early games are dipping into the “slow” pool of the XSX memory, either due to developer not having time to optimize or the “tools” still not mature enough in this regard?
Alpha effects are very basic hardware job that should not need some kind of optimization. And I doubt those cross-gen games need more than 10GB of memory as they only need 5GB on base consoles.
 
If you feel it's fair to compare DLSS to native rendering, then I assume you're also okay to compare CBR with native rendering. Both are comparable to native rendering. Leadbetter has said how he was challenged by Cerny and he eventually was able to tell when he had his nose to the screen.

If we accept some upscaling techniques that are platform specific and not others, then we're shifting into confirmation bias.

If the CBR implementation was providing equal or superior image quality to native outside of zoomed in stills but at any reasonable viewing distance and screen size then yes I'd argue it's fair to compare with native rendering. In the comparisons I've seen though that generally isn't the case.
 
If the CBR implementation was providing equal or superior image quality to native outside of zoomed in stills but at any reasonable viewing distance and screen size then yes I'd argue it's fair to compare with native rendering. In the comparisons I've seen though that generally isn't the case.

Resident Evil Village?
 
Village is a dark demo with very low contrast. It also has only one point of comparison. The PS5 version to the PS5 version. It is not easy to know how it looks in comparison to the real native thing.

Can you tell at a normal viewing distance that it isn't native 4k? I can't.

You really shouldn't need to compare to native 4k to tell the difference.
 
How else would you assess the PS5 code relative to a native 4K presentation? Everything in isolation is the best and the worst it can be. An external frame of reference is required for perspective. :yes:

Can you tell if a game is 720p or 900p on a 1080p screen?

Haven't you also made comments in the past about native 4k being wasted and at normal viewing distance 1440p is sufficient? ;)

Realistically, 6800xt at 4k is either of the consoles at more-or-less 1440p.
 
Can you tell at a normal viewing distance that it isn't native 4k? I can't.

But again, what's a normal viewing distance? On PC it's probably between 2-4 feet which on some larger monitors presents a very large viewing window. Much larger than what would be considered a more normal console viewing distance. What may not be easily visible with a console and 50" TV may be very visible with a PC and a 34" monitor.

You really shouldn't need to compare to native 4k to tell the difference.

I don't really follow this. If I were to eyeball one implementation or the other in isolation, could I say which one it is? Almost certainly not. But that's largely because different games have better image quality than others even at the exact same resolution. If I compare some of the Lego games on PC for example at 4K to Tomb Raider at 1440p. TR appears to have better image quality to my eyes. So looking at a 4k screen of a Lego game in isolation I might not be able to say whether it's 4K or 1440p. That doesn't mean though that I wouldn't prefer the 4K version of the Lego game over the 1440p version if I were able to switch between them and judge.
 
If the CBR implementation was providing equal or superior image quality to native outside of zoomed in stills but at any reasonable viewing distance and screen size then yes I'd argue it's fair to compare with native rendering. In the comparisons I've seen though that generally isn't the case.

Does this rule not apply anymore?
 
Expected this outcome, with memory contention included the PS5 is more like a 2070/ GPU, Series X is ~ 2070 Super. Ray Tracing could slash them down even further.
base on one scene in one game ? in valhalla and some other games ps5 has perf advantage over xsx (though its strange looking on paper specs) and looks like ps5 performing above 5700xt but in h3 below, also would like to seen mendoza foliage scene and sniper zoom on 2060super, 2070 and rx5700xt couse even in h3 different scene could behave differently (but agree in the end it should be this range of performance so around 5700xt)
 
Can you tell if a game is 720p or 900p on a 1080p screen? Haven't you also made comments in the past about native 4k being wasted and at normal viewing distance 1440p is sufficient? ;)

On your first question, I can barely perceive any resolution above 1440p on my 55" LG C9 at around 8ft! But I watch DF and NXG videos for their insights into graphical techniques like how CBR can look damn close to native 4K. Ultimately I don't care if they like it or if there is abetter implementation on another platform, it's the differences that interest me. :yes: It's not, is it good, it's why is it different.
 
base on one scene in one game ? in valhalla and some other games ps5 has perf advantage over xsx (though its strange looking on paper specs) and looks like ps5 performing above 5700xt but in h3 below, also would like to seen mendoza foliage scene and sniper zoom on 2060super, 2070 and rx5700xt couse even in h3 different scene could behave differently (but agree in the end it should be this range of performance so around 5700xt)

Well, DF actually thinks WDL and hitman are the best comparison benchmarks so far, and they put them 2070 vanilla for PS5 and XSX a tad above 2070S. All this talking normal rendering and no reconstruction ofcourse.
 
Well, DF actually thinks WDL and hitman are the best comparison benchmarks so far, and they put them 2070 vanilla for PS5 and XSX a tad above 2070S. All this talking normal rendering and no reconstruction ofcourse.
and I think its just one benchmark not better than others but understand its best in case they can use almost same pc settings
 
I don't follow? I'm not talking about comparing the CBR version to native in zoomed in stills, I'm talking about high quality side by side video (or non zoomed stills that I could flick between). i.e. something that reflects what I'd actually see on my monitor while playing the game.

Well, it sounds like the DF folks struggled to tell if Resident Evil was native or not even when using zoomed in screenshots and presumably pixel counting. If you struggle to tell under those conditions I'd argue that you'd struggle to tell at normal viewing distances, even when comparing to native 4k.
 
and I think its just one benchmark not better than others but understand its best in case they can use almost same pc settings

Your contradicting yourself there, its generally best to compare the same settings and situations. Besides that, WDL and hitman are much better looking titles then say valhalla which can easily be passed as something from last generation. Nothing wrong with 2070/5700XT performance anyway.
 
Status
Not open for further replies.
Back
Top