Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
I'm not seeing any difference in medium texture quality at 1440p with the patch on my end. I think they would have mentioned that in the patch notes if there was a substantial upgrade in texture quality at medium?

Edit: Okay, sorry for misinformation. I think this is not how it works. I need to do further testing. My findings could've been wrong. Disregard what I've said until I get to the bottom of this.
 
Last edited:
did you not read what BRIT said not to do?!

anyways - as to your provided Link: It is you again who is using this quote out of context. Cerny gave in his Road to PS5 Speech the information that that Kraken decoder chip equals to about 9 zen 2 cores.
He did so to give the audience a picture what to expect from that chip in terms of Power. Of course he meant it in a way that "if" a Zen 2 " Processor would need to do such a task ( remember a Zen 2 CPU not a specific hardwaredecoder) it would need nine of those cores to make the decompression happen in realtime without slowdown.

Fabian from Oodle mentioned then rightfully that it makes no sense to just say it is a 9 core Zen 2 CPU Equivalent because (of course) a Zen 2 CPU would be a waste if ordered to just decompression nonstop. Thats exactly the reason why sony did not put a bigger CPU inside. Because just decompression is done way more effectivly (and cheaper) with fixed funktion hardware. But that still not makes the statement from Cerny wrong - quite the opposite. Because it is technicaly still true - would you order a Zen 2 CPU do do the decompression on the fly it would come out that you would need nine of those cores to make it happen in time , and that with all the inefficencys involved there because it is in the end not fixed funktion hardware. But multipurpose instead. If anything - that makes the Kraken Decoder look even more potent than before - because as he rightfully states fixed funktion hardware will always be better than a multi purpose hardware if thrown the same problem at. So much to that..

But interresting - did you scroll a bit more down on that quasi Q&A in the comment section? The actuall Author of this blogpost Cbloom said a couple question below :

Those other bottlenecks he speaks of are none on PS5 btw - everything is covered and setup in a way to support a 9GB/s ongoing datastream- not burst - ongoing if needed.
We still dont see anything like Ratchet and Clank Rift Apart on PC - why is that?

We will see wich PC will be faster than PS5. Nobody said it needs to be better than every PC . Its loads faster than the most of them - thats enough for now. In terms of raw power the current Gen Consoles have overtaken like 70% of all PC out there according to Steam Hardware Survey. But i happily state this here again for later quotes : I believe that the Ryzen 5 3600 / RTX 2070 Super Combo will not be enough to match PS5s performance on FIRST PARTY TITLES from this year on , because Publishers like UBISOFT will happily optimize to the 70% to get every Plattform on par and call it a day. So third Party Titles will not be the Test.
I expect Spiderman 2 and Wolverine to blow everything out of the Water in Terms of Graphics. Especially Insomniac seems to have the best insight how to max out PS5. They will deliver. Cant wait!
No.
 
I think NX Gamer loses a lot of the audience with his word salad and video editing.
You know, I don't always agree with him, but I did like his take on the Dead Space VRS image quality, where he basically said that it much less noticeable if you keep the game dark, as it should be for that game. I didn't particularly like him throwing shade at other channels about it, but he did cover the actual image quality more like I feel like it should have been covered.
 
I know what he said, unfortunately replying to you doesn't require a lot of detail due to the poor logic and arguments you make and use.

Which after having some PS5 games that use the decompression hardware (Like Spiderman) release on PC we can now say his claim is false.

Myself and everyone knew exactly what you meant when you said what you did.

Sure it was, I mean it was nothing to do with heat, power, die space or cost at all was it.

No, seeing Spiderman on PC decompress data just as fast as PS5 does without any where close to '9 Zen CPU cores' pretty much kills that claim.

Spiderman begs to differ.

And with Direct Storage they're no longer an issue on PC either.

Erm, Titanfall 2 was doing swaps like that in 2016.



True, but remember there's hundreds of millions of PC gamers and 30% of those have a PC more powerful than PS5, which means there's more gaming PC's faster than PS5 than there are PS5's.

If these first party games use ray tracing and support DLSS it will.

I'm going to do the sensible thing now and put you on ignore.
All this nonsense about Spiderman..
Lol Dude cerny with his 9 zen 2 Cores was referring to NINE GB/S !! Nine Dude, not the occasional couple 100mb/s Spiderman the cross gen port requests..
Yeah go on and put me on ignore, fits you right.
When the next sweep of PS5 Exclusives land you will be in for rude awakening.

That PC Master Race Stance led you onto real thin Ice, when it breaks i will be waiting in the water for you.

Just watched that Titanfall Video. Yeah on first glance they do the same but that level is hardly as big as the Rift Apart Lvl in terms of Gb.
Plus they could have everything in ram.
Thats not feasable for modern AAA Titles. Rift Apart was on that nvidia leak list. I hope they port it soon so we can have an idea of what is needed on PC side to mimik the same ..
 
Last edited:
You know, I don't always agree with him, but I did like his take on the Dead Space VRS image quality, where he basically said that it much less noticeable if you keep the game dark, as it should be for that game. I didn't particularly like him throwing shade at other channels about it, but he did cover the actual image quality more like I feel like it should have been covered.

Gee, wonder what those 'other channels' were. :)

Dunno, I've seen plenty of coverage with it at proper gamma settings, and the shimmering and aliasing were extremely apparent. Plus disabling it had almost no cost to performance, so it's pretty clear this was an oversight by the dev team. Regardless when DF members were sending out screenshots to highlight it, like John was doing, he almost always mentioned he was jacking up the brightness in order to show the weird VRS construction more clearly, especially for the types of portable devices people usually browse sites like Twitter on.

It's one thing to critique a graphical artifact that exists in a game but is essential to a graphics engine being able to deliver the performance it can, like DF was critical of Returnal's image quality on PS5 but they otherwise praised its implementation because of everything else it was doing. In this case the critique of Dead Space's VRS, it's that it was so egregious - it presented wobbling and shimmering which just isn't seen in modern games, even among most PS4/Xbone titles. And it turned out, it wasn't even necessary for the game's performance at all, so I can't see how anyone's critique of it wasn't completely justified when it ended up being completely fixed just days later.

I mean no shit - you keep the screen dark enough and you won't see every little flaw, but that's not really what I want from a technical review. A lot of VRR displays will iron out frame drops too, but I still want them recorded and mentioned.

edit: I did come across his update video, and in this comparison where he states that 'at the proper brightness settings' the difference isn't that stark, he's standing still staring at a door lit by his flashlight. The VRS problem was not simply the poor quality textures, but it's interaction with reconstruction, in this case FSR. That produces all that wobbling and noise which is painfully evident, but much more so in motion. So sitting still is hardly proper methodology to judge the improvement.

He goes on to imply 'large channels' were 'creating drama' because they 'didn't have much to talk about' (in the section titled "An honest appraisal of the issue", pffft). Come on. For one, John from DF in particular, but really all of them tend to bend over backwards to be more than fair to developers, and they praised DS all the same regardless (Their review is "What a Best-In-Class Remake Looks Like"!). But the image quality issues were pretty blatant, and especially on the PS5 in particular. I was really looking forward to this and so was looking at threads in ResetEra/Steam/Reddit when it dropped to see the results, and man there were so, so many threads asking wtf is going on with image quality and shared solutions of how to disable VRS, maybe second only to the stuttering. And ultimately, this got fixed. It's a better game now due to this critique.

All that said btw -damn it really sucks that the PC version was never fixed with its stuttering, and will likely never be. I said as much when people were saying to wait a bit for patches, unfortunately traversal stutters like this are rarely fixed to any significant degree. I'm far more concerned about this getting fixed than say, TLOU, as Dead Space is one of those games I can replay countless times, especially a shame as only PC hardware can run this at 60+ at the resolution I'd like. But it may never be. :(
 
Last edited:
The real problem not talked about here is the RAM problem, not the VRAM alone. The game demands more than 16GB on a clean Windows install without any additional background apps. This is the biggest source of crashes and hang ups in the game so far, full system RAM on 16GB all the time.

The question is, why? Even much more complex games don't demand more than 16GB, Spider-Man games don't, Cyberpunk doesn't, Returnal doesn't, Battlefield 2042 doesn't, Hogwarts doesn't, Dead Space Remake doesn't, Plague Tale doesn't, Callisto Protocol doesn't, Flight Simulator 2020 doesn't .. no PC game so far does, so why this one in particular? Something in the core of this game is so memory intensive, that it is wrecking havok all over.
 
Last edited:
The real problem not talked about here is the RAM problem, not the VRAM alone. The game demands more than 16GB on a clean Windows install without any additional background apps. This is the biggest source of crashes and hang ups in the game so far, full system RAM on 16GB all the time.

The question is, why? Even much more complex games don't demand more than 16GB, Spider-Man games don't, Cyberpunk doesn't, Returnal doesn't, Battlefield 2042 doesn't, Hogwarts doesn't, Dead Space Remake doesn't, Plague Tale doesn't, Callisto Protocol doesn't, Flight Simulator 2020 doesn't .. no PC game so far does, so why this one in particular? Something in the core of this game is so memory intensive, that it is wrecking havok all over.

I mean add it to the list of architectural oddities with what this game delivers visually vs. what it demands I guess.

I'd argue though at least it's a lot easier for your average desktop owner to remedy this. Dropping in 2 more sticks of ram is relatively trivial, both in installation and cost. I mean 2 sticks of 8GB DDR4 is ~$50 CAD now, and for DDR5 systems, chances are you already have 32GB as 8GB dimms are relatively rare compared to 16. If games start coming out that demand 32GB, I'll basically be "Well ok then".

If say months from now they significantly improve the vram management, up the base rendering performance, and fix any remaining stutters/crashes - but the 32GB recommendation remains - that will be a far easier pill to swallow. It's very different when to upgrade from my 3060 and not downgrade my VRAM, I'm probably looking at a $900-$1k CAD 4070 vs an extra $50 outlay.

That's the thing with PC pricing now - everything but the GPU is actually quite affordable. I'm not saying TLOU's usage is warranted, just that at the very least this bottleneck is relatively economical to fix.

(btw in all my testing with this game on my 12400f, 16GB, 3060 system, I have yet to have a crash at least)
 
The real problem not talked about here is the RAM problem, not the VRAM alone. The game demands more than 16GB on a clean Windows install without any additional background apps. This is the biggest source of crashes and hang ups in the game so far, full system RAM on 16GB all the time.
I think this is worth talking about because it's one of the quirks that I've noticed in this game. It has this really strange behavior where the process will happily eat 10GB+ when I first boot it up but when I look at the number after playing for a while (say, an hour) then this number has always fallen into the 2-3GB range. You'll see a resolution difference in this screenshot (disregard that) but note the huge discrepancy between how much memory the process itself is using. I don't really have an explanation for the total memory usage being so similar but I've never seen this behavior before.

Regardless of the reasons, this has me wondering if we could be seeing a perf difference between 16GB and 32GB systems. It's almost like the game is preloading a lot of data at launch and then releasing it after a while. And this is consistent behavior from what I can tell.

2XkYby9.png
 
No, seeing Spiderman on PC decompress data just as fast as PS5 does without any where close to '9 Zen CPU cores' pretty much kills that claim.

1. Spider-Man is a PS4 based game. 2. More importantly, PC decompression was slower as it took CPU cycles away causing lower performance that it otherwise would be capable of if it had PS5 like decompression hardware.

And with Direct Storage they're no longer an issue on PC either.

Which is why we've been having these conversations on many big title games right?

Erm, Titanfall 2 was doing swaps like that in 2016.

Erm, nope. Wrong again. For Titanfall both maps were resident in memory, so there is no streaming going on. Not the case with Ratchet on PS5.

True, but remember there's hundreds of millions of PC gamers and 30% of those have a PC more powerful than PS5, which means there's more gaming PC's faster than PS5 than there are PS5's.

You're still not getting it. Processing power is meaningless if the data is struggling to get where it needs to be. That is the core issue right now. That is the main factor that has changed this new generation. What has narrowed the relative performance gap between home consoles and PC. Do your best to stop thinking under outdate standards and understand why and how things will be different going forward.

This is the first PC release of ND and Direct X 12 isn't easy to master. I hope more and more engine will use virtual texture it can help solve the problem of 8 GB GPU. This is probably difficult to work with PC with less RAM for graphics than PS5 and without a mandatory SSD. The next big ND single player PC port will probably use Direct Storage.

It will help to allow for better texture management, but who are we kidding in the PC space it's all about benchmarks and comparisons between other platforms, gpu vendors, etc. In that sense, all that will change is higher standard for all since it will left every platform/rig up. The performance order won't change.
 
The real problem not talked about here is the RAM problem, not the VRAM alone. The game demands more than 16GB on a clean Windows install without any additional background apps. This is the biggest source of crashes and hang ups in the game so far, full system RAM on 16GB all the time.
Hogwarts ram usage was high too (at least with 32gb ram) but maybe you’re talking about how well it scales down.
That’s seems to be the issue for most problematic points
 
Stole my idea? 😂 I already said previously they should have just ported TLOU remastered. It would have been a disappointment to some but the game would have ran well with little issues

Not really. If Sony, going forward, is going to go all out with PC support with their first party titles, it would behoove first party devs to start getting their hands dirty and jumping into a multiplatform development environment. Projects like these are the perfect way to jumpstart that transition.
 
Not really. If Sony, going forward, is going to go all out with PC support with their first party titles, it would behoove first party devs to start getting their hands dirty and jumping into a multiplatform development environment. Projects like these are the perfect way to jumpstart that transition.
My thing is. If they aren't gonna do them right early they probably shouldent bother. First impressions of these games are absolutely essential and according to statistics none of sonys games have so far lead even a decent return on investment into the platform. And that in my opinion mostly comes down to how trash their ports have been early on which discourages people from buying and hurts the branding of playstation on pc. They need to get this crap right. No more excuses
 
The only thing that we can be sure is first party PS5 titles on PC perform bad is because lack of experience with PC nothing else. If you think this is due to some "secret sauce" or something you will be disappointed pretty quickly. There are already many examples of titles that doing more with less hardware and achieving better results. If i break my leg and i have to run marathon it doesn't matter what shoes i put on. My leg is still broken.
 
The real problem not talked about here is the RAM problem, not the VRAM alone. The game demands more than 16GB on a clean Windows install without any additional background apps. This is the biggest source of crashes and hang ups in the game so far, full system RAM on 16GB all the time.

The question is, why? Even much more complex games don't demand more than 16GB, Spider-Man games don't, Cyberpunk doesn't, Returnal doesn't, Battlefield 2042 doesn't, Hogwarts doesn't, Dead Space Remake doesn't, Plague Tale doesn't, Callisto Protocol doesn't, Flight Simulator 2020 doesn't .. no PC game so far does, so why this one in particular? Something in the core of this game is so memory intensive, that it is wrecking havok all over.
I don't actually get a lot of crashes with 16 GB RAM, but I do clear working sets before entering the game everyime with rammap. https://learn.microsoft.com/en-us/sysinternals/downloads/rammap Maybe it helps.

Hogwarts Legacy was similar. I feel like they're trying to push people to upgrade to 32 GB RAM. Looking at Steam surveys, it seemed to work as I've seen a large increase in both 16 GB to 32 GB RAM users there, but more for 32 GB. Most likely Hogwarts' effect. For example on Steam forums, many people report they get "please wait" warnings with 16 gigs, this legit never once happened to me in 14 hours.

What is critical with RAM usage, you need to keep your texture settings in check. Because from what I'm seeing, they're having a lot of "duplicate" data in RAM.

This scene crawls to a halt in this test because they're pushing ultra duper textures on the 4090;


Most 3090+ owners wouldn't have 16 GB. Most 16 GB owners have GPUs ranging from GTX 1050ti to 3070, which as VRAM budgets ranging from 4 GB to 8 GB. The VRAM literally won't let you saturate the RAM anyways.

Versus,

VfbmvzO.png


ousKFRQ.png

aYj4FQa.png


That video is also spreading like wildfire among 16 GB users so I'm sure we will see another surge of increase in 32 GB RAM users come the next Steam survey. And after that, developers can freely ignore 16 GB if they feel enough users have jumped the ship. And to be honest, most 16 GB users won't have free 12-14 GB RAM available to the game like me. (it is just that it is really cheap to upgrade. I, however, do not want to invest on DDR4 anymore and I don't want to lose my extreme overclock by going 4x8 either. So I will try to keep on as much as I can)

IMO I feel like throwing more hardware at stuff simply clogs innovation on PC sides of things. Wide adoption of 32 GB will most likely negate the need of proper DirectStorage implementation on PC games. Or I feel like that will what happen.
 
So I had mentioned before that there was an issue with the way that the PC version of TLOU P1 paces frames.. Yamaci17 had said his game was smooth, but I was having a much different experience..

I said that even if you cap the FPS to 60 and never go under, the game will still stutter and judder as you move around and turn the camera... RTSS and other overlays would show a perfectly straight line for me, but the frame pacing was terrible. Now I know (or at least I think I know) what the cause was. I was playing around with the resolution scale and noticed that if the resolution scale was at 100 or lower, the game was silky smooth.. but if I put the scale at even just one notch higher (120) then it would introduce the random judder/stutter into the frame delivery. So to me it seems like there's a problem with how the resolution scaler works... and it could explain why some people complain about stuttering while others don't. This also has nothing to do with my GPU not being powerful enough to push the higher resolution... my GPU is basically sleeping at 120 res (4608x1920) locked to 60fps..

Here's a quick vid I made to hopefully demonstrate the issue. Should be clear enough. There's some inherent judder from the recording, but just ignore that.. on my display it was silky smooth (as silky as 60fps can be I guess lol) At first the slider is at 100 (native res).. and it's smooth.. then I switch to 120, and you can see the stuttering it introduces, then I change it back.


Hopefully this can help someone and might explain why they're seeing stutters that others aren't!
 
1. Spider-Man is a PS4 based game. 2. More importantly, PC decompression was slower as it took CPU cycles away causing lower performance that it otherwise would be capable of if it had PS5 like decompression hardware.
And yet we have PC benchmarks of the game that say otherwise, especially now after numerus patches.

So factual benchmark's of the game disprove your comment.

My $150 6 core i5 provides much higher CPU performance than what's in PS5 and provides frame rates in Spiderman well above what PS5 can manage.
Which is why we've been having these conversations on many big title games right?
It seems you failed to grasp the conversation, try reading it again.
Erm, nope. Wrong again. For Titanfall both maps were resident in memory, so there is no streaming going on. Not the case with Ratchet on PS5.
He provided a video that showed R&C level warping, I showed Titanfall 2 doing the same thing.

He failed to discuss the technical details as to how R&C was achieving it so in the context of his comment my response was correct.
You're still not getting it. Processing power is meaningless if the data is struggling to get where it needs to be.
This has nothing to do with our 'discussion' and is a separate point entirely.
That is the core issue right now.
There is no core issue, 99% of games run better on PC and this will always be the case.
What has narrowed the relative performance gap between home consoles and PC.
And every month that gap is growing again, the amount of PC's that outperform the consoles increases every day.

Meanwhile consoles are stuck at their power level for at least another 4-5 years.
Do your best to stop thinking under outdate standards and understand why and how things will be different going forward.
You seem to think there's some secret sauce that means consoles will stay competitive.

Outside of outliers like TLOU they're not competitive, but it's OK that they're not, they're not supposed to be in the long term.

People don't buy consoles because they want the absolute fastest hardware around.

But you're going on ignore now too, I don't think I've seen a new member ignored by so many on this forum in such a short space of time that you have.

That should tell you everything about how you portray yourself and how you need to improve.
 
Judging by the amount of backlash and Sonys initiative to keep porting ps exclusive games to pc, will we ever see a built from the ground up ps5 exclusive harnessing the full power of the ssd if the game is ultimately getting ported to the pc down the line?

I LOVE that pc gamers get to experience these great games but I just feel like due to this negativity that studios will start developing pc versions in tandem with the ps5 version of a game and that the whole 9gb/s on ps5 (if developers devise a way to make this happen without bottlenecking something else) will never become standard.
 
Now I know (or at least I think I know) what the cause was. I was playing around with the resolution scale and noticed that if the resolution scale was at 100 or lower, the game was silky smooth.. but if I put the scale at even just one notch higher (120) then it would introduce the random judder/stutter into the frame delivery. So to me it seems like there's a problem with how the resolution scaler works... and it could explain why some people complain about stuttering while others don't.

But that setting defaults to 100 so it shouldn't be a problem? and I find it hard to believe everyone who has stutter has changed this setting.

But that setting increases native rendering resolution and thus increases VRAM requirements which would cause stuttering.
 
Judging by the amount of backlash and Sonys initiative to keep porting ps exclusive games to pc, will we ever see a built from the ground up ps5 exclusive harnessing the full power of the ssd if the game is ultimately getting ported to the pc down the line?

There's technologies currently available on PC like Direct Storage that provides PC will a higher level of SSD performance than what is in PS5.

So as long as developers are willing to invest the time in porting their game to PC and using Direct Storage we won't have a problem.
 
Status
Not open for further replies.
Back
Top