Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
I am happy. 😁 Alex agreed with me on the idea that inherently targeting the PS5 as the core platform is still the best option compared to say targeting lower spec PC hw as some people have suggested. They just need to use more PC appropriate equivalents when porting.

I'm not sure about targeting lower spec PC's from the outset (which would be rubbish), but PC ports of PS5 games should absolutely scale below PS5 equivalent hardware. At least on the GPU front where scaling is very trivial, e.g. just lower screen, effects or texture resolution. Accommodating an older feature set is more troublesome and I don't necessarily think that should be a requirement.

Sure the port is unoptimized. But that was not my point. It is quite clear that naughty dog for some reason tryd to mimic PS5s I/O with pure brute forcing.
And it revealed the perfect example as to why the I/O Block of PS5 is so smart .
According to Cerny the Kraken Decompressor Chip equates to 9 "additional" Zen 2 CPU Cores.
What difference that makes is visible on TLOU on PC.

I think you'll find the same would also hold in reverse if they tried to use a PC centric decompression scheme on the PS5. All TLOU shows us is that Kraken + the decompression hardware works well for the PS5 (no kidding). Not whether the PC can match or even exceed that performance using a PC optimised solution. e.g. GDeflate, DirectStorage, and a streaming system optimised for split memory pools.
 
IMO I would take some streaming/loading hitches over performance getting worse and worse over time anyday. Because when you optimize settings, you do it from a certain baseline and you don't expect it to get worse after a little play time getting to the same place as before.

The memory streaming in TLOU is not that bad for this reason. Performance stays consistent even after long play sessions. Same cannot be said about games like Spiderman, Dying Light or Cyberpunk. (Note I didn't test Spiderman and Dying Light but I've heard there are some memory leak issues with these games).

So to me, that makes this PC Port already better than some even though there is indeed a lot of improvements left that can be made for TLOU.

This port isn't so bad, and I say that as a 6 GB user running high textures. Also @yamaci17 has shown us that you can get a great experience on a 8 GB card if you adjust settings accordingly. We low VRAM peasants don't have to restort to these awful looking medium textures (which certainly should look lot better than they are now).
 
I'm not sure about targeting lower spec PC's from the outset (which would be rubbish), but PC ports of PS5 games should absolutely scale below PS5 equivalent hardware. At least on the GPU front where scaling is very trivial, e.g. just lower screen, effects or texture resolution. Accommodating an older feature set is more troublesome and I don't necessarily think that should be a requirement.
Yeah sorry I didn't mean that games below PS5 should not be able to run or anything. They should definitely scale down with lower settings and resolutions and such.

I just meant that it would be stupid and take away a lot of the strength of the dev pipeline for games if they started decoupling games from the consoles and started with PCs much weaker just to make sure those games could run better on lower end pc hw. It would be no different from our current cross gen phase except worse imo

Games tailored for PS5 just need to be optimized properly for PC to begin with when porting over. PCs have the power to accommodate this.
 
IMO I would take some streaming/loading hitches over performance getting worse and worse over time anyday. Because when you optimize settings, you do it from a certain baseline and you don't expect it to get worse after a little play time getting to the same place as before.

The memory streaming in TLOU is not that bad for this reason. Performance stays consistent even after long play sessions. Same cannot be said about games like Spiderman, Dying Light or Cyberpunk. (Note I didn't test Spiderman and Dying Light but I've heard there are some memory leak issues with these games).

So to me, that makes this PC Port already better than some even though there is indeed a lot of improvements left that can be made for TLOU.

This port isn't so bad, and I say that as a 6 GB user running high textures. Also @yamaci17 has shown us that you can get a great experience on a 8 GB card if you adjust settings accordingly. We low VRAM peasants don't have to restort to these awful looking medium textures (which certainly should look lot better than they are now).
Thanks for acknowleding me (I'm gonna bring out my inner Roman Reigns and cut a "Acknowledge me" promo on this forum at this rate!!!")

Speaking of which, someone suggested me trying District Plaza and said it was very tough on their end. I gave it a try, with super quick mouse turns to stress things, seemed perfectly locked to 45,


1440p native, High character and enviroment textures, low volumetric and visual effects texture quality (7.3 gb game application usage, 8.8 gb! total vram usage. I should be dead, right?)

(I know the ones who watch my videos wonder why I use locks like 45/50. My 2700 is not enough for this game, I will have to admit. From the video, you can see all 16 threads of my old CPU is maxed out, quite literally, hurdles along the %80-85 with a 45 FPS cap. This is the first game I've ever seen such a behaviour. If I run uncapped framerates, CPU gets %95-100 usage and frametimes go all over the place. 45 FPS is a nice place that allows a consistent performance on this lowend CPU. GPU still has some performance headroom for a higher FPS lock, but 1440p/60 may be attainable with a better CPU. And of course, recording always takes a hit on my end with on all three fronts: CPU, GPU and VRAM. Despite that... I've managed to procure these videos. I don't know what else to tell. Maybe 8 GB owners suck it up, either upgrade, or just keep their backgrounds super clean. I do still agree that medium textures should be better. But I would still not use them (I really like how high fidelity the game is with high textures. I wouldn't want to sacrifice that for myself). Without recording, I can push in-game VRAM usage to 7.4-7.5 GB.

As you said, this game is super consistent in terms of what amount of VRAM it will use. If it says 7325 MB, it will, at peak, use around 7450-7500 MB. This makes the super stable once you found your comfortable VRAM settings. I've never seen the game trying to overshoot that value. Most is +200 mb, sometimes gets reduced to +100. I wasn't surprised that this area proved to be problematic for some, as it pushes +200 megs of VRAM on top of 7300 MB in game bar suggests. Most other locations will stick to the lower end of 7400 MB instead, or at times, simply around 7350 MB.

And this is at 1440p native. 1440p dlss quality has a much lower memory frontpit. Most <2080 users should be able to run the game with high textures at 1440p/dlss quality if they have an idle VRAM usage of 7200 to 7400 MB.
 
Seems we have a console warrior trying to defend his consoles superiority.

And my point is your point made no sense considering the state of the port.

Pure speculation on your part.

More speculation on your part

And RAD Game Tools who made the I/O block disagreed with his wording in the blog they did on PS5.

A quad core i7 4770k from 2013 kept up with PS5 I/O block in Spiderman, explain?

In what way exactly? CPU's that are years old do 60fps in the game perfectly fine.
Rad Game Tools did not make the Chip - they developed Kraken and Oodle ...
And link the exact statement where they say otherwise about the performance of the Kraken Decoder Chip.
Only my speculation now? I thought is was agreed in here that Naughty Dog seemingly just ported the Streaming System from their PS5 build on to PC , hence the CPU hammering.
Listen, the PS5s I/O System is not in need of your approval - it is generally accepted as worldclass.
Spiderman was a last gen Port. They did many things to circumvent the slow 30mb/s I/O Speed of PS4. There was a whole GDC Talk about that specific Problem. But most importand - they made it happen even with 30MB/s on PS4.
What does that tell you about the streaming hunger of that game? - right - cant be alot. But still CPUs on PC where hammered. From that bit of decommpression need.
But this entire Discussion we should postpone until the first second wave First Party Title land on PS5.
----------
A threadrule question: since the title of this thread is about DF Articles, is it not allowed to post NXgamers findings in here?!
 
Thanks for acknowleding me (I'm gonna bring out my inner Roman Reigns and cut a "Acknowledge me" promo on this forum at this rate!!!")

Speaking of which, someone suggested me trying District Plaza and said it was very tough on their end. I gave it a try, with super quick mouse turns to stress things, seemed perfectly locked to 45,


1440p native, High character and enviroment textures, low volumetric and visual effects texture quality (7.3 gb game application usage, 8.8 gb! total vram usage. I should be dead, right?)

(I know the ones who watch my videos wonder why I use locks like 45/50. My 2700 is not enough for this game, I will have to admit. From the video, you can see all 16 threads of my old CPU is maxed out, quite literally, hurdles along the %80-85 with a 45 FPS cap. This is the first game I've ever seen such a behaviour. If I run uncapped framerates, CPU gets %95-100 usage and frametimes go all over the place. 45 FPS is a nice place that allows a consistent performance on this lowend CPU. GPU still has some performance headroom for a higher FPS lock, but 1440p/60 may be attainable with a better CPU. And of course, recording always takes a hit on my end with on all three fronts: CPU, GPU and VRAM. Despite that... I've managed to procure these videos. I don't know what else to tell. Maybe 8 GB owners suck it up, either upgrade, or just keep their backgrounds super clean. I do still agree that medium textures should be better. But I would still not use them (I really like how high fidelity the game is with high textures. I wouldn't want to sacrifice that for myself). Without recording, I can push in-game VRAM usage to 7.4-7.5 GB.

As you said, this game is super consistent in terms of what amount of VRAM it will use. If it says 7325 MB, it will, at peak, use around 7450-7500 MB. This makes the super stable once you found your comfortable VRAM settings. I've never seen the game trying to overshoot that value. Most is +200 mb, sometimes gets reduced to +100. I wasn't surprised that this area proved to be problematic for some, as it pushes +200 megs of VRAM on top of 7300 MB in game bar suggests. Most other locations will stick to the lower end of 7400 MB instead, or at times, simply around 7350 MB.

And this is at 1440p native. 1440p dlss quality has a much lower memory frontpit. Most <2080 users should be able to run the game with high textures at 1440p/dlss quality if they have an idle VRAM usage of 7200 to 7400 MB.

The ability to lock a game to any frame rate you wish on a PC with a VRR display is one of the platforms greatest strengths IMO. I was playing valhalla at a locked 34 fps on my old 1070 which was clearly noticeably better than 30. I'm now playing locked at 80fps (along with lots of other graphical enhancements) on my 4070Ti, which again is much better than 60.
 
I don't like to talk smack, but NX just confuses me a lot of the time in how he presents the information, let alone his conclusions many times being wrong or strange in how he came to them.

I much prefer DFs way of disseminating things down to simpler ways for normal people to get the basic jist
 
Listen, the PS5s I/O System is not in need of your approval - it is generally accepted as worldclass.
That's true, of course. The problem with this port is that PS5's IO works in a way that makes little to no sense to try to replicate on PC because of the lack of specialized hardware. Which is why we believe the port to be having the issues it does.
A threadrule question: since the title of this thread is about DF Articles, is it not allowed to post NXgamers findings in here?!
There is a thread for non-DF technical discussion here. NXGamer/IGN technical reviews usually have discussion there.
 
Rad Game Tools did not make the Chip - they developed Kraken and Oodle ...
And link the exact statement where they say otherwise about the performance of the Kraken Decoder Chip.

Sure...

"Fabian 'ryg' Giesen said...
(I also work at RAD on Oodle.)
The Kraken decoders are not "equivalent to 9 Zen 2 cores", that's quoting wildly out of context; by the same rationale a Deflate decoder that hits 5-6 GB/s output would be "equivalent to 12 Zen 2 cores" which is just as misleading.

Link

Only my speculation now? I thought is was agreed in here that Naughty Dog seemingly just ported the Streaming System from their PS5 build on to PC , hence the CPU hammering.
Yes, it all speculation and guess work as none of us work at ND or have any confirmation from anyone on the port that that is what they did.
Listen, the PS5s I/O System is not in need of your approval - it is generally accepted as worldclass.
I never said it did need my approval, you're the one seeking everyone's validation for its existence and powa!

You just need to chill your beans in hyping it up, PS5 has been out for over 2 years now and we haven't seen anything that can't be done on platforms that don't have an equivalent I/O block.
Spiderman was a last gen Port.
That's true but it still takes better advantage of the custom I/O hardware than TLOU does as noted by it's sub 2 second loads times (loads times that can be matched by PC, PC's which do not have a custom I/O block no less)
They did many things to circumvent the slow 30mb/s I/O Speed of PS4. There was a whole GDC Talk about that specific Problem. But most importand - they made it happen even with 30MB/s on PS4.
I know, I watched the presentation, it was very informative.

But the game was altered to take advantage of the increased storage speeds (The swinging speed is slightly faster in the PS5 version)
What does that tell you about the streaming hunger of that game? - right - cant be alot.
Do you have proof that it isn't a lot of are you speculating again?

The games burst decompression rate is much higher than TLOU's and thus is more demanding.
But still CPUs on PC where hammered. From that bit of decompression need.
And yet offered better performance than what PS5 was offering.
But this entire Discussion we should postpone until the first second wave First Party Title land on PS5.
I agree, we can then compare them to the first wave of Direct Storage games and see what excuses are made when PC's load faster than PS5.

The first Direct Storage enabled game (Forpoken) can load faster on PC so the future is bright for loading on PC and Xbox.
 
Pushed a 1440p DLSS Quality / High important textures video with 7.3 GB VRAM usage.


This location was a bit lighter on the CPU. I will do a one final video in a more open section and then I'll be done with it. I've tried my best to spread out the word about this but it seems like its been in vain. I suspect most folks have really ludrucious amounts of idle VRAM usage that indeeds line up with what ND projected where it would land. There's nothing I can do about that aside from giving suggestions to dial back that idle VRAM usage down.

If you have, at idle, have free 7.4 GB VRAM, you can push 1440p/DLSSQuality/important high quality textures with consistent frametimes on 8 GB GPUs. 8 GB users can try and use my curated settings if they want to. (Once again, I apologize for 1080p/30 FPS recording. This is best I can do.)

@Dampf For 1080p, 6 GB is kind of workable; yes. But sacrifices have to be made. I've made some experiments on my friend's 1660 super. We practically destroyed all VRAM using apps (disabled Steam's hardware accerelated web views, disabled Discord's hardware accerelation. it is a temporary solution, he will enable them again after he is done with the game). We've managed to procure an idle VRAM usage of 5.75 GB at 1080p desktop. Before doing these two crucial steps, his game would be reduced to slideshows. Now the game functions at least.

So what we did was to set the overall preset to Medium. And had to resort to 1080p/FSR quality. (to claw back some VRAM by sacrificing resolution).
Reduce all geometry settings to Low (this will save some more VRAM).
Dynamic objects to low, characters texture to medium and most critical is... enviroment textures to high).
Visual effects and volumetrics are back to low again.
We reduced directional shadow resolution and points light shadow resolution to low to just be sure.

These settings will produce around 5.8 GB application VRAM usage. It is really pushing the limits but it works. Don't use High character+High enviroment textures, as that will nerf the enviromental textures at 1080p for some reason. Using both at High magically reduces VRAM usage to 5.2 GB and will result with blurry enviromental assets. Using env at High and char at Medium will use 5.8 GB and load high quality assets.

Use Medium character+High enviromental textures. Medium character textures still look somewhat "acceptable" and more than presentable compared to PS4 remasters. My friend played for 2 hours like this and his performance was stable and consistent. However due to his Ryzen 3100 being strangled, we had to cap his frames to 40 as well. CPU never fell blow %100 even at 40 FPS lock. I suggested him to upgrade to a 5600x or 3600 at worst.

Image quality is acceptable. At least compared to PS4 version. 1080p FSR quality is somewhat "bareable" if you're already accustomed to "native TAA blur" that is very problematic at 1080p.

Most other settings othar than "enviroment" textures are just there for show. Upmost importance is to have Enviromental textures at High. Once you manage to fit those textures into your VRAM buffer; game will look decent regardless of geometry and other rasterization settings.
 
It was never supported though.

Not 'officially', but certainly gave that impression earlier:


This is just another example though why this just isn't a case of PC gamers 'whining' because their hardware isn't matching PS5 performance 1:1 like other titles, or that this was a unique porting dilemma - it may very well be! But from the inaccurate recommended specs sheet, to the fact the game was withheld from reviewers until launch day, it's ultimately an issue of a company knowingly releasing a broken product. The dilemma was one of Sony/ND's own making.

Whether anyone thinks your average PC isn't up to the task of running this uniquely demanding title is largely irrelevant in this context - a seller promoted and shipped a product they knew was deeply flawed.

(For the record I agree with ND that SteamDeck 'fixes' should be absolutely at the bottom of the to-do list. You get PS5 native titles to run at all on the SteamDeck, it's a win - but it should never be expected)
 
This is just another example though why this just isn't a case of PC gamers 'whining' because their hardware isn't matching PS5 performance 1:1 like other titles, or that this was a unique porting dilemma - it may very well be! But from the inaccurate recommended specs sheet, to the fact the game was withheld from reviewers until launch day, it's ultimately an issue of a company knowingly releasing a broken product. The dilemma was one of Sony/ND's own making.

Whether anyone thinks your average PC isn't up to the task of running this uniquely demanding title is largely irrelevant in this context - a seller promoted and shipped a product they knew was deeply flawed.
Yep. And this is why I throw tantrums on this forum about it. They release these games in a broken or very unoptimized state, and then scramble to fix them.. only after a big fuss has been made, and very often those games ARE fixed up in relatively short order.. which makes a person question what the hell is going on and why this stuff isn't being addressed before it launches. It's a slap to the face of PC gamers, and we're rightfully getting tired of it.

Again though, we're seeing a lot of movement now regarding this subject from the industry. It seems like it's come to a point where they have to act.

Big props to @Dictator and Digital Foundry for really getting this issue the public's attention!
 
Ngl, way too many Neogaf users are bringing their console warrior baggage over here.

Anyway, I can’t wait to see second wave titles because if it’s not unreal engine, I have a feeling things will get significantly worse for some folks. To me, the ps5 and series x are native 1080p to 1440p consoles. As a result, I expect most of the 3000 series is going to age horribly as it should. Even the unreleased 4060ti, 4060 and 4050 are dead on arrival. I really hope reviewers keep this in mind going forward as they do their new reviews. Again, regardless of performance, any GPU with less than 12 gb of ram should not be recommended as a long term GPU.
I don't like to talk smack, but NX just confuses me a lot of the time in how he presents the information, let alone his conclusions many times being wrong or strange in how he came to them.

I much prefer DFs way of disseminating things down to simpler ways for normal people to get the basic jist
Digital foundry has better breakdowns for the most part but it’s really dependent on who is doing the video. Some are better than others. I think NX Gamer loses a lot of the audience with his word salad and video editing. Finally, I do wish digital foundry would allow people to arrive at their own conclusions. They becoming far too opinionated of late and certain members are becoming quite hyperbolic with their takes. People then take those quotes and parrot them around as if DF was an expert in making games or developing anything.
 
Last edited:
They becoming far too opinionated of late and certain members are becoming quite hyperbolic with their takes. People then take those quotes and parrot them around as if DF was an expert in making games or developing anything.
You can’t stop people from being themselves. DF isn’t trying to actively cause warring between platforms. But frankly whether you do or don’t say something, people will weaponise words, data, anything really to get their way.
 
Sure...



Link


Yes, it all speculation and guess work as none of us work at ND or have any confirmation from anyone on the port that that is what they did.

I never said it did need my approval, you're the one seeking everyone's validation for its existence and powa!

You just need to chill your beans in hyping it up, PS5 has been out for over 2 years now and we haven't seen anything that can't be done on platforms that don't have an equivalent I/O block.

That's true but it still takes better advantage of the custom I/O hardware than TLOU does as noted by it's sub 2 second loads times (loads times that can be matched by PC, PC's which do not have a custom I/O block no less)

I know, I watched the presentation, it was very informative.

But the game was altered to take advantage of the increased storage speeds (The swinging speed is slightly faster in the PS5 version)

Do you have proof that it isn't a lot of are you speculating again?

The games burst decompression rate is much higher than TLOU's and thus is more demanding.

And yet offered better performance than what PS5 was offering.

I agree, we can then compare them to the first wave of Direct Storage games and see what excuses are made when PC's load faster than PS5.

The first Direct Storage enabled game (Forpoken) can load faster on PC so the future is bright for loading on PC and Xbox.
did you not read what BRIT said not to do?!
For the sake of having civil discussions, can we please not do the this-for-that style of discourse that goes line by line? It usually ends up isolating others from the discussion.
anyways - as to your provided Link: It is you again who is using this quote out of context. Cerny gave in his Road to PS5 Speech the information that that Kraken decoder chip equals to about 9 zen 2 cores.
He did so to give the audience a picture what to expect from that chip in terms of Power. Of course he meant it in a way that "if" a Zen 2 " Processor would need to do such a task ( remember a Zen 2 CPU not a specific hardwaredecoder) it would need nine of those cores to make the decompression happen in realtime without slowdown.

Fabian from Oodle mentioned then rightfully that it makes no sense to just say it is a 9 core Zen 2 CPU Equivalent because (of course) a Zen 2 CPU would be a waste if ordered to just decompression nonstop. Thats exactly the reason why sony did not put a bigger CPU inside. Because just decompression is done way more effectivly (and cheaper) with fixed funktion hardware. But that still not makes the statement from Cerny wrong - quite the opposite. Because it is technicaly still true - would you order a Zen 2 CPU do do the decompression on the fly it would come out that you would need nine of those cores to make it happen in time , and that with all the inefficencys involved there because it is in the end not fixed funktion hardware. But multipurpose instead. If anything - that makes the Kraken Decoder look even more potent than before - because as he rightfully states fixed funktion hardware will always be better than a multi purpose hardware if thrown the same problem at. So much to that..

But interresting - did you scroll a bit more down on that quasi Q&A in the comment section? The actuall Author of this blogpost Cbloom said a couple question below :
In theory PC SSD's will keep getting faster, but you would need several CPU cores running software Kraken to match the decompressed bandwidth of the PS5 hardware Kraken. Even then, a typical game on the PC won't be able to achieve that IO speed because of other bottlenecks; once you're going that fast lots of other things in the system software can become problems, you have to address it all through the software.
Those other bottlenecks he speaks of are none on PS5 btw - everything is covered and setup in a way to support a 9GB/s ongoing datastream- not burst - ongoing if needed.
We still dont see anything like Ratchet and Clank Rift Apart on PC - why is that?

We will see wich PC will be faster than PS5. Nobody said it needs to be better than every PC . Its loads faster than the most of them - thats enough for now. In terms of raw power the current Gen Consoles have overtaken like 70% of all PC out there according to Steam Hardware Survey. But i happily state this here again for later quotes : I believe that the Ryzen 5 3600 / RTX 2070 Super Combo will not be enough to match PS5s performance on FIRST PARTY TITLES from this year on , because Publishers like UBISOFT will happily optimize to the 70% to get every Plattform on par and call it a day. So third Party Titles will not be the Test.
I expect Spiderman 2 and Wolverine to blow everything out of the Water in Terms of Graphics. Especially Insomniac seems to have the best insight how to max out PS5. They will deliver. Cant wait!
 
did you not read what BRIT said not to do?!
I know what he said, unfortunately replying to you doesn't require a lot of detail due to the poor logic and arguments you make and use.
Cerny gave in his Road to PS5 Speech the information that that Kraken decoder chip equals to about 9 zen 2 cores.
Which after having some PS5 games that use the decompression hardware (Like Spiderman) release on PC we can now say his claim is false.
He did so to give the audience a picture what to expect from that chip in terms of Power. Of course he meant it in a way that "if" a Zen 2 " Processor would need to do such a task ( remember a Zen 2 CPU not a specific hardwaredecoder) it would need nine of those cores to make the decompression happen in Realtime without slowdown.
Myself and everyone knew exactly what you meant when you said what you did.
That's exactly the reason why sony did not put a bigger CPU inside.
Sure it was, I mean it was nothing to do with heat, power, die space or cost at all was it.
But that still not makes the statement from Cerny wrong - quite the opposite.
No, seeing Spiderman on PC decompress data just as fast as PS5 does without any where close to '9 Zen CPU cores' pretty much kills that claim.
Because it is technically still true - would you order a Zen 2 CPU do do the decompression on the fly it would come out that you would need nine of those cores to make it happen in time
Spiderman begs to differ.
But interesting - did you scroll a bit more down on that quasi Q&A in the comment section? The actuall Author of this blogpost Cbloom said a couple question below :

Those other bottlenecks he speaks of are none on PS5 btw - everything is covered and setup in a way to support a 9GB/s ongoing datastream- not burst - ongoing if needed.
And with Direct Storage they're no longer an issue on PC either.
We still dont see anything like Ratchet and Clank Rift Apart on PC - why is that?

Erm, Titanfall 2 was doing swaps like that in 2016.


We will see wich PC will be faster than PS5. Nobody said it needs to be better than every PC . Its loads faster than the most of them - thats enough for now. In terms of raw power the current Gen Consoles have overtaken like 70% of all PC out there according to Steam Hardware Survey.
True, but remember there's hundreds of millions of PC gamers and 30% of those have a PC more powerful than PS5, which means there's more gaming PC's faster than PS5 than there are PS5's.
But i happily state this here again for later quotes : I believe that the Ryzen 5 3600 / RTX 2070 Super Combo will not be enough to match PS5s performance on FIRST PARTY TITLES from this year on.
If these first party games use ray tracing and support DLSS it will.

I'm going to do the sensible thing now and put you on ignore.
 
Last edited:
Medium enviroment textures do not load PS2-like textures anymore at 1440p/4K. 1080p-texture relation is still somewhat bugged.

I'm not seeing any difference in medium texture quality at 1440p with the patch on my end. I think they would have mentioned that in the patch notes if there was a substantial upgrade in texture quality at medium?

1680649263036.png
 
Status
Not open for further replies.
Back
Top