Textures aren't typically the biggest consumer of VRAM nowadays. Render targets and buffers often double or even triple that and is possibly what's happening in TLOU.The vram change between high and low is fairly small if I’m remembering correctly. The quality change is like 1/4 resolution. It doesn’t make a lot of sense.
@BitByte 8GB or less are probably 80% of the market. Getting better results from medium and low would be a big improvement. High/ultra being for 10-16 GB cards is fine.
Textures aren't typically the biggest consumer of VRAM nowadays. Render targets and buffers often double or even triple that and is possibly what's happening in TLOU.
Going from low to ultra textures only increases the VRAM allocation by ~3GB. Something else is taking up the remainder. RTs, geometry, and buffers all need VRAM as well and not all of them scale with resolution.
Cpu limited to below 60 fps on a 13600k with low settings? That’s pretty wild. Would be very curious to see what the dominant cpu bottlenecks are.
Not even stable 60 FPS in these outside parts. And image quality isnt even on par with the PS4 pro Version...
While I don’t find APT to be impressive, it does have great textures. However, asset variety pales in comparison to TLOU. Again, a plagues tale doesn’t appear to have been architected solely for the ps5 then ported back to pc which is the problem here. It’s clear Naughty dog is decompressing textures like crazy on the ps5. Why? I don’t know but if that behaviour is intentional then, I don’t expect the issue to be fixed easily.Is plague tale requiem a better comparison? It barely uses 6GB at 4K on PC.
It's strange, my overclocked 12400f is literally locked to 60fps throughout the whole game so something is crazy with that system.
As the game streams so much I wonder how much influence the PCIEX slot speed has on the frame rate.
Does a 2060 have pcie3? Can’t remember.
Edit: cpu is showing 97% or so though. On a 13600k at less than 60 fps, that’s an achievement. Very strong gaming cpu. Unless the guys memory timings are really bad.
Yea, he referenced ZLib. Again though, when you do a cost benefit analysis, I don’t know that it’s worth fixing at all.He references this, talks about how PC textures are usually packed in a different format that's far more CPU friendly instead of natively using the PS5's compression. To fix this of course, assuming there is a far more CPU-friendly solution that can handle this volume (which really, does not seem exhorbitant) would require re-authoring all the the textures, so I'm very skeptical that will happen regardless, at least within the next 2 months.
This is good to know.Not the mouse issue which apparently ND will fix on Tuesday for TLOU, maybe they will finally port it back to Uncharted. Still an extremely GPU hungry game but not quite as bad as this, but other than that it can run a far wider range of systems very decently, you basically just have to run at a lower res than you may be accustomed to for ports.
It never required anything like the work here, and its release was far less in the limelight - Sony/ND wanted TLOU out to coincide with the interest in the TV series, and welp - they got their attention alright.
As @Remij mentioned, there are still two big releases - Factions and (I assume) TLOU2 to come, so that may give more push for them to make the necessary time investment to get this engine into a more performant state, who knows.
In a way it is catering. Those who have >10 gb of vram can run high textures with no problem. The texture issue is solely with those who have 8gb of vram. Even then, there are people on this forum providing evidence that they’ve found a way to run with high textures. The simple fact is that the base requirements have increased. Imo, the minimum acceptable vram on any card now is 12gb.That's just silly. Even among the midrange-to high, it's the majority of the market, it's not the bargain basement. If you don't care about 8GB cards, then you don't care about the PC, as you're completely shutting out the huge majority.
Alex is not against running textures at medium, what should not happen is choosing that setting completely breaks texture quality so it resembles something from 20 years ago. That's broken. Preventing that from happening isn't 'catering', it's just designing your port so it actually works on what your users have.
Sea of toothpaste is hyperbolic. In fact, some of DFs comments in the video about the textures were hyperbolic. It’s not great but, they were complaining as if it was rage mega textures ps3 level of bad and it’s not.It's a product. It's being sold, not bestowed. It's not 'whining' if it doesn't live up to what was advertised. That's called a product review. 8GB of vram is not 'shackling' games, no one is asking to run this at 4k with Ultra textures and a new ray tracing feature on a 3070, they just expect textures that work correctly. There is just nothing on display here which warrants this game crippling 8gb cards. No other game that doesn't use ray tracing behaves this way. There are some games like Far Cry 6 where you can't use the Ultra HD texture pack yes, but they scale down as you expect - the 'regular' textures are just a little less sharp, not a sea of toothpaste.
You realize that crysis came out and most GPUs couldn’t run it well? Now I’m not saying TLOU is equivalent to crysis because it’s not but, I’m just providing an example to invalidate your statement. People upgraded their GPU to play crysis.Lol no they wouldn't, my man I was gaming on the PC during that time as well. The PC gaming market was absolutely tiny by comparison to today. The Voodoo2 was the new hotness in the late 90's, and it was $299, so ~$500 today. If a game came out that required SLI Voodoo2's people would absolutely shit a brick.
The fact that an 8gb GPU is still being sold doesn’t make it a good product. Once the console specs were announced, it was very clear that if you plan to keep your GPUs for an extended period of time, you shouldn’t buy anything with 8gb of vram. The minimum acceptable amount of vram was 12gb and even then, that was barely enough. This happens every single console generation so I’m not sure why people are surprised? Once a new console comes out, it’s standard practice to buy a GPU with vram greater than or equal to the console’s memory capacity. Like many people knew of the perils of 8gb. Go look back at ampere’s launch and you’ll see lots of comments warning about it. If someone made the bad decision of purchasing an 8gb card in 2020 when you could easily exceed the 8gb buffer with mods, that’s a personal choice.
Using ultra textured should not incur an additional performance penalty if you have enough vram. We know the port is has significant issues so the performance metrics here cannot be trusted.Historically consoles are best bang for buck type of machines. If you can’t see the difference, developers will choose the better performing option. It’s the reason we continue to move away from native to reconstruction and so forth.
In a question about texture quality, console games also must be designed with enough ceiling to ensure that locked 60fps, from
What I understand even on PS5 it does not, so why take an additional hit on textures if there is no visual difference.
Secondly, ultra settings are usually terrible in terms of value for performance, they are unoptimized and usually only available for cards with enough power to brute force, which is something that is not typically a console feature.
Sometimes consoles have custom features that selectively allow them to improve some
Items are ultra and some items below, some weird mixture of low and high, and these settings have never been available to PC.
I don’t think you need an hour to provide nuance. Half the video was just retreading old points. If they were going to spend a whole hour, they should have talked about other issues plaguing the port.DF does more than just reporting on things, they aim to educate. And education takes longer than some sound bites. At the end of the day, someone has to do the heavy lifting instead of rushing to put out over-sensationalized YouTube videos for the sake of monetization and getting there first.
Many people complaining about it are looking for an answer, are often looking for weaponizing data points for an argument. Taking the time to provide nuance stops a lot of that, and their core audience appreciates detail. That’s what the patron surveys are for and about.
This is assuming devs will not do the "PS2 texture, take it or leave it" attitude on the console.Series-S offers 8GB of total, useable RAM.
So while your comment will be true for Sony exclusives that have a base minimum of 12GB of useable RAM, for games built to run on Xbox 8GB VRAM will and should be OK.
I guess you missed the numerous complaints about the series s? Anyway, I don’t think devs will take the series s into consideration when designing their games. They’ll design their games around the constraints of the ps5 and series x then they’ll just deliver a half-assed port on series s. The Xbox demographic doesn’t really buy games anyway due to gamepass so it’s not a big loss anyway. I mean Microsoft data shows this reality. If I examine my spending habits, I’ve owned a series x off and on(buy -> sell -> rebuy) since launch and I’ve purchased exactly 0 games for it due to gamepass. Did the $1 dollar conversion trick and locked it in for 3 years.Series-S offers 8GB of total, useable RAM.
So while your comment will be true for Sony exclusives that have a base minimum of 12GB of useable RAM, for games built to run on Xbox 8GB VRAM will and should be OK.
So if there is no visual difference, why incur a larger impact to your VRAM allocation unnecessarily I would want to know the benefit here because that space could be used for render targets and other items.Using ultra textured should not incur an additional performance penalty if you have enough vram. We know the port is has significant issues so the performance metrics here cannot be trusted.
Do you know what is funnier Scott? The argument of "we had 8 GB for RX 480 in 2016?" And do you know the reality? Nearly half of that amount was unused at 1080p till 2021 unless you were into the niche stuff like using ultra settings/30 FPS lock or stuff (which awkward people like me could've done actually). This is the brutal truth none of these people can admit. RX 480/580 never had the grunt to run 1440p/ultra settings that required 8 GB VRAM back then. Only in recent years that 6 GB was saturated at 1080p actually. I'd say 3060ti/6600xt and alike is okay with 8 GB VRAM. But yeah 4060ti/3070ti and stuff is a bit pushing at 1440p.@BitByte We should never go back to the days where people had to upgrade their pcs to play new games. That was pure stupidity and it's financial suicide for game companies. The reality is, most games can scale in reasonable ways so low and medium settings actually look pretty good. Most PC games scale well, and that's a good thing. There are limits. You don't want to be supporting HDDs forever, or 8GB RAM, or CPUs that don't support AVX, or maybe a particular shader model for gpus. 8GB of VRAM is still incredibly common. It's not time to drop it yet. I think in general people just want medium textures (and even low) to look a bit better. It really does look like a very old game, like PS360 era stuff.