davis.anthony
Veteran
Is plague tale requiem a better comparison? It barely uses 6GB at 4K on PC.
And looks significantly better in the process.
Is plague tale requiem a better comparison? It barely uses 6GB at 4K on PC.
Asking Sonys first devs to plan ahead for PCs with (much less) CPU or GPU power would essentially put the shackles back on. Wider ones perhaps but shackles non the less.
Nobody asked them to port this game to the PC.I sort of hard disagree here - this year we will be finally freed from the cross gen shackles of PS4. Asking Sonys first devs to plan ahead for PCs with (much less) CPU or GPU power would essentially put the shackles back on. Wider ones perhaps but shackles non the less.
I could have put in a Plague Tale - I even downloaded it to perhaps make that comparison. But in the heat of the moment of the recording, I kept thinking how Nixxes did a lot better so I said that. But yeah, A Plague Tale is a great comparison point for texture quality and VRAM usage while being visually and design-wise extremely similar to TLOU Pt 1
Another one of the trio says that Naughty Dog should fix the cpu issues. My question exactly is how do they propose they do that? If on the fly decompression is a requirement to keep ram utilization in check, how do you speed up decompression without a significant cpu penalty? Again the ps5 has dedicated hardware for this so it’s almost free. I suppose they could use GPU decompression in an attempt to speed it up but there will be a performance penalty?
Honestly, I think what will happen is that they’ll try to fix the easy win bugs. Maybe optimize where they can but, I don’t know that this will be a complete turn around. Does anyone know if all of uncharted’s issues were sorted?
Finally, I must say, I do not believe devs should care about or cater to the concerns of 8gb GPU holders.
It’s been a good 7 years but it’s time to move on. Devs should not shackle themselves to old hardware. For a long time, we read lots of comments about how old consoles were shackling pcs. Now consoles have stepped up the requirements and we’re met with incessant whining.
In the 90s and early 2000s, people would have upgraded without blinking an eye.
Any volunteers for running TLOU shader comp on a PC with a mechanical HDD....??
He references this, talks about how PC textures are usually packed in a different format that's far more CPU friendly instead of natively using the PS5's compression. To fix this of course, assuming there is a far more CPU-friendly solution that can handle this volume (which really, does not seem exhorbitant) would require re-authoring all the the textures, so I'm very skeptical that will happen regardless, at least within the next 2 months.
Not the mouse issue which apparently ND will fix on Tuesday for TLOU, maybe they will finally port it back to Uncharted. Still an extremely GPU hungry game but not quite as bad as this, but other than that it can run a far wider range of systems very decently, you basically just have to run at a lower res than you may be accustomed to for ports.
It never required anything like the work here, and its release was far less in the limelight - Sony/ND wanted TLOU out to coincide with the interest in the TV series, and welp - they got their attention alright.
As @Remij mentioned, there are still two big releases - Factions and (I assume) TLOU2 to come, so that may give more push for them to make the necessary time investment to get this engine into a more performant state, who knows.
That's just silly. Even among the midrange-to high, it's the majority of the market, it's not the bargain basement. If you don't care about 8GB cards, then you don't care about the PC, as you're completely shutting out the huge majority.
Alex is not against running textures at medium, what should not happen is choosing that setting completely breaks texture quality so it resembles something from 20 years ago. That's broken. Preventing that from happening isn't 'catering', it's just designing your port so it actually works on what your users have.
It's a product. It's being sold, not bestowed. It's not 'whining' if it doesn't live up to what was advertised. That's called a product review. 8GB of vram is not 'shackling' games, no one is asking to run this at 4k with Ultra textures and a new ray tracing feature on a 3070, they just expect textures that work correctly. There is just nothing on display here which warrants this game crippling 8gb cards. No other game that doesn't use ray tracing behaves this way. There are some games like Far Cry 6 where you can't use the Ultra HD texture pack yes, but they scale down as you expect - the 'regular' textures are just a little less sharp, not a sea of toothpaste.
Lol no they wouldn't, my man I was gaming on the PC during that time as well. The PC gaming market was absolutely tiny by comparison to today. The Voodoo2 was the new hotness in the late 90's, and it was $299, so ~$500 today. If a game came out that required SLI Voodoo2's people would absolutely shit a brick.
Yeah I could do it later if you want, albeit I'm not sure of the purpose - even with ~10GB of compiled shaders, the majority of the time is still going to be the CPU compiling, not the IO. Writing 10GB to a HDD takes a couple of minutes, if that.
Finally, at the top of the hill of general-purpose compression algorithms sits a commercial package called Oodle[20]. Its high compression algorithms Kraken and Leviathan beat any other currently available compression algorithm in one or more categories.
Kraken is following up on a bunch of leads we ran into last year while working on LZNA and BitKnit, several things we learned tuning LZNib for the PS4 (1.6GHz AMD Jaguar cores, 2 instrs/cycle max, ~200 cycle memory access latency, ~24 cycle L2 cache latency; relatively challenging target for compression, though heaven compared to older game consoles), and earlier stuff we've been experimenting with since about 2014. It's basically combining the more successful ideas from BitKnit, LZNib (that one Charles has written about extensively) and LZB (LZ-bytewise, essentially a LZ4 derivative, which Charles has been very explicit about from the beginning
I could have put in a Plague Tale - I even downloaded it to perhaps make that comparison. But in the heat of the moment of the recording, I kept thinking how Nixxes did a lot better so I said that. But yeah, A Plague Tale is a great comparison point for texture quality and VRAM usage while being visually and design-wise extremely similar to TLOU Pt 1
They're really not, APT is a level above TLOUCharacter models geometry are comparable although I would give the edge to Part I.
Again, APT is several levels above TLOU here.I will agree texture is similar.
They're really not, APT is a level above TLOU
Again, APT is several levels above TLOU here.
Sorry, I can't take you seriously anymore with these responses.
what are you talking about?? making a game run (even if barely) on old quad cores automaticly puts restrictions on the games design in terms of general complexity. Ai (that includes Weather systems and physics btw) for Enemys will suffer. World size will suffer. Are you now going to tell me now that Insomniac when they planned for Rift Apart could have aimed for exact the same Game if they had to consider it to be able to run on a Quad Core/ GTX 1060 Combo in a later coming PC Port??That's not how it works, having to create an extra lower tier of assets will not put any shackles on PS5 at all.
And if developers want to make money from their PC ports then they absolutely need to ensure the ports are good or they loose out.
STEAM are already and have already issues loads of refunds to those who bought TLOU on PC, meaning the developers have not only lost out on financial revenue but their image as a developer has also taken a blow.
This is false as some of those 'old' quad cores still offer IPC higher than the early Ryzens and as games are still mainly single threaded limited your claim is ridiculous.what are you talking about?? making a game run (even if barely) on old quad cores automaticly puts restrictions on the games design in terms of general complexity.
No they won't, you just turn those things down or off for low end systems.Ai (that includes Weather systems and physics btw) for Enemys will suffer.
In what way?World size will suffer.
Spiderman (which uses the same engine as R&C) runs just fine on a quad core and GTX1060 thank you and it actually has a more complex world than R&C.Are you now going to tell me now that Insomniac when they planned for Rift Apart could have aimed for exact the same Game if they had to consider it to be able to run on a Quad Core/ GTX 1060 Combo in a later coming PC Port??
Again, yes it is.THATS not how it works.
You're right, PC should stop dragging these old systems (consoles) with it.Several Console Gen cycles have proven already that it is the sudden development for one (new) Hardware Solution that brings the big step ups in Graphics and Scope. Not the dragging along support of very old Systems.
We've had several 1st party games on PC now, all of them offering better graphics and options than what their respective PS version offered.So NO - Sony better keeps not gimping their own First Party Games in favor of PC.
Like me.And anyway there are People outside with very good Gaming PCs.
No we're not.They are usually fine with the Ports.
True.It was so often said that one of the biggest Advantages a PC has over a Console is that it can be upgraded.
No we won't.So thats it what Kevin, Angie, Paul and Ismael going to do - they go outside for once go to a hardwareshop and upgrade their efing PC.
The only thing gimping PS5 games is that 2018 level PC GPU inside PS5.Not Sony gimping their PS5 games.
What hard feelings?Oh and before the hard feelings hit - that included myself.
You poor thing.I was gaming with a an Ryzen 1500x / GTX 1650 Super for quite some time.
TLOU port says you won't.Last year in autumn i upgraded to a Ryzen 5 5600 and a RTX 3060 12GB. And since my PC is connected to a 1080p Screen i will be fine for the duration of the Gen in terms of PS5 Equivalency of Ports.
That's nice.But for now iam playing Warhammer 3 with the System and it runs quite well.
Do not even attempt to debate TLOU having better textures than APT.
I've played both on max settings and APT slaughters TLOU in the texture department.
And it also has these character models.View attachment 8638
In typical ND style the model of Joel you get in cutscenes is vastly superior to what you get in gameplay.
This is a typical wall texture in APT - Leagues above anything in TLOU.
View attachment 8639
If you feel TLOU is even close to those textures then I (And I'm sure many others in this forum) will not be taking you seriously.
Yeah no I'm not playing the screenshot game with you, but thanks anyway. And considering past consensus among members of this forum that turn out to be hilariously off the mark, you can imagine how much credibility I give to this forum's groupthink.
You don't need to play the screen shot game, anyone who's played both games or anyone who is unbiased will and can see it's no contest and APT easily wins in the texture department.
It seems this port has really brought the level of discourse down to neogaf/twitter levels. "Forum groupthink"? This is one of the most rational and constructive forums out there IMO.
Often times, yes. But I can point to many occasions where herd mentality seems to take over only to be proven incorrect later. Not really a big deal, and I just brought it up to knock down the fallacy davis used by suggesting consensus within a forum certifies whether or not his statements are correct.