The Last of Us, Part 1 Remaster Remaster [PS5, PC]

So basically what your saying is, GPU theoretical power alone as expected doesn't tell the whole story about performance. Should be obvious to anyone really

It's not that PS5 gpu is stronger than the 6600xt, but the rest of the system is allowing the PS5 gpu to use much more of it's peak theoretical power due to having more ram(12 vs 8) and more bandwidth. 6600xt has close bandwidth as PS5 but only using embedded ram, PS5 has that level of bandwidth just by default

GPU memory bandwidth should absolutely be considered part of the GPU's "theoretical power". And that bandwidth should be considered in the context of the GPU's resolution target. That's doubly important for RDNA2 with its infinity cache where its not only the speed (bandwidth) that is scaled for a specific resolution, but also the capacity. The 6600xt has been specifically scaled by AMD to hit its full potential at 1080p and hence it could be argued that its in that resolution where it should be compared to PS5 with higher resolutions adding a handicap that the GPU simply wasn't designed to cope with.
 
Requirements are now amended, 5700XT is in place of 5800XT, and 7900XTX is in place of 7900XT.

The 8GB version of the 3060, one of the worst GPU SKU for the market atm, is chosen as a recommended requirement. Performance wise, it's worse than the 2070s by quite a large margin.
What are they thinking?
 
Requirements are now amended, 5700XT is in place of 5800XT, and 7900XTX is in place of 7900XT.

There's also Iron Galaxy's logo in the image now.

52739543965_bfc86cbdbe_h.jpg
 
Never said they were (that was Techuse), whatever that means regardless. No recommended specs can ever be 'dead on'. I'm saying with respect to GPU power, these recommended spec charts are more accurate than not when we look at the performance of the games once released vs their earlier recommended chart. Can you point to one of these recommended spec releases where the GPU power recommended for fps/res was wildly overestimated?



So then what's the problem? You're the one expressing bewilderment that people here are actually taking these released specs are representative of how GPU hungry the game will be. From past history, especially when comparing the only other ported game built on a Naughty Dog engine, there's more reason to believe these will be closer to the actual GPU demand of the game than not.



"Like everybody" - but you're posting this in a thread where nobody here is expressing shock/surprise at CPU requirements. Everyone here is talking about the GPU.



There isn't a 'strategy' here, there's just recalling actual history. You're coming into a thread where some have expressed a little surprise/concern about the GPU requirements being relatively 'high', and effectively told everyone, hey- calm tf down, these specs are usually nonsense and there's no reason for concern.

Your initial post doesn't make any sense in that context if you're actually arguing that the release performance could actually be more demanding than these recommended specs indicate, so it's pretty clear your angle here is to assuage the (tepid) concern by dismissing every aspect of the recommend specs as Sony just pulling them out of their ass. So Horizon (and I would argue, Spiderman to an extent on the lower-end recommended specs) coming in with being more demanding at the actual release is perfectly applicable to my argument; these recommended spec charts are not exorbitantly overshooting the required hardware. With respect to GPU power, on the whole, they have a history of being pretty accurate.
So what's the problem? When something is only 'vaguely' accurate through guesswork, it should NOT be treated seriously, like it's absolute gospel. Cuz those examples you named are the exception, not the rule. Most of the time they are generally pretty wide of the mark in terms of accuracy where you couldn't even claim they are in the vague ballpark of accurate.

People genuinely base their purchasing decisions based on this stuff. They judge the entire port quality before the game even comes out. And they just generally display a total lack of critical thinking that is just endlessly frustrating to watch, as they take something seriously that is never really accurate, over and over and over and over again.

And again, you're completely ignoring my whole point about the CPU requirements being inaccurate. I'll repeat myself again, though I shouldn't have to - the fact that the CPU requirements are CLEARLY made up is all the evidence we need to know that these sorts of requirements in general aren't being tested thoroughly and shouldn't be taken so seriously. Just because people are only 'talking about' GPU's doesn't change this. It's not complicated.

Your interpretation of my 'agenda' here is also laughably wrong. I have no agenda here in regards to this specific game. My only REAL point here is that people should just wait and see til the game(or any game) releases when it comes to fretting over hardware demands. Again, this shouldn't be complicated to understand. So yes, something overstating or understating demands is completely irrelevant - the point is that they aren't accurate. You need to make up some other agenda here for my post because you just dont want to admit that these should NOT be taken seriously, and people are being extremely silly for doing so, and seem to never learn.

EDIT: Apparently they're trying to ensure this runs ok on Steam Deck. Which already would confirm the SSD minimum requirement is not correct, to say nothing of the other specs that are obviously well above Steam Deck's.
 
I'm actually pleased if Iron Galaxy are helping with TLOU P1. The Uncharted Collection has always ran really great for me, although there definitely IS a problem with mouse/keyboard controls where the camera is stuttery.. which doesn't happen with a controller, so I completely understand where some people are coming from with their concerns.

Hopefully TLOU P1 doesn't have this same issue, and if it does, then we'll have to make a big fuss to hopefully get mouse controls fixed for both.
 
I notice people don't talk about nixxes anymore. Sony bought them specifically for PC ports like this. Why aren't they working on it? They are amazing and always have been at PC ports in addition to PS4 ports of the tomb raider games
 
I notice people don't talk about nixxes anymore. Sony bought them specifically for PC ports like this. Why aren't they working on it? They are amazing and always have been at PC ports in addition to PS4 ports of the tomb raider games
Supposedly working on Ratchet and Clank: Rift Apart.

Seems pretty clear to me that Nixxes is working closely with Insomniac at the moment and Iron Galaxy with Naughty Dog.. probably getting the teams up to speed with doing internal porting on their own eventually as they grow.
 
Supposedly working on Ratchet and Clank: Rift Apart.

Seems pretty clear to me that Nixxes is working closely with Insomniac at the moment and Iron Galaxy with Naughty Dog.. probably getting the teams up to speed with doing internal porting on their own eventually as they grow.
But if they are teaching internal studios about PC what would their job become. That seems off
 
But if they are teaching internal studios about PC what would their job become. That seems off
Get the teams to a point where they can be self sufficient then move on to other teams maybe? They've ported the Insomniac engine over to PC, so they're the best studio to efficiently bring Insomniac's library of titles to PC. After that, I could see them integrating with another studio and doing the same.
 
But if they are teaching internal studios about PC what would their job become. That seems off
Nixxes can only do so much on their own. They will always be valuable and useful, but if Sony wants to make a bigger push for PC releases, they need the main studios to be competent at doing these ports as well when Nixxes is busy with other stuff.
 
Back
Top