Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
lol what a bizarre response.

No, I'm just fed up of stupid questions and assumptions in this place at the minute from people reading 'x' comment in the interwebs!!11!!!!

You failed to question if I was using v-sync, you didn't initially ask what my GPU load was and you bought up the price of my GPU because....?????

So yea, question better.
 
No, I'm just fed up of stupid questions and assumptions in this place at the minute from people reading 'x' comment in the interwebs!!11!!!!

You failed to question if I was using v-sync, you didn't initially ask what my GPU load was

This isn't answering a troll's riddles to cross a bridge. You could just answer what GPU usage you were getting instead of having this tantrum, as techuse asked as well. People in a technical forum want actual details, not difficult.

and you bought up the price of my GPU because....?????

Because you said it was 'amazing!!!", and it's a console port and is currently being review bombed atm, so your experience seems to be outside the norm and more details are naturally going to be requested. Obviously, this is a common topic here, that being how 'good' a port is relative to what it achieves on its native platform, so the power/price disparity is perfectly relevant.

This isn't a game review thread when you popped in to report "I'm having a blast", and I told you your experience was invalid. It's a thread to report on the technical makeup of released games.

So yea, question better.

Make actually informative posts. Don't you have a game to be playing btw?
 
This isn't answering a troll's riddles to cross a bridge. You could just answer what GPU usage you were getting instead of having this tantrum, as techuse asked as well. People in a technical forum want actual details, not difficult.
And yet you go to STEAM reviews for technical details
Because you said it was 'amazing!!!",
And? For the power I'm using I think it is.
and it's a console port and is currently being review bombed atm, so your experience seems to be outside the norm and more details are naturally going to be requested.
Plenty of videos going up on YouTube showing great performance on GPU's at my level or higher.
Make actually informative posts. Don't you have a game to be playing btw?
Maybe when you stop posting info off STEAM reviews.
 
I'm sure this port could be significantly improved, but at a certain point the reason these tentpole ps5 exclusives are so impressive is the tech and feature set and content of the game are very molded to a fixed platform. Of course there's excessive vram overhead supporting a platform without unified memory that needs to pre-compile shaders. Of course it's hard to make it scalable -- the entire development process until this port was targeting a single spec! Of course it's not "as good as" the previous naughty dog ports, they were ps4 games.
 
I'm sure this port could be significantly improved, but at a certain point the reason these tentpole ps5 exclusives are so impressive is the tech and feature set and content of the game are very molded to a fixed platform. Of course there's excessive vram overhead supporting a platform without unified memory that needs to pre-compile shaders. Of course it's hard to make it scalable -- the entire development process until this port was targeting a single spec! Of course it's not "as good as" the previous naughty dog ports, they were ps4 games.
It seems to be similar to the previous ND port. What is this game doing any differently than TLOU 2 to be disqualified as a PS4 game? A 1060 being unable to hit 60 fps at 720p/low, which looks vastly worse than TLOU 2 on PS4, is indefensible.
 
So now gaming press is mad when games pre-compile all shaders?

I think they're saying it takes an hour to compile shaders on the Steam Deck. When the game was advertised as Steam Deck ready especially, it's completely reasonable to note that you will be waiting an hour before you can start to (maybe) enjoy the game after it's installed.

Potentially this is something that Valve could provide in the near future by compiling them on their own servers as they have for other games, they likely didn't get the code until now like everyone else.

It seems to be exactly the same as the previous ND port. What is this game doing any differently than TLOU 2 to be disqualified as a PS4 game? A 1060 being unable to hit 60 fps at 720p/low which looks vastly worse than TLOU 2 on PS4 is indefensible.

Yeah going from this to TLOU2 on my PS5 and the differences are relatively marginal. TLOU1:RM definitely looks better sure, but it is really not that significant a leap visually over TLOU2.
 

4090 is clearly punching below it's weight. 1060 is a complete disaster.

How so?

At native 4k its running at well over double the frame rate at Ultra settings.

If we find out PS5 is equivalent to Ultra settings you have a point, but if PS5 is level with say medium than the 4090 performance is fine.

And how much of the 1060 performance is due to it being nearly 7 years old and well down in Nvidia's driver priority list.
 
I think they're saying it takes an hour to compile shaders on the Steam Deck. When the game was advertised as Steam Deck ready especially, it's completely reasonable to note that you will be waiting an hour before you can start to (maybe) enjoy the game after it's installed.

Potentially this is something that Valve could provide in the near future by compiling them on their own servers as they have for other games, they likely didn't get the code until now like everyone else.



Yeah going from this to TLOU2 on my PS5 and the differences are relatively marginal. TLOU1:RM definitely looks better sure, but it is really not that significant a leap visually over TLOU2.
The performance profile here honestly seems identical to the Uncharted 4 port.

How so?

At native 4k its running at well over double the frame rate at Ultra settings.

If we find out PS5 is equivalent to Ultra settings you have a point but if PS5 is level with say medium than the 4090 performance is fine.

And how much of the 1060 performance is due to is being nearly 7 years old and well down in the driver priority list.
The PS5 isn't running exactly at 30. There is going to be headroom to maintain the Vsync. A 4090 is ~4x the raster performance This is also probably one of the lesser demanding areas of the game. When the coding is competent a 1060 can still perform admirably. This game isnt using state of the art rendering tech that obsoletes a 1060.
 
Last edited:
I think they're saying it takes an hour to compile shaders on the Steam Deck. When the game was advertised as Steam Deck ready especially, it's completely reasonable to note that you will be waiting an hour before you can start to (maybe) enjoy the game after it's installed.

Potentially this is something that Valve could provide in the near future by compiling them on their own servers as they have for other games, they likely didn't get the code until now like everyone else.



Yeah going from this to TLOU2 on my PS5 and the differences are relatively marginal. TLOU1:RM definitely looks better sure, but it is really not that significant a leap visually over TLOU2.
I understand your point but problem also extends to the PS5 itself more than the PC hardware. Logically, if PS4 is able to run TLOU2 at 1080p/30 fps with that kind of gorgeous fidelity; you would expect something like TLOU Part 1 remake to run at 1800p/2160p/60 FPS on PS5, whereas it drops to 20s on a 30 FPS mode at native 4K, which is pretty hilarious.

I don't know what this remake tries to do this much demanding, regardless of PC or PS5.

We need that sweet PS5 equivalent settings IMO.
 
The performance profile here honestly seems identical to the Uncharted 4 port.

It definitely seems similar in rendering load, but Uncharted's shader compiling during gameplay at least would also not affect frametimes that much if at all, so that seems to be a big distinction here. Also the vram, we'll have to see what texture setting is equivalent to the PS5's. If it's Ultra, then...yowza.

Even so, the evidence just keeps mounting that 12GB really is going to be the bare minimum for any reasonable performance tier GPU going forward. If/when the 4060ti releases, this is going to be a huge mark against it.

(It's bizarre how it's inherited the mouse stuttering issue from Uncharted as well. Also reports are that Iron Galaxy's involvement in this was quite minor - were they involved in m&k implementation specifically? :))
 
I understand your point but problem also extends to the PS5 itself more than the PC hardware. Logically, if PS4 is able to run TLOU2 at 1080p/30 fps with that kind of gorgeous fidelity; you would expect something like TLOU Part 1 remake to run at 1800p/2160p/60 FPS on PS5, whereas it drops to 20s on a 30 FPS mode at native 4K, which is pretty hilarious.

I don't know what this remake tries to do this much demanding, regardless of PC or PS5.
TLOU 2 doesnt even run at 4k/60 on PS5. Both run at the same resolution and framerate. It's likely more an issue of the work being required to tailor the engine to PS5 doesn't justify the return.
 
It definitely seems similar in rendering load, but Uncharted's shader compiling during gameplay at least would also not affect frametimes that much if at all, so that seems to be a big distinction here. Also the vram, we'll have to see what texture setting is equivalent to the PS5's. If it's Ultra, then...yowza.

Even so, the evidence just keeps mounting that 12GB really is going to be the bare minimum for any reasonable performance tier GPU going forward. If/when the 4060ti releases, this is going to be a huge mark against it.

(It's bizarre how it's inherited the mouse stuttering issue from Uncharted as well. Also reports are that Iron Galaxy's involvement in this was quite minor - were they involved in m&k implementation specifically? :))
Problem goes beyond 4060ti, really. They also plan to make the 4060 8 GB too. These 60 cards are always the most popular ones. But problem is, this level of performance is becoming too much for 1080p. Most people will be able to run Ultra maxed settings. And the argument of "you don't need 4K textures on 1080p" has always been a misleading opinion. I have my own experience, I always saw huge uplift in texture quality in games with 4K texture packs regardless of my screen. So why or how they are still able to push this 8 GB narrative is beyond me. With nextgen textures, I can see 10-12 GB being baseline for 1080p and 16 GB a must for 1440p.

I really don't understand the obsession with 8 gigs. 3060 had the right amount IMO. some people mocked the card saying "never be able to utilize", but here we are, it will be able to run maxed out textures in RE 4, Last of Us even if it run the game at 1080p and will easily use upwards of 10 GB while doing so (same for hogwarts too)
 
Remember, a 3090 offered over double the raster performance of a PS5. 4090 is 60-70% faster than a 3090.

So that's not 4x PS5 is it and is barely 3x (In which case the 4090's performance at native 4k with Ultra settings is about right)

And remember the same PC GPU inside a console will always perform higher.
 
So that's not 4x PS5 is it and is barely 3x (In which case the 4090's performance at native 4k with Ultra settings is about right)

And remember the same PC GPU inside a console will always perform higher.
About 3.5-3.6x depending on which resolutions you draw performance scaling numbers from.
 
I really don't understand the obsession with 8 gigs. 3060 had the right amount IMO. some people mocked the card saying "never be able to utilize", but here we are, it will be able to run maxed out textures in RE 4, Last of Us even if it run the game at 1080p and will easily use upwards of 10 GB while doing so (same for hogwarts too)

The argument that "You don't have the GPU power to run those cards at 4k anyways" was always bunk, I play a lot of games at '4K' with DLSS on my 3060, and with high textures it's not like DLSS massively cuts down on the amount of VRAM needed - and I'm not just looking at 'allocated' vram in rivatuner either, I'm looking at the actual amount used. Sure, most games would be 'ok with 8 or 10gb, but damn it's close.

I've always held the belief that you need the amount of vram on PC GPU's that consoles have in working ram in total to be 'safe' for their generation, and I think with the past two gens that's generally played out as such.

On a related note, also goes back to my critique of most PC DIY gaming sites's benchmarks just being generally shit, at least to detect bottlenecks like vram spillage/share compilation stutter. As Gamers Nexus pointed out years ago, 1%, and even .1% lows are just inadequate to detect performance issues that are noticeable to a player, on a benchmark run long enough to be meaningful at least. Those .1% lows will be averaged out enough where they give a false impression of a higher floor than what's actually detectable to the gamer.

So with sites like Hardware Unboxed that only report 1% lows (not even .1%!) I wonder how often some games are actually exhibiting major frametime spikes on 8GB cards that are being missed in the noise. 1% lows will not give you anywhere near the level of granularity you need to determine actual consistency.
 
Status
Not open for further replies.
Back
Top