Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
I used to think these scars are some forms of decals bonded to the skin mesh, but from this shot it looks more like writing to a uv space mask, which explains why it looks pixelated on large objects
67B30639-34D4-41CE-9BC9-2DCAF7D6ECD6.jpeg
 
Guy on Twitter has this more or less locked to 120fps on a 13900k and....... get this............8800Mhz DDR5!!

For that CPU to still show performance gains at that memory speed shows how memory bandwidth CPU's are and is an issue where console CPU's could benefit greatly from due to their memory set-up.
 
This is my personal guess.
The hair shader gets its main lighting contribution from the indirect specular so environment reflections.
Without raytracing, this is simply fetching from sparsely placed reflection probes, meaning a low spatial resolution (plainer appearance) and light leaking due to less or no specular occlusion, making you feel this is “glowing” or in another word “unshaded”
Tbh this issue exists in the prior entry as well, and many ue4 games kinda suffer from this (hogwartz legacy, shin megami tensei V as the closest two I can recall)
Yes you're actually correct. I had this with Hogwarts Legacy as I was playing a... red head.


image.png

image.png


It's actually dark red;

image.png
 
This mirrors what I've seen. When the PS5 drops it really drops low, like 30fps low (parts in quality mode feel like 15fps). Even if it is probably better on average PS5 is equally as awful as XSX in terms of overall experience.

My advice is wait for a patch and see where the dust settles.

I know a lot of the reviews mentioned performance but most I read/watched were talking about the PC and implied the console versions were ok which is really far from the truth.

I've not played past the 2nd level but the quality mode is fine the majority of the time if you are ok with 30fps.
 
Guy on Twitter has this more or less locked to 120fps on a 13900k and....... get this............8800Mhz DDR5!!

For that CPU to still show performance gains at that memory speed shows how memory bandwidth CPU's are and is an issue where console CPU's could benefit greatly from due to their memory set-up.

I wish he had shown the Koboh performance because that's where you get the worst stutters. But still impressive.

 
I think devs are being crunched to death, overworked, overruled by publishers which want games out faster with the complexity and scale of games not slowing down and instead vastly increasing with every title.

It's definitely a problem. But I won't fault the human labor for being "unskilled" it's a bad take.
Wait what? Devs have been suffering with crunch since the PS1 days and this is absolutely nothing new at all. I think it was Kaz saying they were sleeping on beds in the office while working on GT1. While the scope of games have increased, the quality of tools have also drastically improved. Studios have ballooned in size to unbelievable heights to chase a scope that they chose for themselves.

The devs of Jedi Survivor chose the scope of their game. The consumers didn’t. In fact, most consumers aren’t asking for the increased scope as only a minority of consumers actually finish the game. On top of that, they literally fixed none of the structural problems both in the previous game post launch and this game. I’d actually buy crunch as an excuse if the fixed the structural cpu issues in Fallen order after launch but they didn’t. They knew they had huge issues with cpus and if they couldn’t fix it, they could have managed the scope of the game to work around the issue.

How long should crunch be used as a catch all excuse to remove personal accountability? There are other developers suffering with the same constraints and still managing to deliver technically proficient products? I don’t mean to sound unsympathetic to their plight but, I have never worked on a software project where there wasn’t any crunch. It doesn’t mean we were allowed to deliver a substandard product despite our poor working conditions.
 
Last edited:
Tools have not increased with the scope and complexity of games at all, by an order of magnitude. The entire AAA industry wiped itself out for the most part in the 7th gen pushing thousand man teams on the worst case when on PS1 the highest team size was barely 100.

And as more money was introduced into the industry publishers and shareholders became far more interested in drawing blood from a stone than what would be best for workers.

In the ps1 era crunch was 20 or 30 people working nights overtime and that was the worst teams in Japan

Today it is normalized everywhere to not be home for weeks and sleeping in the office just cause your not allowed to see your family until the game ships and we are talking about hundreds of people at a time. It's not sustainable and hasn't been sustainable. It's without a doubt partly to blame for what is happening right now
 
And how much of that increased complexity is due to developers shoving RPG elements into every game so they can attempt to nickel and dime gamers with a metric fuckton of DLC? Not to mention the whole open world issue. No large majority of gamers are demanding or even want this nor are there sales trends suggesting this is the only way to make a profitable game.
 
Last edited:
That doesn't seem very obvious to me? CPU utilization is probably pretty bad on consoles as well

Maybe but does that matter as long as it’s delivering a solid 30fps? If your target is 30 it doesn’t matter if or why you’re limited above that. This game behaves like a 30fps game with an unlock fps option.
 
Guy on Twitter has this more or less locked to 120fps on a 13900k and....... get this............8800Mhz DDR5!!

For that CPU to still show performance gains at that memory speed shows how memory bandwidth CPU's are and is an issue where console CPU's could benefit greatly from due to their memory set-up.

I think the 6.3Ghz OC might have something to do with it too! Raptor Lake does like a lot of memory bandwidth in some titles but on Zen at least, memory latency seems to be more important, at least after a certain point:

 
Guy on Twitter has this more or less locked to 120fps on a 13900k and....... get this............8800Mhz DDR5!!

For that CPU to still show performance gains at that memory speed shows how memory bandwidth CPU's are and is an issue where console CPU's could benefit greatly from due to their memory set-up.
I would Imagine it still has stutters - bringing down a 250 ms stutter or even 80 ms stutter to 8 ms requires a complete magnitude or more of Processing Power.

That is why stutters are awful, they can Take decades of compute performance to make "invisible"
 
I wish he had shown the Koboh performance because that's where you get the worst stutters. But still impressive.


Still plenty of big-ass stutters though. And as mentioned that's not usually the area where you see the biggest CPU bottlenecks regardless.

1682950742514.png

steve has chosen this hill to make his last stand for some reason.

He's taken these kind of potshots at DF before. Pretty juvenile.

Lol.

He’s not making an incorrect claim I guess,

He is though.


Like Alex said, every CPU will get these stutters, and yes, that means the latest X3d chips. Some systems will be less prominent than others, but they do occur - and especially when you actually benchmark an area outside of the first 30 minutes.

Additionally: Here's Bang4buckPCGamer on a 7900XTX and 7900X3D cpu. The 'golden combo' ticket according to HBU, and yes the GPU utilization is often at 99% - however the stutters are clearly visible, there are several large stutters just from 30 seconds of traversal in that area. HBU is simply not testing (or not reporting) the problem areas.
 
Last edited:
Still plenty of big-ass stutters though. And as mentioned that's not usually the area where you see the biggest CPU bottlenecks regardless.

View attachment 8867



He's taken these kind of potshots at DF before. Pretty juvenile.



He is though.


Like Alex said, every CPU will get these stutters, and yes, that means the latest X3d chips. Some systems will be less prominent than others, but they do occur - and especially when you actually benchmark an area outside of the first 30 minutes.
ooof yea.

those lurches are pretty bad.
 
Status
Not open for further replies.
Back
Top