Yes you're actually correct. I had this with Hogwarts Legacy as I was playing a... red head.This is my personal guess.
The hair shader gets its main lighting contribution from the indirect specular so environment reflections.
Without raytracing, this is simply fetching from sparsely placed reflection probes, meaning a low spatial resolution (plainer appearance) and light leaking due to less or no specular occlusion, making you feel this is “glowing” or in another word “unshaded”
Tbh this issue exists in the prior entry as well, and many ue4 games kinda suffer from this (hogwartz legacy, shin megami tensei V as the closest two I can recall)
This mirrors what I've seen. When the PS5 drops it really drops low, like 30fps low (parts in quality mode feel like 15fps). Even if it is probably better on average PS5 is equally as awful as XSX in terms of overall experience.
There are also horrible artifacts
Guy on Twitter has this more or less locked to 120fps on a 13900k and....... get this............8800Mhz DDR5!!
For that CPU to still show performance gains at that memory speed shows how memory bandwidth CPU's are and is an issue where console CPU's could benefit greatly from due to their memory set-up.
Wait what? Devs have been suffering with crunch since the PS1 days and this is absolutely nothing new at all. I think it was Kaz saying they were sleeping on beds in the office while working on GT1. While the scope of games have increased, the quality of tools have also drastically improved. Studios have ballooned in size to unbelievable heights to chase a scope that they chose for themselves.I think devs are being crunched to death, overworked, overruled by publishers which want games out faster with the complexity and scale of games not slowing down and instead vastly increasing with every title.
It's definitely a problem. But I won't fault the human labor for being "unskilled" it's a bad take.
The devs of Jedi Survivor chose the scope of their game. The consumers didn’t.
That doesn't seem very obvious to me? CPU utilization is probably pretty bad on consoles as well
Guy on Twitter has this more or less locked to 120fps on a 13900k and....... get this............8800Mhz DDR5!!
For that CPU to still show performance gains at that memory speed shows how memory bandwidth CPU's are and is an issue where console CPU's could benefit greatly from due to their memory set-up.
I would Imagine it still has stutters - bringing down a 250 ms stutter or even 80 ms stutter to 8 ms requires a complete magnitude or more of Processing Power.Guy on Twitter has this more or less locked to 120fps on a 13900k and....... get this............8800Mhz DDR5!!
For that CPU to still show performance gains at that memory speed shows how memory bandwidth CPU's are and is an issue where console CPU's could benefit greatly from due to their memory set-up.
Lol.
steve has chosen this hill to make his last stand for some reason.
I wish he had shown the Koboh performance because that's where you get the worst stutters. But still impressive.
steve has chosen this hill to make his last stand for some reason.
Lol.
He’s not making an incorrect claim I guess,
ooof yea.Still plenty of big-ass stutters though. And as mentioned that's not usually the area where you see the biggest CPU bottlenecks regardless.
View attachment 8867
He's taken these kind of potshots at DF before. Pretty juvenile.
He is though.
Like Alex said, every CPU will get these stutters, and yes, that means the latest X3d chips. Some systems will be less prominent than others, but they do occur - and especially when you actually benchmark an area outside of the first 30 minutes.