Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
Jedi was designed to run at 30 FPS. This performance mode is a joke. "Next Gen" console and then upscaling from ~1080p with FSR performance and reduced quality.
I think console gaming has to go back to simpler times.
Arbitrary resolution counting doesn't really matter to what generation the hw is as long as the resulting image looks passable and/or it allows for a stable fps. It's no longer even next gen, it's current gen. Your acting like DLSS, TSR and FSR and XeSS are not vital to the gaming experience now. They are and it applies to all hw.

It's silly to rag only on consoles when every platform has issues. Just as it would be silly for me to rag on a PC optimization issue when the console versions turn out ok.

Let's all come together and improve gaming rather than platform war between console and pc
 
Sigh... okay, to try and bring this back to something resembling a technical discussion, let me explain for you why this kind of misreporting of technical details does indeed have the potential to impact peoples purchasing decisions.

Scenario 1
  1. Reviewer claims that a 4090/7900XTX is unable to maintain 60fps in a game while in reality they are bottlenecked by lets say a 5950X to 50fps average and without that bottleneck the GPU could comfortably average 120fps.
  2. Consumer who owns (for example) a 3060 along with a 5600x concludes that since their GPU is less than 1/3 as fast as the GPU that cannot hit 60fps, the game, will be unplayable on their system.
  3. Consumer decides not to purchase game on those grounds.
  4. Consumer has been misinformed because in fact their GPU was quite capable of hitting playable frame rates in the game while their CPU, being only roughly 10%-15% slower for gaming would have still been in the 40fps range for a solid 30fps lock (or higher with VRR).
Scenario 2
  1. Reviewer claims that a 4090/7900XTX is barely able to maintain 60fps in a game while in reality they are bottlenecked by lets say a 7800X3D, and without that bottleneck the GPU could easily exceed 100fps.
  2. Consumer who owns (for example) a 3080 along with a 3700x concludes that since their GPU is only around half as fast as the GPU that just about hitting 60fps, they need to upgrade their GPU to something faster
  3. Consumer throws $1200 down on a 4080, buys the game and loads it up
  4. Consumer is massively bottlenecked by their 3700x to around 30 fps because they were misinformed by the reviewer. If they had spent 1/4 of that $1200 on a a new CPU (5800X3D) instead of the GPU, they could have achieved the same ~80% performance of the reviewers system that they expected to get from their GPU upgrade. Instead, they spent 4x as much and got no performance increase at all over their older GPU.
Obviously given the variety of hardware combinations out there it would be possible to come up with hundreds of different variations on the above scenarios that all lead to the consumer making a poor choice because they were misinformed.

Let's be honest, if we boil this right down to basics, you're arguing that reviewers putting out false/misleading information doesn't matter. Are you really willing to die on that hill?
Another fantastic straw man. That has never been my argument. My argument has been that in the context of the discussion occurring here on b3d, your point is pedantic as the audience here already know it. I’ve been in support of you correcting misinformation at the source.

Finally with regards to your hypothetical scenarios, they’re just that… hypothetical. Even if we were to take it at face value, the article in question would not be the sole source of contention for a title surrounded by overwhelming negativity. Anyone going to a niche site like pcgamesn is likely already an enthusiast and would likely come across other information on sites they frequent. Anyway, let’s just bury this man, it’s pointlessly.
 
The write up says the performance mode is reconstructed from between 972p and 1080p up to 1440p. Is the internal res actually even lower?

Is does look shockingly bad at times in performance mode with loads of aliasing and shimmering and has loads of what I assume are FSR artifacts when you move the camera.
This is a joke right? The devs can’t be that bad….. I don’t know if it’s just me but, it appears that there’s a generational shift going on in the games industry in terms of talent. This might not be correct but, I get the impressions that the talent from the ps360 era are starting to retire or go to find new challenges. And to me, the new talent is not remotely up to scratch. Saints Row, Fallen Order, Gotham Knights, etc comprise a list of many of the technically unimpressive efforts that are unnecessarily demanding.
 

Seems reasonably balanced from what I watched, although a little light on the settings comparison. He does again over emphasise the "split memory pools" of the PC being the cause of the performance issues a little, but he also clearly states the PC version is lacking in optimisation vs the consoles and will hopefully be resolved through patches.

One interesting thing I took away from the video was the recommendation for 30fps on both consoles and PC (or his PC at least) though due to the inconsistent performance mode. If we consider 30fps viable in this title then most half decent PC CPU's should be up to the task leaving more powerful GPU's free to ramp up the core graphics and image quality.

Another fantastic straw man. That has never been my argument. My argument has been that in the context of the discussion occurring here on b3d, your point is pedantic as the audience here already know it. I’ve been in support of you correcting misinformation at the source.

Lol, so you acknowledge that there was nothing actually incorrect or wrong with what I said, you simply dislike the fact that I said it because you deem it to be unnecessary. As such you're suggesting that people shouldn't be allowed to comment on this forum about the technical merits of content published elsewhere on the web now - even if those comments are accurate? Maybe you should post that as a suggestion in the site feedback thread.... best of luck!
 
If you are happy with 30fps then both consoles seem ok.

I've tried both consoles and the performance mode is, for me personally, unplayable on both.

The XSX version chugs constantly and the PS5 version feels better on average but drops out of VRR range fairly often giving really noticeable stutters.

If they can get the PS5 version to stick within the VRR range then it's definitely closer to being ok.

If you don't have a VRR TV don't touch performance mode on either console as it will be a horrible experience.

I don't think there is enough mention in this video about just how bad the IQ is in performance mode and how many FSR artifacts are obvious.
 
This is a joke right? The devs can’t be that bad….. I don’t know if it’s just me but, it appears that there’s a generational shift going on in the games industry in terms of talent. This might not be correct but, I get the impressions that the talent from the ps360 era are starting to retire or go to find new challenges. And to me, the new talent is not remotely up to scratch. Saints Row, Fallen Order, Gotham Knights, etc comprise a list of many of the technically unimpressive efforts that are unnecessarily demanding.
I think devs are being crunched to death, overworked, overruled by publishers which want games out faster with the complexity and scale of games not slowing down and instead vastly increasing with every title.

It's definitely a problem. But I won't fault the human labor for being "unskilled" it's a bad take.
 
I think devs are being crunched to death, overworked, overruled by publishers which want games out faster with the complexity and scale of games not slowing down and instead vastly increasing with every title.

It's definitely a problem. But I won't fault the human labor for being "unskilled" it's a bad take.

I think that's true, but also in this case EA asked Respawn if they needed more time. They declined.
 
I think that's true, but also in this case EA asked Respawn if they needed more time. They declined.
The problem with this is who knows what the "consequences" are internally for respawn not making this date or financial quarter. The first time I heard compensation was being withheld for arbitrary metacritic review scores after the fact I knew there was tons of shit under neath the counter we were not seeing in how devs are pressured by publishers

Of course if it was just a matter of respawn overestimating themselves that's fine. But it's very hard to believe they could not know the issues inherent with the game skus rather than the mindset of pushing it out and fixing post launch.
 
I think devs are being crunched to death, overworked, overruled by publishers which want games out faster with the complexity and scale of games not slowing down and instead vastly increasing with every title.

It's definitely a problem. But I won't fault the human labor for being "unskilled" it's a bad take.
I don’t think all the blame can be put on the publishers when there are developers who manage to ship technically competent games in reasonable time frames. There is a huge range in skill/talent when looking at any complex endeavor, why do people think coding would be any different. It’s no coincidence that the best games always come from the same small pool of developers.
 
Last edited:
The problem with this is who knows what the "consequences" are internally for respawn not making this date or financial quarter. The first time I heard compensation was being withheld for arbitrary metacritic review scores after the fact I knew there was tons of shit under neath the counter we were not seeing in how devs are pressured by publishers

Of course if it was just a matter of respawn overestimating themselves that's fine. But it's very hard to believe they could not know the issues inherent with the game skus rather than the mindset of pushing it out and fixing post launch.

Yeah they could have said no to extra time out of arrogance or fear or maybe they actually didn’t understand how bad things were due to lack of transparency internally. In my experience the people who actually know what’s going on aren’t the people making these calls and people often don’t tell their bosses the whole truth. Either way I don’t understand why the metacritic score is so high.
 
Lol the user rating is 1.4!
User scores are very sus a lot of the time(see burning shores)

I think it's a matter of reviewers not being as technically inclined, playing the game on the "most optimal" sku at launch and stuff like that. Unless the issues actually make the game unplayable to that degree, fps drops like on PS5 aren't gonna affect the evaluation of the game itself
 
User scores are very sus a lot of the time(see burning shores)

I think it's a matter of reviewers not being as technically inclined, playing the game on the "most optimal" sku at launch and stuff like that. Unless the issues actually make the game unplayable to that degree, fps drops like on PS5 aren't gonna affect the evaluation of the game itself

Yeah user scores on metacritic are mostly useless. I suspect the positive review scores for PC are often copied from console reviews as it wouldn’t make sense for a reviewer to publish platform specific scores.
 

Nice video. Those are some of the worst SSR artifacts I’ve ever seen. Looks like it’s missing on screen objects too which is super weird. I appreciate the coverage of all the various platforms and settings though it was a bit disorganized. The random comments about PCs having two memory pools being a problem were also unnecessary and probably wrong. The problem is that UE4 isn’t scaling up to take advantage of fast wide CPUs but the poor CPU and GPU utilization is never mentioned.

In short the game was obviously designed to run on consoles @ 30fps. That combination seems to do very well.
 
That doesn't seem very obvious to me? CPU utilization is probably pretty bad on consoles as well
Yes at that low resolution the performance mode is also very likely CPU limited on consoles. It's probably because of the default RT reflections surprisingly also present in that mode.
 
The game has weird glowy orange hair in lit scenes on Series S and usually non ray traced cards. What's up with that?

I was happily playing but now I'm at a location where my hair gets orange glow whenever I move. Ray tracing somehow fixes it. Any ideas?
This is my personal guess.
The hair shader gets its main lighting contribution from the indirect specular so environment reflections.
Without raytracing, this is simply fetching from sparsely placed reflection probes, meaning a low spatial resolution (plainer appearance) and light leaking due to less or no specular occlusion, making you feel this is “glowing” or in another word “unshaded”
Tbh this issue exists in the prior entry as well, and many ue4 games kinda suffer from this (hogwartz legacy, shin megami tensei V as the closest two I can recall)
 
Status
Not open for further replies.
Back
Top