Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
Thanks for new thread, this year will be a good one hopefully. I look forward to Hogwarts Legacy and then hopefully Starfield towards the end...
 
I feel that this is the most disappointing generation
Life takes over. We have less major companies making games. Games take more time to make, production shortages affecting units which affects devs. COVID destroying plans. It's all there.
 
Nah it had graphical artifacts that shouldn't have been there. Think it got fixed already though.
Fixed as in turned off with virtually no performance loss (about 5% according to NXGamer). Which reminds me my algorithm prof favorite mantra: the only good GOTO is a dead GOTO. :yep2:
 
Bwahaha, PC Max settings appears to be missing shadows in the distance.

Probably because the BVH doesn't extend very far so the RT shadows don't either.

The materials also look like they were designed with SS reflections in mind. The RT full reflections just end up looking like a mirror, missing all the detail the artists put into the materials, or very obviously flicker and "boil" even at max settings.

Looks like the best visual quality is actually to turn RT off. I thought there was RT AO but I didn't notice any.

TBH incorrect material responses and buggy looking "features" are becoming more and more common. I'd really rather games just run well rather than have all these "extras" that don't actually look good on PC. EG the supposedly "bad" FFVII Remake was basically perfect for me, 10/10 port. "It just runs well" saves dev time and aggravation too.
 
Last edited:

  • PS5 and Xbox X Series have 5 display modes. The quality mode displays settings similar to High/Ultra on PC with an average resolution of 1800p at 30fps without ray-tracing.
  • Ray-tracing mode is applied on shadows and reflections (except water), shadows and ambient occlusion in a hybrid way with conventional lighting.. However, RT by generates too much noise on some surfaces on consoles and PC.
  • Ray-Tracing mode decreases drawing distance and texture quality on PS5 and Xbox Series X.
  • Xbox Series S does not have Ray-Tracing. In this version we can choose between 3 display modes that prioritize graphic quality, framerate or balanced (the latter only with 120Hz screens).
  • Shorter loading times on PS5.
  • Shadows in Quality (Fidelity) mode have a higher resolution on PS5 compared to Xbox Series X.
  • We can unlock the framerate on consoles (except Xbox Series S).
  • PS5 shows better performance in Ray-Tracing mode, but Xbox Series X framerate is higher in all other modes.
  • Quality mode increases texture quality, draw distance, shadows, vegetation and lighting on consoles.
  • All versions have stuttering problems inside the castle while advancing through the rooms. There will even be times when we get stuck in a doorway until the other room has finished loading.
  • Xbox Series S has a lower NPC density.
  • Balanced and Performance HFR modes can only be enabled on 120Hz compatible displays.
  • On PC, Ray-Tracing is too demanding for what it delivers. This is another case of a game whose traditional lighting system is the clear basis of development and Ray-Tracing is a superfluous improvement.
  • In the absence of some patches to correct the annoying stuttering, Hogwarts Legacy does a good job on all platforms and any version is enjoyable.
 

Test system specs:
CPU: Ryzen 7700X
Cooler: Corsair H150i Elite
Mobo: ROG Strix X670E-a
RAM: 32GB Corsair Vengeance DDR5 6000 CL36
SSD: Samsung 980 Pro
Case: Corsair iCUE 5000T RGB
PSU: Thermaltake 1650W Toughpower GF3
Monitor: LG C1 48 inch OLED
Keyboard: Logitech G915 TKL (tactile)
Mouse: Logitech G305
  • The RTX 4090 cannot maintain 60fps at 4K Ultra/DLSS Performance/RT Ultra (base 1080p resolution). It runs at 45-55fps. GPU usage is around 45-50%.
  • The RTX 4090 at 4K Ultra/RT off managed 80-90fps range but GPU usage was still low at around 50-60%
  • A 3060 Ti on the same system at 1440p Ultra/No RT ran at 45-55fps but with stutters that did not seem to be caused by shader compilations (VRAM perhaps?), However, GPU usage was consistently at 95%+
  • The 3060 Ti at 1440p Ultra/DLSS Quality (base 961p resolution) ran at 70-80fps but notably, GPU usage dropped to being almost always below 90%
  • The 3060 Ti at 1440p/Medium ran at around 70-80fps but once again, the GPU usage was quite low, often dropping to the low 80s
  • The 3060 Ti at 1080p Medium ran at 85-95fps but the GPU usage dropped even further, often hovering around 70-80%
  • The 6700 XT at 1440p Ultra/No RT ran at 60-70fps, notably, GPU usage was almost maxed out at 100%
  • The 6700 XT at 1440p Ultra/No RT FSR Quality (base 960p resolution) ran at 90-100fps and again, the GPU was almost always maxed out
  • The 6700 XT at 1080p Ultra/ No RT ran at around 80-90fps. GPU usage hovered around 100%
  • The 7900 XTX at 4K/Ultra/RT off managed 70-80fps. 100% GPU usage
  • The 7900 XTX at 4K/Ultra/RT off FSR Quality (base 1440 resolution) off managed 100-110fps. 100% GPU usage
  • The 7900 XTX at 4K/Ultra/Ultra RT could not maintain 30fps, tanks to the mid 20s. 100% GPU usage
  • The 7900 XTX at 4K/Ultra/Low RT managed 35-45fps. 100% GPU usage
  • The 7900 XTX at 4K/Ultra/Low RT/DLSS Quality managed 60-70fps. 100% GPU usage
  • The 7900 XTXT at 4K/Ultra/Ultra RT/DLSS Quality managed 45-55fps. 100% GPU usage

Oof, sorry for the long list of bullet points but the tl;dr is that the 3060 Ti and 4090 struggle to be properly utilized. The 4090 especially has extremely low usage to the point of being outperformed by a 7900 XTX without RT. The 6700 XTX and 7900 XTX exhibit no such problem and consistently outperform their NVIDIA counterparts at similar settings and resolutions.
 

Test system specs:
CPU: Ryzen 7700X
Cooler: Corsair H150i Elite
Mobo: ROG Strix X670E-a
RAM: 32GB Corsair Vengeance DDR5 6000 CL36
SSD: Samsung 980 Pro
Case: Corsair iCUE 5000T RGB
PSU: Thermaltake 1650W Toughpower GF3
Monitor: LG C1 48 inch OLED
Keyboard: Logitech G915 TKL (tactile)
Mouse: Logitech G305
  • The RTX 4090 cannot maintain 60fps at 4K Ultra/DLSS Performance/RT Ultra (base 1080p resolution). It runs at 45-55fps. GPU usage is around 45-50%.
  • The RTX 4090 at 4K Ultra/RT off managed 80-90fps range but GPU usage was still low at around 50-60%
  • A 3060 Ti on the same system at 1440p Ultra/No RT ran at 45-55fps but with stutters that did not seem to be caused by shader compilations (VRAM perhaps?), However, GPU usage was consistently at 95%+
  • The 3060 Ti at 1440p Ultra/DLSS Quality (base 961p resolution) ran at 70-80fps but notably, GPU usage dropped to being almost always below 90%
  • The 3060 Ti at 1440p/Medium ran at around 70-80fps but once again, the GPU usage was quite low, often dropping to the low 80s
  • The 3060 Ti at 1080p Medium ran at 85-95fps but the GPU usage dropped even further, often hovering around 70-80%
  • The 6700 XT at 1440p Ultra/No RT ran at 60-70fps, notably, GPU usage was almost maxed out at 100%
  • The 6700 XT at 1440p Ultra/No RT FSR Quality (base 960p resolution) ran at 90-100fps and again, the GPU was almost always maxed out
  • The 6700 XT at 1080p Ultra/ No RT ran at around 80-90fps. GPU usage hovered around 100%
  • The 7900 XTX at 4K/Ultra/RT off managed 70-80fps. 100% GPU usage
  • The 7900 XTX at 4K/Ultra/RT off FSR Quality (base 1440 resolution) off managed 100-110fps. 100% GPU usage
  • The 7900 XTX at 4K/Ultra/Ultra RT could not maintain 30fps, tanks to the mid 20s. 100% GPU usage
  • The 7900 XTX at 4K/Ultra/Low RT managed 35-45fps. 100% GPU usage
  • The 7900 XTX at 4K/Ultra/Low RT/DLSS Quality managed 60-70fps. 100% GPU usage
  • The 7900 XTXT at 4K/Ultra/Ultra RT/DLSS Quality managed 45-55fps. 100% GPU usage

Oof, sorry for the long list of bullet points but the tl;dr is that the 3060 Ti and 4090 struggle to be properly utilized. The 4090 especially has extremely low usage to the point of being outperformed by a 7900 XTX without RT. The 6700 XTX and 7900 XTX exhibit no such problem and consistently outperform their NVIDIA counterparts at similar settings and resolutions.
I expected better numbers from the 7900XTX tbh.

this game performs very well on the Intel A770 16GB, even with RT on. It outperforms the 3070. Dunno the reason. The CPU the author of the video is using is the AMD 7600X. VRAM could have something to do with it.

 
I am not going to be covering the game to my knowledge, but I think this is the issue.

Nearly every UE4 game is dependent on single thread performance, it is an awfully threaded engine by default, and the 7700X is not a single thread king in comparison to others.
A shame you won't be covering this one. The bottlenecks seem complex so I was looking forward to your analysis. I also noticed there is nothing on eurogamer about this release. Will someone else from the DF be covering it?

I've been thinking that but then what puzzles me is that the 6700 XT/7900 XTX are fully loaded whereas the 4090/3060 Ti aren't. At least according to the performance metrics. The 7900 XTX also seems to easily outperform the 4090 as long as you don't turn on RT. A friend of mine has a 5800X3D+4090+32GB of VRAM and this game just flies on his computer from what he's told me. 120fps in Hogwarts at Ultra settings/RT off WITHOUT frame generation which I find hard to believe but he assures me that's what he's getting. He must have toggled DLSS on. There's no way he's getting that at native 4K.

I'm trying to find a video running a 5800X3D+4090 but cannot find one.

Anyway, here's one comparing 16GB to 32GB.


The frame times on the 16GB are absolutely brutal.

I don't have the game so I cannot test it but I think it's best to simply wait for the Day 1 patch at this point. The performance seems all over the place.
 
Last edited:
Status
Not open for further replies.
Back
Top