Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
Some general weird performance all around - why is the 3060ti blowing away the 2070 super?

View attachment 9756
That is extra strange, might it be one of the few edge cases where Ampere's doubled FP32 rate vs Turing actually matters?
Actually, now that I look at it, some of the other results are really strange as well, when you check other resolutions.:

Like 1440p:
How on earth is an RX 590 beating the 2070 Super?

Also, why is the 5700XT twice as fast as the 2070 Super at 1440p, but the 2070 Super faster at 1080p?, but then the position reverses again with the 2070 Super faster at 4k?

Normally, the 5700XT, 2070 Super, and 1080Ti are all pretty close in rasterized titles.

It almost looks like some of the 8GB VRAM cards are having things spill over into system memory... but not all of them, and not consistently.

1696448621901.png
 
they go on about memory here https://www.maxon.net/en/tech-info-cinebench

  • The higher complexity of the scene also means the memory footprint has increased threefold. Depending on your CPU configuration Cinebench 2024 will need at least 6.5 - 8.5 GB RAM. The minimum memory requirement is therefore set to 16 GB (while macOS can execute the CPU test also on 8 GB machines – with significant influence of paging – Windows usually has several GB assigned to non-purgeable memory which prevents the execution of Cinebench 2024 on 8 GB machines).

  • GPUs need more memory for efficient operation, and as such, they need 8 GB or more of GPU memory. On Apple silicon, this memory is part of the unified memory system, and therefore Redshift GPU can only run on machines with at least 16 GB.

  • NVIDIA GPU with CUDA compute capability 5.0 or higher and 8 GB VRAM, or
  • AMD "Navi" or "Vega" GPU or later with HIP capability and 8 GB VRAM or more (see GPU list below)
 
Quite unexpected and does somewhat weaken the argument that Starfields superior AMD performance is born from it being an Xbox exclusive alone.
That was never the messaging that should have been received, and if people read that Starfield and UE5 were the turning points of AMD toppling Nvidia, that was never the case. What both Starfield and UE5 have in common is that they attempt full lighting and reflection systems entirely in software compute based shaders, and no hardware RT is used at all.

The newest FM does not attempt to do this, and if it does, it does it through RT based hardware. The non-RT lighting and GI system is not anywhere close to the fidelity of Starfield or Lumen and without RTAO would look painfully last generation.

By all expectations, this title is our run of the mill typical title that Nvidia cards are designed to completely smash though.

It was never about AMD performing better than nvidia, it was about people upset that nvidia wasn't performing enough in certain titles, (claiming that said titles are poorly optimized) and yet those select titles had some very specific things in common. The new Forza Motorsport, while looking better than FM7, is honestly, just not graphically representative of a leap in lighting and reflections.
 
Yep. Clearly we didn't make a big enough issue about this before..

Also looks like Epic is done with it as they don't mention anything about continuing to improve the situation on their roadmap.

Good to see at least with Croteam (Talos Principle 2), they're actively engaging with the community and requesting logs, even if the demo isn't 'officially' posted (here's the Steam link btw if you want to check it out). I sent them a log, and while they do seem to be performing some shader precompiling, it does look from the log that it is indeed encountering new PSO's during gameplay (which admittedly is likely always going to occur to some extent though)

[2023.10.04-22.29.03:829][ 0]LogRHI: Display: Encountered a new graphics PSO: 1262271872
[2023.10.04-22.29.03:843][ 0]LogRHI: Display: Encountered a new graphics PSO: 1867542771
[2023.10.04-22.29.03:880][ 0]LogRHI: Display: Encountered a new graphics PSO: 964093621
[2023.10.04-22.29.03:904][ 0]LogRHI: Display: Encountered a new compute PSO: 2684016535
[2023.10.04-22.29.03:905][ 0]LogRHI: Display: Encountered a new graphics PSO: 3692416332
[2023.10.04-22.29.03:905][ 0]LogRHI: Display: Encountered a new graphics PSO: 2229874854
[2023.10.04-22.29.03:906][ 0]LogRHI: Display: Encountered a new compute PSO: 2585203028
[2023.10.04-22.29.03:919][ 0]LogRHI: Display: Encountered a new graphics PSO: 1524516730
[2023.10.04-22.29.03:984][ 0]LogRHI: Display: Encountered a new graphics PSO: 615818470
[2023.10.04-22.29.04:104][ 0]LogRHI: Display: Encountered a new graphics PSO: 3568324265
[2023.10.04-22.29.04:162][ 0]LogRHI: Display: Encountered a new graphics PSO: 308840599
[2023.10.04-22.29.04:163][ 0]LogRHI: Display: Encountered a new graphics PSO: 764357097
[2023.10.04-22.29.04:170][ 0]LogRHI: Display: Encountered a new graphics PSO: 2031053507
[2023.10.04-22.29.04:201][ 0]LogRHI: Display: Encountered a new graphics PSO: 4006432873
[2023.10.04-22.29.04:201][ 0]LogRHI: Display: Encountered a new graphics PSO: 2735894437

Hopefully they can get it sorted for release.
 
Last edited:
That was never the messaging that should have been received, and if people read that Starfield and UE5 were the turning points of AMD toppling Nvidia, that was never the case. What both Starfield and UE5 have in common is that they attempt full lighting and reflection systems entirely in software compute based shaders, and no hardware RT is used at all.

The newest FM does not attempt to do this, and if it does, it does it through RT based hardware. The non-RT lighting and GI system is not anywhere close to the fidelity of Starfield or Lumen and without RTAO would look painfully last generation.

By all expectations, this title is our run of the mill typical title that Nvidia cards are designed to completely smash though.

It was never about AMD performing better than nvidia, it was about people upset that nvidia wasn't performing enough in certain titles, (claiming that said titles are poorly optimized) and yet those select titles had some very specific things in common. The new Forza Motorsport, while looking better than FM7, is honestly, just not graphically representative of a leap in lighting and reflections.
nVidia is much faster with software based raytracing. Look at every compute solution like OpenCL.

What we seeing in Starfield and UE5 is an unoptimized mess of different approaches to solve problems which have been solved with hardware Raytracing in a much more efficient way.
 
So the Robocop developers released a patch and it's fixed the shader compilation stuttering issue.
The Talo's Principle 2 developers have communicated with players and are fixing the compilation stuttering in that game too

The only developers who have yet to even acknowledge the shader compilation issues of their game is the Ghostrunner 2 developers.. (that I know of)

Hopefully they use this time between the demo and full release to fix the issues so that the launch version is as great as it deserves to be!

Also, kudos to the developers for updating their demos and not just promising to fix the issues in the final release!
 
So the Robocop developers released a patch and it's fixed the shader compilation stuttering issue.

Yep, about 90% are fixed. Still got a handful at the start and there's the occasional traversal stutter, but just a massive improvement overall. The price to pay is about 10 seconds worth of compiling at the title screen.

The Talo's Principle 2 developers have communicated with players and are fixing the compilation stuttering in that game too

The only developers who have yet to even acknowledge the shader compilation issues of their game is the Ghostrunner 2 developers.. (that I know of)

Yeah, it's not a good look, albeit GR2 is still UE4 so perhaps that's part of the reason, UE5 seems to be far quicker/obvious to implement a precompiling option, or at least one that actually captures more of the PSO's.
 
Yep, about 90% are fixed. Still got a handful at the start and there's the occasional traversal stutter, but just a massive improvement overall. The price to pay is about 10 seconds worth of compiling at the title screen.



Yeah, it's not a good look, albeit GR2 is still UE4 so perhaps that's part of the reason, UE5 seems to be far quicker/obvious to implement a precompiling option, or at least one that actually captures more of the PSO's.
I actually didn't experience any shader stuttering, and today with the updated demo was actually my very first time playing the game.. but I did experience some very slight traversal stuttering, but in my experience it was in corridor rooms with nothing else happening between the big fighting sequences. And it was indeed very slight, not nearly as intrusive as it has been in other games.

Mind you the game is fairly simplistic. But I will say that I love the damage and destroying walls and chunks of cement flying off the pillars.

The devs definitely should implement a shader compilation screen so that it doesn't stutter during the intro logo videos lol. UE should really have something built in to easily allow that for developers.

And yea, I noticed the documentation for UE5.3 PSO caching is far more detailed than before, reflecting the new features of the engine.

 
Last edited:
That’s the point, they aren’t going to rewrite all the shaders of a game to benefit nvidia, and no drivers will address it either
Pardon me for quoting a slightly old post, but Intel just announced a driver that boosted their Starfield performance by huge margins. So driver improvements are indeed happening in this title.

Game performance improvements versus Intel® 31.0.101.4885 software driver for:
  • Starfield (DX12)
    • Up to 117% uplift at 1080p with Ultra settings
    • Up to 149% uplift at 1440p with High settings

 
Pardon me for quoting a slightly old post, but Intel just announced a driver that boosted their Starfield performance by huge margins. So driver improvements are indeed happening in this title.



it was never working with intel to begin with.

I think it’s fair intel would see significant improvements. I still don’t know what you are expecting with nvidia performance.
 
Pardon me for quoting a slightly old post, but Intel just announced a driver that boosted their Starfield performance by huge margins. So driver improvements are indeed happening in this title.

Not particularly relevant to their point wrt to Nvidia 'optimizing' their drivers to provide big boost to Starfield though. Of course Intel drivers see a 'boost', because it was basically unplayable before:

1697000135157.png

Intel didn't discover any magic involving supposed shader 'rewrites', they're just continuing their long process to mature their DX11 drivers.
 
it was never working with intel to begin with.
At 1080p Ultra, Arc GPUs were working, just not optimally. An A770 was so underperforming, it was basically equal to an RTX 3050. Now with the new driver, I am sure it's now up to a 3060/4060.

VQDizFd6pGAc6G6ktHwYEN-1200-80.png



I think it’s fair intel would see significant improvements. I still don’t know what you are expecting with nvidia performance.
NVIDIA GPUs are severely underperforming in this title as well (just like Intel GPUs), their power consumption is significantly lower than normal. If Intel can do it, NVIDIA could do it as well.

Intel didn't discover any magic involving supposed shader 'rewrites', they're just continuing their long process to mature their DX11 drivers.
Starfield is pure DX12 though, not DX11.
 
Status
Not open for further replies.
Back
Top