Current Generation Games Analysis Technical Discussion [2022] [XBSX|S, PS5, PC]

Just looked at memory use on my 3060ti using GPU-Z.

At native 1080p max settings with no ray tracing GPU-Z reports the following:

Max VRAM used: 6786GB

Now same max settings but with ray tracing maxed out (So very high reflections, very high geometric detail and object range slider at 10)

Max VRAM used: 6814GB

So ray tracing doesn't do a fat lot for VRAM savings and I still had a comfortable amount of head room on my 8GB 3060ti.

I might test memory use at a higher resolution......be right back.

EDIT: Tested at a fixed 2715x1527 max settings and max RT.

Max VRAM used: 6860GB
Something seems wrong with you set-up there:-
 
Doesn't look like it to me....

ddsss.png


Again, I told you CERTAIN mips load on time, and others late.... and others... never... within the context of scenes playing out.
You are picking another shot, I linked you the one with the main guy that does not load with RT, here screen shot for you, but you knew that already.

RT ON
1662901756704.png

RT OFF
1662901683946.png
 
Something seems wrong with you set-up there:-
I can say the same thing about your Ryzen 5 3600, why has yours been capped to 3875Mhz in some of the footage?

It easily boosts to over 4Ghz on the stock cooler in Spiderman, I know that, because mine does.

And that poor RTX2070 in your picture is so CPU limited, you could crank the reflection resolution to very high and have better resolution ray tracing than PS5 could handle and still not notice a performance difference.
 
I can say the same thing about your Ryzen 5 3600, why has yours been capped to 3875Mhz in some of the footage?

It easily boosts to over 4Ghz on the stock cooler in Spiderman, I know that, because mine does.

And that poor RTX2070 in your picture is so CPU limited, you could crank the reflection resolution to very high and have better resolution ray tracing than PS5 could handle and still not notice a performance difference.
Ah, so what you are saying then is PC can differ and your performance is not reflective of the market and even the same machines. Great, so all PC tests are invalid then right?

Plus I am showing you it goes above the 6.7GB you state, so that is incorrect as a soft limit, or maybe another bug also?
 
Ah, so what you are saying then is PC can differ and your performance is not reflective of the market and even the same machines. Great, so all PC tests are invalid then right?

Plus I am showing you it goes above the 6.7GB you state, so that is incorrect as a soft limit, or maybe another bug also?

I'm saying your Ryzen 3600 doesn't behave and clock like it should do, why?

Are you not concerned that it's not boosting properly and thus making your frame rate results redundant?

Here is a picture of my Ryzen 5 3600 with default settings and PBO disabled, yours should be boosting to the same level as mine so your 3600 is either broken or you've artificially restricted it.

Spider-Man-2022-09-11-14-28-21-024.png
 
Last edited:
You are picking another shot, I linked you the one with the main guy that does not load with RT, here screen shot for you, but you knew that already.
RT ON
View attachment 6907

RT OFF
View attachment 6906

Well that might be because I'm not interested in SOME cherry picked scenes working properly... either all scenes work properly... or there's a problem.. I just posted a picture from your video in the scene directly before the cut to your shot... and the mip isn't loaded. I mean seriously.. your screen grab is taken at 27:01, and mine is taken at 26:56. My screen grab is of you running the game without RT.. and the texture isn't loaded.

Again, look at what's happening in this thread right now. We have myself with a 2080ti 11GB, no RT.. and that texture is taking 15 seconds to load in. Another user with 8GBs and it's taking 11 seconds at similar settings. Your 2070 is NOT loading in that texture.

There's something messed up with the game currently. This isn't explained by bigger RAM pools resulting in less pressure, because 3090s are also not loading those textures..
 
I'm saying your Ryzen 3600 doesn't behave and clock like it should do, why?

Are you not concerned that it's not boosting properly and thus making your frame rate results redundant?

Here is a picture of my Ryzen 5 3600 with default settings and PBO disabled, yours should be boosting to the same level as mine so your 3600 is either broken or you've artificially restricted it.

Spider-Man-2022-09-11-14-28-21-024.png
Are you not concerned your GPU is leaving ram underused? or have you artificially limited it?
 
Well that might be because I'm not interested in SOME cherry picked scenes working properly... either all scenes work properly... or there's a problem.. I just posted a picture from your video in the scene directly before the cut to your shot... and the mip isn't loaded. I mean seriously.. your screen grab is taken at 27:01, and mine is taken at 26:56. My screen grab is of you running the game without RT.. and the texture isn't loaded.

Again, look at what's happening in this thread right now. We have myself with a 2080ti 11GB, no RT.. and that texture is taking 15 seconds to load in. Another user with 8GBs and it's taking 11 seconds at similar settings. Your 2070 is NOT loading in that texture.

There's something messed up with the game currently. This isn't explained by bigger RAM pools resulting in less pressure, because 3090s are also not loading those textures..
The only cherry picking here is you, I am NOT saying issues and bugs do not remain, no software ever released can lay claim to that.

And I even reference the MIP loading issue in the video as something that need to be solved, I also show that it does load when given long enough (not possible in a cutscene).

But I am also showing you direct examples of VRAM usage affecting IQ, which I cover in the video. So, my RTX2070 8GB is insufficient to run this game at the same settings as PS5, that is a fact and why you couple here are taking it all so personal is beyond me.

These excuses I hear so often, the test is not fair, Consoles are like PCs, game has bugs, needs to be changed for PC, Architecture is not the same. Hell even on another forum someone said to me just because the PS5 is performing faster in that test section does not mean the PS5 performs better??? I cannot, generally, understand this thought process.

I spoke with Nixxes during the review period, gave feedback on areas, had chats on certain things such as Resolution scaling, DLSS use in the mode and more. Just because I do not quote everything that is ever said or done does not mean more has not been said and done.
 
Are you not concerned your GPU is leaving ram underused? or have you artificially limited it?
So you can't address criticism so decide deflection is the best tactic?

For someone of your 'expertise' to not realise or notice his CPU was not reaching the clock speeds expected of it and to still be OK with using that inaccurate data in a comparison video shows how poor and shit the quality of your data actually is thus making your video's useless to anyone outside of Twitter.

And then to not even try to tackle the criticism constructively?

Laughable.
 
So you can't address criticism so decide deflection is the best tactic?

For someone of your 'expertise' to not realise or notice his CPU was not reaching the clock speeds expected of it and to still be OK with using that inaccurate data in a comparison video shows how poor and shit the quality of your data actually is thus making your video's useless to anyone outside of Twitter.

And then to not even try to tackle the criticism constructively?

Laughable.
You never answered any of my questions and have just been moving goalposts and the subject when it fails.

Have a genuine discussion without such distain and we can discuss, if you looked in the video I hot 4.2GHz when testing the Ray Tracing and other areas, but when travelling and maxing the CPU out the temps go up, so it throttles at times to 3.9-4GHz. So the 4-8% I may get when this happens is not making any difference to the GPU tests OR the overall result.
 
You never answered any of my questions and have just been moving goalposts and the subject when it fails.

Have a genuine discussion without such distain and we can discuss, if you looked in the video I hot 4.2GHz when testing the Ray Tracing and other areas, but when travelling and maxing the CPU out the temps go up, so it throttles at times to 3.9-4GHz. So the 4-8% I may get when this happens is not making any difference to the GPU tests OR the overall result.

My data answers your questions as that even shows it can punch above 6.7Gb use.

And Spiderman is so cache and latency dependant on these early Ryzens that the performance increase from higher clock speeds scales above what the raw clock speed increase suggests.

And that shot you showed me with the memory use, your 2070 in that scene is so CPU limited it's downclocked itself by a good chunk.

So why did you not recommend increasing the ray tracing reflection resolution on your 2070 and getting higher quality RT over PS5 in the video? You're so CPU limited it wouldn't have affected the frame rate and is essentially a 'free' upgrade.
 
Last edited:
My data answers your questions as that even shows it can punch above 6.7Gb use.

And Spiderman is so cache and latency dependant on these early Ryzens that the performance increase from higher clock speeds scales above the raw clock speed increase suggests.

And that shot you showed me with the memory use, your 2070 in that scene is so CPU limited it's downclocked itself by a good chunk.

So why did you not recommend increasing the ray tracing reflection resolution on your 2070 and getting higher quality RT over PS5 in the video? You're so CPU limited it wouldn't have affected the frame rate and is essentially a 'free' upgrade.
And here we go, missing the point. At the same settings as PS5 with DRS on it performs worse than it at the DRS 4K, bumping GPU and CPU demands further would only make that bigger.

All I can hear from you is platform wars, I do not care about any of that. In fidelity the CPU is not the limiting factor in this rig, the GPU is. Other modes, yes, but all that is clearly detailed in the video.
 
All I can hear from you is platform wars
Please, don't go there, your amazigly ignorant conclusion about 3 million RTX GPUs is a stuff of unprecedented legendary ignorance. Mixing daily Steam concurrent users with monthly users are a true excercise of mental gymnastics to justify your pathetically gimped choice of hardware: bad CPU, and medium end GPU that is downclocking itself to hell and beyond. Your whole setup screams major bottlenecks, which defeats the purpose of you actually doing any analysis with it.

Technical analysis needs resources, if you lack them then just stop.
 
@Remij it took 11 seconds for that texture to load when using native 4k on my 3060ti via Nvidia DRS.

This game is broken when it comes to VRAM use, now I would expect the game to use noticeably more VRAM when at native 4k vs native 1080p.

But look at my numbers:

Max quality settings and max ray tracing settings were used:

  • At native 1080p max VRAM used was 6814GB
  • At a fixed 2715x1527 max VRAM used was 6860GB
  • At native 2160p max VRAM used was 7059GB

I would expect to see much larger increases in VRAM use moving through the resolutions than that, the game doesn't seem to want to use more than 6.8-7GB of VRAM regardless of the resolution being used on my system.

So what I suspect is happening is as the resolution increases the game doesn't want to increase the VRAM usage to accommodate the higher resolution frame buffer it has to sacrifice something in order to accommodate said larger frame buffer, and this results in less VRAM for textures which results in the lower quality MIPS being used that we're seeing.
Tools like MSI Afterburner are mostly useless for telling if you're running out of dedicated VRAM. GPU memory is virtualized nowadays and games will start pushing data into system RAM as dedicated VRAM gets filled. Afterburner only reports the dedicated part unless you configure it otherwise.

Your GPU may only show 6-7GB in use but you could easily have another 2-3GB residing in system RAM and thrashing over the PCIe bus. You can check how much shared memory is in use in the GPU tab in Task Manager if you don't have it configured.
 
And here we go, missing the point. At the same settings as PS5 with DRS on it performs worse than it at the DRS 4K, bumping GPU and CPU demands further would only make that bigger.

All I can hear from you is platform wars, I do not care about any of that. In fidelity the CPU is not the limiting factor in this rig, the GPU is. Other modes, yes, but all that is clearly detailed in the video.

All your video does is review your CPU's as you don't have the CPU performance to fairly compare raw PC GPU performance to PS5.

And the clock speed of your RTX2070 confirms it is massively CPU limited when RT is on.

Just because it's at 99% load doesn't mean its maxed out, looking at your picture your 2070 still has another 400-500Mhz worth of clock speed to boost up to but because your CPU is lacklustre it's down clocked itself.

So you could easily increase ray tracing reflection resolution above what PS5 offers and not even notice a performance hit.

Someone who's impartial would have done and recommended just that.
 
The only cherry picking here is you, I am NOT saying issues and bugs do not remain, no software ever released can lay claim to that.

And I even reference the MIP loading issue in the video as something that need to be solved, I also show that it does load when given long enough (not possible in a cutscene).

But I am also showing you direct examples of VRAM usage affecting IQ, which I cover in the video. So, my RTX2070 8GB is insufficient to run this game at the same settings as PS5, that is a fact and why you couple here are taking it all so personal is beyond me.

These excuses I hear so often, the test is not fair, Consoles are like PCs, game has bugs, needs to be changed for PC, Architecture is not the same. Hell even on another forum someone said to me just because the PS5 is performing faster in that test section does not mean the PS5 performs better??? I cannot, generally, understand this thought process.

I spoke with Nixxes during the review period, gave feedback on areas, had chats on certain things such as Resolution scaling, DLSS use in the mode and more. Just because I do not quote everything that is ever said or done does not mean more has not been said and done.
Sigh.. You said disabling RT fixed the issue... and I showed you that it doesn't, using your own work. If you think fixing SOME scenes is ok and not all of them... then that's on you. The point is... the issue persists... and increasing VRAM doesn't fix it.. like you've stated multiple times that it will.

And I don't understand why you're trying to tell me that your RTX 2070 8GB is insufficient and that I'm taking it personally... when I'm telling you my 2080ti 11GB is insufficient.... The ONLY difference is that I'm blaming it on being a bug/code issue (software), and you're blaming it on VRAM (hardware) saying that higher amounts with less pressure will fix it... I disagree.

RTX 3090TI + 12900K.... same cutscene


Explain that. Explain how more VRAM is going to help when it's using ~10GB of a 24GB pool of VRAM.. and mips still aren't loaded.
 
Sigh.. You said disabling RT fixed the issue... and I showed you that it doesn't, using your own work. If you think fixing SOME scenes is ok and not all of them... then that's on you. The point is... the issue persists... and increasing VRAM doesn't fix it.. like you've stated multiple times that it will.

And I don't understand why you're trying to tell me that your RTX 2070 8GB is insufficient and that I'm taking it personally... when I'm telling you my 2080ti 11GB is insufficient.... The ONLY difference is that I'm blaming it on being a bug/code issue (software), and you're blaming it on VRAM (hardware) saying that higher amounts with less pressure will fix it... I disagree.

RTX 3090TI + 12900K.... same cutscene


Explain that. Explain how more VRAM is going to help when it's using ~10GB of a 24GB pool of VRAM.. and mips still aren't loaded.
I do not need to; I have showed you that turning off RT resolves most of the lower mip issues. It will not FIX them all but it does help.

You are only saying, "you showed me clear examples of VRAM and settings affecting mip quality, but I choose to ignore them".

With your logic, just download more VRAM then. This is a waste of time as you are not having a discussion at all, you are just putting your fingers in your ears with some La La La.
 
All your video does is review your CPU's as you don't have the CPU performance to fairly compare raw PC GPU performance to PS5.

And the clock speed of your RTX2070 confirms it is massively CPU limited when RT is on.

Just because it's at 99% load doesn't mean its maxed out, looking at your picture your 2070 still has another 400-500Mhz worth of clock speed to boost up to but because your CPU is lacklustre it's down clocked itself.

So you could easily increase ray tracing reflection resolution above what PS5 offers and not even notice a performance hit.

Someone who's impartial would have done and recommended just that.
That is NOT at all how GPU clocks work, do you understand the workload being pushed through the GPU can and will affect its power throttling, heat etc etc. The GPU is often between 1950 and 2010MHz, that is perfect for the OC but depending on the scene demands it will have differing levels of work loads.

Also, if I am CPU bound, how can my Frame-rate increase when I lower the GPU settings in the Performance RT tests. Surely, they will remain static as the RT demands have not changed really, only GPU demands have reduced but CPU should still be the limit?
 
Back
Top