Current Generation Games Analysis Technical Discussion [2022] [XBSX|S, PS5, PC]

I do not need to; I have showed you that turning off RT resolves most of the lower mip issues. It will not FIX them all but it does help.

You are only saying, "you showed me clear examples of VRAM and settings affecting mip quality, but I choose to ignore them".

With your logic, just download more VRAM then. This is a waste of time as you are not having a discussion at all, you are just putting your fingers in your ears with some La La La.
Your logic as to WHY it "helps" them is what's wrong here.... my lord. You said disabling RT reduces pressure on the VRAM and that's why it helps resolve the issue... and what did I say? I asked you what pressure is there on a 24GB framebuffer?

So if disabling RT which gains you an extra 1 or 2GBs is MOSTLY resolving the issue... why the hell isn't 2x the VRAM solving it completely?


I'll tell you why.... because it's a bug/code issue with needs to be addressed.. not a VRAM issue. When those textures load in... does VRAM spike up suddenly? NO.. it doesn't.
 
Your logic as to WHY it "helps" them is what's wrong here.... my lord. You said disabling RT reduces pressure on the VRAM and that's why it helps resolve the issue... and what did I say? I asked you what pressure is there on a 24GB framebuffer?

So if disabling RT which gains you an extra 1 or 2GBs is MOSTLY resolving the issue... why the hell isn't 2x the VRAM solving it completely?


I'll tell you why.... because it's a bug/code issue with needs to be addressed.. not a VRAM issue. When those textures load in... does VRAM spike up suddenly? NO.. it doesn't.
You are talking about another card, not my test or anything I have said, done or seen.

I can test what I have, with the software and results showing that reducing the VRAM use on this card, helps IQ. The evidence is presented and is factual along with tests to support it, PC can and will have issues it may be an issue with your config, driver, OS etc etc and nothing to do with the game. I do now know as you just keep talking about something I have not seen or created.
 
That is NOT at all how GPU clocks work, do you understand the workload being pushed through the GPU can and will affect its power throttling, heat etc etc. The GPU is often between 1950 and 2010MHz, that is perfect for the OC but depending on the scene demands it will have differing levels of work loads.
If it's often between those clocks why is it less than 1600Mhz in your screen shot?

Your system is incapable of fairly comparing an RTX 2070 to PS5 in a pure GPU comparison.

You claimed PS5 is performing like a 3070?

My 3060ti is less powerful than an RTX3070 and I can do native 4k/30fps with ray tracing literally on max.

Something PS5's GPU would never be able to do as it simply isn't powerful enough so how is it performing like a 3070 exactly?
 
If it's often between those clocks why is it less than 1600Mhz in your screen shot?

Your system is incapable of fairly comparing an RTX 2070 to PS5 in a pure GPU comparison.

You claimed PS5 is performing like a 3070?

My 3060ti is less powerful than an RTX3070 and I can do native 4k/30fps with ray tracing literally on max.

Something PS5's GPU would never be able to do as it simply isn't powerful enough so how is it performing like a 3070 exactly?
That shot was not in the video, I took it now at stock clocks, at least reference the video.
 
The evidence is presented and is factual along with tests to support it,
Most of your claims are unfounded and unsupported, you claimed the PS5 is on the 3070/2080T level but never showed any facts to support your claim. My 2080Ti/3770K PC runs rings around the PS5, native 4K, max settings, max RT, locked to 30fps, and the GPU is massively underutilized too .. explain that.
 
This video sums up the fact about problem being entirely VRAM bottleneck:


RTX 3070, native 4K, matched PS5 ray tracing settings: 35-39 FPS
Only bringing textures to "High" shoots the framerates back to 56-60 FPS. As I've said in countless forums, Steam discussion forums and other various platforms, the issue is that game has a hard cap at 6.4 GB. NOT TOTAL SYSTEM VRAM usage.... Game's specific VRAM usage. It will never go above 6.4 GB. And once it breaches past 6.4 GB, it will go back to 5.2-5.4 GB and use NORMAL RAM instead. You can all see it happen in front of you in the video below.

That heavy RAM to VRAM transaction stalls the card and have a near %50 performance drop (from almost a locked 60 at native 4k to a measly 36-39 FPS performance profile). Watch his "2070" being on par with PS5 at the beginning, then becomes two times slower than PS5. It is true, and it is not his config being wrong, it is simply a huge VRAM bottleneck. He however misinterprets VRAM bottleneck and presents as a success and achivement from PS5 front. It is not. 2070 is quite literally losing upwards of %80 performance in that scene due to heavy VRAM bottleneck. 8 GB at native 4k in this game alongside with Ray Tracing is not going to cut it. Even Imsomniac suggests "High" preset SPECIFICALLY with ray tracing at 1440p if you have a RTX 3070. meaning, they actually suggest 8 GB users to use High textures. I can understand the sentiment of needing to match the PS5 settings, but this is simply wrong.

Here's your problem "nx gamer". You try to match PS5 with a 2700x, saying it is equivalent CPU. Then you went on ranting about with a GPU that does not match PS5's VRAM budget. That is your problem. If you're so keen on having equivalent parameters. To match PS5 in this game specifically with very high textures, we need a 12 GB RTX 2070 super variant, which does not exist. 3060 is lagging behind due to having worse rasterization, and 3080 is overshooting what PS5 is capable of.

While your "2070" drops to 20-25 framerate region, 3060 gets native 4K and a solid 35+ framerate with healthy frametimes. PS5 is not "overachieveing". it performs like it should. 3060 also performs like it should. 8 GB cards simply start underperforming at 4K.


PCGH even found that with extensive testing, even 3080 drops frames (since it is capped to 8 GB VRAM utilization) compared to a 2080Ti;

1440p RT benchmarks;
T5gwoas.jpg

4K RT benchmarks.

y2hpl71.jpg



Hardware unboxed did all his tests with High preset, meaning High textures, meaning all of his tests were done without this specific VRAM pressure VH textures cause.

All of this could've been avoided if NVIDIA was not stingy and gave 3070 10 GB and 3080 12 GB from the get go. But here we are. Thanks to NVIDIA, now people like NXG will have endless amount of fuel going into the nextgen, trying to match console texture settings at 4K and discovering that 8 GB is not enough budget to cover that.

Even then, 8 GB budget is barely enough for 1440p, let alone 2160P. As the other user also discovered, game needs upwards of 9.6 GB+ at native 4K. You can actually SEE SO in the 3060 video I've linked. It is clearly using 9.6 GB VRAM. 3070/2070 simply does not have this much VRAM, and falling back on RAM, it causes performance problems.

Of course the "%80 vram allocation" cap they're using is not helping. B
 
my RTX2070 8GB is insufficient to run this game at the same settings as PS5

My 3070m 8GB at 115watts (130w max) laptop with 5800h outperforms the PS5 version, and it should, its GPU and CPU are more capable. Yeah its a newer Zen3 CPU, but a RTX2070 dgpu should perform fine. Its more capable than the PS5 in the grand scheme when considering ray tracing, and about on par in raw raster performance.
A 3070m is close to a 3060ti (non OC).

Edit: look at you, all over the map on here. Immature and claiming users are platform warring. You that have a shit at DF from time to time. ive never seen anyone from team DF behaving like this on any forum or comments section.

Your way off base dude, unconstructive as can be, probably close to being moderated.

Oh and this davis user is far from being platform warring, he's actually been against PC many times before. The facts are so blatantly true that even PS users have to point them out.
 
Last edited:
Most of your claims are unfounded and unsupported, you claimed the PS5 is on the 3070/2080T level but never showed any facts to support your claim. My 2080Ti/3770K PC runs rings around the PS5, native 4K, max settings, max RT, locked to 30fps, and the GPU is massively underutilized too .. explain that.

What his videos needs to say is

"PS5 performs like a 3070 if said 3070 is paired with a 4 year old CPU'
 
Here is Capframex Overlay section. "memory" and "memory + cache" thing is game's system RAM usage. GPU memory sections are specifically labeled with GPU memory in front of them. And for this game, per-game VRAM usage will never, ever, ever breach past 6.4 GB with a 8 GB GPU. IT will never ever breach past 8 GB with a 10 GB GPU. Even if you have the cleanest background possible, the game will refuse to touch final %20 part of your GPU. as a matter of fact, once it reaches that %80 mark, it will go back to 5.3-5.4 GB and start to use more RAM as a resulkt.
 

Attachments

  • system1.jpg
    system1.jpg
    273.9 KB · Views: 9
What his videos needs to say is

"PS5 performs like a 3070 if said 3070 is paired with a 4 year old CPU'
PS5 also underperforms compared to 2080ti at 4k with very high textures. PCGH also made this discovery. I too, proved it with a video. He just uses this problematic issue to his own whims. But still, you're just further fueling his ego by playing into his game. Not having enough VRAM at 4k with VH textures really downgrade the performance a lot in this game. To a point 3070 loses %50 of its potential performance;


Naturally, losing nearly %50 performance for both 2070 and 3070 will make them look weird compared to PS5. It is not PS5 overachieving, it is 2070 and 3070 underperforming due to enormous VRAM stress.
 
Ok, but I need to know the DRS scaling target framerate they used.

NXG's video is showing PS5 unlocked so I'd suggest no DRS at all and no framerate cap. If you measure then at native 4k, and at 1512p (which John at DF says it can drop to in Fidelity mode on PS5) then it would potentially give an upper and lower performance bound. i.e. if at 4K the 2080Ti is ever faster, then it's definitely faster than the PS5 at those points. Similarly if at 1512p the PS5 is ever faster, then it's definitely faster at those points. We learn nothing however from those points were the 2080Ti is slower at 4K (because PS5 might be running at a lower DRS level) or from those points where the 2080Ti is faster at 1512p (because the PS5 may be running at a higher resolution).
 
You are talking about another card, not my test or anything I have said, done or seen.

I can test what I have, with the software and results showing that reducing the VRAM use on this card, helps IQ. The evidence is presented and is factual along with tests to support it, PC can and will have issues it may be an issue with your config, driver, OS etc etc and nothing to do with the game. I do now know as you just keep talking about something I have not seen or created.
I'm talking about a card far more powerful in every respect than your card. You're failing to give me any reasonable reason why this issue would persist in such a setup, and not yours.

The evidence points to VRAM not being the issue... because we can throw far more VRAM at the issue..... It's verifiable across multiple configurations..

So if you want to shift the blame to potential issues with my config, driver, OS, ect, ect... I counter than I can post MANY setups with varying configs, drivers, OS.. illustrating this issue... all with more VRAM than your 2070.

The fact is that my sample size of GPUs with far more VRAM having this issue regardless of the config... is larger than your ONE sample.. and thus more conclusive of the issue being a bug/coding.. than VRAM.
 
Finally, that texture streaming bug happens regardless of the VRAM you have.

It also happens on the 3090 of mkiceandfire (I did some research, it shows he has a 3090. i couldnt quite confirmed it. but considering the video is super smooth, i have an assumption that he has a beastly rig)



This is at this point is clearly a bug, or just a misconfigured texture streaming but when RT is enabled. Simple. as. that.

HE will probably argue that ps5 is using special sauce IO stuff to upload textures faster to memory, but if the said systems do not exist on PC, they could simply use the PS4 fallback where they have loading screens. These textures usually get loaded after 5-6 or seconds, so a 5-6 second loading screen would be more than enough to fully get textures in these cutscenes.

It is not an isolated case either, it will happen in every cutscene, even while you're swinging.

Z8hlADx.jpg


VqKuNt4.jpg
 
Finally, that texture streaming bug happens regardless of the VRAM you have.

It also happens on the 3090 of mkiceandfire (I did some research, it shows he has a 3090. i couldnt quite confirmed it. but considering the video is super smooth, i have an assumption that he has a beastly rig)


This is at this point is clearly a bug, or just a misconfigured texture streaming but when RT is enabled. Simple. as. that.

HE will probably argue that ps5 is using special sauce IO stuff to upload textures faster to memory, but if the said systems do not exist on PC, they could simply use the PS4 fallback where they have loading screens. These textures usually get loaded after 5-6 or seconds, so a 5-6 second loading screen would be more than enough to fully get textures in these cutscenes.

It is not an isolated case either, it will happen in every cutscene, even while you're swinging.
Exactly. I can post countless configurations of GPUs with far more VRAM than a 2070, which show this issue.
 
Another "false" deduction in this shot:
KIDifBH.jpg


"mipmaps are very low in place"

no it is not, it is a LOD bias bug happens on all systems, regardless of it being a 750ti or 3090, when the game is set to 1080p, game fails to load proper LODs for Otto's sweater, EVEN if you use very high textures, so it is also has nothing to do with texture quality, or game quality, or 750ti, or streaming.

It is a bug, I reported it 6 times, they wanted dxdiag reports, did nothing, and bug still exists. of course since you rushed the video, you will never be able to make these deductions, instead, you will just yap about "haha low mipmaps go brr" just to present yourself as this "technical expert". Even this small example alone proves that most of the time you're actually talking things about you don't understand, or things that you haven't actually analysed or dissected. You shouldn't make bold claims such as "it loads low level mipmaps see 750ti haha" when you don't the cause, or when you don't even the ponder the possibility of it being a bug. Because it is, actually, a bug.

Full texture quality, 1080p, waited 30 seconds to make sure it is not a "streaming" lag. It is not, it is a MIP BIAS bug. It is not a defect or problem that card is having. Quite literally, universally, that sweater at 1080p will not load properly on any kind of system, be it a 750ti or rtx 3090

WaKykhB.png

That sweater is so messed that it cannot even load PS4 equivalent sweater even 4k with dlss balanced

BCdAKhg.png


Actually comically, pushing 1871 dlss balanced also brings it back to the "completely gone" territory;


FkwQBLu.png


When does it get fixed, properly, you might asked? At native 1440p or resolutions that is higher than 1440p internally. That's it.

nv1bu40.png


As you can see, it is indeed a bug. Do your research next time instead of rushing to "deductions" and make irrevelant "deductions" about MIPMAPS not being able to load with a 750ti at medium settings. Yeah, it has nothing to do with that. And there's nothing, literally nothing that you can counter what I've presented above.
 
So, what I said then, glad we agree.

The fact is, if you have more Vram with RT (as in my RX6800) it does not happen. Changing code, memory allocation pools etc etc will help/resolve it. But many of you here also state consoles are just PC's, if so why would changes need to be made to the source code?

If you're accepting that this is a bug as opposed to some fundamental weakness/inefficiency in the PC architecture like your video implies then yes, we're in full agreement.

The problem is the way you frame things, this is clearly a bug in the most classical sense and something the developers can and likely will fix now that you've drawn attention to it. However in the video your framing implies that it's a more fundamental hardware weakness by suggesting that a top of the line CPU would be needed to improve the situation. You then use that to make a pretty disingenuous argument that you need a $500 CPU to equal the PS5. No-one buys top of the range hardware to try and brute force their way past software bugs. It's just a bug in the PC version that's not present in the PS5 version, similar I'd suggest to how there is a missing reflection in Spidermans eye in that opening cutscene on the PS5. Obviously it should be considered in a comparison of which platform gives the better experience, that's totally fair, but bugs shouldn't be used to try and compare the relative capabilities of two platforms.
 
Looks much sharper/better on the at the time low-end 750Ti gpu (medium settings) vs the ps4.
actually ps4 one looks more natural, lighting is changed between scenes, ps5 version of the game uses a different lighting for cutscenes. 750ti pic looks sharper because most likely default sharpening is in effect. ps4 / pro version never used any sharpening, but PC version by default has sharpening enabled (it was forced, but now you can turn it off)

i dont like the direction they went with PS5 version. thats a personal preference and out of topic however. I even actually thought PC version being wonky, until I first decided to check how it looks on PS5, and my suspicions were confirmed. Especially his hair looks much better on base PS4, whereas PS5 remastered version looks a bit wonkier, honestly

35tninu.png

His face and his ears appears more natural on PS4 version but that's just me. (right one is ps5, left one is ps4) I also disliked the color profile shift they did.
 
Back
Top