Current Generation Games Analysis Technical Discussion [2022] [XBSX|S, PS5, PC]

talking about assets density, i remember when i was impressed by kessen 2 up to 500 characters on screen at once on PS2, and now we get things like this

Remember Demon Chaos on ps2?
65k characters in a 60fps game.
Pretty sure the amount was limited by memory of VU1 and not by processing power.
 
NX Gamer was the only one comparing the unlocked framerate modes on PS5 to PC, so I think we can give him credit for that. I didn't know Spiderman Remastered had unlocked FPS modes too, so as I found out, I'm pretty disappointed Digital Foundry did not test this and compare them to PC.
The reason why we at DF do not do that is because the way it is currently done is inaccurate. It relies on using a product from El Gato which only logs frame-rates (not even frame-times) every 16.6 milliseconds... even though VRR would be every 8.3 ms on a 120hz screen. It is not accurate in terms of frame-times and frame-rates measured AND it is also a visual mess as it triple buffers the output into the container... making it completely unsmooth.
 
The reason why we at DF do not do that is because the way it is currently done is inaccurate. It relies on using a product from El Gato which only logs frame-rates (not even frame-times) every 16.6 milliseconds... even though VRR would be every 8.3 ms on a 120hz screen. It is not accurate in terms of frame-times and frame-rates measured AND it is also a visual mess as it triple buffers the output into the container... making it completely unsmooth.

So Digital Foundry have highlighted an issue which would give them inaccurate or poor data to present and have done the right thing and just avoided it.

So why can't NXG do the same? He must need dem clicks bad.
 
NX Gamer was the only one comparing the unlocked framerate modes on PS5 to PC, so I think we can give him credit for that. I didn't know Spiderman Remastered had unlocked FPS modes too, so as I found out, I'm pretty disappointed Digital Foundry did not test this and compare them to PC.

However, there were multiple VRAM bottlenecks on the 2070, especially in the VRR fidelity cutscenes, so the performance comparison from NX Gamer was pretty flawed, especially because the game limits itself to 6.7 GB VRAM on 8 GB cards for some reason.

The performance difference in RT Performance Mode is likely due to how the port works and how CPU limited it is. I already knew a more powerful CPU compared to the PS5 is required for the same performance, but I didn't anticipate the performance difference being that drastic. It's really important to run your tests at unlocked framerates on PC and console alike to get a true grasp of how well hardware and software performs.
game would still get heavily VRAM bottlenecked even without limitation. game clearly requires 9.5-10 GB VRAM to match PS5's fidelity mode at 4K
even 4k+dlss quality still needs upwards of 9 GB VRAM. PS5 always uses 4K lods, even when using temporal upscaling using lower internal resolutions. the key here is that it always upsamples to 4K, which has its own VRAM cost due to aformentioned LOD thing. this is another topic, where I believe internal 1200p upscaled to 4K with lods honestly looks better than actual brute native 1440p

btw its not 6.7 GB, it is 6.4 GB. worse is once it hits 6.4 GB, it retracts back to 5.3 GB and starts using 3.5 GB RAM so yeah, a fix for that would be welcome.

3070 would, without a problem, probably do fine at native 1440p if the problem gets a small fix (at least cap the VRAM to something like 7.4 GB). however 4k and 4k/upscaling would still cause problems. even Nixxes themselves suggest high textures for 8 GB cards, so I'm not surprised.

High textures... is that a solution? To a point yes. They look okay. At times, you really get ugly textures, but overall, it looks fine. However yeah, it is not quite a PS5 match then.

I really don't think 8 GB cards will have the VRAM budget to match PS5's texture quality in nextgen games at 4K with upscaling. native 1440p and 4K with upscaling are different beasts with different VRAM requirements.

Example: I can play Cyberpunk at native 1440p with RT enabled for hours without a performance drop on my 3070. VRAM usually hovers around 7.1-7.2 GB.
Playing at 4K with DLSS performance, so internal 1080p, and personally, 4K+DLSS performance
1) looks much, much better than native 1440p
2) almost costs as much as 1440p
3) uses 7.7-7.9 GB VRAM

after playing for half an hour at 4k dlss performance, card once again buckled, dropping to 20s, and power consumption was lowered too, signaling me that compute cores were sleeping and stalling to wait for VRAM to catch up.

i even confirmed it to be the case, once I set textures to medium, my framerates in the same region went from 20s to 40s. it tanked exactly 2 times, just like how it happens similarly on Spiderman.

8 GB was stingy, but it is actually quite problematic for 4K situations. not even upscaling reduces VRAM requirements enough to a point where it could save you in the case of Spiderman and Cyberpunk.

In the case of Cyberpunk: one can say that performance already does not warrant the use of 4K/upscaling.
In the case of Spiderman: actually this is brutal. because at 4k/dlss quality with high textures, I can literally get frames upwards of 70+ with ray tracing. it is amazing, the game is actually light on RTX GPUs with ray tracing. even at native 4k, nearly 60 FPS is possible. problem however arise when you simply want to use very high texture, which drops me from that sweet 70+ framerate average to 40-45s, which is simply dreadful. Card is so powerful for the game that even losing 2 times performance still nets you a playable experience, funny enough

So PS5 has that advantage going on about it. 4K/upscaling became a meme and people often criticise PS5 for dropping to 1200s/1300s, but I don't think that conveys the whole story. I've taken a look at temporal upscaling, DLSS and FSR 2.0 benchmarks and clearly 1080p-1200p upscaled to 4K almost costs as much as native 1440p rendering, while objectively looking better. Why 4K/DLSS 2.0/FSR 2.0 performance destroys native 1440p presentation in most gamers however is beyond me. There's something weird going on with modern day engines where they use higher quality objects for some reason the higher the resolution you have.


Here's a prime example. Halo Infinite has a basic temporal upscaling, yet 4K+50 scaling murders native 1080p in terms of image clarity.

I would prefer 4k/dlss performance over native 1440p any given day and time but sadly 8 GB VRAM starts to stop me from doing that. Up until now, it worked fine but I guess it is time to kiss goodbye to 4K for me.
 
The reason why we at DF do not do that is because the way it is currently done is inaccurate. It relies on using a product from El Gato which only logs frame-rates (not even frame-times) every 16.6 milliseconds... even though VRR would be every 8.3 ms on a 120hz screen. It is not accurate in terms of frame-times and frame-rates measured AND it is also a visual mess as it triple buffers the output into the container... making it completely unsmooth.
Just to clarify, this is your work-flow/tool issue not mine.

To be clear (and I covered this in my Elgato review) my process capturing VRR modes:-
- DOES NOT use the Elgato log capture method at all.
- DOES use the direct capture footage but I have integrated it within my capture tool directly.
- The direct footage captured is then ran through my analysis to create my frame-rate graphs and stats.
- Which means the process is identical to all my other captures and frame-rate analysis, as I wrote my own software for this I can do that quickly and integrate it.

As such my results are 100% accurate as per any FPS and device I test.

Again, all capture cards buffer to some degree and the output is not "unsmooth" capture a 120Hz game and you get 120Hz. Any drops do show up as they would in the 8.3ms maximum refresh cycle.
 
Indeed, this is a Beyond3D Technical Discussion. There's no value here to random internet content. Please don't post knowing it to be crap and useless - you're just generating OT noise. Only post low-grade links if accompanied with a high-grade question about a meaningful point raised.
Seriously, you are calling me low grade, at least try to hide your bias.

You are rude and condensing, to moderate you do not need to wear a cap as you are or gate keep discussion in a technical forum which is derived from actual data points, facts and results.

I am shocked at your display here, sadly I was warned already this would happen.
 
NX Gamer was the only one comparing the unlocked framerate modes on PS5 to PC, so I think we can give him credit for that. I didn't know Spiderman Remastered had unlocked FPS modes too, so as I found out, I'm pretty disappointed Digital Foundry did not test this and compare them to PC.

However, there were multiple VRAM bottlenecks on the 2070, especially in the VRR fidelity cutscenes, so the performance comparison from NX Gamer was pretty flawed, especially because the game limits itself to 6.7 GB VRAM on 8 GB cards for some reason.

The performance difference in RT Performance Mode is likely due to how the port works and how CPU limited it is. I already knew a more powerful CPU compared to the PS5 is required for the same performance, but I didn't anticipate the performance difference being that drastic. It's really important to run your tests at unlocked framerates on PC and console alike to get a true grasp of how well hardware and software performs.
Again, it is NOT. The test is for the RTX2070 as PS5 settings, I cannot, for the life of me, fathom how the collection here are just saying the test is flawed due to VRAM. The card ONLY has 8GB.

By this logic any tests on PC with a 12900K CPU are equally as "flawed" as the consoles do not have such a CPU and thus could be CPU bound where the PC would not.

This test is for people with that machine spec and this game running on it, end of. The results are valid, repeatable and reflective of the real world. If the 2070 had 12GB Vram it would be better, but of the PS5 had more VRAM it may also, but they do not so where is the logic?

Some basic facts which are (intentionally) being missed here:-

- The results now are how this card and others (as evidenced here by many of you) will perform and display Mips, thus if you play on PC you get this experience and if you play on PS5 you get this experience - Fact 1
- Changes have come and will continue to come that may help reduce this and remove it on certain cards (I even state that in the video) that does not change the results are they are now and how all will play - Fact 2
- Reducing the VRAM usage can and does help reduce the Mip issues due to the fact the allocated Pool of ram can store the higher mips, show in the video and repeatable - Fact 3

I am dizzy from all the NON technical discussion and hoop jumping, merry-go-round, gatekeeping and attempts to undermine me here with no success or credence. We have gone from VRAM, CPU affecting GPU bound scenarios, game is a poor port, wait till they fix it, the test is not fair. I am so very confused and did not expect this sort of behaviour and lack of logic on such a forum.
 
Last edited:
Again, it is NOT. The test is for the RTX2070 as PS5 settings, I cannot, for the life of me, fathom how the collection here are just saying the test is flawed due to VRAM. The card ONLY has 8GB.

By this logic any tests on PC with a 12900K CPU are equally as "flawed" as the consoles do not have such a CPU and thus could be CPU bound where the PC would not.

This test is for people with that machine spec and this game running on it, end of. The results are valid, repeatable and reflective of the real world. If the 2070 had 12GB Vram it would be better, but of the PS5 had more VRAM it may also, but they do not so where is the logic?

Some basic facts which are (intentionally) being missed here:-

- The results now are how this card and others (as evidenced here by many of you) will perform and display Mips, thus if you play on PC you get this experience and if you play on PS5 you get this experience - Fact 1
- Changes have come and will continue to come that may help reduce this and remove it on certain cards (I even state that in the video) that does not change the results are they are now and how all will play - Fact 2
- Reducing the VRAM usage can and does help reduce the Mip issues due to the fact the allocated Pool of ram can store the higher mips, show in the video and repeatable - Fact 3

I am dizzy from all the NON technical discussion and hoop jumping, merry-go-round, gatekeeping and attempts to undermine me here with no success or credence. We have gone from VRAM, CPU affecting GPU bound scenarios, game is a poor port, wait till they fix it, the test is not fair. I am so very confused and did not expect this sort of behaviour and lack of logic on such a forum.
All of that is true and I understand your position.

The test is not flawed because you did it wrong (if you were indeed using the close to PS5 equivalent settings from DF in the respective modes), but because the game is programmed in such a way that it does not use the full VRAM of a PC GPU. Thus, the 2070 is acting like a 6 GB GPU which is obviously not enough for 4K mip level biases. Based from what I am hearing, I can't believe people call this an extraordinary port with all these glaring issues.

Thankfully Alex already reported that to Nixxes, so maybe they will patch it soon. And if the 2070 is still running into VRAM bottlenecks with that fix, then the game needs more than 8 GB in these settings and PS5 has an edge.

Someone here measured how much VRAM the game needs, but AFAIK at max settings instead of optimized PS5 settings and I think it was around 9 GB (which is totally reasonable and what I would expect) It would be nice if that person can update the measurement with PS5 settings at 4K.
 
Last edited:
All of that is true and I understand your position.

The test is not flawed because you did it wrong (if you were indeed using the close to PS5 equivalent settings from DF in the respective modes), but because the game is programmed in such a way that it does not use the full VRAM of a PC GPU. Thus, the 2070 is acting like a 6 GB GPU which is obviously not enough for 4K mip level biases.

Thankfully Alex already reported that to Nixxes, so maybe they will patch it soon. And if the 2070 is still running into VRAM bottlenecks with that fix, then the game needs more than 8 GB in these settings and PS5 has an edge. Based from what I am hearing, I can't believe people call this a good port with all these glaring issues.

Someone here measured how much VRAM the game needs, but AFAIK at max settings instead of optimized PS5 settings and it was around 9 GB? It would be nice if that person can update the measurement with PS5 settings at 4K.
problem happening on 3090 is enough proof that MIP issue is a streaming lag that would happen universaly on every platform with RT enabled

also I don't think nixxes would fix it or do anything about it. they already made their mind about that, they practically suggest using high textures for 8 gb cards when rt is enabled in their recommendation chart. so I'm always positive that its not even a bug (the vram cap thing is) and it is intentional. and by that virtue, I think 8 gb users, me included, can practically kiss very high textures goodbye at 4K

problem... is also happening at 1440p but after 20+ mins of playtime. but they also suggest using high textures for 1440p as well

loBzu65.jpg


so yeah, it wont be getting a fix. practically, they think that neither 3080 nor 3070 should be able to handle their very high textures alognside with ray tracinh. emphasis on "high" preset is important here. high puts the textures to high, which practically solves the problem for all VRAM constrained cards (bar the 6950xt and 6900xt or 6800xt. you guys, you guys rock. you can goo all out!)

I will also touch on the aspect of High textures. Especially towards High textures + high ray tracing, which "nixxes" recommends and how it looks in certain cases. In an upcoming post. :D
 
Last edited:
All of that is true and I understand your position.

The test is not flawed because you did it wrong (if you were indeed using the close to PS5 equivalent settings from DF in the respective modes), but because the game is programmed in such a way that it does not use the full VRAM of a PC GPU. Thus, the 2070 is acting like a 6 GB GPU which is obviously not enough for 4K mip level biases.

Thankfully Alex already reported that to Nixxes, so maybe they will patch it soon. And if the 2070 is still running into VRAM bottlenecks with that fix, then the game needs more than 8 GB in these settings and PS5 has an edge. Based from what I am hearing, I can't believe people call this a good port with all these glaring issues.

Someone here measured how much VRAM the game needs, but AFAIK at max settings instead of optimized PS5 settings and it was around 9 GB? It would be nice if that person can update the measurement with PS5 settings at 4K.
Thank you for a genuine reply, with manners and good points.

Yes, I agree, the game does have memory allocation issues on PC and I call it out. This is as a result of the architecture differences between the console and PC. I also raised a good deal of bugs with a mini Triage I sent to Nixxes during my review period. Some already fixed, such as the RT quality, No SSR and texture issue on AMD cards when DRS is active. To be clear I am not saying that I had any involvement in that as they almost certainly had them on their own triage/bug list, but I tried to help (as I always do) when reviewing titles.

The thing is all PC to Console tests will never be 100%, DRS makes that really impossible, but settings will also especially with buffers used, Particle systems and even Driver tweaks on PC to alter things. But these are a guide, like all technical reviews and even reviews. I see it from all sides (here I am getting the PC platform warriors) but I get it when I say it about the PS5, Series X, Series S etc etc and I am not involved in any of that, has no basis on my view or methodology, I simply want to be as accurate and exhaustive as I can and the numbers falls as they will. Here with a titled tailered for Consoles these kinds of areas are not a surprise, we saw even worse scenarios with Arkham Knight at at launch on PC that were related to the same limitation in Texture pools and PCIe sharing of data.

Once Nixxes release a patch to improve the mem-alloc and data streaming to better suit the PC I will certainly be covering it.
 
so this is the upcoming post

so, we know that "high" ray tracing setting uses some kind of texture reflection as DF showed. instead of reflecting the thing directly, high rt geometry setting makes it so that it uses some kind of texture map as a reflection.

this is where it gets messy. if you by recommendation, and use high preset, high textures and high RT geometry setting, you get very unusual reflections, even compared to PS5.

if high textures did not have such weird oddities all around, I wouldn't be making a big fuss out of it. that's why I said high texture is a solution but not a solution at the same time. let's dig in to my findings;

this is how a run off the mill building reflection looks like on PS5 (taken a from a youtube video for general purpose comparison)

2puRiBk.png


this is how these reflections look like on ps5 equivalent settings

nothing abnormal here. it practically matches the PS5. everything is good. if you have enough VRAM for it, that is.

very high textures + high geometry

oDba6u1.png


here is the brutal part. Nixxes recommendation of having high preset, and by extension high textures, and high geometry.
there are now probably tens of thousands RTX gamers who play with high preset and high ray tracing, and practically their experience is hugely inferior to what they would get on PS5.

why this happens? my theory aligns with DF's theory. they say that high geometry setting uses texture maps for reflections. so when you use high texture quality, which is a huge downgrade compared to very high texture quality at certain times (i will prove this in extra note), causes reflection textures to be downgraded even further, causing this abnormally ugly texture reflection

High textures + High ray tracing (as stated, official nixxes recommendation for 1440p rtx 3070 users)

05f6lzY.png



situation becomes even more funny when you use Low extures but use Very High geometry.

now you have worse textures than PS5, but have better reflections than PS5! Its just a funny mechanicsm in works here.

b6lviaF.png



To get proper reflections with "High" geometry, you HAVE TO use very high textures.
With high textures, you HAVE TO use very high geometry.

You may wonder, how I made this discovery? I didn't actively look for it. After experiencing tons of VRAm bottlenecks, I finally decided to play with high textures and p5 equivalent settings. Then I came to this location. Saw those ugly horrible reflections.

I though to myself: "Jeez, PS5 cannot look this bad. This cannot be real." I quickly looked up on the net to see same mission on a PS5. To my surprise, it did not look this bad.
Then I upped the geometry, and voila, it is fixed. but I knew that PS5 did not use very high geometry. if it did, it would undermine the entire video DF made about the PC port.
Then I played around with texture setting... and finally pinpointed the issue. And it culminated to the results you've seen above.
I finally given up on my 4k/dlss performance/high texture dream, and went down to 1440p/very high texture+high geometry.

as a final addendum, i will add high and very high texture screenshots of the reflected windows across the street

DZ0nT7U.png


qJYj6av.png


As you can see, high texture does not look that bad. as a matter of fact, they look very identical. yet, they produce enormously different Reflections with High geometry setting.
 
By this logic any tests on PC with a 12900K CPU are equally as "flawed" as the consoles do not have such a CPU and thus could be CPU bound where the PC would not.
You were comparing a PC GPU to PS5.

In order to compare that PC GPU to PS5 fairly and impartially you needed to ensure the GPU was the one and only bottleneck in your PC.

You failed do that and as a result your results are FACTUALLY incorrect.

An RTX2070 with a 12900K would have the CPU bottleneck with RT enabled completely removed and thus would perform measurably better than your 'testing' has shown it to.

If you don't have the required hardware to do the test properly and still insist on doing it then expect to get called out.

You want to sort your mess out?

Then why don't you put a Tweet out right now and advise people the RTX2070 you used in the testing was CPU limited on your CPU with ray tracing enabled and will perform better on a newer CPU.
 
You were comparing a PC GPU to PS5.

In order to compare that PC GPU to PS5 fairly and impartially you needed to ensure the GPU was the one and only bottleneck in your PC.

You failed do that and as a result your results are FACTUALLY incorrect.

An RTX2070 with a 12900K would have the CPU bottleneck with RT enabled completely removed and thus would perform measurably better than your 'testing' has shown it to.

If you don't have the required hardware to do the test properly and still insist on doing it then expect to get called out.

You want to sort your mess out?

Then why don't you put a Tweet out right now and advise people the RTX2070 you used in the testing was CPU limited on your CPU with ray tracing enabled and will perform better on a newer CPU.
when the rtx 2070 drops to 25s in that intro cutscene, that has nothing to do with CPU. as i said, you're just fueling his agenda by attacking his CPU. he purposefully uses that CPU in a heavy GPU bound situations to lure people like you into his baits. please do not bite it. neither 12900k nor 5800x3d would push the 2070 above 25 fps in that scene, since it is 2070 itself that is heavily destroying itself

in that intro cutscene, 2700x is actually able to lock to a 60 with ray tracing (when vram is not a constraint). so dropping to 25 cannot possibly have anything to do with the inferior CPU, sadly

however I still believe if he wants a fair comparison to PS5, he must get a GPU with 10 GB available VRAM budget to the game, so a rtx 3060 or 6700xt. naturally he avoids such comparisons and instead focuses on huge VRAM bottlenecks to make PS5 appear overachiving, whereas it is not, since PS5 still performs just a tad bit above 3060 in most cases.

showing 3060 pushing 36-37 frames versus ps5's 45 frames instead of showing 2070 pushing 25-26 frames versus ps5's 45 frames would be sacrilegious for him.

"This is as a result of the architecture differences between the console and PC."

this is a blatant false deduction; you either have enough VRAM or not. the game requests 10 gb vram as it does on PS5. you cannot expect 8 GB budget to match 10 GB. this has nothing to do with architecture differences. game perfectly performs okay when given 10 GB memory budget. this has always been the case.

he acts as if 8 GB is the only maximum VRAm you can have on PC, potentially making everyone a victim of this problem.

if it WAS a result of architecture difference, than 3060 would not perform similar to a PS5 while 2070 getting destroyed.

it is called BEING OUT OF VRAM. it is a concept that existed since 2005s. if you don't have enough VRAM, you will always lose frames. nothing, NOTHING do with architectures.
 
Last edited:
when the rtx 2070 drops to 25s in that intro cutscene, that has nothing to do with CPU. as i said, you're just fueling his agenda by attacking his CPU. he purposefully uses that CPU in a heavy GPU bound situations to lure people like you into his baits. please do not bite it. neither 12900k nor 5800x3d would push the 2070 above 25 fps in that scene, since it is 2070 itself that is heavily destroying itself

in that intro cutscene, 2700x is actually able to lock to a 60 with ray tracing (when vram is not a constraint). so dropping to 25 cannot possibly have anything to do with the inferior CPU, sadly
I'm not talking about that one scene, I'm talking about the whole video.
 
A final comparison towards the VRAM thing.

QRbva76.jpg


with low textures, my 3070 gets 56 framerate average with matched PS5 settings in this scene. this is what you expect from 3070, being much better than ps5 in both rasterization and ray tracing. a clear cut %30 framerate difference
a6glnra.jpg

with very high textures, it drops to 41

CxtnbuY.jpg


a clear and profounded VRAM bottleneck. simple as that. ps5 is not performing like a 3070. instead, 3070 is unable to perform like it should.

swap 3070 with a 2080ti in this scene and it would get 56 framerate average with very high textures. I have no idea why some people correlate this event to architectural differences. it is clear that VRAM is not up for the task, but it is so because it is not matching what game requests on PS5.

so when the dude says 3070 would perform like a ps5 in that scene, yeah, it sadly happens. but it happens because of the huge VRAM constraint. Having this kind of enormous VRAm related performance drop while card chills at 5.7 GB VRAM usage is not cool at all. But that is up to Nixxes to find out.
 
Back
Top