Current Generation Games Analysis Technical Discussion [2022] [XBSX|S, PS5, PC]

B3D isn't a PC forum ;) In fact doesn't it have much higher footfall in the console sections than the PC ones?

I dont think that has much to do with there being 'more console users' or the other way around. Its the forum's structure which some have pointed out and requested to be altered awhile ago. Perhaps there are more pc oriented users due to the platform (its always evolving, more capable, you can tweak, new tech is there first etc etc) but at the same time also game on console and the other way around. The forums layout, like DF topics being in the console sections along other main topics, discussions happen there. Also, the graphics forums are mostly pc-centric.

And yeah, most here including myself have both consoles and pc's, ive generally always had PC+PS as it was the killer combo for decades untill now where things start to become more and more spread across all platforms, which is a good thing really.

As for NXG YT channel, i think it has been covered now regarding its content. We dont have to keep batting him i think. It spawned two new quality posters which really inject some good old fashion technical discussions without all the platform warring/bias to it. Makes for more healthy discussions and less irritations to other posters (including myself). Its these 'PS5 is better because of cache scrubbers and NVME' etc without really undermining that thats really the case, which started about two years ago that made these discussions unhealthy in my eyes.
These videos have every damn time stirred up the discussions, every damn time. This time i think it was worth it due to said new posters and a quite nice tech discussion that followed, which others can learn from. And again, im glad we DF and one its members active here.
 
It makes it right for situations in which the objective is to measure the full performance of a specific component. As I said before, almost no one tries (and I don't think DF does) to test entire systems because there are too many things that might differ. Once the viewer/reader knows how his component should perform in an ideal scenario, they decide what's best for them. Testing a system with a 2600+2070 is useful for those with that config but doesn't tell much to someone with a 10900K+2070, hence why most reviewers review pieces of hardware, not rigs.
This is what I'm saying, though. NXG has tested with his system for a while now. He has maybe 3 or 4 configs plus some data from IGN staff to pull from. The sample size isn't that large. NXG isn't reviewing video cards, he's reviewing games based on their performance across different hardware configurations. It's not the same as testing a slew of video cards against a benchmark suite. He's essentially testing his hardware against current software and trying to normalize visual settings. There's nothing inherently wrong with that, but not all settings can be dialed to equivalent settings in some games (Spider-man being one of them) and his conclusions aren't always ones I agree with.

Alex at DF does test a few different CPU/GPU configurations for many of his settings review videos. That's one of the reasons they are so good. If you have this class CPU, these settings affect it. If you have this class GPU, you want to change these. There is a bit of user imagination required to combine all of the data points if you have a mismatched system with an overly powerful GPU or CPU for the rest of your configuration, of course.

If you're just testing how a specific PC compares to the console at console matches settings, and nothing else, then I agree there's no problem with using a weaker CPU. It's when you try to make direct GPU to GPU performance comparisons in CPU limited scenarios, or when you frame a specific specification bottleneck in that system as a more general architectural deficiency or inefficiency that it becomes a problem.
And this is the real issue I have with NXG's work product. He is essentially testing his system against PS5 while trying to achieve settings parity, and he has the hardware he has. That's all fine. It's when he goes on about having a top 4% machine and makes statements about the PC platform in general that aren't necessarily true for cases outside his own. But in the case of Spider-man, it isn't like the game is without performance issues and curiosities on PC.

On the Spider-Man topic, and reviews that review towards "what settings make it playable", here's an example of a youtuber who does just that with an entry level system.
His desktop is a Ryzen 3600X with a GTX1650S but he also tests on some old laptops. Doesn't do any ini tweaks or hex edits, just uses the settings menu to see if a game is playable on the laptop, and what settings are best for the desktop.
 
Here is my Take

i Like NXG and been watching his content since the Witcher 3 Days, calling him names is immature or good for a healthy discussion

an overclocked 2070 can get VERY close to an actual RTX2070 Super at some games, while others not so much, it's game dependable, so calling 2070 a 2070Super everytime is false in a technical standpoint.

on to the Topic at hand, when it comes to Benchmarking between different Systems or GPU performance you see a Parallel line syncing with each other like below


parralel.jpg

see how the frame graph of all GPUs tested showing similar pattern in a straight line which makes it accurate

Digital Foundry with their Death Stranding extracting PS5 performance difference showed similar pattern

NXG benchmarking sadly is all over the place however there is some truth to it



1663136894317.png
1663137020781.png

in the Beginning of the Benchmark here you see that exact same behavior which i showed earlier of performance syncing with each other

PS5 42fps - 38fps RTX2070 -- 10%
PS5 43fps - 40fps RTX2070 -- 7%
PS5 49fps - 45fps RTX2070 -- 8

all the way till Fisks poster shows up and then everything goes downhill for the RTX GPU losing 50 +% performance while the PS5 remained showing the same performance or behavior from the beginning in the mid 40s which means we hit some sort of bottleneck here, which already been established due to how the game allocates the VRAM

PS5 44 - 30 RTX2070
PS5 43 - 19 RTX2070
PS5 45 - 26 RTX2070

however when Spider-man jumps out the Window, the performance is back to where it was from the beginning in parallel 4-5 fps difference back to where we started



1663137597094.png


1663137808903.png

then you get into the gameplay which becomes tricky to calculate owing the nature of the game of being Dynamic and already established there is some sort of bottleneck

so what's the Take away even from NXG analysis ? since another member showed his RTX2080TI performing 38% higher than the PS5 while in that same scene i counted the 2070 to be 79% below the 2080TI

and in the Circumstances not being Bottlenecked by the VRAM

the PS5 outperforms NXG RTX2070 overclocked by counting all the averages is around 8%

an Actual RTX2070Super would be few percentage higher which makes it all ballpark of the PS5

Rich from Digital Foundry stated the RTX3060 is ballpark to have similar experience to the PS5 in his analysis and i can see that with its 12GB VRAM buffer which seems to be more inline to what's been posted here

hope this helps
 
RTX3060 is a match to a stock 2070 in raw performance. If the 2070 would be sporting 12gb vram it would be as performant as the ps5 at the same settings.
However thats not what anyone would do, the 2070 being as fast as the ps5 in raster but faster in RT, one would go with settings higher in other areas which would actually give the vram bottlenecked stock 2070 the upperhand.

So no, the PS5 isnt more performant than a rtx2070. The rtx is vram starved in SM, but the ps5 is RT perf starved, which actually is a performance issue.

Also, NXG is using a underpowered cpu teamed to his 2070, and god knows RAM and other factors. Anything to botch pc performance.

Also to add, its a port and it being weird with vram probably has something to do with that. In other titles a 8GB rtx actually has the upperhand vs the consoles vram wise.

Ports almost always play in favor of the original platform, be it pc to ps5 or ps5 to pc. You cant say ’humpf ps5 faster’ because of only ports from that system.
 
Here is my Take

i Like NXG and been watching his content since the Witcher 3 Days, calling him names is immature or good for a healthy discussion

an overclocked 2070 can get VERY close to an actual RTX2070 Super at some games, while others not so much, it's game dependable, so calling 2070 a 2070Super everytime is false in a technical standpoint.

on to the Topic at hand, when it comes to Benchmarking between different Systems or GPU performance you see a Parallel line syncing with each other like below


View attachment 6919

see how the frame graph of all GPUs tested showing similar pattern in a straight line which makes it accurate

Digital Foundry with their Death Stranding extracting PS5 performance difference showed similar pattern

NXG benchmarking sadly is all over the place however there is some truth to it



View attachment 6920
View attachment 6922

in the Beginning of the Benchmark here you see that exact same behavior which i showed earlier of performance syncing with each other

PS5 42fps - 38fps RTX2070 -- 10%
PS5 43fps - 40fps RTX2070 -- 7%
PS5 49fps - 45fps RTX2070 -- 8

all the way till Fisks poster shows up and then everything goes downhill for the RTX GPU losing 50 +% performance while the PS5 remained showing the same performance or behavior from the beginning in the mid 40s which means we hit some sort of bottleneck here, which already been established due to how the game allocates the VRAM

PS5 44 - 30 RTX2070
PS5 43 - 19 RTX2070
PS5 45 - 26 RTX2070

however when Spider-man jumps out the Window, the performance is back to where it was from the beginning in parallel 4-5 fps difference back to where we started



View attachment 6923


View attachment 6924

then you get into the gameplay which becomes tricky to calculate owing the nature of the game of being Dynamic and already established there is some sort of bottleneck

so what's the Take away even from NXG analysis ? since another member showed his RTX2080TI performing 38% higher than the PS5 while in that same scene i counted the 2070 to be 79% below the 2080TI

and in the Circumstances not being Bottlenecked by the VRAM

the PS5 outperforms NXG RTX2070 overclocked by counting all the averages is around 8%

an Actual RTX2070Super would be few percentage higher which makes it all ballpark of the PS5

Rich from Digital Foundry stated the RTX3060 is ballpark to have similar experience to the PS5 in his analysis and i can see that with its 12GB VRAM buffer which seems to be more inline to what's been posted here

hope this helps

But you've failed to factor in his 2700x not being any where near good enough to push his overclocked 2070 to it's full potential.

All his data shows is how a CPU limited 2070 performs when compared to PS5.

Big difference.
 
i'm aware of that

what i did is i extracted the Actual performance from NXG rig vs the PS5 "(if not hitting a bottleneck) with the behavior of the performance being in parallel

i actually had a Friend Test his RTX3070TI paired with a Ryzen 7 5800X and the Result with PS5 Settings?

as you can see we're about 54% higher performance than PS5 and a 134% higher than NXG RTX2070

RTX3070TIpeter.jpg

1663142993343.png

infact here is more showing around 50%

RTX3070TIpeter2.jpg
ps5peter.jpg


so we're talking about 50 - 55% difference regarding this specific rig vs the PS5

so if we took NXG conclusion of an RTX3070 being only 10% higher than the PS5, than the 3070TI should be around 15% higher performance than the PS5 which we're not seeing here including the RTX2080TI result which another member showed which makes it unanimous NXG conclusion is inaccurate, but it makes sense between the 3070TI i showed here with the RTX2080TI "counting the difference in hardware"

not sure if my friend hit a bottleneck however there are some sites that reported the RTX3060 is delivering better performance than RTX3070 at 4K at Max Settings "gamegpu"

while HUB didn't use the Max Settings but High with High RT showing the 3070 outperforming the 3060 by 48%
 
It's just how he presents himself and the angle he takes. I mean, when you're overclocking your 2070, so you can call it a "2070 super"... just to make the PS5 sound like it's performing better relative to PC hardware.. it irks me. There's an agenda there.

I mean, if he's so intent on providing data for a PS5 level PC configuration for the average everyday PC gamer.... why is he overclocking the GPU at all? Why is not everything 100% stock as you'd buy it?


IMO it shows massive bias.

There's lots of other nitpicks I have with his videos as well. I generally find they are quite messy with regards to him speaking about one thing and the video showing and focusing on another thing altogether.. I feel he could work on ensuring that what is being shown on the screen is representing what he's speaking about, for the most part. It's also been mentioned here about how often when he's comparing 2 things, let's say the 8x PCIe 3.0 vs 16x PCIe 3.0.. he'll have completely different statistics showing in RTSS or whatever else he uses. Sometimes it's missing info, other times it's something different.

nxg.png


There's not any consistency through it. It's not visually consistent at all. You add that, to the point I mentioned above about how things aren't always lining up with what he's saying, and you often don't realize exactly what it is he's wanting you to focus on.


Again, there's other things.. he's too close to the work and thus it doesn't come across to him. I'm sure he's a great guy, not saying anything about him personally... but after as long as he's been doing this, you'd expect these things to tighten up and improve.
 
@Metal Snake Eater

The 5800x isn't bad with Spider Man but it's still potentially leaving performance on the table.

The problem with the intro is Peter's model, when he's on screen the frame rate on my 3060ti at 1080p tanks and tanks hard.

A side from when he's on screen my frame rate is a locked 60fps with RT on max and very high textures.

Maybe it's his hair rendering? I dunno, but there's something about his model that PC GPU's don't seem to like.
 

Attachments

  • 1440p-High.png
    1440p-High.png
    169.4 KB · Views: 19
The component that's hammered the most in Spider-Man by far is the CPU. During traversal especially, the frame rate can become very erratic due to the lightning-fast texture streaming I presume. That's an enormous load on the CPU if that's accurate.

Additionally, Nixxes also said they were looking into DirectStorage which I presume they won't add to this game but may integrate to ports in the future. That would be hugely useful for PS5 ports. I can't imagine how rough a game like Rift Apart would be on CPUs during those rapid transitions.
 
Additionally, Nixxes also said they were looking into DirectStorage which I presume they won't add to this game but may integrate to ports in the future. That would be hugely useful for PS5 ports. I can't imagine how rough a game like Rift Apart would be on CPUs during those rapid transitions.

Direct Storage is actually referenced in the games files so it might be that they've already messed around it.

Ratchet and clank won't be doable on PC until GPU decompression gets added to Direct Storage.
 
I can't imagine how rough a game like Rift Apart would be on CPUs during those rapid transitions.

I don't see there being much of a problem there tbh. Those rapid transitions in R&C are just very quick loading screens. So even without any GPU decompression and a slow CPU the worst that should happen is the transition would just take longer which might be a bit annoying but not game breaking. I guess there is a possibility though that the game starts to stream the next worlds data in the background as when you get within a certain distance of the portal. If that was a large amount of data then it could impact CPU performance. It would be pretty amusing for example to see framerates tank whenever toy step within 5m of a portal. There should be ways around that on PC though. For example stream the next environment from a longer distance away at a much slower rate utilising the PC's bigger combined memory pool.
 
The component that's hammered the most in Spider-Man by far is the CPU. During traversal especially, the frame rate can become very erratic due to the lightning-fast texture streaming I presume. That's an enormous load on the CPU if that's accurate.

The game can definitely be CPU limited in every mode, RT is definitely the most demanding effect by far, but yeah. Whether that's due to the texture streaming as the primary cause though, who knows.

Traversing through the city on my 12400/RTX 3060 has it between 90-130fps at 1080p fixed (most settings very high except dof low, hair medium), with GPU usage dropping to just 85% at its lowest. Don't think I've come across a game on my 12400 that is CPU limited at 1080p, think the only thing is Rise of the Tomb Raider in DX11 mode (which DX12 rectifies). At 900p, GPU usage can dip into the ~70% range at points.

At these framerates, my CPU generally tops out at ~70% CPU usage. With RT it can go into the 80%. Thing is even without RT, I've seen videos of something like a 5900X on a 3080 at 1080p low (no rt) nd have extremely similar performance to mine, but with CPU usage going from between 30-50 percent range. So clearly something on the CPU side is holding this game back, but also not something that particularly scales with the amount of cores.

Direct comparisons with the PS5 VRR mode would be nice, but it's tough for a couple of reasons: For one, I haven't come across a reliable source which shows extensive city traversal of SM:RM in VRR with just performance mode (no RT), and secondly the PS5 is always using dynamic res so fps variations could also be GPU-found. It certainly would be interesting to see the potential effect of a hardware decompression block if we get like-for-like scenarios where the GPU usage isn't so variable but not really possible atm.

Additionally, Nixxes also said they were looking into DirectStorage which I presume they won't add to this game but may integrate to ports in the future. That would be hugely useful for PS5 ports. I can't imagine how rough a game like Rift Apart would be on CPUs during those rapid transitions.
There was a long discussion about this example earlier in the thread about the time Spiderman came out.
 
Last edited:
The reason why we at DF do not do that is because the way it is currently done is inaccurate. It relies on using a product from El Gato which only logs frame-rates (not even frame-times) every 16.6 milliseconds... even though VRR would be every 8.3 ms on a 120hz screen. It is not accurate in terms of frame-times and frame-rates measured AND it is also a visual mess as it triple buffers the output into the container... making it completely unsmooth.
So why not buy a Vrr capture card they exist now?
 
So Digital Foundry have highlighted an issue which would give them inaccurate or poor data to present and have done the right thing and just avoided it.

So why can't NXG do the same? He must need dem clicks bad.
Well no Nx gamer bought the first vrr capture card ever made to do his tests there are other tests out there with higher framerate averages like elanalist but Nx are probably more accurate
 
Just to clarify, this is your work-flow/tool issue not mine.

To be clear (and I covered this in my Elgato review) my process capturing VRR modes:-
- DOES NOT use the Elgato log capture method at all.
- DOES use the direct capture footage but I have integrated it within my capture tool directly.
- The direct footage captured is then ran through my analysis to create my frame-rate graphs and stats.
- Which means the process is identical to all my other captures and frame-rate analysis, as I wrote my own software for this I can do that quickly and integrate it.

As such my results are 100% accurate as per any FPS and device I test.

Again, all capture cards buffer to some degree and the output is not "unsmooth" capture a 120Hz game and you get 120Hz. Any drops do show up as they would in the 8.3ms maximum refresh cycle.
I don’t think most of the people on this forum watch your content so they are unaware you came out of your own pocket to buy a vrr capture card unlike DF to get the most accurate results. Im shocked you get any pushback here I would have thought a forum like this would love someone like you
 
All of that is true and I understand your position.

The test is not flawed because you did it wrong (if you were indeed using the close to PS5 equivalent settings from DF in the respective modes), but because the game is programmed in such a way that it does not use the full VRAM of a PC GPU. Thus, the 2070 is acting like a 6 GB GPU which is obviously not enough for 4K mip level biases. Based from what I am hearing, I can't believe people call this an extraordinary port with all these glaring issues.

Thankfully Alex already reported that to Nixxes, so maybe they will patch it soon. And if the 2070 is still running into VRAM bottlenecks with that fix, then the game needs more than 8 GB in these settings and PS5 has an edge.

Someone here measured how much VRAM the game needs, but AFAIK at max settings instead of optimized PS5 settings and I think it was around 9 GB (which is totally reasonable and what I would expect) It would be nice if that person can update the measurement with PS5 settings at 4K.
It’s really appreciated that there are still level headed people on this forum I thought this was resetera 2.0 and by the way im new here
 
You were comparing a PC GPU to PS5.

In order to compare that PC GPU to PS5 fairly and impartially you needed to ensure the GPU was the one and only bottleneck in your PC.

You failed do that and as a result your results are FACTUALLY incorrect.

An RTX2070 with a 12900K would have the CPU bottleneck with RT enabled completely removed and thus would perform measurably better than your 'testing' has shown it to.

If you don't have the required hardware to do the test properly and still insist on doing it then expect to get called out.

You want to sort your mess out?

Then why don't you put a Tweet out right now and advise people the RTX2070 you used in the testing was CPU limited on your CPU with ray tracing enabled and will perform better on a newer CPU.
If you read what he said you would realize why this response wasn’t needed. Why is the pc allowed to be free from cpu bottlenecks but not console especially in what is supposed to be primarily a gpu benchmark
 
Can you address the second part regarding PS5 matching a 3070 and the example I gave?
He never said the ps5 was as powerful as a 3070 he said in this specific instance it was matching it cause the 3070 has to do more work to run the game than the ps5 that can offset a lot of things
 
Hello! I was responding to the discussion, which always criticises NXG's contributions rather than discussing technical points. I've not watched your content and have no personal opinion, so I can hardly be biased! I'm only biased in favour of genuine technical discussion held at a competent level which is fact-based and can discuss methods and datapoints.

IF NXG tech breakdowns are low-grade as everyone here states (in interpreting scans of the discussion rather than being involved in depth with said discussion), they shouldn't be posting it here. There is no gate-keeping info on the internet, which is as it should be, but that doesn't make everything out there worthy of discussion. No-one should be posting anything that they consider to be poor data. People should only paste stuff in this thread if they believe it to be valuable content, at which point the discussion should be about the content. Ideally there'd be consensus on what content is worthwhile and what isn't so discussion don't get constantly bogged into arguments about reputation and biases.

I respect your appearance here and hopefully you can talk about methods and results in a way that contributes to the discussion. I will pay more attention to this thread for the time being to ensure the discussion is healthy and productive.

Edit: One suggestion, based on how people have spoken of NGX videos over the past coupled with your confusion here, I wonder if there's a communication issue, that the information you are presenting is being interpreted in a different way? Perhaps people should question what precise information you were meaning to share and what they heard? NX Gamer is here to talk to directly and ask questions about their positions without assumptions being needed. The discussion floor is open...
You are appreciated
 
Back
Top