Current Generation Games Analysis Technical Discussion [2022] [XBSX|S, PS5, PC]

I'm watching this video like.... simply astounded by the stupidity, and the incredibly obvious in-your-face attempt to purposefully make the PC version look bad. This is some truly next-level NXGamer bullshit.

It's like he's purposefully presenting the PC to have terrible performance in literally every comparison.. meanwhile making snide comments about how OTHER tech channels are confused and misrepresenting things...

How completely infuriating... and laughable.
 
Last edited:
He is claiming the entire PC high end market who owns a RTX 2070 or higher are just 3 million Steam users, compared to 28 million PS5/Series X users!!! What the fuck is this guy drinking?

Yeah you've gotta love how at 18:00 he claims the entire PC market (I assume he means gaming market) is only 28m units when Steam alone was recording that number of concurrent players over 2 years ago.

I'm watching this video like.... simply astounded by the stupidity, and the incredibly obvious in-your-face attempt to purposefully make the PC version look bad. This is some truly next-level NXGamer bullshit.

He's purposefully presenting the PC to have terrible performance in literally every comparison.. meanwhile making snide comments about how OTHER tech channels are confused and misrepresenting things...

How completely infuriating... and laughable.


Yeah there were quite a few snide digs at Digital Foundry in that video including directly contradicting Alex's object distance setting comparison (NX claims it's closer to 10 in performance mode on PS5 while Alex says 7-8). But I note he had absolutely no problem ripping off a load of the info from the DF Nixxes interview early in the video to "explain why the PS5 is so much more efficient than the PC".

I also enjoyed his section around 5:00 where he advises us the "PC will need to improve" in PCIe bandwidth because 8x PCIe 3 can be a bottleneck in this game. That's right folks, the PC's bleeding edge PCIe 2.0 16x graphics interface which debuted in 2007 with 8800GT can no longer cut it in PS5 ports. Even PCIe 3.0,that ultra modern bleeding edge tech launched in 2010 might represent a slight bottleneck here. PC must improve!!

Also is it just me or does the hair quality comparison look way better on the PC version even at medium than any of the PS5 modes? I assume that's just a poor comparison point or I'm interpreting it incorrectly as even Alex is saying PS5 Fidelity is equivalent to high. Sure as hell doesn't look like it from that NXG comparison though.
 
So I've done my own testing using his settings.

Like him I use a Ryzen 5 3600 that's cooled by an Artic eSports 34 Duo and the CPU is completely stock so it boosts to 4175Mhz during Spiderman, just like his 3600 does.

Unlike him I use an overclocked RTX3060ti that runs to ~2Ghz, now when looking at Techpowerup and other reviews it puts my 3060ti within ~10% of a stock RTX2080ti.

Now I wanted to check the CPU performance as his results don't look right and as I have the same CPU it's easy to do, I also game at native 1080p anyway so I'm always CPU limited in Spiderman.

Using the PS5 ray tracing settings which he states are:

  • Reflection resolution = High
  • Geometric detail = High
  • Object range = 8
Observations?

His numbers are bullshit, he's either purposefully gimping his PC or his PC needs to be rebuilt as it's broken, at minimum I'm getting a frame rate that's at least 10fps higher than what he shows in CPU limited scenes, playing the first 3 minutes I'm locked to 60fps 80% of the time and only drop to the high 40's when at street level and even then most of that is at or close to 60fps (Note that during this I was no where near maxing out my 3060ti)

I'm so CPU limited I can actually increase the reflection resolution to very high and not have a single change to frame rate.

I increased the resolution to 1527p using Nvidia DSR and again, pretty much a locked 60fps with some dips as I'm now GPU limited in the odd place but I can still run reflection resolution at very high.

His video showed an RTX2070 OC dropping to 19fps during the intro cut scene in Peters apartment, my 3060ti is a locked 60fps in this scene and my 3060ti is not 3x faster than a RTX2070 so his RTX2070 must be one of these Aliexpress knock offs.

He's definitely doing something shady to skew the PC results.
 
Last edited:
NXG: "No matter what anyone states my machine represents a well built and configured PC that is better than the majority of 2070 light (?) machines out there in the market."

[X]

Here is the 2700X, which with all cores loaded is some 5+% faster than his 2700* with his under/overclock settings of 3.8 ghz:

(Edit, this H.U.B. system is using DDR4 3200, not relatively slow DDR4 2800 like NXG's machine. Zen+ is latency sensitive and Spiderman RT is CPU bandwidth heavy).

NXG.png

Remember, his CPU is slower than the 2700X. And the 2700X is getting hammered particularly in minimum frame rates by even quad core Zen 2 and 3, with the Zen 3 4c mobile chip here only having PCIe 3, just like his Zen+ (so it isn't getting any help there). It basically loses horribly to everything that isn't Zen 1.0.

But he keeps on using his Zen+ machine to say how much faster the PS5 is than the 2070 Super, and extrapolating to say PS5 is more on a 2080 to 3070 level even in this RT heavy game.

*I'd wrongly said in the past that NXG's CPU was a 2700X, and compared that to benchmarks of the PS5 APU based AMD Ryzen 7 4700S. He actually has a 2700, which I think he has locked to 3.8. That's going to be slower in all real world situations than a 2700X (and slower in some circumstances than a stock 2700). So NXGs Zen+ test rig is even worse than I thought.

(Frequencies with heavy workloads across all threads (i.e. not Spiderman) for ref: https://www.techpowerup.com/review/amd-ryzen-7-2700/16.html)

(Speculation innit: Spiderman appears to have a single primary thread that limits framerate. This might be the thread that handles all the DX submissions - if so, a PCIe bottleneck might cause this primary thread to suffer, hence poor results for the 10+ year old PCIe 3)

He showed a Ryzen 5 3600 only running at 3.8Ghz? What's he done to it? My Ryzen 5 3600 hits over 4.1Ghz on the stock cooler during Spiderman.

Maybe he messed up an overclock and hurt or disabled precision boost? I think he did a manual fixed overclock with his 2700.
 
CCX latency is killing the 1st, 2nd and 3rd generation Ryzens.

It's also the reason the 3300x does so well, it only uses a single CCX so there's no latency issues like you get on the CPU's that use multiple CCX's.

I imagine this CCX latency is something developers can work around and avoid on a fixed platform like PS5.
 
Last edited:
Looks like we've rattled him :ROFLMAO:


Calling on his Twitter crew rather than coming here and dealing with the criticism directly.

Wow, can anyone else remember that time when NXG at least wore the façade of being objective? Clearly that was too much work to keep up. But hey, if he wants to lay his arguments open to the scrutiny of this forum I say let him. He's obviously reading this right now so by all means NXG, tell us your "Reasons".
 
Wow, can anyone else remember that time when NXG at least wore the façade of being objective? Clearly that was too much work to keep up. But hey, if he wants to lay his arguments open to the scrutiny of this forum I say let him. He's obviously reading this right now so by all means NXG, tell us your "Reasons".
I would love for him to come here and explain to me why his Ryzen 5 3600 is a good 10fps slower than mine in CPU limited scenes.
 
CCX latency is killing the 1st, 2nd and 3rd generation Ryzens.

It's also the reason the 3300x does so well, it only uses a single CCX so there's no latency issues like you get on the CPU's that use multiple CCX's.

I imagine this CCX latency is something developers can work around and avoid on a fixed platform like PS5.

That's a fair point, and on consoles manually scheduling threads would seem to be an option where you know what kind of arrangement/cache structure you're always going to have.

Zen 3 on PC moved onto 8 core CCX chiplets for a good reason, after all!

Looks like we've rattled him :ROFLMAO:


Calling on his Twitter crew rather than coming here and dealing with the criticism directly.

I don't think he'll turn up here. It would be Patreon suicide. He'll yap from a distance.
 
I don't think he'll turn up here. It would be Patreon suicide. He'll yap from a distance.

He doesn't get a tremendous amount of engagement on his Twitter, but I think part of the reason his tweet currently has 5 likes is because the majority of his followers are wondering: Uh, who/what the fuck is Beyond3D? I doubt it would affect it either way, this site is just too esoteric.

Regarding Spiderman PC:

Nixxes on Steam said:
This week's update is relatively small, as many team members have been enjoying some well-deserved time off. Rest assured that we continue our work on updates, with a focus on new features, performance improvements and more bugfixes.

From the latest update. Nothing concrete promised of course but they are seemingly going to attempt to devote some effort to performance tuning now that they've addressed some critical bugs from the release.

Great communication from the team btw. Every week like clockwork.
 

It looks like the PC is using more detailed RT settings in this scene where NXG shows the PS5 having higher performance (look at the eye reflections).
 
Last edited:
LOL only 3 million PC gamers with hardware 2070 level or higher he says... 😄

lol.png


He's using DAILY CONCURRENT ACTIVE USERS... and using the percentages based off of that... Not the MONTHLY ACTIVE users.. which is ~135 million LOL.. The Steam hardware survey is done monthly.

There's about 15% of Steam utilizing hardware better than a RTX2070... 15% of 135M = ~20M users with GPUs beating the PS5... and that's RTX alone, not counting AMD at all..
 
Last edited:

It looks like the PC is using more detailed RT settings in this scene where NXG shows the PS5 having higher performance (look at the eye reflections).

ClJhW47.png


I think an issue as well is he's running out of vram. I mean it's definitely a mark against the PC version that you need more than 8GB to run RT at high resolutions, but it bears mentioning this isn't necessarily reflective of comparable "gpu power" when you're throttling your available memory.
 
Back
Top