Value of NXGamer technical investigation videos *spawn

NXGamer has always made up with enthusiasm what he lacked in accuracy, but that’s why people love him, no?? Just me?

I do remember when he first popped up prominently with his Uncharted 4 video. His energy is infectious but even from that video we saw that he made some errors here and there and some BIG assumptions.

Give the guy a break! If anything, go after him for being an anti vax! :LOL:
 
The guy praises PS5's wonderful upscaling "his words not mine" from 1080p to 4K, but does nothing to mention the more competent DLSS options available on PC! He didn't even care to turn on DRS or FSR!! He states the PC version offers higher LOD, SSR, shadows and Tessellation options but just brushes them all off as inconsequential, while also exaggerating the difference between PS4 and PS5 quality settings despite not showing any footage of why this is the case (apart from a direct resolution comparison).

Worse yet, his RTX 3090 is running hot at 87 degrees and is throttling hard to 1575MHz core clock, which is so amateurish of him.
 
A comparison between 1080p/lower settings to the native 4k/maxed settings. Someone had to be that guy before 2021 ends i suppose.
 
BTW not sure if it was ever brought up in other threads, but NxGamer recently recently did a comparison on the PS5 release of Death Stranding: Directors Cut and the original on the PC (https://www.youtube.com/watch?v=GpfXfB4WdM0).

It's illustrative in some points and reveals some details that early analysis of the PC version with DLSS vs. PS4 Pro version may have missed no doubt - for example showing that there were some ambient occlusion/shadowing areas that are darker on the Pro. Hell, I went out and purchased DS PS4 Pro for the cheap upgrade to the PS5 version despite owning it on the PC already due to it having unresolved issues with frametimes on my PC - which he also goes into somewhat (specifically on the broken vsync, but more on that below), it's far superior (at least for me) on the PS5 in many aspects. It's probably the #1 game now that shows the PS5's rasterization power in the best light, you basically need a 2080TI to match the performance of it at 4k, especially with the latest patches which have even increased res mode performance on the PS5 since probably this video was made.

But...there are some issues I have with the comparison at times, and some points he brings up are pretty eye-popping in their reasoning that veer close to comical.

First off - two different versions. Directors Cut is far more than the PS4 Pro version just being unlocked in frame rates. Yes, he does mention it's now no longer using checkerboarding, but then mentions how there's greater texture detail being exposed as opposed to even DLSS Quality mode. The problem with this as I noticed on ResetEra (https://www.resetera.com/threads/de...ital-foundry-tech-review.490447/post-74036659) is that DC on the PS5 has different levels of contrast and also makes some changes to the post-processing - it exposes significantly more fine detail than native 4k on the PC. This is not due to DLSS, rather it's a change in art direction, TAA implementation and/or added sharpening.

Secondly, his PC screens clearly exhibit a lack of aniso, I believe the DS on the PC is broken in this regard, you have to force aniso 16x through the CP. Worth mentioning as the usual extra hassle of the PC no doubt, but still it should be immediately obvious:

ZAPiRC9.png


He also goes into the problems of DLSS, which sure! They exist - except the main complaint he has (ghosting on objects with no motion vectors) can be fixed by dropping in a more recent DLL. Again, worth it to bring up that hassle you have to do this, but this is not exactly esoteric information at this point to anyone who covers PC gaming. He chocks up this ghosting to DLSS as a "heavy TAA solution" then that's not entirely the cause, especially in these most egregious examples. Overall, he feels DLSS and checkerboarding are give and take with regards to quality, which I don't necessarily consider an outlandish statement, I've often felt CB is disregarded by the PC community too readily (especially on the PS5 as 60fps can improve the final resolve) - but he also does not note the significant increase with pixel-popping you get with details in the distance. Alex covered this in his video on DS DLSS vs. checkerboarding and it's pretty blatant - DLSS can blur things in motion more, but that comes at the advantage of significantly more stability. Still, it's largely a PS5 vs. PC video so it's not a huge oversight imo.

Vsync: He makes a good chunk of the performance comparisons with vsync on with the PC, which is perfectly valid. And to his credit, he does notice the GPU utilization % dropping on the PC with vsync enabled, which has always been the case with DS's messed up vsync implementation - it's absolute garbage.

However, once again it's his reasoning for why he's witnessing something is occurring where his analysis goes off the rails.

Instead of DS just having a borked frame buffer implementation - which is does - he states that it's due to DX12 driver overhead as the cost of vsync on the command queue. Like, what? First off, he specifically mentions that he's choosing the cutscene because it's GPU limited and at 'no part is the CPU the bottleneck here' - so what does 'driver overhead' have to do with this then? Again, it's a broken vsync implementation and it deserves to be called out (there are many threads of reddit/Steam forums complaining of the microstuttering (which occurs even with a solid 60fps lock) which 505 games have done nothing for. This is I feel another example of NXGamer beating his favourite drum about "PC overhead", which surely does exist - but he often tries to shoehorn it in when it doesn't necessarily apply.

It's not the only game that can exhibit a poor vsync (A Plagues Tale for example - which you can see from his own comparison video on the game where he failed to pick up on it despite RT showing him the low GPU usage), the problem in particular is that since it's DX12 you can't fix it like you can with other games by forcing fast sync - but by doing so I'm not magically freeing up 'driver overhead', I'm just overriding the developers poor frame buffer management.

Since he believes vsync, by it's very nature, introduces this 'overheard', he then goes on to speculate that the PS5 itself is of course suffering from this vsync overhead as well of 10-20% and could be performing even better than his comparison shows because of it.

When you see your GPU utilization % jump around from 80% to 95% where you don't have a corresponding CPU bottleneck, that's nothing to do with your 'command queue' being stalled due to 'driver overhead' - it's due to the game having a broken triple buffered implementation of vsync. If this was truly an issue of the driver being strained you wouldn't be able to fix these occurrences on the PC by forcing vsync through an external utility or the CP.

While still on the topic of vsync, he further compounds his misunderstanding by going on to say that the performance uplift you see with vsync off is what VRR is designed to fix - not fluctuating framerates with a vsync cap that fall below the cap. Sorry, but this is absolute nonsense. You should enable vsync with VRR displays. I realize this is a long debated topic but it's been covered for a while now - VRR will not always eliminate tearing, only vsync can, and vsync also does not induce additional latency with it enabled when vsync is running.

The rest of the video is him congratulating himself on how he takes the time to do deep analyses and get things right instead of rushout of the gate to be the first. Um...ok.
 
Last edited:
. It's probably the #1 game now that shows the PS5's rasterization power in the best light, you basically need a 2080TI to match the performance of it at 4k, especially with the latest patches which have even increased res mode performance on the PS5 since probably this video was made.

Goes to show, theres games where you can match the PS5 with much lesser GPU's aswell. It all depends on what the games are doing. Thats why one shouldnt use just one or a handfull of games for a comparisons but as many as possible.
And then were much closer to a 2070/S (ballpark) when no RT or a 5700XT/RX6600.
 
BTW not sure if it was ever brought up in other threads, but NxGamer recently recently did a comparison on the PS5 release of Death Stranding: Directors Cut and the original on the PC (https://www.youtube.com/watch?v=GpfXfB4WdM0).
uge oversight imo.
Whoops messed up the link:


Just another example that popped up btw regarding the differences in clarity between the Directors Cut and the original, all three are running at native 4K here - but look at the significant difference in the beard and bt goo detail on Sam's face on the PS5 vs the 2070/5600. This has nothing to do with DLSS, they just improved the TAA + contrast in the DC. Without YT compression the difference is even more stark going by my experience jumping back and forth between the PC and PS5 on my systems.

byx9NvQ.jpg
 
Last edited:
The rest of the video is him congratulating himself on how he takes the time to do deep analyses and get things right instead of rushout of the gate to be the first. Um...ok.

Thats more than likely a shot at Alex and DF, which he seems to have made some thinly veiled shots in the past as well. I haven't watched it, but folks on neogaf were pointing out how after Alex's video about Halo's Infinite issues after last year's showing of the campaign, nxgamer put out some video soon after basically saying the exact opposite. I haven't watched personally to confirm, but it's what i've seen people there mention.
 
Goes to show, theres games where you can match the PS5 with much lesser GPU's aswell. It all depends on what the games are doing. Thats why one shouldnt use just one or a handfull of games for a comparisons but as many as possible.
And then were much closer to a 2070/S (ballpark) when no RT or a 5700XT/RX6600.

The PS5 does indeed perform very well in Definitive Addition, however I'd speculate that perhaps in this instance the engine itself has received additional performance optimisation for that release. We know the graphics have been improved in some respects so performance optimisations certainly don't seem that unrealistic. Hopefully we might find out at some point if it releases on PC as suggested it will by Alex.
 
The guy praises PS5's wonderful upscaling "his words not mine" from 1080p to 4K, but does nothing to mention the more competent DLSS options available on PC!

Correction, available on SOME PC's...... Things like DLSS should never be spoken about and compared like they're a standard feature that's enjoyable by every single PC gamer, it's not. But the upscaling PS5 enjoys is on ALL PS5's.

It's the same when DF constantly praise VRR on Series consoles as a major game changer over PS5 when the user base with VRR enabled TV's is so small it's barely worth a mention and definably not worth all the time they spend talking about it.

"Series-X is performing worse then PS5 but it doesn't matter as Series-X has VVR"....... that's great DF..... so what about the other 95% of Series owners that don't have a VRR TV?

But what should NXG do? Compare a feature that (however good it is) is implement in such a very small amount of games and available on a very small amount of gaming PC's? Or compare a common, middle ground feature set on PC and therefore be able to relate to a lot more users then just those with RTX GPU's.
 
But it's not available in a very small amount of games. We have games with DLSS in almost every week of every month. At what point is the number of games gonna stop being an issue ? You had just last month, FIST, Alan Wake, Crysis Trilogy, Guardians of the Galaxy, all with DLSS. This month you have Call of Duty, GTA Trilogy, Battlefield, Jurassic World, Bright Memory. DLSS and "not enough games" is 2019 talk. 20% of steam has RT cards from nvidia. Tens of millions of people. Are we pretending they're a small amount that dont matter also ?
 
But it's not available in a very small amount of games. We have games with DLSS in almost every week of every month. At what point is the number of games gonna stop being an issue ? You had just last month, FIST, Alan Wake, Crysis Trilogy, Guardians of the Galaxy, all with DLSS. This month you have Call of Duty, GTA Trilogy, Battlefield, Jurassic World, Bright Memory. DLSS and "not enough games" is 2019 talk. 20% of steam has RT cards from nvidia. Tens of millions of people. Are we pretending they're a small amount that dont matter also ?
He didn't say 'no DLSS games'. davis.anthony points out it's a hardware specific feature in a minority of titles (although he did say small number, where '120' is relative, and should have used 'small fraction'). If people want to make meaningful comparisons, they should stick to percentiles. What fraction of PC users can use DLSS, and what fraction of games? Are we going to limit this comparison to AAA titles or the full libraries? Is that fraction then representative to be a focal point of comparisons? If it's 95% of PC users and 95% of games, sure. If it's 0.1% of PC users and 0.001% of games, probably not worth mentioning. davis.anthony's point is more that, focussing on niche features makes little sense until they become mainstream so there shouldn't be criticism over a comparison that focuses on a median level, say.

That is, there are many, many different comparisons that can be made. That a person doesn't make the comparisons you think are worth making doesn't necessarily make them wrong. For them to be wrong, you'd have to know what comparison they set out to make and then consider whether they were effective or not.

24m+ pc’s…. more than ps5 users.
This is statistics wrangling. Yes, more PCs than PS5s, but is the video made exclusively for RTX owners? If it's made for 'all PC users' then why focus on a feature than only 10% of the PC video viewership can use?

Can we please move on from the age-old selective number arguments and leave them to the politicians? We all, I think, know how to use different numbers to make different comparisons. We know the differences between 'difference' and 'ratio' and 'fraction', of 'mean' and 'median' averages. Everyone needs to refine their arguments and align their discussion into comparable points and counterpoints.
 
Last edited:
Correction, available on SOME PC's...... Things like DLSS should never be spoken about and compared like they're a standard feature that's enjoyable by every single PC gamer, it's not. But the upscaling PS5 enjoys is on ALL PS5's.

It's the same when DF constantly praise VRR on Series consoles as a major game changer over PS5 when the user base with VRR enabled TV's is so small it's barely worth a mention and definably not worth all the time they spend talking about it.

"Series-X is performing worse then PS5 but it doesn't matter as Series-X has VVR"....... that's great DF..... so what about the other 95% of Series owners that don't have a VRR TV?

But what should NXG do? Compare a feature that (however good it is) is implement in such a very small amount of games and available on a very small amount of gaming PC's? Or compare a common, middle ground feature set on PC and therefore be able to relate to a lot more users then just those with RTX GPU's.

Considering that almost all good LG, Samsung, TCL, Hisense, Vizio, Panasonic and many other TV makers have VRR in their sets, is it really "barely worth" a mention when those sets make up the vast majority of TV sets sold world wide? And LG and Samsung started putting VRR in their sets around the time MS announced VRR coming to Xbox One consoles.

Sony is one of the few (and only major TV brand) that haven't had VRR in their sets prior to this year. And they make up only about 7% of the market (Largest TV Manufacturers by Market Share Worldwide 2020 | Best TV Brands | Global TV Market - Technavio ).

So, yeah if you have an older TV or a Sony TV you might possibly not have VRR. If you have a decent Samsung or LG TV or any other Korean made TV in the past 5 years (greater than 1/3 of the world wide TV market) then you probably have a VRR capable TV. Pretty much any Chinese made TV started including VRR soon after the Korean manufacturers.

If anything, console gaming households without a VRR capable TV are likely in the minority.

Regards,
SB
 
Back
Top