Current Generation Games Analysis Technical Discussion [2020-2021] [XBSX|S, PS5, PC]

Status
Not open for further replies.
To me comparisons with Nvidia GPUs are fundamentally wrong and nonsense, it's two completely different architectures and game engines behave differently depending on the GPU architecture. It's not a surprise or impressive, it's just that this game likes RDNA more than Turing, and the reverse also may happen often.
If he wants to compare the PS5 with a desktop GPU than he should buy a 5700XT or a 6700XT.

Also, he has two PC? One with a Zen 1 2700 and other with a Zen 2 3600? Why don't he uses the same PC and simply swaps cards?
 
I only skipped to the end cause i was curious about his conclusions and he says in this particular title the ps5 behaves around a 3070 level. Even though he doesnt have a 3070. He compares the DE edition on ps5 to the regular edition on pc. He keeps claiming that his vanilla 2070 is a 2070 Super because he applied some oc to the card, even though its impossible to gain the 18% lead the actual 2070 Super has over the vanilla 2070. He then says "just imagine what it will be in a years time". Is the hardware in the consoles not fixed ? It will be at 3080 levels in a year or what exactly is he trying to say ? Absolutely nothing will change in a year. The hardware will be exactly the one that it is now. He also claims how because the hardware in the consoles is amd its gonna benefit that particular architecture more. Even though this is false right now and it was false for the entire ps4 generation which was also amd.

He managed to put all these wrong claims in about a minute of video

I watched it and he didn't claim that an overclocked 2070 is a 2070 Super. The PS5 is measurably performing substantially better in a like-for-like scenario.
 

The comparison with a 3070 (because PS5 performs often ~50% better than his 2070 2ghz when both are dropping so we can actually see the real gap) shouldn't be surprising and he shows the game is far from being CPU limited at 60fps (on PC with his 2700 clocked at 3.8ghz).

As I showed previously PS5 performs almost abnormally well in this game compared to both PS4 (8x better) and Pro (4x better with much better water) when many others multiplat games are performing only 2.5x better than Pro. And the game on those old gen consoles was already performing very well (well it's Decima) as it's still one of the most technically impressive game on those machines.

The only game that performs better than this (compared to PS4 and Pro) is The Touryst and we actually know from the developers their game uses PS5 native APIs (but that game is heavily pixel bound so that's another story).

Interestingly he shows that allowing screen-tearing on PC with Vsync Off unsuprisingly improves the framerate (he obviously did his fair comparisons with VSync on) and that Nvidia DLSS has plenty of problems in this game compared to a native image, mostly heavy ghosting on moving objects and blurry textures.
 
I watched it and he didn't claim that an overclocked 2070 is a 2070 Super. The PS5 is measurably performing substantially better in a like-for-like scenario.

Surprising then. I said it in reference to the way he outlined in the video that the 2070 is OC. As for claiming his 2070 vanilla behaves like a 2070 super because its oc, he says so in his Crysis Remastered video, in the Cyberpunk video, just off the top of my head. Many others im sure, since it seems this was discussed here before. nxgamer likes to post on neogaf, people are always questioning his claims in every video that he does and also this fact, that he has a 2070 and keeps saying its a 2070S. Its not lost on anyone the fashion in which he makes videos. I remember relatively recently, he made i think 4 different videos about Dirt 5, which no one cares about. Because he found the ps5 better in some regards. And when a big release like Hitman 3 comes out, which has higher res on xbox and better quality settings, he didnt make a single video. People were asking him why, it seems he didnt find time. But he did to make 4 videos on Dirt 5.
 
He compares the DE edition on ps5 to the regular edition on pc. He keeps claiming that his vanilla 2070 is a 2070 Super because he applied some oc to the card, even though its impossible to gain the 18% lead the actual 2070 Super has over the vanilla 2070. He then says "just imagine what it will be in a years time". Is the hardware in the consoles not fixed ? It will be at 3080 levels in a year or what exactly is he trying to say ? Absolutely nothing will change in a year. The hardware will be exactly the one that it is now. He also claims how because the hardware in the consoles is amd its gonna benefit that particular architecture more. Even though this is false right now and it was false for the entire ps4 generation which was also amd.

He managed to put all these wrong claims in about a minute of video
No offence mate but I think you're the one making the wrong claims
eg I listened a bit he said his overclocked 2070 was near a 2070 super, not that it made it a super.
then afterwards he was talking about later on in the generation ps5 titles will perform better relatively compared to the same PC hardware, this happened with the PS1, the PS2/xbox, the PS3/xb360 and yes with the PS4/xbone now perhaps with the PS5/Xbox series the same wont hold true but I wouldnt bet on it :)

Though to me it looks like its just not optimized well for the PC theres no way that the ps5 should be that much quicker
 
To me comparisons with Nvidia GPUs are fundamentally wrong and nonsense, it's two completely different architectures and game engines behave differently depending on the GPU architecture. It's not a surprise or impressive, it's just that this game likes RDNA more than Turing, and the reverse also may happen often.
If he wants to compare the PS5 with a desktop GPU than he should buy a 5700XT or a 6700XT.

Also, he has two PC? One with a Zen 1 2700 and other with a Zen 2 3600? Why don't he uses the same PC and simply swaps cards?

He should indeed use 5700XT or 6700 non-XT perhaps. Some games favor AMD others NV hardware. In general though, il line with DF's findings that the PS5 is closest to a RTX2070 (ballpark, sometimes abit above). Its no more than that.
 
Naturally any video from NX gamer with PS5 vs PC in the title is going to have a forgone conclusion, and I'm only half way through at present but the first point that immediately jumps out at me is how he's using the ghosting issue to downplay DLSS even though that can be resolved by using a newer version of the DLSS dll. I haven't got to his claims about his slightly O/C'd 2070 being a 2070 Super yet but that claim has already been thoroughly pulled apart on this thread.

EDIT: So I'm getting a bit further into the video and I'm sorry but his DLSS analysis is complete trash. Take 8:30 for example where DLSS looks blatantly obviously better when not in motion but then in motion it loses some ultra fine detail that's only visible at 5x zoom. NXG uses this as justification for stating that 4K looks better while the 4K image is clearly shimmering madly in comparison - something that would absolutely not require 5x zoom to see. He then compares that "4K advantage" to a far more obvious and significant blurring of the foreground grass in 4K that DLSS completely resolves and makes out as if this is only a "partial" redemption in DLSS's favour.

Quite apart from his conclusions about DLSS (which I don't fully agree with for the reasons you pointed out), I have issues with his conclusions about what this one game means for PS5 vs PC, or DLSS vs native.

DLSS is constantly improving, and RT potency is a huge part of the RTX series' appeal. Even old Turing has far greater resources and power to tap into in terms of ML upscaling and sheer RT power. So even as PS5 optimisation improves over the generation, it's possible that DLSS improvements and RT workloads will grow faster than PS5 optimisation can allow for relative performance improvements.

NXG has taken one game with a now outdated implementation of DLSS and no ray tracing, and used it to compare against an entirely different architecture on PC, and from there go on to talk about how this is evidence that the PS5 will increasingly perform better than PC. There are many caveats to that, and many ways in which that might very well not play out across the board.

But pulling unsupported conclusions (which he then presents as fact) from his analyses is something he now has form for: Control loading stutters being due to XSX not having PS5 IO; Control reflections being better on PS5 when it was actually a bug on PS5; HZD PC stutters on his PC being due to PC not having Playstation unified memory. I think there may be a pattern beginning to present itself.
 
But pulling unsupported conclusions (which he then presents as fact) from his analyses is something he now has form for: Control loading stutters being due to XSX not having PS5 IO; Control reflections being better on PS5 when it was actually a bug on PS5; HZD PC stutters on his PC being due to PC not having Playstation unified memory. I think there may be a pattern beginning to present itself.

He's doing the same things Durante chastised him for, 4 years ago. Misidentifying effects, inventing naratives based on his eronous results, making claims that are completely unverifiable by merely looking at the output of a game. He's doing the same things he did at least half a decade ago. As long as he can claim at the end of the video that whatever sony console exists at the time is the best in the world, its ok
 
As I showed previously PS5 performs almost abnormally well in this game compared to both PS4 (8x better) and Pro (4x better with much better water) when many others multiplat games are performing only 2.5x better than Pro. And the game on those old gen consoles was already performing very well (well it's Decima) as it's still one of the most technically impressive game on those machines.

This suggests there may be changes to the underlying engine which improve performance on top of the difference in hardware capability.

and that Nvidia DLSS has plenty of problems in this game compared to a native image, mostly heavy ghosting on moving objects and blurry textures.

As noted above the ghosting is eliminated by simply using a newer version of the DLSS dll. As for other "problems" it's obvious even from NXG's own footage which flies in the face of his analysis that despite some pro's and cons, DLSS is a clear net win over native in this game. There's also no degradation in texture detail at DLSS quality mode, at one point he specifically says that himself, then later contradicts himself by saying that there is while simultaneously showing footage that shows that there isn't.

Compare that to the following video where the commentary actually matches what's on screen and the benefits of DLSS in this title at least are clearly obvious:

 
Last edited:
All those critics against NXGamer may be right (DLSS will also improve, some of his previous explanations being wrong, I mean others always tell true things when they assume something about a game?), but still, in this game PS5 performs >4x the Pro and ~1.5x the 2070 at 2ghz which should be the main event to discuss here.

I am more impressed by how it outperforms the PS4 and Pro considering notably the bandwidth of PS5 compared to PS4 and Pro. They must have really well optimized bandwidth accesses (or is it that plus specific PS5 architecture, cache scrubbers, caches clocked at 2.23ghz and whatnot?).

But we also saw even more impressive stuff with the Touryst (respectively 16x / 8x better performance than PS4 and Pro which is insane, I even know the game can actually drop under 60fps on PS4 at 1080p, and resolution can also drop down to 1440p on Pro, while it's AFAIK locked at 8K 60fps on PS5).
 
He's doing the same things Durante chastised him for, 4 years ago. Misidentifying effects, inventing naratives based on his eronous results, making claims that are completely unverifiable by merely looking at the output of a game. He's doing the same things he did at least half a decade ago. As long as he can claim at the end of the video that whatever sony console exists at the time is the best in the world, its ok
Yea. I mean, I don't like to dogpile on him, but I agree with many things said about NXGamer in this thread. I've said it before that he (perhaps unintentionally) misrepresents things, such as specific hardware, or effect implementations... which I don't agree with at all.. and he has a very apparent Playstation bias, by my own estimation... He is only human, and perhaps he's a bit too close to the material to see it from our viewpoint... but anyway.. my biggest issue with him is how he presents his information, often as fact... which in many cases is wrong, and off the mark. There's no possible way for him to be sure of many of the things he says, and yet he presents it as if he's sure. I don't believe that's right, and I think he should re-examine how he presents his info.
 
Naturally any video from NX gamer with PS5 vs PC in the title is going to have a forgone conclusion, and I'm only half way through at present but the first point that immediately jumps out at me is how he's using the ghosting issue to downplay DLSS even though that can be resolved by using a newer version of the DLSS dll. I haven't got to his claims about his slightly O/C'd 2070 being a 2070 Super yet but that claim has already been thoroughly pulled apart on this thread.

EDIT: So I'm getting a bit further into the video and I'm sorry but his DLSS analysis is complete trash. Take 8:30 for example where DLSS looks blatantly obviously better when not in motion but then in motion it loses some ultra fine detail that's only visible at 5x zoom. NXG uses this as justification for stating that 4K looks better while the 4K image is clearly shimmering madly in comparison - something that would absolutely not require 5x zoom to see. He then compares that "4K advantage" to a far more obvious and significant blurring of the foreground grass in 4K that DLSS completely resolves and makes out as if this is only a "partial" redemption in DLSS's favour.

I don't usually comment on here as I find it more interesting to read but, the comment in bold is certainly a take. Regardless of ones thoughts on NX gamer, there are certain indisputable truths regarding DLSS and trying to use your personal bias to hand wave these truths away is poor form. DLSS often introduces visual inconsistencies into games when used. From Cyberpunk to Modern Warfare, Death stranding and beyond, DLSS image degradation properties knows no bounds. One of the most prominent of these inconsistencies is ghosting and ghosting artifacts. Now whether one can stomach the trade offs in inconsistency is up to the individual but factually speaking, these inconsistencies look worse than the original image because they introduce artifacts that are not present in the original image. Games are played in motion and are not screenshots. I personally turn off DLSS where possible. I've sunk quite a few hours into death stranding and I turn off DLSS on my 3080 because I think its awful. Ghosting to me looks significantly worse than imperceptible blur and even on a 27 inch monitor, it's quite noticeable. DLSS ghosting to me is an poor trade-off for whatever perceived performance gains one might get and the fact that it's paraded as a savior of rendering to me is frankly disgusting. It's sounds and reads like marketing speak.

Does DLSS offer advantages? Yes it does. It reduces the rendering load and enables GPUs to gain additional performance at the expense of image quality. It's even great for raytracing making it feasible on GPUs that it should not be feasible on. However, if we could stop pretending that it's anything other than a rendering crutch for under-powered GPUs, that would be very much appreciated.
 
All those critics against NXGamer may be right (DLSS will also improve, some of his previous explanations being wrong, I mean others always tell true things when they assume something about a game?), but still, in this game PS5 performs >4x the Pro and ~1.5x the 2070 at 2ghz which should be the main event to discuss here.

I am more impressed by how it outperforms the PS4 and Pro considering notably the bandwidth of PS5 compared to PS4 and Pro. They must have really well optimized bandwidth accesses (or is it that plus specific PS5 architecture, cache scrubbers, caches clocked at 2.23ghz and whatnot?).

But we also saw even more impressive stuff with the Touryst (respectively 16x / 8x better performance than PS4 and Pro which is insane, I even know the game can actually drop under 60fps on PS4 at 1080p, and resolution can also drop down to 1440p on Pro, while it's AFAIK locked at 8K 60fps on PS5).
I believe they've done some good engine optimizations between the original release and this new one. Probably optimizations which have been made for Horizon 2 have worked their way through and perhaps they've implemented them into the Director's Cut.

Regardless.. very nice performance out of the PS5 for sure.
 
Does DLSS offer advantages? Yes it does. It reduces the rendering load and enables GPUs to gain additional performance at the expense of image quality. It's even great for raytracing making it feasible on GPUs that it should not be feasible on. However, if we could stop pretending that it's anything other than a rendering crutch for under-powered GPUs, that would be very much appreciated.

I disagree. It's not a crutch at all. It's about rendering images as efficiently as possible. It doesn't matter how powerful you could make a GPU, if you made the same GPU and added DLSS on top of it.. you're still going to take the GPU with DLSS on top for even more performance, because it simply doesn't make sense to render all the pixels every frame anymore.
 
I disagree. It's not a crutch at all. It's about rendering images as efficiently as possible. It doesn't matter how powerful you could make a GPU, if you made the same GPU and added DLSS on top of it.. you're still going to take the GPU with DLSS on top for even more performance, because it simply doesn't make sense to render all the pixels every frame anymore.
If DLSS were an open source software, maybe I’d be more forgiving. DLSS arose from Nvidia’s need to sell a feature that their GPUs could not deliver, real time raytracing. 2 gens on from its inception and their gpus cannot still deliver on the promises they made a few years ago. The only way they’re able to come close is by using DLSS as a crutch. When you combine DLSS and Ray tracing, it’s image degradation properties are more pronounced. Efficient rendering is delivering the same image at a fraction of the cost. DLSS almost never delivers the same image. It delivers an approximation of the image often filled with artifacts.

Crutch: “a source or means of support or assistance that is relied on heavily or excessively”

DLSS is a crutch.
 
I disagree. It's not a crutch at all. It's about rendering images as efficiently as possible. It doesn't matter how powerful you could make a GPU, if you made the same GPU and added DLSS on top of it.. you're still going to take the GPU with DLSS on top for even more performance, because it simply doesn't make sense to render all the pixels every frame anymore.

I understood that NXG was trying to make the same point. He also said that CBR is another method with a similar output (but worse when at 400% zoom), but that they're broadly close. I think this is a totally fair comment to make.

I don't think he's stated anything outrageous (except maybe the 3070 comment) . I enjoyed watching a video that compared a like-for-like comparison.

He did prove that there is a breakdown in quality of DLSS in motion, which some of us figured was the case. We also know that DLSS increases image quality on some aspects of the image and degrades it in others. This has also been proven by a thread on this forum. That's not to take anything away from it, the tech is amazing. It's just good to see matched settings in one of these videos, some of us have been crying out for that for a long time now.
 
Last edited by a moderator:
I don't usually comment on here as I find it more interesting to read but, the comment in bold is certainly a take. Regardless of ones thoughts on NX gamer, there are certain indisputable truths regarding DLSS and trying to use your personal bias to hand wave these truths away is poor form. DLSS often introduces visual inconsistencies into games when used. From Cyberpunk to Modern Warfare, Death stranding and beyond, DLSS image degradation properties knows no bounds. One of the most prominent of these inconsistencies is ghosting and ghosting artifacts. Now whether one can stomach the trade offs in inconsistency is up to the individual but factually speaking, these inconsistencies look worse than the original image because they introduce artifacts that are not present in the original image. Games are played in motion and are not screenshots. I personally turn off DLSS where possible. I've sunk quite a few hours into death stranding and I turn off DLSS on my 3080 because I think its awful. Ghosting to me looks significantly worse than imperceptible blur and even on a 27 inch monitor, it's quite noticeable. DLSS ghosting to me is an poor trade-off for whatever perceived performance gains one might get and the fact that it's paraded as a savior of rendering to me is frankly disgusting. It's sounds and reads like marketing speak.

Does DLSS offer advantages? Yes it does. It reduces the rendering load and enables GPUs to gain additional performance at the expense of image quality. It's even great for raytracing making it feasible on GPUs that it should not be feasible on. However, if we could stop pretending that it's anything other than a rendering crutch for under-powered GPUs, that would be very much appreciated.

Similarly checkerboard rendering and all other temporal rendering techniques are crutches for under-powered consoles and in some cases GPUs. :p

Or if you move into the realities of modern rendering combined with the silicon wall we're rapidly approaching, one of the few ways forward to see dramatic increases in 3D rendering quality. That being temporal rendering techniques or advanced upscaling techniques such as DLSS (which is basically similar to temporal rendering techniques in what it attempts to achieve).

Regards,
SB
 
I don't usually comment on here as I find it more interesting to read but, the comment in bold is certainly a take. Regardless of ones thoughts on NX gamer, there are certain indisputable truths regarding DLSS and trying to use your personal bias to hand wave these truths away is poor form. DLSS often introduces visual inconsistencies into games when used. From Cyberpunk to Modern Warfare, Death stranding and beyond, DLSS image degradation properties knows no bounds. One of the most prominent of these inconsistencies is ghosting and ghosting artifacts. Now whether one can stomach the trade offs in inconsistency is up to the individual but factually speaking, these inconsistencies look worse than the original image because they introduce artifacts that are not present in the original image. Games are played in motion and are not screenshots. I personally turn off DLSS where possible. I've sunk quite a few hours into death stranding and I turn off DLSS on my 3080 because I think its awful. Ghosting to me looks significantly worse than imperceptible blur and even on a 27 inch monitor, it's quite noticeable. DLSS ghosting to me is an poor trade-off for whatever perceived performance gains one might get and the fact that it's paraded as a savior of rendering to me is frankly disgusting. It's sounds and reads like marketing speak.

Does DLSS offer advantages? Yes it does. It reduces the rendering load and enables GPUs to gain additional performance at the expense of image quality. It's even great for raytracing making it feasible on GPUs that it should not be feasible on. However, if we could stop pretending that it's anything other than a rendering crutch for under-powered GPUs, that would be very much appreciated.

As noted a couple of times already, the ghosting can be effectively eliminated in this game by simply using a newer DLSS dll. If it couldn't then I may agree with you, but it can. So putting that aside, I don't disagree that DLSS has some unambiguous quality compromises vs native. But it also has some unambiguous quality improvements over native, the primary one for me which is amply demonstrated in NXGs own video being a more stable image which is by far the worst side effect of low resolution/poor AA in my opinion.

So the question isn't whether DLSS can match native point for point, but rather which set of compromises results in a better overall image. And once the ghosting issue is resolved the answer to that question is DLSS imo. Others may feel differently but the problem here is that NXG is presenting his opinion that DLSS is worse than console native as a fairly unambiguous fact, helped along by the fact that he's ignored the now widely known fix for DLSS ghosting.
 
Similarly checkerboard rendering and all other temporal rendering techniques are crutches for under-powered consoles and in some cases GPUs. :p

Or if you move into the realities of modern rendering combined with the silicon wall we're rapidly approaching, the one of the few ways forward to see dramatic increases in 3D rendering quality. That being temporal rendering techniques or advanced upscaling techniques such as DLSS (which is basically similar to temporal rendering techniques)

Regards,
SB
Your post assumes that I’m here to console war. I’m not. Yes those techniques are crutches as well and you’ll never find me saying otherwise. They exist to aid these underpowered gpus.
 
Status
Not open for further replies.
Back
Top