Current Generation Games Analysis Technical Discussion [2020-2021] [XBSX|S, PS5, PC]

Status
Not open for further replies.
I only skipped to the end cause i was curious about his conclusions and he says in this particular title the ps5 behaves around a 3070 level. Even though he doesnt have a 3070. He compares the DE edition on ps5 to the regular edition on pc. He keeps claiming that his vanilla 2070 is a 2070 Super because he applied some oc to the card, even though its impossible to gain the 18% lead the actual 2070 Super has over the vanilla 2070. He then says "just imagine what it will be in a years time". Is the hardware in the consoles not fixed ? It will be at 3080 levels in a year or what exactly is he trying to say ? Absolutely nothing will change in a year. The hardware will be exactly the one that it is now. He also claims how because the hardware in the consoles is amd its gonna benefit that particular architecture more. Even though this is false right now and it was false for the entire ps4 generation which was also amd.

He managed to put all these wrong claims in about a minute of video
It absolutely benefitted AMD during the PS4 gen. AMD consistently gained ground requiring Nvidia users to upgrade while GCN kept chugging along nicely.
 
Has anyone pixel counted Far Cry 6 on Stadia and Luna yet? Considering the game runs at 30fps on those platforms it'll be interesting to see how it stacks up compared to the 60fps console versions.
 
His other failure is the conclusion that the PS5 is faster than a 2070S based on one game that favors AMD gpu's. He should use the 5700XT/6600XT (the latter is faster than PS5) which perform ballpark PS5 gpu performance. And the 5700XT aint outperforming the 2070/S all that much. So yeah, around 2070/S for PS5 GPU is the closest for NV gpus, if you want to make that comparison.
In some games 2070/S outperforms PS5 quite much, in some it doesnt.

t absolutely benefitted AMD during the PS4 gen. AMD consistently gained ground requiring Nvidia users to upgrade while GCN kept chugging along nicely.

That had more to do with Nvidia's (bad) choice for architecture.
 
that show how important is the rest of the PC paired with the GPU.
Lot of people just say 'this card is better than a PS5/XsX and cost almost the same" but the card alone will do nothing, you have to put it in a decent PC, and that'll cost at least the double of these consoles, show even more how bang for the bucks these new gen consoles are compared to last gen.
 
that show how important is the rest of the PC paired with the GPU.
Lot of people just say 'this card is better than a PS5/XsX and cost almost the same" but the card alone will do nothing, you have to put it in a decent PC, and that'll cost at least the double of these consoles, show even more how bang for the bucks these new gen consoles are compared to last gen.

I dont think that ever was the discussion here, and i didnt see anyone stating anything about cost to performance but ok. Seems that these performance discussions always end in 'but price' or 'but the rest of the system'. Its a way to derail the technical debate between a 2070 series gpu and the PS5 gpu as per the NXgamer video into a discussion about component prices, mining, tsmc fabs and scalpers, because thats a large reason why prices are so high even for the PS5 which is very hard to get.
 
it has always been part of the discussion here, and that's why older cards keep being compared to these consoles, just like DF has done "budget hardware vs consoles" videos, it has always been a case of "are the consoles offering good perfs for their price", because otherwise everybody knows higher end PCs will always win by a wide margin, and it's more interesting to compare cards closer to the consoles in perf and price.
 
I honestly don’t know why people get caught up in the semantics of which card or console is faster. You cannot ever get a like for like comparison so your conclusions will always be incomplete. Different CPUs are used, different bandwidth and API constraints are in place, etc. The testing that is done by all YouTubers is fundamentally flawed because their test methodology is wrong. All you can deduce from your testing is which system runs it better. You cannot be fully certain as to why that is the case. You can make intelligent guesses but they’re still guesses until you validate that hypothesis. You cannot evaluate the individual components of each system because you cannot create like for like scenarios. People often take offence to statements made by others because it triggers their inner fanboy. So what if someone makes a controversial statement? Who actually cares? It doesn’t make their statement true nor does it make it worthy of debate.
 
I honestly don’t know why people get caught up in the semantics of which card or console is faster. You cannot ever get a like for like comparison so your conclusions will always be incomplete. Different CPUs are used, different bandwidth and API constraints are in place, etc. The testing that is done by all YouTubers is fundamentally flawed because their test methodology is wrong. All you can deduce from your testing is which system runs it better. You cannot be fully certain as to why that is the case. You can make intelligent guesses but they’re still guesses until you validate that hypothesis. You cannot evaluate the individual components of each system because you cannot create like for like scenarios. People often take offence to statements made by others because it triggers their inner fanboy. So what if someone makes a controversial statement? Who actually cares? It doesn’t make their statement true nor does it make it worthy of debate.

How dare you to come here and be all rational about this stuff?!

You're supposed to deny objective data in a video because the content creator once said that X card performed a bit like Y card when overclocked. And he likes PlayStation, therefore his opinion is invalid.
 
There won't be many PC gamers dropping in .dll files in to games to increase the IQ.

But that's not the point. NXG implies that this is simply what DLSS users have to put up with. You either don't use DLSS or you put up with the ghosting. And that's not the case. It may be that most users don't care enough about the ghosting to bother swapping the dll. But if they do care enough (which is probably most people that watch a video like this and actually care about the conclusion), the problem is very simply resolved. I agree that he should be testing against the version that ships with the game, but since he made such a big issue of it and used it as a primary driver for claiming PS5's image quality superiority, he could have at least mentioned that the problem is easily resolved.
 
NXG implies that this is simply what DLSS users have to put up with.

It is for 99% of RTX owners playing Death Stranding.

This was not a DLSS comparison video, this was a Death Stranding comparison video.

You either don't use DLSS or you put up with the ghosting. And that's not the case.

Again, it is for the 99% of RTX owners playing the game.

I agree that he should be testing against the version that ships with the game

Which is exactly what he does, you just don't like the outcome.

but since he made such a big issue of it and used it as a primary driver for claiming PS5's image quality superiority, he could have at least mentioned that the problem is easily resolved.

He might as well start talking about community made patches to fix games, texture packs and other things too while he's at it.
 
It absolutely benefitted AMD during the PS4 gen. AMD consistently gained ground requiring Nvidia users to upgrade while GCN kept chugging along nicely.

This generation of consoles will age far worse compared to their contemporary Nvidia counterparts than the last generation did. Kepler was a backwards looking architecture when it launched, designed to run the current generation of games as fast as possible with little thought for future games and it's resulting lack of compute capability hurt it significantly in many titles. Turing on the other hand was extremely forward looking with a full DX12U feature set 2 years before the consoles launched as well as RT and ML capabilities that eclipse the much newer RDNA2 architecture. RT alone will ensure Turing (and even more so Ampere) will remain relevant far longer into this generation than Kepler, while DLSS will only amplify that in supported titles (which is why some commentators will spend so much effort to downplay it).
 
This generation of consoles will age far worse compared to their contemporary Nvidia counterparts than the last generation did. Kepler was a backwards looking architecture when it launched, designed to run the current generation of games as fast as possible with little thought for future games and it's resulting lack of compute capability hurt it significantly in many titles. Turing on the other hand was extremely forward looking with a full DX12U feature set 2 years before the consoles launched as well as RT and ML capabilities that eclipse the much newer RDNA2 architecture. RT alone will ensure Turing (and even more so Ampere) will remain relevant far longer into this generation than Kepler, while DLSS will only amplify that in supported titles (which is why some commentators will spend so much effort to downplay it).
The 5700xt has steadily been gaining on it’s Turing competitor. It’s pretty unarguable that being in both consoles benefits AMD. Even in our past discussion, AMD gained some 15% in a large game sample performance comparison over a couple of years vs what everyone claimed was a far more advanced and forward looking architecture in Pascal.
 
The 5700xt has steadily been gaining on it’s Turing competitor. It’s pretty unarguable that being in both consoles benefits AMD.

Not to mention there's never been a Nvidia proprietary technology that's ever become a standard, they all die out (R.I.P PhysX :cry:) so claiming DLSS as an advantage for the RTX series is a stretch.

Once a solution like Intel's XeSS is available and delivers good results on all hardware DLSS will simply die out.
 
Not to mention there's never been a Nvidia proprietary technology that's ever become a standard, they all die out (R.I.P PhysX :cry:) so claiming DLSS as an advantage for the RTX series is a stretch.

Once a solution like Intel's XeSS is available and delivers good results on all hardware DLSS will simply die out.
Oh I agree with him on DLSS. Ever since 2.0 its fantastic. More games it gets in the better. I don't think anything soon will offer any competition to it.
 
The 5700xt has steadily been gaining on it’s Turing competitor. It’s pretty unarguable that being in both consoles benefits AMD. Even in our past discussion, AMD gained some 15% in a large game sample performance comparison over a couple of years vs what everyone claimed was a far more advanced and forward looking architecture in Pascal.
What happens now is entirely irrelevant, because we are still looking at cross generation games that are do not use the crucial DX12 Ultimate features such as mesh shading and Sampler Feedback. Also, some next gen games like Avatar will use Raytracing only as their lighting solution, so HW-RT is of course automatically much faster than software, meaning any Turing card has an instant advantage in these new games.

Once cross gen is over, RDNA1 will age like milk. Consoles won't help it here, as those also have HW-RT and DX12 Ultimate.
 
It is for 99% of RTX owners playing Death Stranding.

No, exactly 0% of RTX owners have to put up with the ghosting in Death Stranding. 99% may choose to, but of those I imagine 98% don't give a damn. Because for anyone that does, they can simply remove the issue.

This was not a DLSS comparison video, this was a Death Stranding comparison video.

He spent almost half the video trying to downplay the games DLSS implementation....

Which is exactly what he does, you just don't like the outcome.

No it's the narrative he overlays that I don't like. He spent half the video trying to justify why he didn't include DLSS in the comparison because if he had the conclusion would have been entirely different. Using the Digital Foundry Native vs DLSS comparison video as the basis for example would have concluded that a 2070 can run Death Stranding just as well or better than the PS5 while also enjoying better image quality.

He might as well start talking about community made patches to fix games, texture packs and other things too while he's at it.

If he's going to spend so much time complaining about a specific bug in a game and then use that to justify his conclusion that his platform of choice is the best place to play the game, then yes, the least he can do is mention that the bug is entirely resolvable via a vendor provided fix.
 
Status
Not open for further replies.
Back
Top