Current Generation Games Analysis Technical Discussion [2020-2021] [XBSX|S, PS5, PC]

Status
Not open for further replies.
IMO I don't think it's wrong to state if PS5 is performing at 3300/1060 Ti for certain games if that's what the actual data bears.
1060 is rx580 level so lastgen xox performance, hard to imagine situtaiton rx5700xt level graphics with quite capable cpu should be comparable to this level of performance, so nope, something wrong here Edit: you wrote 1060ti so closer to 1070 level so still slower than vega 56 and we have new df cyberbunk video and stadia with this level of gpu loosing to ps5 and xsx in backward compatibility mode...
 
reminds me of that old tweet, but replace the WiiU by 1060/1070 and the PS4 by PS5

Kfwtw3T.png
 
So since this Topic is heavily discussed in both Resetera and Neogafs respective Threads although with a lot of trolling and and half knowledge, is anyone intrested discussing it here perhaps a bit more civilized ?

If you not know what the heck i am talking about (what i doubt:oops:) - This here is already dividing Console and PC Players even more than before you could say:


So they made this Video about Comparing PS5 and a ~5 Year old PC ( ~GTX 1070 / Ryzen R3 3300) with certain Games .. And they only look into 120FPS Modes on those Games.
To cut things short they come to the Conclusion that the beforementioned PC is about comparable with a PS5.

Note the Titel of the Video has been changed to the currently one -
"PlayStation 5 120FPS Mode vs. PC 120FPS: Benchmarks & Graphics Quality Comparison"

before it was someting like:

500$ PC vs PS5
.. they changed it because of backlash. So only their updated Title gave information that its only 120FPS modes that are compared so theres that..

Furthermore they seemed oblivious to the Fact (or straight up ignored it) that on i think all those Games tested
(Dirt 5, Borderland 3 and Devil May Cry 5) PS5 actually is running higher on average than 1080p (since dynamic Res from 900p to 1440p with 900p seen very rarely) but still tested PC with looked 1080p so all their fancy fps Graphs are wortheless.

there are more Points to mention but maybe we start arguing and approuch those then ..
So whats your take on this Video ?
 
oh i see - hm strange i actually looked for a discussion about it. Must have overlooked it. Anyway thx for the heads up :D
 
oh i see - hm strange i actually looked for a discussion about it. Must have overlooked it. Anyway thx for the heads up :D

Yeah, like Brit says we've already discussed this video over the last couple of pages. The reason it's not generated so much conversation here (as opposed to maybe some other places) is that if you take out the platform warring, and the channel/influencer fan element and just look at the content ... it's a pretty awful video, and for a number of reasons (not just the DMC5 resolution debacle). You seem to have worked this out though. :)

As an aside, this video is - at least to me - another indicator that many of the "PC tech" Youtube channels are mostly about showing FPS graphs with little more insight than that. It's possibly one of the reasons that so many of the "tech-lite" PC tech channels are resistant to taking raytracing seriously. They don't really know what it's doing, it doesn't look good on FPS graphs, it doesn't fit with the resolutions they assume certain cards should hit at high frame rates, and so they kick it out to the long grass and give tomorrows GPUs recommendations based on yesterdays software. Anyway, that's just a bugbear of mine.

Thank goodness for Digital Foundry, who look into the frames and don't just count them. It's also great to have other "console" focused tech channels - NXGamer and VGTech both bring interesting material to the conversation.
 
This gives us a relatively like-for-like comparison to better understand what the PS5 is comparable to -- it's about a GTX 1060, 1070, and 1080, depending on game,

As someone with a background in research, if you're getting outrageous results then 99.9% of the time it's because:

a) Your measuring equipment is flawed
b) Your measuring protocol is flawed
c) You're measuring the wrong variables


Steve may not be aware of exactly how outrageous his results are, but at the very least the results of his video and the fact that he is apparently defending them on social media makes me lose a lot of confidence in Gamers Nexus in general.
The fact that they decided to write this sentence that compares the PS5 to >4 year-old nvidia Pascal cards when they definitely had RDNA1 and GCN comparison points makes it all the more click-baity and trollish.
 
So since this Topic is heavily discussed in both Resetera and Neogafs respective Threads although with a lot of trolling and and half knowledge, is anyone intrested discussing it here perhaps a bit more civilized ?

The only thing I'll say is how can you make a comparison video with launch games that are often rushed and far from optimal for the new hardware?

He should revisit his comparison in 2023 with a multiplatform game like COD 2023. I bet the performance will be in favor of PS5.
 
The only thing I'll say is how can you make a comparison video with launch games that are often rushed and far from optimal for the new hardware?

Goes the same for PS5/XSX comparisons....

He should revisit his comparison in 2023 with a multiplatform game like COD 2023. I bet the performance will be in favor of PS5.

I have no clue how anyone could come to the idea to compare a 2016 midrange system to a console launched almost in 2021. The only thing his video proves is that hardware hasnt moved all that fast since that ancient system still can cope with 2021 hardware, even though we talk cross gen games.

He could have went with a 5700XT if he wanted to compare rasterization performance (which is stupid anyway since everythings RT these days). But no lets go with a 1060 from 2016 coupled with some low end cpu from the same era.
 
He could have went with a 5700XT if he wanted to compare rasterization performance (which is stupid anyway since everythings RT these days). But no lets go with a 1060 from 2016 coupled with some low end cpu from the same era.
You have to be brainwashed by Nvidia to think that everything is RT these days. This is absolutely false. Yes, RT performance is important, but we are not there yet where RT is the most important metric because in PC space, the ones with RT I believe not even close to 20%. The most popular GPU in steam survey is 1060 a lot of gamers own system around that performance level. I think some people in this forum are really out of touch with reality (especially PC master race) that they think everything will be direct storage soon, everything will be about RT, when the tech itself is only accessible to small amount of gamers. With the current GPU prices, if things are not going back to normal soon, that 1060 will probably still be the most popular GPU in 2021. PC is not like console, you can't simply pick RTX series as the main point of comparison since it doesn't represent how most people will play a certain game.
 
You have to be brainwashed by Nvidia to think that everything is RT these days. This is absolutely false. Yes, RT performance is important, but we are not there yet where RT is the most important metric because in PC space, the ones with RT I believe not even close to 20%. The most popular GPU in steam survey is 1060 a lot of gamers own system around that performance level. I think some people in this forum are really out of touch with reality (especially PC master race) that they think everything will be direct storage soon, everything will be about RT, when the tech itself is only accessible to small amount of gamers. With the current GPU prices, if things are not going back to normal soon, that 1060 will probably still be the most popular GPU in 2021. PC is not like console, you can't simply pick RTX series as the main point of comparison since it doesn't represent how most people will play a certain game.

Your conclusion is false. The majority of console gamers are stuck on PS4/oneS, even in 2021 and beyond, which is far below even a GTX1060. On consoles, last generation users (the biggest share) are left without ray tracing aswell, in percentages thats probably below 20% even.
Everything about RT? well, just about anything launched has the option for ray tracing, even on consoles, not many have it yet, but its there and its being used by those who can use it. Even older games like Control get patched for RT on consoles.
Oh yes, i can 'simply pick a RTX/6000 series gpu' since it does as much represent what pc gamers can experience as PS5/XSX gamers.
 
As the Next Gen has now become the Now Gen, I have returned to my Input Latency testing across a selection of Titles. Next Gen, High Frame-rate, Cross-Gen, BC and more.

Has the issues from last gen been solved here also and how much better are those 120fps titles?

FYI
As it seems some are getting needlessly 'confused' I did test all consoles with the relevant controller which included the new SX controller and the Dual Sense. So any fears of a rookie mistake can be quelled, and in the fact the tests are reflective of the new hardware as I discussed. The intro was just to show my methodology of testing.
 
The only thing I'll say is how can you make a comparison video with launch games that are often rushed and far from optimal for the new hardware?

He should revisit his comparison in 2023 with a multiplatform game like COD 2023. I bet the performance will be in favor of PS5.

That's how I've been looking at it; can't the specific games he tested be unoptimized to the point where, regardless of the peak hardware potential, they're performing equivalent to 3300s and 1070 Ti cards, but that doesn't mean typical 3P games even later this year will run at that equivalent level as games optimize for the hardware better.

I.e you'll need quite a bit more than 3300s and 1070 Ti cards to run those new games at PS5 console equivalents, and that should go up and up pretty swiftly this year I'd say. Same with running those games at Series X equivalents on PC.


Good to hear he addresses that head-on but I guess just in spirit of good testing among a community someone else can test for latency on those games and setups to see if they replicate the data.
 
I only skimmed through the video, but ... It is a bit of a shame they stuck with cross-platform games and didn't run their tests using first party exclusive games on each platform, such as Gears 5 between OneX and SeriesX, since that's been a title touted for lower input latency.
 
I only skimmed through the video, but ... It is a bit of a shame they stuck with cross-platform games and didn't run their tests using first party exclusive games on each platform, such as Gears 5 between OneX and SeriesX, since that's been a title touted for lower input latency.

Yeah, I think he wanted to stick to the comparative analysis theme of cross-platform titles. Maybe he'll do a future video covering only first-party software input latency.
 
As someone with a background in research, if you're getting outrageous results then 99.9% of the time it's because:

a) Your measuring equipment is flawed
b) Your measuring protocol is flawed
c) You're measuring the wrong variables


Steve may not be aware of exactly how outrageous his results are, but at the very least the results of his video and the fact that he is apparently defending them on social media makes me lose a lot of confidence in Gamers Nexus in general.
The fact that they decided to write this sentence that compares the PS5 to >4 year-old nvidia Pascal cards when they definitely had RDNA1 and GCN comparison points makes it all the more click-baity and trollish.

iam, as many, curious what they do with their updated Video about the Topic - they announced they make a deep dive into this. Adding Data for a 60fps Comparison for example. Will they aknowledge some of their Mistakes or will they double down on it.
This would be simple - they have proven to have a Method that shows on Paper : PS5 is comparable to a GTX 1060.. regardles what we might think about the credebility of their testing - if they think they are right they can use their Method to amplify the first result..
 
Regarding the NXGamer video, I’m surprised PS5 does so well...again, MS have been going on about their low latency and the DS5 packing more tech might imply a white-wash.

But maybe that’s just my poor thinking on how these things work!?
 
MS have made some big latency improvements on their end, so it makes sense they'd talk about them. Sony have also made improvements it seems, but they aren't as big because they were already in a better place to begin with. Latency results can vary so much that it's probably best not to read too much into any one game's results, as it's easy to introduce something that increases latency in a particular mode, on a particular platform.

The Dirt 5 results are interesting (it's turning out to be a rather interesting game!) because 120 fps modes are tied in NX Gamer's results, but for 60 fps mode XS is worse than 60 fps mode on ... everything. Even X1S, which is quite something. Clearly this isn't the platform, but the game. 60 fps mode is doing something to add latency over even the X1S version. And what about dat AC:V 30 fps mode on PS5? Ouch. That won't be the PS5's fault either.

I also thought it was interesting that NX Gamer also says that the PS5 just about pips the Series X, when I think his results show - if anything - that it's not possible to determine in any meaningful way that either platform is inherently better for next gen game input latency.

AC: Valhalla
60 fps mode: XSX 6 ms faster
30 fps mode: XSX 38 ms faster (u wot m8?)

Watchdogs Legion
30 fps: XSX 10 ms faster

Dirt 5
120 fps mode: draw
60 fps mode: PS5 13 ms faster (XSX laggier than X1S == PS4)


COD: Cold War
120 fps mode: PS5 4 ms faster
60 fps mode: PS5 6 ms faster

Removing the obviously game related figures from Dirt 5 60 fps mode (XSX platform is laggier than X1S == PS5 == PS4), and also the single biggest delta of all in AC: Valhalla 30 fps mode (PS5 loses an additional 32 ms dropping to 30 fps mode, basically an entire additional frame over XSX), you end up with really small differences that can go either way.

Based on this (limited) data it's not possible to say that either platform has an advantage for next gen games. Both are similarly low when you remove obvious outliers like AC:V 30 fps on PS5.
 
The input lag in those Ubisoft games (or Rocksteady games) is mostly coming from the engine. But I am quite impressed about the improvement seen in COD on PS5. I didn't think they could improve things here. From 53ms to 42ms. We are getting input lag seen in NES and SNES games (those running at 60fps) using a CRT monitor (40ms). It means they have reduced the input lag of DualSense compared to DS4. And DS4 was already the wireless pad with the lowest input lag.
 
Status
Not open for further replies.
Back
Top