Current Generation Games Analysis Technical Discussion [2020-2021] [XBSX|S, PS5, PC]

Status
Not open for further replies.
So according to GN, screen tearing is new to the console audience. Eh?? It's been there on consoles for 20 years, probably longer.

Edit: and I'm not comfortable with the way they're comparing PC 240 hz to PS5 outputting at 120 hz and then captured at 240 hz. As a comparison of how the PS5 and PC experiences can differ there's something to be said, but as a means of of comparing the raw performance of PS5 vs PC I think it's likely to skew results to some degree in favour of PC.

A machine limited to vsync at 8.3ms intervals like the PS5 - as opposed to 4.2 ms on a 240 hz PC - is naturally likely to have lower 0.1% and 1% lows as there will be more instances of longer waits for synchronisation - particularly if you're not triple (or more) buffered all the way through the pipeline. Likewise, average fps will be boosted because there will be instances where fps can go above 120 fps.

The only fair test is to test both on 120 hz displays IMO.
 
Last edited:
Their results are absolutely baffling. In what world is a GTX 1060 even remotely comparable to a PS5?

It's certainly possible with a slapped together port barely utilizing the PS5. I expect this video will be the new "go to" for certain members here despite how incredibly poor and flawed the comparison is.
 
It's certainly possible with a slapped together port barely utilizing the PS5. I expect this video will be the new "go to" for certain members here despite how incredibly poor and flawed the comparison is.

This would be quite stupid for PS/XB console warriors to 'war about' since Dirt 5, Devil May Cry 5 and Borderlands 3 performance and IQ are virtually identical between PS5 and XBSX. And PC warriors will be... well PC warriors.
 
This would be quite stupid for PS/XB console warriors to 'war about' since Dirt 5, Devil May Cry 5 and Borderlands 3 performance and IQ are virtually identical between PS5 and XBSX. And PC warriors will be... well PC warriors.

The video is shitty. Dirt 5 runs between 900 and 1440p in 120 fps mode. Without raytracing it is 4k checkerboard for 120 fps mode for DMC 5. And the two games probably are far from been CPU bound in this mode.
 
Last edited:
  • Like
Reactions: snc
The video is shitty. Dirt 5 runs between 900 and 1440p in 120 fps mode. Without raytracing it is 4k checkerboard for 120 fps mode for DMC 5. And the two games probably are far from been CPU bound in this mode.

Yeah even in the YT video it looks to me like the PS5 is running at a higher res in the side by side comparison of DMC5. I didn't watch the rest of the video but I think he's got the resolution wrong there which would explain the bizarre performance.
 
In what world is a GTX 1060 even remotely comparable to a PS5?

1060 was a midrage pascal 2016 product, its obsolete and what can be considered low end by now. He could have went with a 5700XT/2070 if he wanted to compare rasterization performance.
 
Woah, the 10 series? He didn't mean the 20 series? And that's not taking RT performance into account? Wonder if he has done similar comparisons between Series X and PC and came across similar, well, surprising (to me anyway, unless he actually did mean 2060, 2070 etc.) performance results. o_O

EDIT: Nope, that's definitely the RTX 10 series in the results. So...did AMD overhype even some of the rasterization performance of RDNA2 here or, am I underselling the RTX 10 series in comparison to the RTX 20 series?
The whole thing makes little sense. It has already been ascertained that both consoles are in the region of "2060S/2070S/maybe 2080 here and there" performance.
 
bizzare results, since when gpu on rx5700xt level is comparable to 1070 or even 1060 ;d
 
It didn't fit the narrative he was trying to make

Thing is, he has 1.27m subscribers and comments seem to agree (which doesnt say anything but ok). LTT is in the same range at times. Then you have redgamingtech etc.
The only reliable, non biased tech channel on YT remains digital foundry. All others have some sort of bias to one of the platforms.
 
I appreciate it when people show their hand like this. Between this video and the hot GDDR6 chip video (also with faulty methodology) I know that he's the kind of clickbait hungry fanboy with which to not waste my time.

Before we hop on Steve, let's make sure he isn't misunderstanding something or at least given a chance on correcting any errors. This is somewhat new territory for him on analyzing console hardware performance. I'm not making any excuses for him, but let's give him a chance before the pitchforks come out.
 
Before we hop on Steve, let's make sure he isn't misunderstanding something or at least given a chance on correcting any errors. This is somewhat new territory for him on analyzing console hardware performance. I'm not making any excuses for him, but let's give him a chance before the pitchforks come out.

He continues to defend his video on twitter.
 
He continues to defend his video on twitter.

Being primarily a PC gamer, I'm just hoping Steve will make any necessary changes when things settle a bit. Sometimes in the heat of the moment, individuals want to be always right or defend their views to the death. I'll wait a bit.
 
  • Like
Reactions: snc
Being primarily a PC gamer, I'm just hoping Steve will make any necessary changes when things settle a bit. Sometimes in the heat of the moment, individuals want to be always right or defend their views to the death. I'll wait a bit.

True, people are proud.

But I really don't think the platform on which you game matters. His bias doesn't reflect on anyone who uses the same platform. It only reflects on him.

Desperate YouTubers drumming up a bit of controversy to try and garner some traffic and subscribers is nothing new. Replace "YouTubers" with "journalists" and the statement is more widely applicable across a much greater timespan.

I've just seen this sort of behaviour for too many years and from too many different people. It's played out and it bores me quicker and quicker every time it becomes apparent.

Maybe he'll try to be more objective. But the algorithm rewards engagement. Irritation/annoyance drives engagement more than any other emotion. And since he doesn't seem to be much more knowledgeable than your average PC builder/modder, I suspect that, deep down, he knows he's clutching an empty sack. So if he wants to carry on being a YouTuber, his only choice is to be a gaming hardware equivalent of a celebrity gossip "newspaper."
 
Being primarily a PC gamer, I'm just hoping Steve will make any necessary changes when things settle a bit. Sometimes in the heat of the moment, individuals want to be always right or defend their views to the death. I'll wait a bit.

This is true, and I didn't get the feeling that he intended to create such an unbalanced piece of work, but he does need to fix this. That DMC5 test is pretty indefensible. According to his DMC5 tests the PS5 is performing somewhere below an RX570 or something ridiculous like that.

As far as we can tell, he's comparing PC at 1080p to PS5 at 4K chequerboard (so probably 2x the rasterised pixels) with some reconstruction overhead on top. Then he's benching the PC at 240 hz to further increase its 0.1 and 1% lows, and push average frame rates beyond where they could possibly be on PS5 (capped with vsync at 120hz). Any time the PC is above 120 fps this will skew its average upwards. So altogether this is a pretty horrible test:

Oh my days.png

And I also think he has come into this console vs PC thing with a lack of awareness and some pre-existing biases. For example, thinking that tearing is new to console gamers, and not seeming able to comprehend why console gamers - particularly without adaptive sync - might want capped and more stable frame rates (with less variability and judder) rather than wildly variable uncapped rates.

Perhaps he's also not aware that not all tvs that can support 120 hz, can support both 120hz and 4K at the same time. :-?
 
This is true, and I didn't get the feeling that he intended to create such an unbalanced piece of work, but he does need to fix this. That DMC5 test is pretty indefensible. According to his DMC5 tests the PS5 is performing somewhere below an RX570 or something ridiculous like that.

As far as we can tell, he's comparing PC at 1080p to PS5 at 4K chequerboard (so probably 2x the rasterised pixels) with some reconstruction overhead on top. Then he's benching the PC at 240 hz to further increase its 0.1 and 1% lows, and push average frame rates beyond where they could possibly be on PS5 (capped with vsync at 120hz). Any time the PC is above 120 fps this will skew its average upwards. So altogether this is a pretty horrible test:

View attachment 5215

And I also think he has come into this console vs PC thing with a lack of awareness and some pre-existing biases. For example, thinking that tearing is new to console gamers, and not seeming able to comprehend why console gamers - particularly without adaptive sync - might want capped and more stable frame rates (with less variability and judder) rather than wildly variable uncapped rates.

Perhaps he's also not aware that not all tvs can support both 120hz and 4K at the same time. :-?

I 100% agree, I'm just hoping he makes the necessary changes soon. If not, I'll probably stay away from posting his console works.
 
The whole thing makes little sense. It has already been ascertained that both consoles are in the region of "2060S/2070S/maybe 2080 here and there" performance.

IMO I don't think it's wrong to state if PS5 is performing at 3300/1060 Ti for certain games if that's what the actual data bears. But, I think it's very fair to also say that these are just early cross-gen games and later in the year we'll start seeing plenty of PS5 3P games that simply can't run on PC with a 3300/1060 Ti or 1070 Ti equivalent, for even rasterization performance. Same can be said for Series X, in terms of 3P game performance there later in the year compared to the PC setup he tested in this video.

Would've been nice if he specified something to that effect in the video itself, but it's not too hard to assume that to be the case, either. So PS5 is performing at this tested spec in these specific games, I think it'd be foolish to assume it will always perform at only 3300/1060 Ti level for games coming out even this year, expect that equivalent PC setup to increase in requirements this year and next year as well.

Thing is, he has 1.27m subscribers and comments seem to agree (which doesnt say anything but ok). LTT is in the same range at times. Then you have redgamingtech etc.
The only reliable, non biased tech channel on YT remains digital foundry. All others have some sort of bias to one of the platforms.

It's always funny to me when people call DF shills because the way I see it, they actually are probably the most unbiased and that's because of the pseudo checks-and-balances they seem to have. Rich leans preferring Xbox, John leans preferring PlayStation, and Alex leans preferring PC.

At least, that's how I tend to see it. So that in a way creates a balance and I never see any of them having their preferences turn into outright bias against other platforms.

Before we hop on Steve, let's make sure he isn't misunderstanding something or at least given a chance on correcting any errors. This is somewhat new territory for him on analyzing console hardware performance. I'm not making any excuses for him, but let's give him a chance before the pitchforks come out.

Personally I don't think his results are wrong, but it's probably worth stating that the PC equivalent for performance comparable to these consoles WILL go up over time, even the next few months, as 3P games are able to start optimizing better for consoles.

Meanwhile, devs really don't "optimize" for very specific CPU/GPU combos on PC side, because of the breadth of possible combinations and that level of abstraction usually not being possible. Stuff like SAM may help with this in the future, but that's a wait-and-see. If he was testing for RT performance then the CPU/GPU combos he used for benchmarks would've automatically been superceded since the GTX 10 series has no RT support built in at hardware level; replicating that in software would decimate the rasterization performance for them.

This is true, and I didn't get the feeling that he intended to create such an unbalanced piece of work, but he does need to fix this. That DMC5 test is pretty indefensible. According to his DMC5 tests the PS5 is performing somewhere below an RX570 or something ridiculous like that.

As far as we can tell, he's comparing PC at 1080p to PS5 at 4K chequerboard (so probably 2x the rasterised pixels) with some reconstruction overhead on top. Then he's benching the PC at 240 hz to further increase its 0.1 and 1% lows, and push average frame rates beyond where they could possibly be on PS5 (capped with vsync at 120hz). Any time the PC is above 120 fps this will skew its average upwards. So altogether this is a pretty horrible test:

View attachment 5215

And I also think he has come into this console vs PC thing with a lack of awareness and some pre-existing biases. For example, thinking that tearing is new to console gamers, and not seeming able to comprehend why console gamers - particularly without adaptive sync - might want capped and more stable frame rates (with less variability and judder) rather than wildly variable uncapped rates.

Perhaps he's also not aware that not all tvs that can support 120 hz, can support both 120hz and 4K at the same time. :-?

When you put it that way, yeah, there were some bad takes in that video xD
 
Last edited:
Status
Not open for further replies.
Back
Top