Current Generation Games Analysis Technical Discussion [2020-2021] [XBSX|S, PS5, PC]

Status
Not open for further replies.
He's running at higher settings than console on PC if the Digital Foundry settings comparison is to be believed.

Also the dynamic res makes a head to head performance comparison problematic as it doesn't work as well on PC as console, again according to Digital Foundry.
Dyn res doesnt works on some cards on others it works
 
On Gaf NXGamer is adamant about his FC6 analysis. According to him the game almost never reaches 2160p during gameplay (like XSS that almost never reaches 1440p) and in like for like sections he is seeing on average 5 to 10% higher resolution on XSX. Meaning that there could be sections showing bigger gaps on Xbox but those moments wouldn't be the usual.
 
Last edited:
I arrived on the big part of the map now and walking in a big valley with big flower fields there is a lot of tearing as soon as i turn the camera.
 
I arrived on the big part of the map now and walking in a big valley with big flower fields there is a lot of tearing as soon as i turn the camera.
What system are you playing on? Also, because this hasn't really been addressed in recent DF content, if you are playing on an Xbox and you have 120hz enabled, does it force vsync on? On Xbox One X that happened when you had 120hz enabled even if the games didn't support 120hz or 120fps.
 
What system are you playing on? Also, because this hasn't really been addressed in recent DF content, if you are playing on an Xbox and you have 120hz enabled, does it force vsync on? On Xbox One X that happened when you had 120hz enabled even if the games didn't support 120hz or 120fps.
you still have a 120Hz output and a lower input lag because of this, but the game is still running at 60 if it doesn't support higher framerates.
 
What system are you playing on? Also, because this hasn't really been addressed in recent DF content, if you are playing on an Xbox and you have 120hz enabled, does it force vsync on? On Xbox One X that happened when you had 120hz enabled even if the games didn't support 120hz or 120fps.
ps5
 
you still have a 120Hz output and a lower input lag because of this, but the game is still running at 60 if it doesn't support higher framerates.
Yeah, I get that. There was just a curious side effect of eliminating screen tearing on Xbox systems when 120hz is enabled, sometimes at the cost of performance, because it forced vsync on.
Bummer. I mean, cool you got a PS5 I just was curious about the vsnc thing.
 
Never heard of a game forcing VSync because of 120 Hz. Don't think they force VRR either. Forcing VSync in general, yes, but not because of certain refresh rates.

Edit: or were you thinking maybe they had a 40 FPS mode where it does vsync at 120 Hz?
 
So how does the Xbox 360 game Frontlines: Fuel of War perform through backwards compatibility on a One X or Series X in 120hz mode?
That game has a menu option to disable vsync. Do the Xbox systems override that setting?
I don't have the equipment to measure the actual framerate of the game on a 120hz display.
 
That game has a menu option to disable vsync. Do the Xbox systems override that setting?

I think this is the order of settings: whatever settings the user has explicitly set for that particular game such as FPSBoost or AutoHDR, then general system level settings, then what the actual game has set.

I dont think there is a general level setting for vsync, just settings for Variable Refresh Rate and other specific refresh rates and Freesync.
 
Well, it might not matter, anyway.
It seems Series S can only run that game in the mid-40s on average. I doubt the Series X CPU's extra speed is enough to push above 60hz.
 
Never heard of a game forcing VSync because of 120 Hz. Don't think they force VRR either. Forcing VSync in general, yes, but not because of certain refresh rates.

Edit: or were you thinking maybe they had a 40 FPS mode where it does vsync at 120 Hz?
No, One X usually forces vsync on games when 120hz output is active (except in Siege). It can hurt performance on some titles, but smooth out improper frame pacing on some games.
 
I only skipped to the end cause i was curious about his conclusions and he says in this particular title the ps5 behaves around a 3070 level. Even though he doesnt have a 3070. He compares the DE edition on ps5 to the regular edition on pc. He keeps claiming that his vanilla 2070 is a 2070 Super because he applied some oc to the card, even though its impossible to gain the 18% lead the actual 2070 Super has over the vanilla 2070. He then says "just imagine what it will be in a years time". Is the hardware in the consoles not fixed ? It will be at 3080 levels in a year or what exactly is he trying to say ? Absolutely nothing will change in a year. The hardware will be exactly the one that it is now. He also claims how because the hardware in the consoles is amd its gonna benefit that particular architecture more. Even though this is false right now and it was false for the entire ps4 generation which was also amd.

He managed to put all these wrong claims in about a minute of video
 
Naturally any video from NX gamer with PS5 vs PC in the title is going to have a forgone conclusion, and I'm only half way through at present but the first point that immediately jumps out at me is how he's using the ghosting issue to downplay DLSS even though that can be resolved by using a newer version of the DLSS dll. I haven't got to his claims about his slightly O/C'd 2070 being a 2070 Super yet but that claim has already been thoroughly pulled apart on this thread.

EDIT: So I'm getting a bit further into the video and I'm sorry but his DLSS analysis is complete trash. Take 8:30 for example where DLSS looks blatantly obviously better when not in motion but then in motion it loses some ultra fine detail that's only visible at 5x zoom. NXG uses this as justification for stating that 4K looks better while the 4K image is clearly shimmering madly in comparison - something that would absolutely not require 5x zoom to see. He then compares that "4K advantage" to a far more obvious and significant blurring of the foreground grass in 4K that DLSS completely resolves and makes out as if this is only a "partial" redemption in DLSS's favour.
 
Last edited:
quite interesting worse quality on nvidia cards with 4k, again ps5 above perf of rtx 2070oc (up to 1.5x in some moments), also like how he showed impact of vsync on frames
He keeps claiming that his vanilla 2070 is a 2070 Super because he applied some oc to the card, even though its impossible to gain the 18% lead the actual 2070 Super has over the vanilla 2070.
are you sure you watched death strandng video ? all I see rtx 2070 oc in description or even only rtx2070 in performance graph
 
Last edited:
I only skipped to the end cause i was curious about his conclusions and he says in this particular title the ps5 behaves around a 3070 level. Even though he doesnt have a 3070. He compares the DE edition on ps5 to the regular edition on pc. He keeps claiming that his vanilla 2070 is a 2070 Super because he applied some oc to the card, even though its impossible to gain the 18% lead the actual 2070 Super has over the vanilla 2070. He then says "just imagine what it will be in a years time". Is the hardware in the consoles not fixed ? It will be at 3080 levels in a year or what exactly is he trying to say ? Absolutely nothing will change in a year. The hardware will be exactly the one that it is now. He also claims how because the hardware in the consoles is amd its gonna benefit that particular architecture more. Even though this is false right now and it was false for the entire ps4 generation which was also amd.

He managed to put all these wrong claims in about a minute of video
yea missing quite a bit of hardware units to for a 2070 to keep up with a super. It's more than just core clocks.
 
Status
Not open for further replies.
Back
Top