Current Generation Games Analysis Technical Discussion [2022] [XBSX|S, PS5, PC]

So, I just upgraded to a 13900K, RTX 4090, and 32GB of DDR 5600MHz from a 2080 Ti, 9900K, and 16GB of DDR4 3000MHz. The performance uplift is pretty insane but once again, Spider-Man is another culprit of poor performance. It runs well, extremely well even but that's a given considering the insane specs of my new PC. With Very High settings and RT on, it struggles to break past 100fps at 3440x1440 sometimes. Once again, I see my 4090 sitting at around 75% utilization. Miles Morales runs much better than the original game. Whereas I saw my performance dip to 70-ish in Spider-Man at max settings, Miles Morales never dipped below 90 and that's with RT Shadows set to Very High. It's also consistently running at 115fp+.

The settings and resolution probably aren't over relevant there as you're very likely CPU limited even with that monster CPU. I'll bet if you activate DLAA or DLDSR you will see little to no performance loss. Or conversely with DLSS 2, no performance gain.

I assume you aren't running FG there either which I expect will near double your FPS.
 
So why quote me?
Because you asking where are the i/o coprocessors on the X Series consoles.

The first I/O coprocessor is dedicated to memory mapping. The second I/O coprocessor's job is to direct SSD I/O, bypassing traditional File I/O. Thats functionality traditionally handled by components like an IO hub which is present in the Series X. IO hubs tend to house IMCs which by defintion are "processors".
 
Last edited:
So you were replying to me then so why did you say you wasn't?

And by the look of it your reply even contains speculation on your part so isn't even useful.

Jesus!

He didn't say he wasn't replying to you, he said you weren't the only person that reply was meant for. IE - you weren't targetted, you just happened to provide a post with which he could respond to in order to explain something to multiple people.

Basically, some people were asking similar things and your post just happened to be a convenient one for him to reply to.

Regards,
SB
 
just Saw the IGN PC Performance Review for Spider-Man Miles Morales and i gotta say it is well done

he pointed out the bottlenecks & how it performs on his end even though it is hard to compare where there is Dynamic Resolution involved (he pointed out as well)

gotta give credit where it's due and not just criticize

 
just Saw the IGN PC Performance Review for Spider-Man Miles Morales and i gotta say it is well done

he pointed out the bottlenecks & how it performs on his end even though it is hard to compare where there is Dynamic Resolution involved (he pointed out as well)

gotta give credit where it's due and not just criticize

interesting that insomniac reconstruction is so much worse on pc, why called it this way when for sure there are difference in implementation
 
interesting that insomniac reconstruction is so much worse on pc, why called it this way when for sure there are difference in implementation

The 4k fidelity mode on PS5 is native 4k and IGTI is just used for anti-asking duties which is why it looks so much better when compared to PC using DRS in the video.
 
The 4k fidelity mode on PS5 is native 4k and IGTI is just used for anti-asking duties which is why it looks so much better when compared to PC using DRS in the video.

all modes are actually dynamic Adjusted

Quality = 4K - 1512p

Performance = 4K - 1440p

Performance RT = 1440p - 1080p

so Quality mode being mostly hovering around Mid 40s even if most cases is near or at the max res, going down all the way to 1440p/1080p is drastic to achieve 60 minimum with some even lower settings than Quality

which is one of the points DF commented as being Tricky to Benchmark & that DRS is different between the two


sources both VGTech & DF

1669395901838.png

1669396008684.png
 
just Saw the IGN PC Performance Review for Spider-Man Miles Morales and i gotta say it is well done

he pointed out the bottlenecks & how it performs on his end even though it is hard to compare where there is Dynamic Resolution involved (he pointed out as well)

gotta give credit where it's due and not just criticize


It's not a terrible review but he really needs to be more scientific with his settings comparisons. For example, at 1:50 he shows the PC dipping below 60fps and uses this as an example of why the PC CPU is unable to keep up with the PS5 CPU. However He doesn't clarify the settings being used. On screen, it says "Fidelity match" which could mean either the settings are equivalent to PS5's Fidelity mode - which runs at 30fps on PS5, or what he considers to be a match to PS5 at Performance RT mode. In any case though, it's not clear the PC is dropping below 60 there due to the CPU or the GPU. Because he's running at 4K DRS, and the PC DRS works different to the PS5, it's quite possible the PC is running much closer to actual 4K at those points where the frame rate drops while the PS5 is running at 1440p, and hence the bottleneck could actually be on the GPU. It would have been better to lock the resolution at 1440p on the PC to remove that possibility.

At 2:30 he does another direct side by side comparison to the PC but this time bizarrely labels the PC as running at 1440p DRS with FSR 2.1 while the PS5 is running at 4K DRS (using IGTI). It's a bit all over the place. As a side note he also states here that the PCIe bandwidth is being "flooded" with 18GB/s at times which is in fact not all that much compared to the 32GB/s (each way) limit of the PCIe 4.0 graphics interface.

At 8:00 he starts to talk about the disadvantages of the RT shadows mode which @Dictator also picked up on, however we've since had a patch which improves the RT shadows. It's unclear whether this is one of the improved aspects though. I think that patch landed on the same day as the NXG video so just unlucky timing there.

The biggest issue by far for me was on the IGTI implementation where I think @davis.anthony is right on the money there. The PS5 in Fidelity mode runs at 30fps and so should have little problem hitting 4K native while the PC (I'm going to assume the 6800) is clearly targeting 60fps and not hitting it, thus DRS is going to be reducing the resolution, likely quite aggressively. I mean, looking at those image comparison it should be blinging obvious that the PC isn;t running at the same internal resolution. That level of difference (it was fricken huge) could not possibly be down to a less optimally implemented upscaling solution, it was far bigger than FSR1 performance vs DLSS Quality for example!. I mean, why the hell did he run this test with DRS on at all? It clearly has the ability to significantly skew the results.
 
It's not a terrible review but he really needs to be more scientific with his settings comparisons. For example, at 1:50 he shows the PC dipping below 60fps and uses this as an example of why the PC CPU is unable to keep up with the PS5 CPU. However He doesn't clarify the settings being used. On screen, it says "Fidelity match" which could mean either the settings are equivalent to PS5's Fidelity mode - which runs at 30fps on PS5, or what he considers to be a match to PS5 at Performance RT mode. In any case though, it's not clear the PC is dropping below 60 there due to the CPU or the GPU. Because he's running at 4K DRS, and the PC DRS works different to the PS5, it's quite possible the PC is running much closer to actual 4K at those points where the frame rate drops while the PS5 is running at 1440p, and hence the bottleneck could actually be on the GPU. It would have been better to lock the resolution at 1440p on the PC to remove that possibility.

At 2:30 he does another direct side by side comparison to the PC but this time bizarrely labels the PC as running at 1440p DRS with FSR 2.1 while the PS5 is running at 4K DRS (using IGTI). It's a bit all over the place. As a side note he also states here that the PCIe bandwidth is being "flooded" with 18GB/s at times which is in fact not all that much compared to the 32GB/s (each way) limit of the PCIe 4.0 graphics interface.

At 8:00 he starts to talk about the disadvantages of the RT shadows mode which @Dictator also picked up on, however we've since had a patch which improves the RT shadows. It's unclear whether this is one of the improved aspects though. I think that patch landed on the same day as the NXG video so just unlucky timing there.

The biggest issue by far for me was on the IGTI implementation where I think @davis.anthony is right on the money there. The PS5 in Fidelity mode runs at 30fps and so should have little problem hitting 4K native while the PC (I'm going to assume the 6800) is clearly targeting 60fps and not hitting it, thus DRS is going to be reducing the resolution, likely quite aggressively. I mean, looking at those image comparison it should be blinging obvious that the PC isn;t running at the same internal resolution. That level of difference (it was fricken huge) could not possibly be down to a less optimally implemented upscaling solution, it was far bigger than FSR1 performance vs DLSS Quality for example!. I mean, why the hell did he run this test with DRS on at all? It clearly has the ability to significantly skew the results.
well

on his Defense at 4:00 he did State the RX6800 is around 33% faster than PS5 and that is with RT shadows set to high

so for argument sake let's just say thats only another 5% to performance than that would be around 40% faster which is what i expect a 6800 performs comparatively



1669407061964.png
 
all modes are actually dynamic Adjusted

Quality = 4K - 1512p

Performance = 4K - 1440p

Performance RT = 1440p - 1080p

so Quality mode being mostly hovering around Mid 40s even if most cases is near or at the max res, going down all the way to 1440p/1080p is drastic to achieve 60 minimum with some even lower settings than Quality

which is one of the points DF commented as being Tricky to Benchmark & that DRS is different between the two


sources both VGTech & DF

Which confirms it's native 4k most of the time and unless you can prove the scenes in the video above (That my comment was about) were less than native 4k there's no real counter argument to what I said.
 
Which confirms it's native 4k most of the time and unless you can prove the scenes in the video above (That my comment was about) were less than native 4k there's no real counter argument to what I said.
Well technically "most of the time" would indeed confirm that it's dynamic. And VGTech always supplies the frames he analyses for people to independently verify.

If VGTech says it's dynamic, it's dynamic (IMO their word is the gold standard)
 
It's not a terrible review but he really needs to be more scientific with his settings comparisons. For example, at 1:50 he shows the PC dipping below 60fps and uses this as an example of why the PC CPU is unable to keep up with the PS5 CPU. However He doesn't clarify the settings being used. On screen, it says "Fidelity match" which could mean either the settings are equivalent to PS5's Fidelity mode - which runs at 30fps on PS5, or what he considers to be a match to PS5 at Performance RT mode. In any case though, it's not clear the PC is dropping below 60 there due to the CPU or the GPU. Because he's running at 4K DRS, and the PC DRS works different to the PS5, it's quite possible the PC is running much closer to actual 4K at those points where the frame rate drops while the PS5 is running at 1440p, and hence the bottleneck could actually be on the GPU. It would have been better to lock the resolution at 1440p on the PC to remove that possibility.

At 2:30 he does another direct side by side comparison to the PC but this time bizarrely labels the PC as running at 1440p DRS with FSR 2.1 while the PS5 is running at 4K DRS (using IGTI). It's a bit all over the place. As a side note he also states here that the PCIe bandwidth is being "flooded" with 18GB/s at times which is in fact not all that much compared to the 32GB/s (each way) limit of the PCIe 4.0 graphics interface.

At 8:00 he starts to talk about the disadvantages of the RT shadows mode which @Dictator also picked up on, however we've since had a patch which improves the RT shadows. It's unclear whether this is one of the improved aspects though. I think that patch landed on the same day as the NXG video so just unlucky timing there.

The biggest issue by far for me was on the IGTI implementation where I think @davis.anthony is right on the money there. The PS5 in Fidelity mode runs at 30fps and so should have little problem hitting 4K native while the PC (I'm going to assume the 6800) is clearly targeting 60fps and not hitting it, thus DRS is going to be reducing the resolution, likely quite aggressively. I mean, looking at those image comparison it should be blinging obvious that the PC isn;t running at the same internal resolution. That level of difference (it was fricken huge) could not possibly be down to a less optimally implemented upscaling solution, it was far bigger than FSR1 performance vs DLSS Quality for example!. I mean, why the hell did he run this test with DRS on at all? It clearly has the ability to significantly skew the results.

No PS5 runs fidelity mode often in 40s-50s
 
Okay, a problem here.

From 2:27, he talks about the CPU bound performance profile of PS5 compared to the 5600x. In those scenes, his 5600x hovers between 60 and 80 FPS. Supposedly, ray tracing is turned off and everything is simply set to "High" in this specific comparison.

I replicated the same benchmark on my super low end 2700x which is clocked at 3.7 GHz at 1.1v.

This is the result.


I have no IDEA what is happening here. My 2700x literally performs similar to the PS5, as one would expect. 5600x is almost always %50 faster than 2700x (at 4 GHz, to be exact), and sometimes nearly %60-80 faster in some scenarios even. His 5600x literally becomes CPU bound near 60 FPS right as Miles zips out from the market. Whereas my 2700x also is CPU bound near 60 FPS at the exact same place, with exact same settings. (I also don't know whether PS5 actually uses High for crowd density even without ray tracing but we would need visual comparisons to compare and contrast) 5600x in those scenes should've been easily getting 90-110 frames.

Either the game is bottlenecking somewhere else that makes 2700x and 5600x perform similary, or there's something wrong on his end, or there's magic sauce on my end. Either way, I can clearly get 60-80 FPS experience with a low end CPU, dunno what else to say.


Sorry for stupid Dualsense adaptive trigger sounds guys. ;(

So, can somebody with a 5600x do the similar test with "everything set to high" except ray tracing? I really want to see this analysed. There's something weird going on. Dunno. That does not look like how an optimal 5600x would perform.
 
Last edited:
No PS5 runs fidelity mode often in 40s-50s

Yes but that's not 60fps, so claiming that drops below 60fps means the PC is slower than the PS5 if the PC is running at the PS5's fidelity mode settings wouldn't be accurate.

Its not clear what settings he's running at though.
 
Okay, a problem here.

From 2:27, he talks about the CPU bound performance profile of PS5 compared to the 5600x. In those scenes, his 5600x hovers between 60 and 80 FPS. Supposedly, ray tracing is turned off and everything is simply set to "High" in this specific comparison.

I replicated the same benchmark on my super low end 2700x which is clocked at 3.7 GHz at 1.1v.

This is the result.


I have no IDEA what is happening here. My 2700x literally performs similar to the PS5, as one would expect. 5600x is almost always %50 faster than 2700x (at 4 GHz, to be exact), and sometimes nearly %60-80 faster in some scenarios even. His 5600x literally becomes CPU bound near 60 FPS right as Miles zips out from the market. Whereas my 2700x also is CPU bound near 60 FPS at the exact same place, with exact same settings. (I also don't know whether PS5 actually uses High for crowd density even without ray tracing but we would need visual comparisons to compare and contrast) 5600x in those scenes should've been easily getting 90-110 frames.

Either the game is bottlenecking somewhere else that makes 2700x and 5600x perform similary, or there's something wrong on his end, or there's magic sauce on my end. Either way, I can clearly get 60-80 FPS experience with a low end CPU, dunno what else to say.


Sorry for stupid Dualsense adaptive trigger sounds guys. ;(

So, can somebody with a 5600x do the similar test with "everything set to high" except ray tracing? I really want to see this analysed. There's something weird going on. Dunno. That does not look like how an optimal 5600x would perform.

This would align to my theory above that it might be GPU bound in those instances with the 6800 running at 4k and DRS not kicking in sufficiently.
 
In regards to Miles Morales. The 8GB 3070 starts suffering at 1440p and above. At 4K it's much slower than the 3080 and even 2080 Ti (which it matches at 1080p) and greatly suffers from its 1% lows.

1% 1440p non-RT lows.

8adnfUT.png


1% lows 4K/non-RT max settings.

aNDIcLO.png


4K/RT max settings
CgRlrVS.png
 
Has anyone noticed that the DLAA in Miles Morales works the opposite of in the OG game? The DLAA in Miles Morales yields much higher performance. Just in the main menu going from TAA to DLAA makes the fps go from 68-69 to 90+ at 4K max settings. In Spider-Man, it's not that dramatic and TAA has a slightly worse performance than DLAA which is to be expected.
 
Back
Top