Current Generation Games Analysis Technical Discussion [2020-2021] [XBSX|S, PS5, PC]

Status
Not open for further replies.
https://www.guru3d.com/articles_pages/hitman_3_pc_graphics_perf_benchmark_review,7.html 4k ultra settings on 5700xt 56fps avarage so would be very close to 60 with medium shadows, hitman 3 is probably first ps5 game than run worse on it than on 5700xt (apparently couse we don't know frames with native 4k)

This is an average. The problem of guru 3d, they never gives the mimimum and maximum frame. Out of one cutscene the PS5 is locked 60 fps and run on average probably well over 60 fps.

Wait @Dictator video it will much more interesting.
 
This is an average. The problem of guru 3d, they never gives the mimimum and maximum frame. Out of one cutscene the PS5 is locked 60 fps and run on average probably well over 60 fps.

Wait @Dictator video it will much more interesting.
I'm curious how desktop cards perform in this scene with 50 and 40fps on xsx, 60 on ps5 using ps5 settings (1800p medium shadows) edit: or maybe not so interesting as no drops on ps5 so we don't know how would run without 60fps cap
 
Last edited:
I don't know why PS5 is running in lower resolution, but it can outperforms Xbox Series in hitman 3. Up to ~46% higher performance in some cases
Perhaps for the same reason the Series S is running at 1080p when it should obviously be capable of running well above 25% of the SeriesX's resolution.
I.e., IOInteractive didn't put that much effort into an extensive performance profiling for selecting the ideal rendering resolution for each of the 7 consoles (or 9 if we count with the PSVR modes).


What's the rendering resolution on all the 8th-gens? Perhaps we could take some hints from there.
 
ElAnalistaDeBits also found a locked 60fps on PS5. He even add they could have reached native 4K on PS5 with a bit more work.

PS5 perfectly supports constant 60fps. I think a little more work would have achieved native 4K.


Perhaps for the same reason the Series S is running at 1080p when it should obviously be capable of running well above 25% of the SeriesX's resolution.
I.e., IOInteractive didn't put that much effort into an extensive performance profiling for selecting the ideal rendering resolution for each of the 7 consoles (or 9 if we count with the PSVR modes).


What's the rendering resolution on all the 8th-gens? Perhaps we could take some hints from there.
The posted link give you all the resolutions on Playstation versions.
 
ElAnalistaDeBits also found a locked 60fps on PS5. He even add they could have reached native 4K on PS5 with a bit more work.

That's assuming that whatever caused the drops on XSX was the only bottleneck that the PS5 faced before running stably at 4K (the vast majority of the time).

That's a sizeable assumption to make. Could be true, or there could be additional factors in the decision e.g. compute, bandwidth or whatever.

It's worth noting that the frame drops in the field full of plants could have a different cause to the frame drops on the sniper rifle with the heavy DOF and alpha. And the cause of those could be down to API, shader compiler, firmware, or profiler (or whatever combination) and only in part the underlying silicon.

Faring relatively better in a largely blending bound scene might not help you in a shader bound scene.
 
The posted link give you all the resolutions on Playstation versions.

So the PS4 Pro gets the same 1440p30 as the One X despite the latter having a significantly faster GPU, more bandwidth and more VRAM?
 
ElAnalistaDeBits also found a locked 60fps on PS5. He even add they could have reached native 4K on PS5 with a bit more work.
Any hardware can reach 4K, the problem is at what framerate. I suspect that they went with 1800p due to inability to have solid 60fps (in comparison to XSX where it drops in a single area only) everywhere. Considering that we know that their engine does not support dynamic resolution - was mentioned somewhere - it is probably due to that they went with lower resolution.
 
I am not sure why he is making that statement when he has absolutely no authority or expertise to do so.

And WHY does that crap land in the Digital Foundry thread? Ive watched other analysis where drops are found in PS5 version aswell, sharing it here isnt a good idea though. Its about DF's videos, discussions around them.
 
I am not sure why he is making that statement when he has absolutely no authority or expertise to do so.
"I think a little more work would have achieved native 4K." - and why people can't have their opinion based on theirs benchmarking experience ? Its forbidden now ?
 
Last edited:
I am not sure why he is making that statement when he has absolutely no authority or expertise to do so.
I came here to say the same thing! "A little more work and it could [hit arbitrary target]" is complete bullshit any time press says it. I think DF is guilty of this too sometimes, but not as bad. Rendering is about tradeoffs. Assuming your engine is well optimized and engineered, it's always going to perform worse for some workloads and data and platforms than others, and real life has a huge variety of workloads. Sometimes those tradeoffs are easier to swallow for a given game but they're always there.

Everything we can see about Hitman shows it's a well optimized game using whatever* techniques under the hood the developers thought would work best for their very specific content demands (huge crowds, big glossy modern architecture, long sightlines.) When a game is hitting all of its performance targets at a stable framerate across several platforms and running great on a wide range of PCs, we can pretty safely speculate any remaining improvements are a lot of work.

*Afaik io never gives talks at gdc or siggraph or anything like that, I don't know even the slightest thing about how their tech works.
 
https://www.guru3d.com/articles_pages/hitman_3_pc_graphics_perf_benchmark_review,7.html 4k ultra settings on 5700xt 56fps avarage so would be very close to 60 with medium shadows, hitman 3 is probably first ps5 game than run worse on it than on 5700xt (apparently couse we don't know frames with native 4k)

What's the source for the medium shadows? Are we sure everything else is set to Ultra?

This is an average. The problem of guru 3d, they never gives the mimimum and maximum frame. Out of one cutscene the PS5 is locked 60 fps and run on average probably well over 60 fps.

Wait @Dictator video it will much more interesting.

The 5700XT manages 56fps at 4k and 102fps at 1440p. 1800p is halfway in between the two so taking a framerate halfway in between the two of 79fps shouldn't be too far out.

Agreed that Alex's review would be the gold standard here although given there are no drops on the PS5 to compare with I'm not sure if we'll get one. Perhaps the XSX would be possible though, one can hope.
 
https://www.guru3d.com/articles_pages/hitman_3_pc_graphics_perf_benchmark_review,7.html 4k ultra settings on 5700xt 56fps avarage so would be very close to 60 with medium shadows, hitman 3 is probably first ps5 game than run worse on it than on 5700xt (apparently couse we don't know frames with native 4k)
hmm. to be thoughtful here on PC benchmarks; PC benchmarks are designed with systems to place the bottleneck on the GPU only. They are given obscene CPUs and memory to ensure that they are measuring the GPU in isolation.

PS5 is a complete system, the CPU is not obscene, it's paired with an equivalent GPU. It's sharing it's power between the two, with an upper limit on power draw. And it shares it's memory.

So while we can get an idea of where PS5 might land, any game that is extremely CPU taxing should be taken into consideration for it's effect on the consoles. As they have no ability to benchmark without it. More CPU means more bandwidth taken away from the GPU for both consoles. More CPU for PS5 means more clockspeed taken away as well.

That 448 GB/s the 5700XT has it's for itself. It's clockspeed determined largely by it's own ability to stay cool as well. Its hard to get a tell, because really, if PS5 drops 10% in clockspeed (200Mhz) it's the same clockspeed approximately as the 5700XT now, with less CUs, and less bandwidth.

So imo I don't think it's necessarily fair to state that PS5 needs to perform at this GPU level or that GPU level. PS5 performs likely in a range of values given how it's being used.

I think we've tried in the past to explain this, but people seem to only want to look at the best case scenarios for PS5, ie. all times 100% clock rate for CPU and GPU. No bandwidth loss.
 
The 5700XT manages 56fps at 4k and 102fps at 1440p. 1800p is halfway in between the two so taking a framerate halfway in between the two of 79fps shouldn't be too far out.

I never caught why the 5700xt is used for a ps5 measuring stick -- is it really similar hardware, or is it approximately the same power level but set up differently?
 
I never caught why the 5700xt is used for a ps5 measuring stick -- is it really similar hardware, or is it approximately the same power level but set up differently?
Its a 40 CU 1900 Mhz RDNA gpu that can boost above 1900.
-4 CUs and crank up max boost 300 Mhz and you're at PS5.
with likely the same cache setup and the same memory setup.
There's no performance metric I understand currently that separates RDNA 1 from RNDA 2 CUs aside from feature set.

There's no closer GPU we can find unless they release a 6600 or something.
 
With both consoles supporting VRR hopefully we start seeing more uncapped modes allowing for proper performance comparisons.
 
Status
Not open for further replies.
Back
Top