Digital Foundry Article Technical Discussion [2021]

Status
Not open for further replies.
"It doesn't need the next frame to interpolate, the interpolation algorithm data is entirely generative based largely on the last frame data."

If software doesn't need a full set of rendered frames and can generate the data from the last interpolated frame, then the frequency should always be 60fps. As the software would make up for whatever rendered frame deficit there is. It certainly wouldn't be some random odd number.

"It's probably just better to look at frame rate as a function of the time it takes to produce a new frame, ie, 16.6ms is 60fps and 16.9ms is 59fps. for instance. So it just missed it's update window by 0.03ms after the frame interpolation. To lose 2 frames you need to be at 17.2ms."

But then we'd have a torn frame and we don't have that here.

"So basically there was a standard 33.33ms update with a frame injected inbetween to make it appear like a 16.66ms update. If PS4Pro misses a bit say 33.6ms from the previous update to the next it will appear at 59fps. To lose 2 frames, the next update would have been at 33.9ms. 0.6ms slower than the standard 33.33ms update."

That would surely lead to screen tearing, which we're not getting.

Sorry for the crap quoting, I'm on my phone while the baby sleeps nearby.
 
I suspect if we were to count the total frames over the entire period of dropped frames, then we'd likely get an equal number that was divisible by 2.

i.e., inclusive of all the second long intervals.
 
I suspect if we were to count the total frames over the entire period of dropped frames, then we'd likely get an equal number that was divisible by 2.

i.e., inclusive of all the second long intervals.
yea possibly, good discussion. I think we'd need someone to go through the source captures to figure out 4Pro is doing in this case, we've no clue what youtube is doing to the video for instance, from our distance there's no method to really tell.
 
Nice video thanks DF: XSX stretching its longer legs it seems, guess the further we go into the gen the more apparant the hardware differences will be as software matures and we leave behind cross gen stuff.

I agree but i'm still a bit disappointed for the moment, i expected a bigger gap than that between the PS4 and the XB1.

But let's see what happens in the future.
 
The maker of CapFrameX did - which is what got me to thinking about it in the first place!

This is not really strange , is it? The bvh tree has to be generated somewhere, and there are reasons to choose either gpu or cpu. (Plus there's potentially additional processing going on to slice up parts of the map, or something like that, i guess?)
This talk on MW's shadows might interest you, although a lot of it went over my head (technical even for a technical talk.) There are a few slides that show cpu and gpu costs.
 
I agree but i'm still a bit disappointed for the moment, i expected a bigger gap than that between the PS4 and the XB1.

But let's see what happens in the future.

Why though? This is just going by paper specs but things like TF difference being around 18% - 20% vs. 40% last gen, CPU clocks being essentially the same in SMT mode, etc. would suggest in many ways these systems are much closer in capability than the XBO and PS4 were.

Some differences have comparatives with 8th-gen; example with bandwidth, the 10 GB pool on Series X mirroring the split between PS4's RAM bandwidth and the XBO's DDR3 bandwidth (I know the eSRAM had a higher bandwidth of 218 GB/s but that was for framebuffers only, and was barely enough for that in terms of capacity).

Overall there's no reason to expect such a large gap between the 10th-gen systems (outside of Series S); with Hitman 3 in particular the only large difference is 44% resolution advantage for Series X but that matches the physical CU advantage (as a percentage) it has over PS5, so this is probably one of the first 3P ports actually fully saturating the Series X GPU (saturating as in keeping the CUs busy, not necessarily in terms of pushing them to their limits it's too early for that).
 
4k 60fps rt sounds amazing on paper for a console version but wasn't blown away playing demo (saw bricks in first location and comparing it to demon souls yhm...)
 
and valhalla i mixed settings with foliage higher than ultra on ps5 so I think adding console settings to pc version would be beneficial and argument that in 2025 or 2026 it will be medium or lower doesn't make much sense for me ;d

Valhalla, says it all. Watch dogs paints a whole diffrerent picture. Already now we start seeing altered settings as compared to highest pc, even in Godfall. It wont be 2025 when settings start to drop, its already happening now, and will only happen more so as we go into the generation.
Thats before talking about reconstruction and ray tracing.

I agree but i'm still a bit disappointed for the moment, i expected a bigger gap than that between the PS4 and the XB1.

Yes its not running BC mode, where XSX can stretch its legs. Its a 44% difference, but indeed, in the future that gap might grow. Its not just a difference in raw TF's but more CU's, and a much higher BW rate. A 100mhz higher CPU clock could potentionally help out too, in special since it wont downclock.

Will sure try resident evil village on pc when it comes, ive never played any resident evil since the PSX days.
 
Valhalla, says it all. Watch dogs paints a whole diffrerent picture. Already now we start seeing altered settings as compared to highest pc, even in Godfall. It wont be 2025 when settings start to drop, its already happening now, and will only happen more so as we go into the generation.
Thats before talking about reconstruction and ray tracing.
I know that in watchdogs rt on consoles use some very low settings but other settigns ? Where can I find this info. Doesn't make much sense, 5700xt is still probably faster than 90% pc gamers gpus.
 
I know that in watchdogs rt on consoles use some very low settings but other settigns ? Where can I find this info. Doesn't make much sense, 5700xt is still probably faster than 90% pc gamers gpus.
Most of the other settings are around Medium, quarter res GI, the resolution of shadow maps, no car head light shadows, etc.
They do Mix in other things from higher presets though like pcss shadows from the Sun and screen space shadows.
 
Most of the other settings are around Medium, quarter res GI, the resolution of shadow maps, no car head light shadows, etc.
They do Mix in other things from higher presets though like pcss shadows from the Sun and screen space shadows.
thx for info, but I think they intelligently chose what need to be high in terms quality/performance ratio
 
thx for info, but I think they intelligently chose what need to be high in terms quality/performance ratio

Well dah :p No doubt the do. But we already see settings being lowered across all titles, not only in ray tracing. Godfall PS5 for example is not running the highest possible (everything epic), and thats a PS5/PC only game. Thats early days, when games require more and more, and newer products launch, when new GPUs release, that 5700XT moves down the ladder.
Neither do i see the relevance how many gamers own a 5700XT class gpu, theres a reason there is gobs of settings and scaling on pc. You get to choose how and what you play regarding settings, resolutions, framerates, ray tracing levels or even omitting it, reconstruction tech like DLSS etc.

5700XT (or abit above if not OC) class GPU (PS5) isnt all that powerfull anyway, it already gets limited if you really crave the highest settings. It fares quite bad in comparison to say a 6800/XT or 3070 and up etc.
 
Well dah :p No doubt the do. But we already see settings being lowered across all titles, not only in ray tracing. Godfall PS5 for example is not running the highest possible (everything epic), and thats a PS5/PC only game. Thats early days, when games require more and more, and newer products launch, when new GPUs release, that 5700XT moves down the ladder.
Neither do i see the relevance how many gamers own a 5700XT class gpu, theres a reason there is gobs of settings and scaling on pc. You get to choose how and what you play regarding settings, resolutions, framerates, ray tracing levels or even omitting it, reconstruction tech like DLSS etc.

5700XT (or abit above if not OC) class GPU (PS5) isnt all that powerfull anyway, it already gets limited if you really crave the highest settings. It fares quite bad in comparison to say a 6800/XT or 3070 and up etc.
still I'm sure many players would benefit from option to use console settings in pc games as most don't have idea what they do and how performance taxing they are
 
still I'm sure many players would benefit from option to use console settings in pc games as most don't have idea what they do and how performance taxing they are

No doubt about that. Thats where Alex does come in with his best performance ratio videos. He basically does all the work (and he likes it), just to take his settings and you probably get the best possible. Devs could do this indeed, but somehow they dont.

I want to add, some might not always like what 'console' or 'best perf' settings do. By watching DF's video, you get to learn what they all do and what impacts they have. This way you can decide yourself what settings are worth it to you or not.

Thats not to say pc wont benefit from such a setting, but i can also see the problems with that. The PC is ever evolving and new faster, more advanced hardware comes. Then those best 'console like' settings might not always apply anymore.
Rarely do pc gamers sit with say a RX6800 class gpu for the whole generation.... A simple GPU upgrade is a very nice thing to do some years in (resell the old).
 
Would a framerate-interpolated 60fps show worse input latency than a blue collar 60fps?

Is 1800p somehow better than an intermediate resolution closer to 2160p (and to the specs difference between the PS5 and XSX)? 44% seems like a big pixel difference even if Hitman is hitting the PS5’s GPU hard enough to make it clock down from its 10TF perch, unless something else is going on (API? simple ROP or even CU count, independent of bandwidth or clockspeed?).
 
Status
Not open for further replies.
Back
Top