Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
You mean Lumen runs at 60fps in a what is a very simple game (Fortnight)

And when it's pushed in an actual next gen looking environment it drops to 30fps.

Tells me everything.

Epic said they optimize Software lumen to run at 60 fps on current gen console. Same go see the Sillent Hill 2 remake PC requirement for 1080p TSR 60 fps and you will understand PS5 and XSX GPU are enough to run UE 5 game with Software lumen at 60 fps.

Software lumen was created for PC GPU without raytracing and Realtime GI at 60 fps on current gen consoles.

Go read to advance Realtime rendering SIGGRAPH 2022 presentation about Lumen and all the optimization for software lumen targeting 60 fps on current gen consoles.



 
Games like Horizon FW and Plague tales Requiem at 60fps are a good case of diminushing returns where 60fps at a lower image fidelity (but helped by smart reconstruction techniques) actually looks better than 30fps with slightly better graphics for many if not most people. What we are seeing in this generation is 40fps/60fps replacing 30fps gaming and 120hz (or uncapped + VRR) is becoming the new 60fps. This is the reality of current next-gen only games. Thinking 30fps gaming is going to replace what we have now is just FUD.

As stated by Shifty there is also the fact that competition is fierce now and the young gamers are used to 60fps gaming. As many studios, big or small, have access to engines / tools allowing them to create actually the best graphics right now at >60fps the others studios used to 30fps gaming on consoles (like Bethesda with Starfield) will have to adapt...or die.
 
Jedi Survivor is the worst case with HW RT GI and partial HW RT reflection with dynamic minimum resolution going down to 648 p and some people prefer play the performance mode.

Right and if people are happy with upscaled 648p developers can choose to target that resolution and crank graphics even higher. If the feedback is that 648p looks great why would they waste GPU resources going higher if they can spend it elsewhere.
 
Right and if people are happy with upscaled 648p developers can choose to target that resolution and crank graphics even higher. If the feedback is that 648p looks great why would they waste GPU resources going higher if they can spend it elsewhere.

I doubt many devs will do the same choice than Respawn with HW RT for GI. Lumen has a software version and EA Seed R&D used in Frosbite use a Software GI solution. I think consoles game will use Software GI and PC game RTX GI.
 
I doubt many devs will do the same choice than Respawn with HW RT for GI. Lumen has a software version and EA Seed R&D used in Frosbite use a Software GI solution. I think consoles game will use Software GI and PC game RTX GI.

Sure but there are other ways to push graphics besides RT. The point is that devs can still choose to use whatever performance is left after cutting back resolution and opting for software RT. There’s no limit to the things they can do with the hardware.
 
Epic said they optimize Software lumen to run at 60 fps on current gen console.

That will still depend on what each game is doing, the technical ability of each development studio and if they're happy with the sacrifices and visual issues with SW Lumen.

Same go see the Sillent Hill 2 remake PC requirement for 1080p TSR 60 fps and you will understand PS5 and XSX GPU are enough to run UE 5 game with Software lumen at 60 fps.
I've yet to see a recent PC requirements chart that's been any where close to being accurate.

Software lumen was created for PC GPU without raytracing and Realtime GI at 60 fps on current gen consoles.

And with PC's that have RT HW GPU's increasing every day the importance of SW Lumen on PC is becoming less relevant.

Go read to advance Realtime rendering SIGGRAPH 2022 presentation about Lumen and all the optimization for software lumen targeting 60 fps on current gen consoles.

There irrelevant (see my first point again)
 
That will still depend on what each game is doing, the technical ability of each development studio and if they're happy with the sacrifices and visual issues with SW Lumen.


I've yet to see a recent PC requirements chart that's been any where close to being accurate.



And with PC's that have RT HW GPU's increasing every day the importance of SW Lumen on PC is becoming less relevant.



There irrelevant (see my first point again)

If devs can go without realtime GI and use SSR and cubemap. I think they will be ok with software GI in general.

For third point did I say something else? I said RTX will be used on PC...
 
Generally speaking, graphics are scalable enough to make a 60fps mode in basically any game viable I think. You can always just cut more resolution, or more shadows, or more RT etc... if people aren't happy with those aesthetic compromises, then the 30fps is for them.

The CPU side is potentially a bit different as you have AI, animation and simulation systems that can be scaled, but beyond a certain point will just look wrong, or even break gameplay altogether. Cyberpunks empty cities on PlayStation are an example of the former.

I imagine something like GTA6 will have sufficiently dense world simulation and realistic animation so as to make a 60fps mode undesirable vs the 30fps mode. Assuming they target 30fps as the baseline that is, which I expect they will given it'll likely reserve the fuller experience for the next generation of consoles.
 
Once phsyics, dynamic destruction, hair simulation, cloths simulation, fluid simulation and advanced AI are added into the mix with complex graphics, 60 fps will become unattainable.

You can already see that in Plague Requiem, they had to cut down rats animation rate and NPCs animation rate in half just to get the game running at more than 30fps. Imagine something more complex than rats, the cost will be heavier for sure.
 
If devs can go without realtime GI and use SSR and cubemap. I think they will be ok with software GI in general.

If they're going to pay a huge performance cost and still not have the quality they want they won't be OK with it at all.

It has to make sense visually, artistically and performance wise.
 
But as we see on PC, cutting the resolution doesn't scale linear in terms of performance gain.

And there's also a point where the IQ takes such a hit that it's just not worth it.

If a dev accept dynamic resolution with sub 720p as the minimum resolution because they use HW RT GI at 60 fps. I think resolution is not a problem.

648p never forget.
 
Last edited:
Zelda sold 10 million copies within the first three days. Consoles exist for having fun, playing a game without having to inspect every option.

Do you want better graphics, more frames, more control? Buy a PC. Make good games with good graphics is everything which matters.
 
Once phsyics, dynamic destruction, hair simulation, cloths simulation, fluid simulation and advanced AI are added into the mix with complex graphics, 60 fps will become unattainable.

You can already see that in Plague Requiem, they had to cut down rats animation rate and NPCs animation rate in half just to get the game running at more than 30fps. Imagine something more complex than rats, the cost will be heavier for sure.
I agree with this, because it's obviously true. But what you are pointing out with the rats in PTR isn't just that they run at half the frame rate, it's that they run at the same framerate as the 30FPS version. It's the CPU/frame equivalent of VRS, where parts of the scene are at a different level of fidelity than the rest of the scene. And honestly it's not even a new thing. The original shipping versions of Bioshock famously had a framerate limit for physics objects, which was updated when the PS4/XBO versions were released, and those updates were rolled back to the PC versions. But even older than that, Dark Forces 2 had small physics objects that you could force push around. Those did not run at the full render framerate either. Plenty of games today use dynamic update rates on distant animation already, and I'm willing to bet every game has at least one thing that isn't updated at the full framerate, even if it's just simple animations like leaves or grass.

Redfall, obviously it has technical issues beyond this, but has what looks to me as quarter rate animations for destruction on cult monoliths and nest hearts, but full rate animation on lots of physics particles. Quarter rate might not be right, maybe it's just half with a drop in the actual framerate when those objects crumble.

Regardless, I think if you are making the choice between 30FPS for everything or 30FPS for some things and 60FPS for most everything else, then 60 is the choice you should go for.
 
But we're not talking about a single genre, we're talking about all types on games.

If the next Horizon games comes out and it's 30fps with no 60fps options but looks vastly superior to Forbidden West I don't think people will complain about the frame rate.

I promise I will!
 
Status
Not open for further replies.
Back
Top