[Alpha Point] - UE5 Tech Demo for Xbox Series X|S

Ridiculous amount of detail, final fantasy spirits within who ?
So good.

I'll be impressed if lod2 looks good. In the sample metahumans they fall apart very fast, which makes me assume their hair implementation doesn't scale well.
 
Lumen is not really suited for 60 FPS it seems. They have trouble running it at 60 FPS. I'm skeptical if 60 FPS can be achieved even with medium Lumen settings at all, I've not been able to reach that performance even with medium settings in the UE5 demo. And it's kinda pointless if it runs slower with RT HW-acceleration.

In that case, I wonder if it doesn't make more sense to use Nvidia's RTXGI. It's much faster as its using proper HW-acceleration with RT enabled cards, perfect for 60 FPS budget. And it looks pretty similar too. Lumen ist just way too expensive, as soon as its enabled, games run much slower than on UE4, which makes sense given its using Software-RT.
 
Last edited:
30fps games are gonna be a thing again on consoles. There's this big wave of '60fps or bust' for a lot of the enthusiast console gamers online right now, but they showed last generation they were just fine with 30fps. And honestly, 30fps *is* entirely playable. Had a great time beating TLOU2 on a base PS4 recently, for example.

All it will take is a developer showing again how much you can improve things when you have double the frametime overhead. Something really extraordinary looking by a big name dev/franchise that's also heavily utilizing reconstruction techniques and whatnot(so there's not a lot of room to just 'drop resolution' for 60fps scaling). People will drool all over it and line up to buy it and praise it.
 
30fps games are gonna be a thing again on consoles. There's this big wave of '60fps or bust' for a lot of the enthusiast console gamers online right now, but they showed last generation they were just fine with 30fps. And honestly, 30fps *is* entirely playable. Had a great time beating TLOU2 on a base PS4 recently, for example.

Not so much fine for most of the people I either know or watch. It was more along the lines of barely tolerable, but the game was worth it because there was no other way to play the game. And then I'd hear the constant moaning about how they wish it was 60 FPS and how much better it would be even if graphics were toned down noticeably. So, far this generation there's been renewed hope among them that at the very least there will always be a 60 FPS option with reduced graphical settings.

Give people that can stand it a 30 FPS mode, and then give people that want it a 60 FPS mode. It's been interesting on these forums seeing some people that were formerly fine with 30 FPS now choosing the 60 FPS option instead of the 30 FPS HQ mode that some games have. Also interesting to see the people that are still fine with 30 FPS and choose that instead of the 60 FPS option. :)

Regards,
SB
 
30fps games are gonna be a thing again on consoles. There's this big wave of '60fps or bust' for a lot of the enthusiast console gamers online right now, but they showed last generation they were just fine with 30fps. And honestly, 30fps *is* entirely playable. Had a great time beating TLOU2 on a base PS4 recently, for example.

All it will take is a developer showing again how much you can improve things when you have double the frametime overhead. Something really extraordinary looking by a big name dev/franchise that's also heavily utilizing reconstruction techniques and whatnot(so there's not a lot of room to just 'drop resolution' for 60fps scaling). People will drool all over it and line up to buy it and praise it.
Developers can revert back to baked-lighting if Lumen doesn't improve on performance.

Also, 40 FPS games might become a thing this gen as John and Richard discussed on DF for devs who want to push the visual quality. and keep the input-lag on the low side.
 
And then I'd hear the constant moaning about how they wish it was 60 FPS and how much better it would be even if graphics were toned down noticeably.
I genuinely never heard one console player in the world complain about Uncharted 4, God of War, Horizon Zero Dawn, Spiderman, Red Dead Redemption 2, Final Fantasy VII Remake, The Last of Us 2 etc etc only being 30fps. Literally not once. But what I did hear was endless praise for these games and how amazing they looked. The hype behind these games was built heavily on how amazing they looked as well, so it was a huge part of their success.

So I really dont know who you're talking about, but if they exist, they were a tiny, insignificant minority.

And the argument isn't that 60fps isn't better, it's that console users tend to be very happy with 30fps when the benefits of 30fps are actually put on display, even if the typical console user(or even enthusiast) doesn't really grasp the tradeoffs the devs have made.

Lastly, yes, of course it's easy to choose the 60fps option when the 30fps option gains you fairly minimal image quality and fidelity. My point is that this wouldn't be the case if a developer chose both a 30fps target *and* a say, 1080p reconstruction->4k resolution to build their game off of as default. THIS is how a dev is really gonna make full use of these new consoles. It's going to cut away the ability to scale the resolution/graphics to a significant degree(that you need to get from 33.3ms to 16.6ms). And this is only considering graphics. If a developer wants to maximize the CPU capabilities of the new consoles with a 30fps target, that will put a much harder limit on practical scalability.
 
Developers can revert back to baked-lighting if Lumen doesn't improve on performance.

Also, 40 FPS games might become a thing this gen as John and Richard discussed on DF for devs who want to push the visual quality. and keep the input-lag on the low side.
40fps modes will be niche for many years if they become a thing at all. It requires a user to have a 120hz display. As I know many(most?) of us here are PC gamers, so it feels more normal(though I still stick with 60hz myself for my own reasons), but it's definitely a lot less common in the console space. Devs cant rely on this at all.

I also think VRR is more likely to standardize quicker on TV's than 120hz. Certainly easier/cheaper for manufacturers to do.

As for Lumen, my argument is mainly that devs *will* be able to use Lumen. If it means they have to target 30fps, so long as they're pushing things in other areas and really showing that their game is a leap ahead of other 60fps titles, I think console users will be very interested.
 
I genuinely never heard one console player in the world complain about Uncharted 4, God of War, Horizon Zero Dawn, Spiderman, Red Dead Redemption 2, Final Fantasy VII Remake, etc etc only being 30fps.

Thats because they didnt play those or even other games regulary at 60fps or higher. As a pc gamer going spiderman tex is something to really to adjust to. Look at rift apart, most opt for the 60fps mode here it seems.
Framerate is part of the image quality.

Lastly, yes, of course it's easy to choose the 60fps option when the 30fps option gains you fairly minimal image quality and fidelity. My point is that this wouldn't be the case if a developer chose both a 30fps target *and* a say, 1080p reconstruction->4k resolution to build their game off of as default. THIS is how a dev is really gonna make full use of these new consoles. It's going to cut away the ability to scale the resolution/graphics to a significant degree(that you need to get from 33.3ms to 16.6ms). And this is only considering graphics. If a developer wants to maximize the CPU capabilities of the new consoles with a 30fps target, that will put a much harder limit on practical scalability.

Well, it seems that 4k60 for about all games as hyped before was never actually going to materialize. Now we still have to trade fidelity for framerates. Ugh.
 
I think for single player experiences 90% of console gamers are fine with 30 fps if it means better visuals. For multiplayer CoD etc... it's probably still 70% of users, but I could believe it's only 50%.

I believe that single player is headed to 1440p reconstructed to 4K @ 30 fps on a lot of titles still this generation. Maybe that's how they sell the mid-gen upgrades if there are any. True 4K and 60 fps. :)

I also think because of COVID we're in a 10 year generation, thus making a mid-gen Xbox in 2025 more of a real possibility now. XSX with better RT, more compute to hold 4K in DRS scenarios @ 60 fps, mainly for the hardware enthusiasts.
 
Thats because they didnt play those or even other games regulary at 60fps or higher. As a pc gamer going spiderman tex is something to really to adjust to. Look at rift apart, most opt for the 60fps mode here it seems.
Framerate is part of the image quality.
Rift Apart benefits fairly little from switching from 60fps to 30fps mode. Clearly the 1080p->4k reconstruction method Insomniac uses is very good.

But a developer would give themselves twice the overhead if they targeted their game to do 30fps at 1080p->4k reconstruction. You can *always* do a lot more at 30fps relative to a given 60fps situation. Always. And I can never, ever see developers just giving up this competitive advantage. I think people underestimate just how much of an arms race exists between developers in the graphics area, as this totally sells games.

Maybe console gamers will just get so used to 60fps that they can never go back, but I really doubt it. Console gamers love to drool over graphics. And when even I can enjoy a 30fps game just fine, despite playing the vast majority of games at 60fps on PC, I'm pretty sure people primarily play on consoles will be able to adjust back just fine.

Well, it seems that 4k60 for about all games as hyped before was never actually going to materialize. Now we still have to trade fidelity for framerates. Ugh.
This was always a very silly expectation.

And you always have to trade fidelity for framerates. You can never have it all in a fixed spec scenario. There is no genuinely no way around this. It will be the same situation with the next generation in 2028 or whatever.
 
Rift Apart benefits fairly little from switching from 60fps to 30fps mode.

Well, even console gamers prefer 60 over 30 if possible. Console gamers mostly live with 30fps because that has been the norm there (the PS2 had many 60fps games though).

Maybe console gamers will just get so used to 60fps that they can never go back, but I really doubt it. Console gamers love to drool over graphics. And when even I can enjoy a 30fps game just fine, despite playing the vast majority of games at 60fps on PC, I'm pretty sure people primarily play on consoles will be able to adjust back just fine.

True, i dont think 30fps will ever dissapear as a whole in the console space. Theres hardware limitations with the machines and there will always be compromises to be made somewhere, and i dont think console gamers care anyway, in special single player games.

This was always a very silly expectation.

Maybe. But it was generally accepted that 4k60 would be the norm due to a 'more powerfull CPU'. People forgot games will evolve aswell, perhaps even faster then the hardware does at this point.
 
40fps modes will be niche for many years if they become a thing at all. It requires a user to have a 120hz display. As I know many(most?) of us here are PC gamers, so it feels more normal(though I still stick with 60hz myself for my own reasons), but it's definitely a lot less common in the console space. Devs cant rely on this at all.

I also think VRR is more likely to standardize quicker on TV's than 120hz. Certainly easier/cheaper for manufacturers to do.

As for Lumen, my argument is mainly that devs *will* be able to use Lumen. If it means they have to target 30fps, so long as they're pushing things in other areas and really showing that their game is a leap ahead of other 60fps titles, I think console users will be very interested.
It'll be 30/40 fps, 40 fps will work with vrr and the squeeze more performance from the gpu
 
Me waiting for the GDC video

giphy.gif
 
In that case, I wonder if it doesn't make more sense to use Nvidia's RTXGI. It's much faster as its using proper HW-acceleration with RT enabled cards, perfect for 60 FPS budget.
Isn't RTXGI pretty much unusable on anything that isn't a RTX graphics card from Nvidia, meant to be coupled with DLSS that is also only available for RTX graphics cards from Nvidia?


How does that help the Series consoles with RDNA2 GPUs from AMD?
 
Isn't RTXGI pretty much unusable on anything that isn't a RTX graphics card from Nvidia, meant to be coupled with DLSS that is also only available for RTX graphics cards from Nvidia?


How does that help the Series consoles with RDNA2 GPUs from AMD?
Thankfully not, it's working good on every DXR capable GPU. It's cross platform, and optimized for every architecture so it would work nicely on the consoles as well.

DLSS is not needed for RTXGI, as it runs fast even without it :) Also has some other advantages like not being screen space dependend and lower latency (former is also the case with Lumen, but then it tanks framerate when enabling HW-RT according to target performance on consoles, unlike RTXGI which benefits greatly from HW-RT in terms of performance).
 
Last edited:
Isn't RTXGI pretty much unusable on anything that isn't a RTX graphics card from Nvidia, meant to be coupled with DLSS that is also only available for RTX graphics cards from Nvidia?

How does that help the Series consoles with RDNA2 GPUs from AMD?

Ask 4A Games. They used a similiar technique like RTXGI.
 
Isn't RTXGI pretty much unusable on anything that isn't a RTX graphics card from Nvidia, meant to be coupled with DLSS that is also only available for RTX graphics cards from Nvidia?


How does that help the Series consoles with RDNA2 GPUs from AMD?

RDNA2 runs Exodus Enhanced just fine and I believe it uses RTXGI/DDGI.
 
The question would be what's faster: Building BVH for proxies + HW RT, or streaming SDF + software featuring approximated cone tracing.
And it depends on if they need SDF for other things as well (reflections, shadows), where RT would do worse for some reasons (like no cones). Having both BVH and SDF is a bit bloated maybe.
We know Lumen is slow, but we don't know how much time of that is taken for their (assumed) probe grid, or which other parts could be replaced by RT with a clear win.
 
Back
Top