Ridiculous amount of detail, final fantasy spirits within who ?
So good.
I'll be impressed if lod2 looks good. In the sample metahumans they fall apart very fast, which makes me assume their hair implementation doesn't scale well.
Ridiculous amount of detail, final fantasy spirits within who ?
So good.
30fps games are gonna be a thing again on consoles. There's this big wave of '60fps or bust' for a lot of the enthusiast console gamers online right now, but they showed last generation they were just fine with 30fps. And honestly, 30fps *is* entirely playable. Had a great time beating TLOU2 on a base PS4 recently, for example.
Developers can revert back to baked-lighting if Lumen doesn't improve on performance.30fps games are gonna be a thing again on consoles. There's this big wave of '60fps or bust' for a lot of the enthusiast console gamers online right now, but they showed last generation they were just fine with 30fps. And honestly, 30fps *is* entirely playable. Had a great time beating TLOU2 on a base PS4 recently, for example.
All it will take is a developer showing again how much you can improve things when you have double the frametime overhead. Something really extraordinary looking by a big name dev/franchise that's also heavily utilizing reconstruction techniques and whatnot(so there's not a lot of room to just 'drop resolution' for 60fps scaling). People will drool all over it and line up to buy it and praise it.
I genuinely never heard one console player in the world complain about Uncharted 4, God of War, Horizon Zero Dawn, Spiderman, Red Dead Redemption 2, Final Fantasy VII Remake, The Last of Us 2 etc etc only being 30fps. Literally not once. But what I did hear was endless praise for these games and how amazing they looked. The hype behind these games was built heavily on how amazing they looked as well, so it was a huge part of their success.And then I'd hear the constant moaning about how they wish it was 60 FPS and how much better it would be even if graphics were toned down noticeably.
40fps modes will be niche for many years if they become a thing at all. It requires a user to have a 120hz display. As I know many(most?) of us here are PC gamers, so it feels more normal(though I still stick with 60hz myself for my own reasons), but it's definitely a lot less common in the console space. Devs cant rely on this at all.Developers can revert back to baked-lighting if Lumen doesn't improve on performance.
Also, 40 FPS games might become a thing this gen as John and Richard discussed on DF for devs who want to push the visual quality. and keep the input-lag on the low side.
I genuinely never heard one console player in the world complain about Uncharted 4, God of War, Horizon Zero Dawn, Spiderman, Red Dead Redemption 2, Final Fantasy VII Remake, etc etc only being 30fps.
Lastly, yes, of course it's easy to choose the 60fps option when the 30fps option gains you fairly minimal image quality and fidelity. My point is that this wouldn't be the case if a developer chose both a 30fps target *and* a say, 1080p reconstruction->4k resolution to build their game off of as default. THIS is how a dev is really gonna make full use of these new consoles. It's going to cut away the ability to scale the resolution/graphics to a significant degree(that you need to get from 33.3ms to 16.6ms). And this is only considering graphics. If a developer wants to maximize the CPU capabilities of the new consoles with a 30fps target, that will put a much harder limit on practical scalability.
Rift Apart benefits fairly little from switching from 60fps to 30fps mode. Clearly the 1080p->4k reconstruction method Insomniac uses is very good.Thats because they didnt play those or even other games regulary at 60fps or higher. As a pc gamer going spiderman tex is something to really to adjust to. Look at rift apart, most opt for the 60fps mode here it seems.
Framerate is part of the image quality.
This was always a very silly expectation.Well, it seems that 4k60 for about all games as hyped before was never actually going to materialize. Now we still have to trade fidelity for framerates. Ugh.
XSX with better RT, more compute to hold 4K in DRS scenarios @ 60 fps, mainly for the hardware enthusiasts.
Rift Apart benefits fairly little from switching from 60fps to 30fps mode.
Maybe console gamers will just get so used to 60fps that they can never go back, but I really doubt it. Console gamers love to drool over graphics. And when even I can enjoy a 30fps game just fine, despite playing the vast majority of games at 60fps on PC, I'm pretty sure people primarily play on consoles will be able to adjust back just fine.
This was always a very silly expectation.
It'll be 30/40 fps, 40 fps will work with vrr and the squeeze more performance from the gpu40fps modes will be niche for many years if they become a thing at all. It requires a user to have a 120hz display. As I know many(most?) of us here are PC gamers, so it feels more normal(though I still stick with 60hz myself for my own reasons), but it's definitely a lot less common in the console space. Devs cant rely on this at all.
I also think VRR is more likely to standardize quicker on TV's than 120hz. Certainly easier/cheaper for manufacturers to do.
As for Lumen, my argument is mainly that devs *will* be able to use Lumen. If it means they have to target 30fps, so long as they're pushing things in other areas and really showing that their game is a leap ahead of other 60fps titles, I think console users will be very interested.
Isn't RTXGI pretty much unusable on anything that isn't a RTX graphics card from Nvidia, meant to be coupled with DLSS that is also only available for RTX graphics cards from Nvidia?In that case, I wonder if it doesn't make more sense to use Nvidia's RTXGI. It's much faster as its using proper HW-acceleration with RT enabled cards, perfect for 60 FPS budget.
Thankfully not, it's working good on every DXR capable GPU. It's cross platform, and optimized for every architecture so it would work nicely on the consoles as well.Isn't RTXGI pretty much unusable on anything that isn't a RTX graphics card from Nvidia, meant to be coupled with DLSS that is also only available for RTX graphics cards from Nvidia?
How does that help the Series consoles with RDNA2 GPUs from AMD?
Isn't RTXGI pretty much unusable on anything that isn't a RTX graphics card from Nvidia, meant to be coupled with DLSS that is also only available for RTX graphics cards from Nvidia?
How does that help the Series consoles with RDNA2 GPUs from AMD?
Isn't RTXGI pretty much unusable on anything that isn't a RTX graphics card from Nvidia, meant to be coupled with DLSS that is also only available for RTX graphics cards from Nvidia?
How does that help the Series consoles with RDNA2 GPUs from AMD?