Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
This is realtime, but I think Xbox need to improve for this. In game footage capture on Series X next time for example.



Yes but at least this the asset they use for the game maybe the final version will look not as good on consoles but a 4090 or a 5090 will be able to render it like the initial trailer.

On the Uncharted 4 case the assets were not in the final trailer.
I mean that's essentially what the developer is saying.

Anyway, I get that the game could come out and not look as good as stated but at least they are stating what it is at the moment. If they didn't state this we would be wondering if it was captured on PC.
 
I think Function means real time during runtime, and not simply captured footage with enhanced after elements.

Remember the controversy surrounding Uncharted 4: A Thief's End announcement, and the actual final product?
They never said that UE4 demo was running on PS4 though. I think it was in-engine.
 
You can run at higher settings and lower frame rates, capture it directly from the system, then play back at "normal" frame rates.

"Capturing" from a system does not preclude, for example, playing back at a different speed or pacing to how something was output by a system.

I am not saying they did this, and I think it's a 1:1 recording / playback. And I think this is probably what the developer is trying to convey. But "captured on" or "rendered on" can hide all kinds of shenanigans.
This is allegedly what happened with The Last Guardian when they showed it on PS3. The game was only running at 15fps and they captured it and played it back at twice the speed when they showed it at E3. Then they moved it to PS4.
 
I expect fable will be lower resolution at launch. Looks great and very achievable, really not that different than what's in HFW aside from art style -- good to see an actual marquee AAA game coming out of ms!
 
Was there some debate on this? I mean, everything from what I could remember was labeled as "captured on Xbox/X."

I didn't doubt that it was captured on xbox and that it was real-time. My issue was we basically saw nothing about what the actual game is like, except maybe a few seconds in Fable. They were essentially just real-time cutscene reveals. At this point, I don't care about that stuff. If they're going to show the game, then show the game play.
 
I didn't doubt that it was captured on xbox and that it was real-time. My issue was we basically saw nothing about what the actual game is like, except maybe a few seconds in Fable. They were essentially just real-time cutscene reveals. At this point, I don't care about that stuff. If they're going to show the game, then show the game play.
This right here. Also, just because something is shown running on real hardware, doesn't even mean that it will look like that in the final product. The first Uncharted 4 reveal was run on PS4, but the asset quality was higher than we got in game, and the sand deformed under Nathan Drake when he gets up in a way that I'm sure is possible on the hardware, but absent from the game. It's much like Spider-Man's puddle gate. Sure, what they showed was possible on the hardware, but that's not what we got in game. Without being overly critical of Digital Foundry, but they essentially dismissed puddle gate because the hardware was capable of using hand placed, high quality cube maps at the performance shown in the E3 demo, and it was simply a manpower issue in that it would take too much time to hand place cube maps for every reflective surface. My opinion is that it's irrelevant if the hardware can do it. If you demonstrate what a game is going to look like, and it looks worse when we get it, that's setting false expectations.
 
Heard reports that the Outer Worlds: Spacer's Choice Edition's PC's latest patch (1.3) had improved things substantially since launch, which had massive GPU demands for one, but the worst aspect was the horrific shader stutter. So plunked down for the upgrade price and gave it a shot on my system (32GB, 3060, i5-12400f).

I only played for an hour last night so this isn't an exhaustive look, but happy to report at least in my short experience, it's night and day since release. First impressions aren't the best at launch, since they made the UX mistake of not communicating to the user what you're doing - there's a black screen with a 'press button/key' prompt, but nothing registers. The reason is it's churning away at shaders in the background, great! But you gotta just pop up a little bubble about "Compiling Shaders", I thought it had crashed since my CPU was basically locked doing the compiling and not registering my button presses. Still, around just a one minute wait, the process is at least well multi-threaded. :)

But what a difference that 1 minute makes. At release, every action you took would produce a shader stutter. Look in this direction - stutter. Fire your gun, stutter. Explosion? Stutter. That's all gone now. Again after only an hour I can't guarantee their compiling step is fully comprehensive, but the only frame drops I had were due to obvious GPU demands. Ultra at everything can be more taxing than the original game due to the new lighting, but it also scales well. I play with Ultra textures at a mixture of medium/high settings, 4k with FSR Performance and get a mostly locked 60. Cards like a 3070 should be able to to Ultra 4k with FSR Quality for 60+ fps I reckon.

There's the frustrating lack of DLSS, FSR2 only which does a...'decent' job, but DLSS of course would be appreciated. There's of course still some odd artistic choices like wildly oversaturated colours at points, but the better textures and lighting in other scenes are definitely noticeable. So it may still not be to everyone's tastes, but at least on a framerate consistency basis - again, the caveat being this was a short run - this is a huge improvement, and more stable than even the original now.

As for the console versions, here's a video of the PS5 version with Patch 1.3. Definitely some improvements, a good chunk of it can be in your VRR window now it seems, but considering how well the game scales on my system it's strange, there should be plenty of opportunity to just lower the res/settings further and get a solid 60. So less impressive on the console front atm.
 
This right here. Also, just because something is shown running on real hardware, doesn't even mean that it will look like that in the final product. The first Uncharted 4 reveal was run on PS4, but the asset quality was higher than we got in game, and the sand deformed under Nathan Drake when he gets up in a way that I'm sure is possible on the hardware, but absent from the game. It's much like Spider-Man's puddle gate. Sure, what they showed was possible on the hardware, but that's not what we got in game. Without being overly critical of Digital Foundry, but they essentially dismissed puddle gate because the hardware was capable of using hand placed, high quality cube maps at the performance shown in the E3 demo, and it was simply a manpower issue in that it would take too much time to hand place cube maps for every reflective surface. My opinion is that it's irrelevant if the hardware can do it. If you demonstrate what a game is going to look like, and it looks worse when we get it, that's setting false expectations.
Didn’t ND come out and state that original Uncharted 4 trailer wasn’t running on a PS4 but more than 1 of them linked together in a server farm configuration?
 

Great article. I particularly liked the confirmation of UE4's issues with multithreading and how they overcame them for Hi-Fi Rush.

Also great (if somewhat obvious) advise there to direct our displeasure at poor ports towards the publishers that forced it to release before it was ready rather than the devs who were likely working their asses off to get it as ready as they good in too small a timeframe. And of course - don't buy day 1 unless you're already sure the game is in a good state. Voting with your wallet is the best way to get things changed. I really want Jedi Survivor - but I'm waiting until it's fixed.
 
Great article. I particularly liked the confirmation of UE4's issues with multithreading and how they overcame them for Hi-Fi Rush.

Also great (if somewhat obvious) advise there to direct our displeasure at poor ports towards the publishers that forced it to release before it was ready rather than the devs who were likely working their asses off to get it as ready as they good in too small a timeframe. And of course - don't buy day 1 unless you're already sure the game is in a good state. Voting with your wallet is the best way to get things changed. I really want Jedi Survivor - but I'm waiting until it's fixed.

Agreed. To that:

Digital Trends said:
I reached out to the development team to get an idea about what went wrong, and although they were initially receptive, EA stepped in and stopped returning my emails
 
The two most popular commercial 3d game engines Unity & Unreal both seem to suffer on multithreading performance, especially in terms of streaming as a lot of deserialization APIs are only callable on the main game thread (Unity is probably even worse than UE on this aspect).
Putting PC gaming aside, loading stutters are way more frequent on many 3d mobile games and so far I saw very little effort into that as well (and I noticed this a lot earlier than the recent lousy PC port trend). I'm not talking about those simple polygon games, but rather big budget mobile 3d games. I've seen many times where the game just halts for a sec when loading a cutscene or landing the first hit vfx.
I guess this may be a good reason why some still choose to make their own in-house engines, cuz they want to get rid of the legacy design of the commercial engines.
 
Final Fantasy 16 performance is rather awful..... Resolutions as low as 720p, FSR 1, unstable fps..... This is frankly getting tiring. Like this is basically Forspoken again but with better art... Forgetting about the quality of the game as shown by reviews, like how can this keep happening. They even bake their lighting which should save performance and somehow they're delivering switch level resolutions/framerates smh.
 
game looks great, takes the computer a long time to render. Quality mode is the default setting here and locks its framerate target.
The game is inconsistent. Sometimes it looks good, sometimes it's very mid and that usually occurs during daylight. Also 1080p in it's "quality mode", 30 fps? I really do wish there was a pc port because it's sinking to levels that are unacceptable for a ps5 game. Again, for a crossgen game like this to be running at these resolutions/performance is very concerning.
 
The game is inconsistent. Sometimes it looks good, sometimes it's very mid and that usually occurs during daylight. Also 1080p in it's "quality mode", 30 fps? I really do wish there was a pc port because it's sinking to levels that are unacceptable for a ps5 game. Again, for a crossgen game like this to be running at these resolutions/performance is very concerning.
It's not a cross gen game. Looking at the shadows and vfx it's not surprising that it's GPU bound and needs to run at 1080p.
 
Status
Not open for further replies.
Back
Top