Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
Would be surprising if PS5 has more usable RAM than Series X. I thought PS5 allocated more for OS functions than Series X did.
 
Nanite without VSMs is often a performance disaster with no real upside... the two were designed to work together.

Something heavy-handed like turning off GI entirely might be interesting from a technical user point of view, but just flicking a switch would probably make the game completely unplayable with regions that are just black or otherwise. Sure you could probably make it look like a PlayStation 2 game by just putting in some constant ambient term but no self-respecting game dev would want their art portrayed in that manner.

Really what people would "expect" in such a version is a version with baked lighting which - as noted explicitly in the interview - would be a whole lot of work... effectively re-doing the graphics and lighting largely from scratch.

Personally I'd rather more they go in the other direction if anything in future content: you're paying for dynamic lighting, so would love to see them make more use of it. More moving/animating lights and so on. That said they are if they can find more console performance it is probably best used on resolution on those platforms.


You've posted that a few times, but that's specifically from the "the path" sequence in the crazy dreamscape type world, right? If you look at the skybox there I can only assume the weird flat look is intentional. Could be some sort of bug but the same assets look regularly lit and fine when they are in the context of the real world.

Interesting interview in any case, thanks for doing that DF. Curious to see where they go from here!

I'm not suggesting they allow you to turn of VSMs and replace them with baked shadows or some other shadowing. Just turn them off. Same with Lumen GI and Lumen Reflections. Just allow people to turn them off, or at most replace them with screen space reflections and whatever AO fortnite uses when you turn off lumen. That's basically how I played Fornite for a while. I turned off VSMs so there were no shadows at all, and I turned Lumen off completely. That way I could get, I think something like 170 fps in most areas for my 240Hz display. Sure, it makes the game look kind of terrible, but for some people they just want the gameplay and it's a pretty nice demonstration of what those graphics options are giving you when you turn them on.

Here's fortnite with Nanite ON, comparing the effect of VSM/Lumen GI/Lumen Reflection OFF (AO, SSR not enabled) vs EPIC for people that want to see.
Fortnite   2023-08-28 6_41_19 PM.jpg

Fortnite   2023-08-28 6_41_54 PM.jpg

Here's another
Fortnite   2023-08-28 6_52_03 PM.jpg
Fortnite   2023-08-28 6_52_44 PM.jpg
 
Last edited:
IoA developer confirms PS5 is running at higher settings. This is taken from the game's subreddit.


View attachment 9479
Why would you quote me in this? My response had nothing to do with any of this. My comment was about your ridiculous claim that the difference between the XSX and PS5 versions was as visually different as the difference between the XSS version and the top versions.

It's becoming quite clear you're really only here as a dishonest platform warrior.
 
The PS5 is a little better so they pushed the fidelity a little more but it performs worse...

Is it me or that doesn't make sense?
 
But it doesn’t make sense. You’d think they’d push higher fidelity but keep the frame rate. If they push higher fidelity but the fps tanks, what’s the point?

Exactly, if you believe you have more headroom on one platform so you up the visual quality, but then the end result is you have lower performance than the platform you felt was more constrained, then you didn't in fact have that headroom.
 
Why would you quote me in this? My response had nothing to do with any of this. My comment was about your ridiculous claim that the difference between the XSX and PS5 versions was as visually different as the difference between the XSS version and the top versions.

I said the resolution and/or image clarity difference between PS5 and Series X is almost as noticeable as the difference between Series S and X and I stand by this Your reply and tone suggested I was way off base. The developer's commentary and Tom's subsequent addendum to his tech analysis and review supports my initial observation.

My point is next to the PC running at native 1440p, the PS5 does not look like a FSR2 4k output upscaled from a 720p base. And if it is, then I say that is a relatively excellent result. I also didn't see any image sharpening artifacts in the DF video or this one. Curious as to what's truly going on here. To me the PS5/XSX difference is almost as noticeable as resolution difference between the SS and SX.



It's becoming quite clear you're really only here as a dishonest platform warrior.

Dishonest? Lol. There was nothing within this exchange where you can label me a warrior.

I really hope mods clean this up. People shouldn't be permitted to recklessly throw out "platform warrior" accusations ESPECIALLY when I'm quoting the developer explanation on the differences between PS5 and Series consoles.
 
What is confusing about this? Graphics settings and performance typically have an inverse relationship.
Again, use your brain. PS5 should perform better with the same settings or perform the same with settings if it has more headroom. Where is the headroom if it has better settings…but performs worse? That’s just normal. And it’s not just 5-10% worse. It’s up to 25%.
 
Again, use your brain. PS5 should perform better with the same settings or perform the same with settings if it has more headroom. Where is the headroom if it has better settings…but performs worse. That’s just normal.

I wasn't disrespectful to you, so what's with this language?

The developer has chosen to push visual fidelity at a performance target ceiling/floor of their choosing. For example, if you choose to turn on Max RT in cyberpunk, those who appreciate the added fidelity over performance will keep the setting active, those who deem performance taking too great a hit to their liking will pass on Max RT. Same concept going on here except with console, the developer makes the choice for performance profiles.
 
I wasn't disrespectful to you, so what's with this language?

The developer has chosen to push visual fidelity at a performance target ceiling/floor of their choosing. For example, if you choose to turn on Max RT in cyberpunk, those who appreciate the added fidelity over performance will keep the setting active, those who deem performance taking too great a hit to their liking will pass on Max RT. Same concept going on here except with console, the developer makes the choice for performance profiles.
Yeah so where is that extra headroom? It performs significantly worse but looks better? Usually, this means that they’re equal or in the same ballpark.

The conclusion would have made sense in one of the scenarios I mentioned.

The interview also mentioned "performance parity" but the performance isn’t on par. Did these guys even test their own game?
 
Yeah so where is that extra headroom? It performs significantly worse but looks better? Usually, this means that they’re equal or in the same ballpark.

The conclusion would have made sense in one of the scenarios I mentioned.

The interview also mentioned "performance parity" but the performance isn’t on par. Did these guys even test their own game?

The extra headroom was for visual fidelity and they are not all related to gpu compute. For example the texture streaming pool is going to be vram dependent and the developer says PS5 has advantage here so asset quality can be pushed higher. And of course async compute is another PS5 advantage he mentions. Also, as I explained in my first reply, developers will have their own specific tolerance levels for fidelity vs performance trade-off. Remember far more often than not PS5 and Series X are running at 60fps, while PS5 higher settings are always in effect. In the most heavy situations where PS5 reveals the framerate discrepancy vs Series X, it may be the developer's willingness to accept the lower framerate for the sake of keeping the increased visual bells and whistles of PS5 activated.

I hope this explanation is clear. Not sure why this has to be contentious it's great that the developer is sharing how the sauce was made in great detail. We're learning much more than we bargained for and with all due respect to DF (it was still great coverage), it goes much further than their reporting in some significant ways.
 
The knowledge drops continue. This is truly awesome stuff. According to developer, Nanite really favors PS5 throughput. I recall going back and forth with some other members on this topic so it's nice to have confirmation from developer of the first game to use all UE5 key features.

IoA 3.JPG
 
The knowledge drops continue. This is truly awesome stuff. According to developer, Nanite really favors PS5 throughput. I recall going back and forth with some other members on this topic so it's nice to have confirmation from developer of the first game to use all UE5 key features.

View attachment 9493
Sorry I'm not really following. Doesn't XSX have better throughput in terms of the specs he posted? I remember XSX has 10GBs of fast GDDR6 memories (320bit bus, 560GB/s). Obviously you not gonna use that 6GBs of slow rams for graphics, so how is it "higher bandwidth GDDR6"?
I'm just kinda confused here.
 
I'm confused. Is this "Leather-tomorrow" person verified to be a dev working on Immortals? His statements are a bit strange.
It's reddit, so would the people verifying them even be verified themselves lol.

It's like the ps5 was the lead platform for immortals so it got more polish vs the xbox version.
 
IoA developer confirms PS5 is running at higher settings. This is taken from the game's subreddit.


View attachment 9479

That's certainly interesting but the original debate wasn't about graphical settings or even texture resolution, but rather base resolution which you claimed was likely 1080p on PS5 vs 720p on XSX despite Tom double confirming the 720p pixel counts on both.

The above quote still doesn't support your assertion, although it does seem to suggest the PS5 is running with higher texture resolution. That in itself is very strange though given basically no other game behaves this way which you would expect to be more common if the PS5 really does have so much more available memory.

The knowledge drops continue. This is truly awesome stuff. According to developer, Nanite really favors PS5 throughput. I recall going back and forth with some other members on this topic so it's nice to have confirmation from developer of the first game to use all UE5 key features.

View attachment 9493

The quote you post doesn't support your statement.

As noted already, the PS5's GDDR6 speed are quite low compared to both XSX and many PC configs while it's 256bit bus is a standard feature in other systems too.

Of course we know Nanite requires an SSD and would very likely favour the NVMe standard, but the 5.5GB/s raw throughput on the PS5 has not been shown to offer any advantage in this respect since streaming requirements whenever measures fall far, far below this. Also 5.5GB/s isn't actually that fast anymore so if that's the reason why Nanite "favours the PS5" (which he didn't actually state) then it would likely favour even faster systems more.
 
But it doesn’t make sense. You’d think they’d push higher fidelity but keep the frame rate. If they push higher fidelity but the fps tanks, what’s the point?
There were many games that runs at higher res on xsx and runs worse than ps5 so not sure why so big surprise (just opposite scenario though here settings). Imo games are not that deep tested as they should and thats the reason
 
Status
Not open for further replies.
Back
Top