Digital Foundry Article Technical Discussion [2020]

Status
Not open for further replies.
Insomniac is of a higher caliber IMO with the added benefit of being able to focus completely on playstation.
I think a lot of spiderman's design decision without RT are in play here. Like @jlippo said - the game uses a static time of day between missions. It is unmoving - indirect lighting is not real time and changing. WDL on the other hand has a real time time of day and indirect lighting which changes with it . From what I can see in the config files for WDL, its indirect lighting is different than its predecessors. It is pretty expensive from what I can see on PC.
Also Spiderman MM has other compromises there that WDL will not have: like how Spiderman MM has no RT on water, or no reflections in reflections, or how the geometry in reflections in MM is a lower LOD than the primary view. All these things are not done in WDL. So it makes other compromises in other directions.

But the most important bit as to why these 2 RT reflection implementations could be making different trade offs although they are both an openworld games... is because WDL has a much higher frametime cost as a base without RT. Without RT WDL would be targeting 30 fps quite obviously based on the performance of this game on PC. Spiderman on the other hand without RT is targeting 60 fps... it has a lot more frametime to throw into its RT reflections presumably due to having a lighter GPU load in the first place.
 
I think a lot of spiderman's design decision without RT are in play here. Like @jlippo said - the game uses a static time of day between missions. It is unmoving - indirect lighting is not real time and changing. WDL on the other hand has a real time time of day and indirect lighting which changes with it . From what I can see in the config files for WDL, its indirect lighting is different than its predecessors. It is pretty expensive from what I can see on PC.
Also Spiderman MM has other compromises there that WDL will not have: like how Spiderman MM has no RT on water, or no reflections in reflections, or how the geometry in reflections in MM is a lower LOD than the primary view. All these things are not done in WDL. So it makes other compromises in other directions.

But the most important bit as to why these 2 RT reflection implementations could be making different trade offs although they are both an openworld games... is because WDL has a much higher frametime cost as a base without RT. Without RT WDL would be targeting 30 fps quite obviously based on the performance of this game on PC. Spiderman on the other hand without RT is targeting 60 fps... it has a lot more frametime to throw into its RT reflections presumably due to having a lighter GPU load in the first place.

Its indirect lighting is not producing an impressive visual result by even current gen console standards judging by your PC deep dive. Are those sacrifices not present in the console version or just the PC version at ultra? Why is the game so heavy to begin with? From just looking at the end result it seems poorly optimized/Ubisoft made poor choices of where to spend performance. Lastly, where on earth is all that CPU power going in the PC version?
 
Its indirect lighting is not producing an impressive visual result by even current gen console standards judging by your PC deep dive. Are those sacrifices not present in the console version or just the PC version at ultra? Why is the game so heavy to begin with? From just looking at the end result it seems poorly optimized/Ubisoft made poor choices of where to spend performance. Lastly, where on earth is all that CPU power going in the PC version?
The GI in WDL is very much are producing impressive results. I think you are just not seeing them in the media you look for or your eye is not catching them. Its quality is why people on PC have been able to make shots like these:

I have even seen shots of alleyways where buildings produce very noticable lined indirect shadows from the sky lighting. Almost like the game is using a type of SVOGI.
 
The GI in WDL is very much are producing impressive results. I think you are just not seeing them in the media you look for or your eye is not catching them. Its quality is why people on PC have been able to make shots like these:

I have even seen shots of alleyways where buildings produce very noticable lined indirect shadows from the sky lighting. Almost like the game is using a type of SVOGI.

Spiderman needs this raytracing solution because of the verticality of the game if it was like Watchdog Legions they would have made other choice. Different choice, different compromise. My only complaint on WL, I would prefer if the dynamic object dissapear further in distance inside the reflection.

My major complaint on Spiderman is reflection in reflection but probably impossible to do this generation maybe the next one.

EDIT:

Not the case of the PC version just consoles version PS5 and XSX

EDIT:

 
Last edited:
Seems like the only hotspot is the LAN port, that get some degrees higher than the rest. But it seems like it is cool and quite.

Ok, what is with with terrible Casio accompaniment music playing throughout the video in the background?

It's awful. :yep2:
 

Seems like the only hotspot is the LAN port, that get some degrees higher than the rest. But it seems like it is cool and quite.
Yea the design of ps5 the outer fins create an additional air pocket outside the plastic that must also heat up to heat up the outer fins. Making it impossible to hold it and feel the heat. Removing the fins and measuring the black would be an interesting experiment but not reflective of how people would use it. That being said; don’t use it with fins offs imo.

the power draw limit I wish was higher than 200W.
 

Seems like the only hotspot is the LAN port, that get some degrees higher than the rest. But it seems like it is cool and quite.
Very interesting, seems like power consumption tops at 200-205W as Richard suggested, making both consoles close to maximum power consumption.

I assume this is why MS didn't go for variable frequencies. Basically, both targeted power and cooling specifications for ~200W of total power consumption, except Sony had a reason to go for variable and MS didn't.

In Sony's case, and going with 36CUs, capping it at say 2.0GHz would probably still result in spikes around 200W, if not now then certainly in future as Cerny said. That meant they would rather cap the power supply at aimed 200W, and downclock in case it goes over it for a few frames, then go with 10% lower clocks that are fixed but would still require similar cooling and power supply in case these spike at that wattage.

For MS, they probably searched for sweet spot (which seems to be 1.8-1.9GHz as per AMD's game clocks for RDNA2) because they also aimed at 200W of total system power, but they also have considerably larger chip. If they went with 2.0-2.1GHz of variable they would have to provide more in the region of 240-250W of constant power to the SOC and that would be way, way overboard (practically would require redesign of entire cooling and power supply solution).

Basically, it makes perfect sense why Sony went with variable frequency with 36CU chip and why MS shot for perfect perf/watt with 52CU chip. Both had same power consumption targets, but where's MS big chip was already at the limit with perfect perf/watt clocks, Sony's narrow chip meant they are leaving alot on the table if they don't boost clocks to maximum.
 
Very interesting, seems like power consumption tops at 200-205W as Richard suggested, making both consoles close to maximum power consumption.

I assume this is why MS didn't go for variable frequencies. Basically, both targeted power and cooling specifications for ~200W of total power consumption, except Sony had a reason to go for variable and MS didn't.

In Sony's case, and going with 36CUs, capping it at say 2.0GHz would probably still result in spikes around 200W, if not now then certainly in future as Cerny said. That meant they would rather cap the power supply at aimed 200W, and downclock in case it goes over it for a few frames, then go with 10% lower clocks that are fixed but would still require similar cooling and power supply in case these spike at that wattage.

For MS, they probably searched for sweet spot (which seems to be 1.8-1.9GHz as per AMD's game clocks for RDNA2) because they also aimed at 200W of total system power, but they also have considerably larger chip. If they went with 2.0-2.1GHz of variable they would have to provide more in the region of 240-250W of constant power to the SOC and that would be way, way overboard (practically would require redesign of entire cooling and power supply solution).

Basically, it makes perfect sense why Sony went with variable frequency with 36CU chip and why MS shot for perfect perf/watt with 52CU chip. Both had same power consumption targets, but where's MS big chip was already at the limit with perfect perf/watt clocks, Sony's narrow chip meant they are leaving alot on the table if they don't boost clocks to maximum.

except, we don't have a title that really maxes out the feature of the xbox and therefore the powerdraw could get even higher. With the PS5 it should be a bit easier to get the max power draw out of the system because we know the PS5 would clock up if the power-window allows it and down if it wouldn't.
It is really a shame that MS didn't have a Turn 10 title or something else for launch that really stresses the GPU & cpu with all the new features. Gears is more or less just using the GCN featureset and raw power.
 
except, we don't have a title that really maxes out the feature of the xbox and therefore the powerdraw could get even higher. With the PS5 it should be a bit easier to get the max power draw out of the system because we know the PS5 would clock up if the power-window allows it and down if it wouldn't.
It is really a shame that MS didn't have a Turn 10 title or something else for launch that really stresses the GPU & cpu with all the new features. Gears is more or less just using the GCN featureset and raw power.
I am pretty sure ~200W is close to max of what they aimed. Gears 5 in 60fps in 4K will definitely result in some heavy workload as seen by new Ampere cards.

Also in DF video you can see DMC Special Edition actually maxing at 202W, same as Spiderman MM which everyone would assume is game that pushes console much more. Even Astrobot can get to 196W from that video.
 
The GI in WDL is very much are producing impressive results. I think you are just not seeing them in the media you look for or your eye is not catching them. Its quality is why people on PC have been able to make shots like these:

I have even seen shots of alleyways where buildings produce very noticable lined indirect shadows from the sky lighting. Almost like the game is using a type of SVOGI.

Thanks for those, I was complete unimpressed with WDL until I saw that. The lighting there is far more impressive than the reflections in either this game or Spiderman MM, IMO. Reflections in games still have a way to go until they don't look wrong. OTOH - lighting is now getting to the point where it's getting harder to notice when it isn't quite right.

Hopefully, we may soon see the demise of AO (crossing fingers) as that's one of the things that makes lighting in games look so wrong to me much of the time. This is assuming that developers start to focus more on indirect lighting and shadows.

Speaking of shadows, shadows are also finally getting to the point where I don't immediately turn them off in games because they don't look right. Unfortunately, it looks like in many games, reflections are now the thing that I might be turning off immediately in games. :p

Regards,
SB
 
I fully expect that VRR and 8k are just a firmware update away ... hard to imagine anything else really as it even says 8k on the box. But you never know of course.

I also don’t quite follow the logic of that RDNA2 comment, but we’ll see. It’s easy to make that legally correct ... if one customization by Sony deviates from the RDNA2 spec, like say a custom geometry engine?, than that’s legally true even if in terms of performance or feature set it means nothing? Or am I missing something.

It will be interesting to see some real head-to-heads show up at some point though!
 
I fully expect that VRR and 8k are just a firmware update away ... hard to imagine anything else really as it even says 8k on the box. But you never know of course.

Same with Sony adding ALLM - auto low latency mode. Though I don't know if they plan to support 1440p output mode.
 
Yeah, I mean my Samsung goes into game mode automatically with the PS4 Pro and so did my Sony tv before that I think so you’d assume something like that should be easy enough with PS5.

The resolution I don’t know either. They have no precedent there I think.
 
I also don’t quite follow the logic of that RDNA2 comment, but we’ll see. It’s easy to make that legally correct ... if one customization by Sony deviates from the RDNA2 spec, like say a custom geometry engine?, than that’s legally true even if in terms of performance or feature set it means nothing? Or am I missing something.
A spec being customised to the point that it is considered not part of the spec?

I think it's more likely that AMD used MS patented VRS for tier 2 and Sony not getting tier 2 support.

Doesn't mean they couldn't have their own way to do something similar or just felt software is good enough.

Hopefully they have a bit more actual insight into what the differences are. Maybe they have dev docs or spoken to dev, otherwise I'm not sure we're going to get any thing more than what we know, i.e nothing.

Regarding ALLM, VRR and resolution support, I think it's more that it's a surprise its not in it from the start than it not getting support at all. May even be there on day 1 patch.
 
Yeah, I mean my Samsung goes into game mode automatically with the PS4 Pro and so did my Sony tv before that I think so you’d assume something like that should be easy enough with PS5.

The resolution I don’t know either. They have no precedent there I think.
I see my Q60 got an update yesterday firmware 1371. The update installed after I booted up my TV. I wonder what has changed.

I see no information on the official webpage. It doesn't even show 1371 on it.

https://www.samsung.com/za/support/model/QA65Q60RAKXXA/#downloads
 
Last edited:
Status
Not open for further replies.
Back
Top