Digital Foundry Article Technical Discussion [2024]

Also the use of those reflections that completely disappear once you move the camera down... Ugh. Please just don't use those! It's terrible.
If you mean SSR those are worst what can be. Very ugly effect. Some time ago I played Half Life 2 Xbox 360 version on XSX. That game have amazing reflections in water. Just amazing. Very sad many new games use SSR. You play game with tons of polygons, advanced lighting, very realistic materials, moving grass, great particle effects and SSR on water. All graphics brokes when you move camera up and down.
 
They went on record that they skipped amplification shaders. And that they intend to use it for their next major release.

It would fall under sparse textures for OGL. Tiled resources under DX.

Right. So if you don’t have the hardware for VRS, you must develop it on the compute shader lane for it to run. You have a chance here to make better performance or at least equivalent with more control.

Trying to emulate VRS on 3D pipeline vs VRS hardware that runs on 3D pipeline is unlikely to be as performant.

If they made VRS on compute, they likely would be running that version for all deployments.
This been discussed to death already here but

"“Software-Based Variable Rate Shading in Call of Duty” presented at SIGGRAPH 2020 (http://advances.realtimerendering.com/s2020/index.htm) has some interesting thoughts on this topic as well. They present a method leveraging how console hardware handles MSAA to emulate VRS on platforms without hardware VRS support and extra flexibility such as smaller tile sizes. In addition, they present an optimized way to apply VRS to compute shaders that uses ExecuteDispatchIndirect to ensure only waves with actual work are dispatched in contrast to our brute force method. However, Software-Based VRS also has some trade-offs including implementation complexity and the overhead of a de-blocking pass. One possibility is to use a hybrid of both techniques, switching between VRS techniques based on the characteristics of the rendering pass."

So yeah it seems that software VRS can be as performant as HW VRS but dosent have to be. We dont know. I wouldnt call softawre VRS solution as superior anyway.
This is from 2020 a lot could have chnaged since then, the HW understaning is much better now. But if they managed to develop own VRS software solution that is as good as HW or better then i agree all versions would have it.
 
Putting it another way, you could claim PS3 and PS4 were a huge regression on PS2 because they lacked its particle drawing power. They sacrificed effects for other rendering strengths.
This is very true man! I remember WRC games on PS2, when you drive on sand or gravel and there is sand dust behind the car for some 50 m, looks especially great in replays. And then PS3 and 360 gen. Dirt games, with sand dust for some 10 m.

BTW, PS5r can make great RT reflections. There are Ratchet and Clank and Spider-man 2 games. Reflections in those games are top tier.
 
This been discussed to death already here but

"“Software-Based Variable Rate Shading in Call of Duty” presented at SIGGRAPH 2020 (http://advances.realtimerendering.com/s2020/index.htm) has some interesting thoughts on this topic as well. They present a method leveraging how console hardware handles MSAA to emulate VRS on platforms without hardware VRS support and extra flexibility such as smaller tile sizes. In addition, they present an optimized way to apply VRS to compute shaders that uses ExecuteDispatchIndirect to ensure only waves with actual work are dispatched in contrast to our brute force method. However, Software-Based VRS also has some trade-offs including implementation complexity and the overhead of a de-blocking pass. One possibility is to use a hybrid of both techniques, switching between VRS techniques based on the characteristics of the rendering pass."

So yeah it seems that software VRS can be as performant as HW VRS but dosent have to be. We dont know. I wouldnt call softawre VRS solution as superior anyway.
This is from 2020 a lot could have chnaged since then, the HW understaning is much better now. But if they managed to develop own VRS software solution that is as good as HW or better then i agree all versions would have it.
So there are separate discussions happening here wrt Space Marine: Firstly, is there even VRS? Secondly is it hardware or software VRS?

I should be clear, I’m looking for reasons why Series consoles are behaving out of the normal band that we expect. We haven’t confirmed anything yet, except I’m just looking at an older developer interview that indicates that they have full intention of putting these features into their engine.

Technically, you are correct. But I wouldn’t say VRS is a new feature then, since you can back port it as far back as DX9? Whereas they were explicitly talking about adding in the new features which would be hardware VRS.

I really don’t have a hard stance here on the performance aspects, I think we’ve long had a great back and forth discussion in b3d on whether custom software solutions are more performant than the equivalent hardware ones (and with good backing info that they are). Performance isn’t at debate for me, given what has been said. However, there isn’t a strong reason for series consoles to outperform their line of best fit here given the history of releases. More bandwidth, and more compute wouldn’t make up a 20% frame rate difference. So I’m looking for evidence of something else. PS5 and PC appear to be in line, so I’m looking at Xbox being the outlier here. But before we start talking performance of such features, we need to verify if it’s even present in Space Marine.

Thst means looking through PC and XSX looking for those shading differences. Seeing if there are options to remove software VRS and so forth.
 
Last edited:
More, about 6 or 7 currently. Call of Duty Modern Warfare 3, Call of Duty Warzone, Cyberpunk 2077, Alan Wake 2, Portal RTX, The First Descendant and Star Wars Outlaws.

Soon there will be: Indana Jones, Half Life 2 RTX, Black Myth: Wukong and Naraka: Bladepoint.


It's getting there, but slowly. So far, we have Star Wars Outlaws, Avatar Frontiers of Pandora, Forza Horizon 5 and Returnal.
From memory I do remember they were using some kind of ray traced audio in Killzone Shadow Fall. I may be wrong.
This been discussed to death already here but

"“Software-Based Variable Rate Shading in Call of Duty” presented at SIGGRAPH 2020 (http://advances.realtimerendering.com/s2020/index.htm) has some interesting thoughts on this topic as well. They present a method leveraging how console hardware handles MSAA to emulate VRS on platforms without hardware VRS support and extra flexibility such as smaller tile sizes. In addition, they present an optimized way to apply VRS to compute shaders that uses ExecuteDispatchIndirect to ensure only waves with actual work are dispatched in contrast to our brute force method. However, Software-Based VRS also has some trade-offs including implementation complexity and the overhead of a de-blocking pass. One possibility is to use a hybrid of both techniques, switching between VRS techniques based on the characteristics of the rendering pass."

So yeah it seems that software VRS can be as performant as HW VRS but dosent have to be. We dont know. I wouldnt call softawre VRS solution as superior anyway.
This is from 2020 a lot could have chnaged since then, the HW understaning is much better now. But if they managed to develop own VRS software solution that is as good as HW or better then i agree all versions would have it.
Well the gist of that paper when I read it was that their custom solution was better because of the very effective de-blocking pass making it unoticeable vs without VRS. For me the blocking artefacts are way too visible using hardware VRS. Textures are basically destroyed and replaced by those cheaply compressed ones. PNG vs low quality JPEG. The difference is stricking in each games comparison I have seen and done. Unsurprisingly proper comparisons are not done by the supplier of RDNA2 VRS or Microsoft the way they did it in that SIGGRAPH paper, the rare ones I have seen are properly selected with dark images, without zooming and proper zoomed in comparison of textures. You may gain 5% of performance but what's the point with destroyed assets? What's the point of paying artists doing those textures in the first place?
 
I'd wait on making any big speculation about Space Marine 2. Games generally get patched nowadays. Wouldn't be shocked to see a patch to bring PS5 in line.
i agree, but it's usually not PS5 that needs to get patched into line. lol. Xbox is usually the non-lead console. It's often xbox being patched into line.
 
0:00:00 Introduction
0:00:41 News 1: Sony leaks PS5 Pro design
0:13:42 News 2: Concord pulled from sale, servers close
0:37:11 News 3: Astro Bot review reaction!
0:53:23 News 4: RTX 5000 series performance and power rumoured
1:05:51 News 5: Bloodborne PC progresses
1:14:53 News 6: John’s Windows handheld impressions!
1:27:04 Supporter Q1: What should we make of the closure of AnandTech?
1:39:09 Supporter Q2: Where will Microsoft be in 15 years?
1:46:14 Supporter Q3: Could an RTX 6090 equipped Xbox explain Microsoft’s largest generational leap claims?
1:52:44 Supporter Q4: Could Ridge Racer make a return in the near future?
1:57:58 Supporter Q5: If Alan Wake 2 isn’t profitable, what hope do other original single player games have?
2:03:28 Supporter Q6: Will we really see ray tracing on Switch 2?

 
I should be clear, I’m looking for reasons why Series consoles are behaving out of the normal band that we expect.
Is it out of line from what we expected? Playstation 5 and Xbox have some advantages over each other, but not more than about 20%, which falls in line with the performance delta from what I've seen. If it's over 20%, that's when I'd think something was really up.
 
Is it out of line from what we expected? Playstation 5 and Xbox have some advantages over each other, but not more than about 20%, which falls in line with the performance delta from what I've seen. If it's over 20%, that's when I'd think something was really up.
Though, 20% more compute has never lead to 20% more framerate. The compute can only solve a single bottleneck, same with bandwidth. 20% more compute and bandwidth can lead to 20% more resolution, but running more frame rate means the entire whole GPU is being hit up 20% more. And that’s a big deal because outside of compute and bandwidth and feature set, the rest of the hardware pipeline PS5 has it beat!

Furthermore 20% (45 vs 60) more framerate also means the CPU is processing more frames, that also means it’s eating up even more of that shared bandwidth.

I’m leaning towards the XSX doing less work somehow.
 
IMG_3273.jpegIMG_3271.jpeg

@Dictator Curious, but do the differences in IQ only come down to sharpening on PS5 vs PC? They both run at 1080p but look quite different. Even the balanced mode on PS5 doesn’t look the same. Is it caused only by a sharpening filter or is there something else at play? I saw YouTube comments stating that the PS5’s image in its performance profile looks even better than in its balanced profile, which doesn’t compute.
 
View attachment 12001View attachment 12002

@Dictator Curious, but do the differences in IQ only come down to sharpening on PS5 vs PC? They both run at 1080p but look quite different. Even the balanced mode on PS5 doesn’t look the same. Is it caused only by a sharpening filter or is there something else at play? I saw YouTube comments stating that the PS5’s image in its performance profile looks even better than in its balanced profile, which doesn’t compute.
Yeah, the performance mode has the sharpening turned up to 11, while the balanced mode looks a lot better with lower levels of sharpening.

The performance mode in this game is so bad 😅
 
Though, 20% more compute has never lead to 20% more framerate. The compute can only solve a single bottleneck, same with bandwidth. 20% more compute and bandwidth can lead to 20% more resolution, but running more frame rate means the entire whole GPU is being hit up 20% more. And that’s a big deal because outside of compute and bandwidth and feature set, the rest of the hardware pipeline PS5 has it beat!

Furthermore 20% (45 vs 60) more framerate also means the CPU is processing more frames, that also means it’s eating up even more of that shared bandwidth.

I’m leaning towards the XSX doing less work somehow.
The difference in performance seems to be CPU heavy sections, no? So the TF difference shouldn't even matter here.

Curious if Xbox offering a 'no hyperthreading mode' makes a difference here maybe? We know some games do benefit from this to varying degrees, but Xbox also gets a 300Mhz boost over PS5's CPU in this mode. 15-20% is a lot to explain, but there's also not a lot that could otherwise explain it...
 
The difference in performance seems to be CPU heavy sections, no? So the TF difference shouldn't even matter here.

Curious if Xbox offering a 'no hyperthreading mode' makes a difference here maybe? We know some games do benefit from this to varying degrees, but Xbox also gets a 300Mhz boost over PS5's CPU in this mode. 15-20% is a lot to explain, but there's also not a lot that could otherwise explain it...
Yes and agreed, small caveat here but we cannot say for sure because frame rate is capped at 60.

During cutscenes that difference is there as well. Not sure what is causing it.

DF will have to speak with them to figure it out.
 
Though, 20% more compute has never lead to 20% more framerate. The compute can only solve a single bottleneck, same with bandwidth. 20% more compute and bandwidth can lead to 20% more resolution, but running more frame rate means the entire whole GPU is being hit up 20% more. And that’s a big deal because outside of compute and bandwidth and feature set, the rest of the hardware pipeline PS5 has it beat!

Furthermore 20% (45 vs 60) more framerate also means the CPU is processing more frames, that also means it’s eating up even more of that shared bandwidth.

I’m leaning towards the XSX doing less work somehow.
Series X's fastest memory has 25% more bandwidth than PS5, the TF advantage can't really hurt, and the CPU is 100 to 300mhz faster than the maximum of PS5's variable clock, plus PS5 has cut back FPUs. Further, on my PC, my CPU runs roughly 5-10c hotter running this game than most other games, leading me to believe that it consumes more power, perhaps causing PS5's power cap to lower the clocks of either the CPU or GPU if it's at it's limit. I'm not the only person to notice the increased CPU temp, but I haven't really looked at frequency close enough to notice if that's any difference.
The difference in performance seems to be CPU heavy sections, no? So the TF difference shouldn't even matter here.

Curious if Xbox offering a 'no hyperthreading mode' makes a difference here maybe? We know some games do benefit from this to varying degrees, but Xbox also gets a 300Mhz boost over PS5's CPU in this mode. 15-20% is a lot to explain, but there's also not a lot that could otherwise explain it...
15-20% is the advantage expected of any platform that has a 20-25% advantage in areas that are bottlenecking performance. I know we don't have 100% accurate performance metrics, but didn't Cerny mention something about AVX workloads consuming a lot of power in the road to PS5, and we know the FPU's on PS5's CPU are smaller. On the CPU side, Series x has a FPU advantage (not sure how much), a clock speed advantage(100-300mhz over PS5's max, and we don;t know if it's hitting that in this game), and a memory bandwidth advantage(+25%). This game is largely CPU limited by all accounts, and concentrating on just those numbers shows the deficit PS5 could have and falls in line with a deficit that it does have. Remember that PS5 isn't really running 20% slower across the board, it's just when it's stressed it's about 20% lower.

Also, on my PC with a Ryzen 5700X3D and a RTX2070Super, my GPU is at 99% utilization when playing this game at 1440p and dynamic res @60fps. I don't think it's ever hitting full resolution, and benchmarks do have a fair amount of scaling above my GPU, which I would put in the range of PS5/Series consoles in performance most of the time. I've seen benchmarks of a 4090 getting double the performance of a 3080 at 1080p so it isn't like the game has no GPU scaling at all.
 
Back
Top