Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
It's not impossible that a slightly higher FSR resolution base could be used than 1440p. Something like 1600p or so. Usually does make a difference when talking about FSR.

Yeah, that would have been a nice bump to IQ. I don't get the impression that would change Dudditz's opinion though, as his complaint seems to be that the game looks very similar on both aside from resolution.
 
Starfield seems very compute bound in its current state. Here's a 720p capture from my 2080 Ti.

jHdpJfh.png



Framerate then scales almost linearly with GPU clock rate even at this low resolution. I don't have an AMD card to test with but I've seen similar behavior in some other DX12 games on this card.

Q4oOVKA.png

r8eeTLj.png
 
I disagree. The game varies from mediocre to good looking but the variance between Series S and X has to be the smallest I've seen from any game released this gen, whether cross-gen or current gen only. Also, the term "higher" is subjective and I expected a difference of PC min vs PC Ultra between S and X. I'm just not seeing that here.
The Series S was always pitched as a product that would deliver the same (or "similar") graphics at a lower resolution. That's what we have consistently seen from first party titles.

I find it funny how MS delivering on their promises is suddenly a bad thing. How dare Bethesda find smart optimisations that allow Series S owners to have a good looking experience!
 
It's crazy how after the DF video we all thought and expected the game to be CPU limited only for it to turn out it's actually GPU limited.
 
Last edited:
Yeah, that would have been a nice bump to IQ. I don't get the impression that would change Dudditz's opinion though, as his complaint seems to be that the game looks very similar on both aside from resolution.
The Series S was always pitched as a product that would deliver the same (or "similar") graphics at a lower resolution. That's what we have consistently seen from first party titles.

I find it funny how MS delivering on their promises is suddenly a bad thing. How dare Bethesda find smart optimisations that allow Series S owners to have a good looking experience!
Well from the discussion here, I raised that XBSS was presented as just a lower resolution XBSS. However, replies point out RAM and BW requirements don't necessarily scale linearly with resolution. As a result, optimal XBSS titles would be capable of scaling up more than just resolution to make use of the RAM+BW of the XBSX.

I think that's an interesting posit and a general look at the state of titles on the two platforms should be an indicator. DF should have a good feel for this. My poorly informed understanding at the moment is XBSS tends to perform proportionally worse than its hypothetical performance delta, but I don't know how that is distributed across framerate and content/features. If devs prioritise framerate, do games scale more linearly? Is the difference in content/features that I'mDudditdz! is expecting more associated with lower framerate (edit: plus less stable) titles budgeting performance differently?
 
Starfield seems very compute bound in its current state. Here's a 720p capture from my 2080 Ti.

jHdpJfh.png



Framerate then scales almost linearly with GPU clock rate even at this low resolution. I don't have an AMD card to test with but I've seen similar behavior in some other DX12 games on this card.

Q4oOVKA.png

r8eeTLj.png

It’s likely scaling with clock speed because 720p isn’t providing enough work to fill the GPU. Based on the trace, GPU utilization is very low across the board. I’m not seeing the usual signs of a compute bottleneck. Occupancy, L1 throughput and SM throughput are all extremely low.
 
It’s likely scaling with clock speed because 720p isn’t providing enough work to fill the GPU. Based on the trace, GPU utilization is very low across the board. I’m not seeing the usual signs of a compute bottleneck. Occupancy, L1 throughput and SM throughput are all extremely low.

In the screens the gpu is at 99% utilization but I don’t know how that metric is calculated. I’m guessing it could be 99% on a long running shader that utilizes the gpu poorly as long as the cpu was ready for the next frame before the shader finished?
 
In the screens the gpu is at 99% utilization but I don’t know how that metric is calculated. I’m guessing it could be 99% on a long running shader that utilizes the gpu poorly as long as the cpu was ready for the next frame before the shader finished?

Yeah the GPU utilization stat in performance overlays is likely based on the “GPU active” status which means the GPU is working on “something” not that the entire GPU is utilized. It’s only useful for checking CPU bottlenecks. Doesn’t really tell you anything about how well the game actually uses the GPU.
 
It’s likely scaling with clock speed because 720p isn’t providing enough work to fill the GPU. Based on the trace, GPU utilization is very low across the board. I’m not seeing the usual signs of a compute bottleneck. Occupancy, L1 throughput and SM throughput are all extremely low.
Consider the 'Compute in flight' metric which is quite full and means it's waiting for compute dispatches to finish. Here's 3440x1440 for reference.

Both resolutions scale almost linearly with GPU frequency though at a lower ~34fps vs ~42fps for the higher resolution. This would be atypical behavior if pixel shading or raster were the primary bottlenecks (especially at 720p) as those are more parallelizable than compute. Running at 720p would traditionally show the GPU idling and reducing the clock rate.
 
But there isn't the power to both do a > 2.5x bump in base resolution, plus shadow and cubemap and grass bumps etc, and also go from PC equivalent min to PC equivalent max settings. I don't think any game has pulled this off.

What are you thinking they should have done instead?


Yeah PC min vs max was a very poor description on my part. In my mind I'm thinking about lowest vs highest PC card within a particular generation (7600 vs 7900) or better yet console delta's we've seen this generation between Series S and PS5/XSX. Forza Horizon 5 quality mode comparison is a great example where Series X takes advantage of much larger RAM and GPU with higher textures, parallax occlusion, etc. MS Flight Sim had clear LOD difference in addition to resolution gap. These differences didn't stop Series S from having a great experience but the difference was still much more obvious.



Also seems to me 3rd party is more willing to widen the gap than first party. Metro Exodus is a great example of this. You see wider image resolution difference, much more aggressive pop in and less asset density on Series S but it is a good compromise for maintaining full RT and 60fps on the smaller console. Still overall great experience on S because the focus is RT


 
The Series S was always pitched as a product that would deliver the same (or "similar") graphics at a lower resolution. That's what we have consistently seen from first party titles.

I find it funny how MS delivering on their promises is suddenly a bad thing. How dare Bethesda find smart optimisations that allow Series S owners to have a good looking experience!

See my comment above. I didn't say Series S can't have a great experience. We've seen developers provide good experience on Series S while also pushing Series X further so both win.
 
Also seems to me 3rd party is more willing to widen the gap than first party. Metro Exodus is a great example of this. You see wider image resolution difference, much more aggressive pop in and less asset density on Series S but it is a good compromise for maintaining full RT and 60fps on the smaller console. Still overall great experience on S because the focus is RT
I think these are very different phenomena. Forza and *flight simulator are more graphically impressive games than starfield, and they have more eye-catching high end features to turn on rather than just turning dials up. Metro scales fairly poorly on series S -- stuff like totally killing the draw distance on grass an details is an uncomfortable sacrifice. It's an older game than the other two, with very expensive features stapled on top. I expect 4a's game will look better on series S, as they shift to more modern techniques.
 
Bethesda RPGs aren’t known for cutting edge tech so no surprise there.
Definitely but I was hoping for something like fast loading or usage of directstorage. We already knew there would be no ray tracing or any modern rendering features.

The budget is reportedly astronomical so I'm not sure what's the reason there. Bethesda should be held to higher standards than "Well, it's Bethesda so whatever." They're a flagship studio from a tech giant.
 
Status
Not open for further replies.
Back
Top