Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
People calling this a 'disaster' just keeps pushing the hyperbolic cynical nature of PC gamers these days,
Imo, 6 years of development since Fm7, nearly 14 years when you factor in that they’ve known about more cores and more cpu threads since Xbox one.

the next Xbox will have even more cores (12-16 CPU likely) but it’s never going to be able to get above 60fps if your code won’t scale above 6.

If it was never brought up as being a potential bottleneck for performance I would be surprised. The decision making is baffling here
 
If it was never brought up as being a potential bottleneck for performance I would be surprised. The decision making is baffling here
Not all workloads are equally amenable to multi-threading, it doesn't surprise me a game cpu bound across ~14 cars with detailed physics simulations scales worse than an rts or something. I'm sure they could do better, but your "disaster" is way out of whack here, we'd need to see the game in a profiler to draw any real conclusions about whether it's "baffling".
 
Last edited:
Do Not forget - XSX is using reflection and cubemaps settings below what you can Set on PC. But yes, I think GPU scaling is probably less than favourable...but the biggest issue is the CPU.

Also worth noting that Olivers count was a rough average and the benchmark you're using is more of a worst case scenario that would likely see resolution drops below the average on XSX. Nevertheless I doubt the XSX is dropping anywhere near 830p in similar situations.

DLSS does add overhead though so perhaps from a performance perspective its more like comparing to 900p-1080p.

Ideally we'd get a pixel count of the PC benchmark scenario running on the XSX, then compare that to.the same resolution on the PC with TAA. It wouldn't be perfect but it should give a decent indication.
 
I haven't watched Alex's video yet, but has anyone tried deleting dstorage.dll and dstoragecore.dll to see if there's any difference yet?
 
Not all workloads are equally amenable to multi-threading, it doesn't surprise me a game cpu bound across ~14 cars with detailed physics simulations scales worse than an rts or something. I'm sure they could do better, but your "disaster" is way out of whack here, we'd need to see the game in a profiler to draw any real conclusions about whether it's "baffling".
I don’t think a profiler will help learn anything here. the game doesn’t scale above 6 cores - we will just see a graph where 6 cores are loaded and everything else is empty.

the graphical settings also have a heavy hit on performance as Alex moves from low to ultra settings. For this one, a profiler would help learn some items.

As for baffling and disaster; FM7 and Turn 10 in general has always been a well tuned and up to date with the latest items. After 6 years of development, I am disappointed that some core pillars of technology aren’t present. I don’t mind if it’s not the latest and greatest RT etc. but I was expecting more along the lines of this is how we handle millions of draw calls and draw a complex scene without crapping out. We developed a system that could handle 24 AI cars on the track, and we can spread it over unlimited cores to take benefit of future hardware. Because this will set them up for the next title. They did the hard plumbing work, now they can dial up graphics for the next title.

I expect more from turn10, because they are iterating on an already strong base of technology. 6 years is a lot of time.
 
Last edited:
Also worth noting that Olivers count was a rough average and the benchmark you're using is more of a worst case scenario that would likely see resolution drops below the average on XSX. Nevertheless I doubt the XSX is dropping anywhere near 830p in similar situations.

DLSS does add overhead though so perhaps from a performance perspective its more like comparing to 900p-1080p.

Good points. DLSS overhead can be somewhat high, my suspicion that 1440p balanced was probably very close to 1080p native performance, so just tried a quick test in Alan Wake remastered:

1440p, DLSS balanced, 160fps:

1696540985661.png

1080p native: 158

1696541027688.png
 
Good points. DLSS overhead can be somewhat high, my suspicion that 1440p balanced was probably very close to 1080p native performance, so just tried a quick test in Alan Wake remastered:

1440p, DLSS balanced, 160fps:

View attachment 9761

1080p native: 158

View attachment 9762

Nice! And you're GPU is very comparable to the 2070S too so that's a good comparison. The question then would be, will Forza drop as low at 1080p in Performance RT mode in that scenario. Possible I guess but an assumption too far at this stage.

Obviously we would expect the Series X to be faster than a 2070S anyway, but not by too much. And although RT is a factor, I suspect in Performance RT mode on the Series X it's very light.
 

Pretty disappointing final result after so many years in development. The game dosent look bad by any mean but i think everybody expected more.
Its mind baffling that FH5 looks so noticeably better and is a crossgen title (developed in just 3 years!). Something must have gone Halo Infinite wrong with this title.
Even the preview version had some hilarious bugs (triangle shaped crowds for example) this dosent really sounds like full 6-7 years were spend on developing this title. But hey by 2023 standards is f... flawless ;)
 
Yeah I was wondering why there wasn't any mention of the native resolution SX was working from if we're trying to go for optimized X settings, had to go to Olivers video to see what that was. For a 2070 Super to not maintain 60fps with DLSS balanced at 1440p and optimized settings is pretty brutal. We're in The Last of Us comparative GPU scaling territory here.
I heard DLSS and FSR are broken in this game.


Native 1440p: 42fps
1440p/w FSR Performance: 46fps

That's on a 7800X3D.
 
Last edited:

Pretty disappointing final result after so many years in development. The game dosent look bad by any mean but i think everybody expected more.
Its mind baffling that FH5 looks so noticeably better and is a crossgen title (developed in just 3 years!). Something must have gone Halo Infinite wrong with this title.
Even the preview version had some hilarious bugs (triangle shaped crowds for example) this dosent really sounds like full 6-7 years were spend on developing this title. But hey by 2023 standards is f... flawless ;)
From a gameplay perspective they made a major move compared to the last forza.

More information is changing the narrative on forza so I will hold until we get more data points. Data is suggesting that it’s It’s not looking cpu limited, but more GPU limited now, which makes me happy.

Curious to see how this changes over the month
 
Last edited:

Their 'first looks' are more comprehensive than most outlets deep-dives. Looking forward to Alex's coverage on image quality, indeed what I noticed particularly in that video is FSR2 is not really playing that nicely with DOF, almost like when PC DLSS mods can't separate out the dof pass and upscaling gets those firefly/bloom low res artifacts popping in and out as a result. Wonder if that's just an FSR implementation issue with Immortals.

FSR3 aside, I'm happy that Radeon at least now has what appears to be a functional version of Reflex with Anti Lag +, nice latency reductions. FSR3 lack of improvement with it currently notwithstanding, just good to see a rather important Nvidia feature no longer exclusive, hope the support for it becomes commonplace.
 

Pretty disappointing final result after so many years in development. The game dosent look bad by any mean but i think everybody expected more.
Its mind baffling that FH5 looks so noticeably better and is a crossgen title (developed in just 3 years!). Something must have gone Halo Infinite wrong with this title.
Even the preview version had some hilarious bugs (triangle shaped crowds for example) this dosent really sounds like full 6-7 years were spend on developing this title. But hey by 2023 standards is f... flawless ;)

Real time global illumination and RT are expensive. It's going to be difficult to make something that looks better than static baked lighting if it was done well. RT can potentially look as good as good baked lighting in a static scene but at a significant performance cost. And we all know the consoles aren't capable of that level of global hardware RT illumination with current gen geometry. Hence why they mostly focus on reflections and RT AO. On PC this means a relatively small graphical uplift at a large performance cost compared to the previous game in order to move from static baked lighting to real time lighting + RT.

So, for games that are going from good static lighting to real time lighting (like Halo and Forza Motorsport), the graphical uplift is going to be limited depending on what you expect.

For games that had already taken the real time lighting hit (for example, a lot of open world games), then improvements gen on gen on will be more noticeable.

Regards,
SB
 
Real time global illumination and RT are expensive. It's going to be difficult to make something that looks better than static baked lighting if it was done well. RT can potentially look as good as good baked lighting in a static scene but at a significant performance cost. And we all know the consoles aren't capable of that level of global hardware RT illumination with current gen geometry. Hence why they mostly focus on reflections and RT AO. On PC this means a relatively small graphical uplift at a large performance cost compared to the previous game in order to move from static baked lighting to real time lighting + RT.

So, for games that are going from good static lighting to real time lighting (like Halo and Forza Motorsport), the graphical uplift is going to be limited depending on what you expect.

For games that had already taken the real time lighting hit (for example, a lot of open world games), then improvements gen on gen on will be more noticeable.

Regards,
SB
Agreed, a shame that what released was so downgraded compared to the reveal, but maybe they'll add higher PC options in the future, more than just RTGI
 
This is pretty awesome, sounds like we'll be getting a lot more PC performance comparisons at console matched settings moving forwards, particularly as part of GPU reviews:

Curious. Why run at console settings on a PC? Doesn't that defeat the point of running on PC, that you can do so much more with midrange or lower PCs that launched at the same time as the console?
 
Curious. Why run at console settings on a PC? Doesn't that defeat the point of running on PC, that you can do so much more with midrange or lower PCs that launched at the same time as the console?

Because it allows us to gauge the true performance of various PC GPU's in relation to the consoles. We all love seeing how various GPU's compare to each other, why wouldn't we also want to know how the two far and away biggest selling GPUs in the market fit into that performance stack?
 
Real time global illumination and RT are expensive. It's going to be difficult to make something that looks better than static baked lighting if it was done well. RT can potentially look as good as good baked lighting in a static scene but at a significant performance cost. And we all know the consoles aren't capable of that level of global hardware RT illumination with current gen geometry. Hence why they mostly focus on reflections and RT AO. On PC this means a relatively small graphical uplift at a large performance cost compared to the previous game in order to move from static baked lighting to real time lighting + RT.

So, for games that are going from good static lighting to real time lighting (like Halo and Forza Motorsport), the graphical uplift is going to be limited depending on what you expect.

For games that had already taken the real time lighting hit (for example, a lot of open world games), then improvements gen on gen on will be more noticeable.

Regards,
SB

Yup. And to add to that, IMO...

The performance cost of going "real time" in things like lighting, reflections, animation etc will ultimately be more than worthwhile. The fewer situations in which everything breaks and the less time and money you have to spend baking everything, the more freedom developers will have to experiment and take risks, and the more freedom and variety gamers are likely to experience in games (time of day, weather, semi-random NPC behaviour, user creatable levels or objects, outrageous unforgettable unique situation etc etc).

So a 4090 or RX 7900XTX is having to sweat doing more of the things we're seeing real time? Good. That's its job. It's a graphics card. Do some graphics.

And I say this as a current 4790K and RTX 3060 gamer. I'm not afraid of my hardware struggling, I'm afraid of missing out on gaming innovations that come from being able to do more stuff in real time.
 
Status
Not open for further replies.
Back
Top