Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
S.T.A.L.K.E.R. uses Open Dynamics physics. It's open source, and used in a few other high profile games like Titan Quest, Call of Juarez and Assetto Corsa. I think World of Goo used it as well. I don't think any game that uses Open Dynamics leveraged complex physics as much as the S.T.A.L.K.E.R. games do. I'm surprised more game didn't use it, because it's old enough to have been available during the rise of PhysX and Havok, but it's been shown to be powerful, free, open source, and not tied to a hardware company.
 
That's still all nVidia, though, isn't it? The Tegra might license ARM technology, but AMD licenses X86 from intel as well. Tegra is an nVidia product, just like the APUs in PS5 and Series are MAD products.
I'm not sure follow the relevance. AMD licence the ISA from Intel (and don't forget Intel license x64 from AMD), but your AMD CPU isn't the same as Intel CPU. Whereas I believe Tegra-1 has stock ARM A57 cores. What does it matter that Microsoft and Sony used APUs with custom Zen2 CPUs sand Switch uses a stock ARM core?
 
I'm not sure follow the relevance. AMD licence the ISA from Intel (and don't forget Intel license x64 from AMD), but your AMD CPU isn't the same as Intel CPU. Whereas I believe Tegra-1 has stock ARM A57 cores. What does it matter that Microsoft and Sony used APUs with custom Zen2 CPUs sand Switch uses a stock ARM core?

I think that's his point. We don't call an AMD SOC an x86 (Intel) + AMD SOC/APU/CPU. So why wouldn't we call an ARM + NV SOC just an NV SOC/APU/CPU?

Calling the Switch ARM + NV is like calling the PS5/XBS consoles Intel + AMD. I guess it gets confusing as ARM is both the name of the licensing company as well as the architecture.

So, just be consistent.

Regards,
SB
 
I think that's his point. We don't call an AMD SOC an x86 (Intel) + AMD SOC/APU/CPU. So why wouldn't we call an ARM + NV SOC just an NV SOC/APU/CPU?.
Oh I see, sure. I mean I wasn't trying to draw some weird distinction, only observing out that despite the narrative that everybody went for the same tech choices, Nintendo didn't. It was neither 80x86/x64 for CPU, nor AMD graphics. But to follow the implied logic, given Nvidia have made efforts for their discrete GPUs to play nice with ARM, a viable option could be discrete ARM CPU and discrete Nvidia GPU.

I do not think Microsoft and Sony are necessarily railroaded to AMD APUs. Modern abstraction techniques are crazy good, it's only when you're trying to emulate something utterly alien that it gets complicated. E.g. emulating Cell. Emulating PowerPC is easy, but the whole Cell SPE, SPUs and ringbus. Yeah.. no thanks. It's similar to why it remains a challenge to emulate ancient Cray XMP/YMP supercomputers, they were fundamentally architected in a design that makes it difficult to emulate in a performant way even in modern hardware many times more powerful. Anyway.. sorry.. tangent! :runaway:
 
The new trees also use billboards by the look of it, just a shit tone more of them.

So that's disappointing as I was expecting actual 3D foliage (Like in the old Speed Tree demos) after the fuss the developers made of it.
 
The new trees also use billboards by the look of it, just a shit tone more of them.

So that's disappointing as I was expecting actual 3D foliage (Like in the old Speed Tree demos) after the fuss the developers made of it.
Yea, I noticed this in the DF video. Never noticed it before then.

This game has honestly disappointed me. It had everything lining up for it to be awesome..
-New generation of consoles
-Ray Tracing
-New/Enhanced engine
-Taking extra time between releases

I expected a new enhanced engine and brand new built from the ground up car models, lighting, and materials.. putting everything else before it to shame.. and while yes it does look better than FM7.. it should have been better than it is. I don't care if the amount of cars drops from 500 to 150... These games can build up car lists over time. It was SUPPOSED to be Forza's time to shine and drop these old car models they've been using in their games for the past 3 Motorsport and Horizon games...
 
Yea, I noticed this in the DF video. Never noticed it before then.

This game has honestly disappointed me. It had everything lining up for it to be awesome..
-New generation of consoles
-Ray Tracing
-New/Enhanced engine
-Taking extra time between releases

I expected a new enhanced engine and brand new built from the ground up car models, lighting, and materials.. putting everything else before it to shame.. and while yes it does look better than FM7.. it should have been better than it is. I don't care if the amount of cars drops from 500 to 150... These games can build up car lists over time. It was SUPPOSED to be Forza's time to shine and drop these old car models they've been using in their games for the past 3 Motorsport and Horizon games...
I didn't expect much from it.

Forza Horizon 6 however, hoo boy. That's going to be groundbreaking.
 
The new trees also use billboards by the look of it, just a shit tone more of them.

So that's disappointing as I was expecting actual 3D foliage (Like in the old Speed Tree demos) after the fuss the developers made of it.
Maybe a racing game with nanite will do this.
But at what framerate...
 
Great Analysis. Sadly the game is severely CPU limited.

Most of these 'settings confusion' aspects is nothing new. Forza Motorsport 7 had much of these same issues. I'm guessing Alex(and most people) never really spent much, if any time with it, though. Turn 10 does amazing work optimizing for fixed spec Xbox hardware, so there's no accusing them of being lazy or incompetent or anything, but I think there's definitely some continuing issues optimizing around DX12 on PC. Even Playground Games seems to have related issues for PC versions in the Horizon games.

People calling this a 'disaster' just keeps pushing the hyperbolic cynical nature of PC gamers these days, though. Obviously there's a lot that should be better here, but it's not a 'disaster' by any means. The fact that it's so scalable in both GPU and CPU demands makes it more accessible than most lately.
 
People calling this a 'disaster' just keeps pushing the hyperbolic cynical nature of PC gamers these days, though. Obviously there's a lot that should be better here, but it's not a 'disaster' by any means. The fact that it's so scalable in both GPU and CPU demands makes it more accessible than most lately.

I agree the game seems far from a 'disaster', but I might quibble on CPU scaling. If there's little to be gained from anything more than a mere 6 cores, then I'd argue it is probably not CPU scaling that well. Also there's the issue of very poor Radeon performance, so GPU scaling seems tied to your vendor atm too.
 
GPU performance looks to be pretty terrible on PC too. Here we see a 2070S averaging below 60fps at XSX settings and much lower internal resolution (1440p DLSS balanced (~836p internal) vs 1584p average with DRS on XSX according to Olivers video). And that is with Ray Tracing enabled.

 
GPU performance looks to be pretty terrible on PC too. Here we see a 2070S averaging below 60fps at XSX settings and much lower internal resolution (1440p DLSS balanced (~836p internal) vs 1584p average with DRS on XSX according to Olivers video). And that is with Ray Tracing enabled.


Yeah I was wondering why there wasn't any mention of the native resolution SX was working from if we're trying to go for optimized X settings, had to go to Olivers video to see what that was. For a 2070 Super to not maintain 60fps with DLSS balanced at 1440p and optimized settings is pretty brutal. We're in The Last of Us comparative GPU scaling territory here.
 
Got the Ratchet and Clank issue with overcommitting vram when changing settings too it seems, requiring a game restart. Damn that's annoying.
Yeah always annoying, but a lot of this is on the OS to improve/fix unfortunately. Games can do various hints and try and adjust things but in these sorts of situations the OS sometimes ends up making catastrophically bad decisions about certain resources/allocations despite all hinting/feedback and never recovers. We need WDDM to advance beyond what it was effectively doing decades ago when GPUs didn't even have virtual memory.

Settings not sticking or silently changing other settings definitely sucks though and needs to get sorted out imo.
 
Status
Not open for further replies.
Back
Top