Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
Alan Wake 2 also conveniently has VERY limited draw distances for the vast majority of the game. This is undoubtedly helping quite a bit in this regard.

Also, speaking of, Rich(DF) did testing on non-mesh shader GPU's and found that while the 5700XT does struggle, it seems to be able to manage a *nearly* playable level of performance while a 1080Ti is just basically unusable trash, barely scraping 20fps. Thought that was interesting as I'm not sure what would be causing that discrepancy. Also, he tests a 1660Ti, which is the non-RT Turing part, and that does seem to have mesh shader support and runs ok as a result.

By draw distance you mean lighting effects or texture quality? Because the latter has been turned down in the latest cyberpunk updates and it looks quite bad now even on 4k.
 
Got it in 2019 when it was still the best GPU available. Benchmarks without RT it was better even than the early RTX GPUs. Still kind of a shame. Still has a lot of performance under there.


But what to do? This is how business works
for now its just one game so if you dont want to play aw2 on pc, you can keep it and it will still be pretty good in other games
 
By draw distance you mean lighting effects or texture quality? Because the latter has been turned down in the latest cyberpunk updates and it looks quite bad now even on 4k.
I mean overall draw distance of the environments in general. The game is constantly enveloping a scene in fog and whatnot, allowing them to do a lot more with the more limited environmental detail that has to be drawn in. Not saying it's a bad thing, it's just undoubtedly helping with things like eliminating pop-in(and of course pushing fidelity).

The 1080Ti had fulfilled it's purpose, lasting from 2017 to 2023, 6 years is more than enough. In the times of old, the 1080Ti would've become obsolete within a couple of years. We've quickly transitioned from DX5 in 1997 to DX11 in 2008. Meaning 6 major DirectX versions in the span of 11 years, each version obsoleting the one before it. DX9 itself had multiple versions obsoleting each other, when DX9c came with it's Shader Model 3 obsoleted the DX9b cards before it (with their Shader Model 2). We've had a good hiatus with DX11 and DX12 when DX11 cards extended their DX11 support into DX12 support. That hiatus should have ended with the release of DXR and DX12U after it but it didn't, and now it's happening. People shouldn't feel suprised, instead they should prepare for this, as we've had enough of a transition period already.

At any rate, the 1080Ti and the Pascal archeticture in general should be compared to it's period correct competitor, which is Vega and Polaris. They all lack mesh shaders, which one would come up on top?
A non-mesh shader GPU doing badly is not surprising. But a 5700XT was originally thought to not have any kind of mesh shader support, while it turns out it does. There was a whole lot of talk recently with this game about how the 5700XT had become 'obsolete' because of this.
 
Got it in 2019 when it was still the best GPU available. Benchmarks without RT it was better even than the early RTX GPUs. Still kind of a shame. Still has a lot of performance under there.


But what to do? This is how business works
The 2080 Ti came out in Sept 2018, as did the 2080. The 1080 Ti was never the best available card in 2019.
 
So much hate on the 1080ti, why?

Put some respect on its name as it was the first 'real' 4k GPU in my eyes and has held its own.

In fact trying to find a 'modern' GPU that's aged as well as it has is incredibly difficult.
what? there is no hate. it just doesnt have mesh shaders so a game that needs it wont be performant on the 1080 ti
but most games dont require mesh shaders so where is the hate?
 
No hate its stil in recommended settings for COD:MW3

F9xRwqwXEAAPwJD
 
what? there is no hate. it just doesnt have mesh shaders so a game that needs it wont be performant on the 1080 ti
but most games dont require mesh shaders so where is the hate?
It was a piece of trash on another page.

I've just looked at modern GPU reviews on Techpowerup and was surprised to see it averaging around RTX2080 performance.

So around console level which is impressive.
 
It was a piece of trash on another page.

I've just looked at modern GPU reviews on Techpowerup and was surprised to see it averaging around RTX2080 performance.

So around console level which is impressive.
well its "trash" in aw2 but it isnt in other games so if you only play aw2 and you have a 1080 ti, you will be pretty annoyed at its performance i would say, but that is a ridiculous scenario
 
It was a piece of trash on another page.
Holy crap dude, CONTEXT.

I didn't say the 1080Ti was a piece of trash in general, I just said it performed like a piece of trash in this specific game, and was referencing a comparison to a 5700XT, which turns out does seem to have some level of mesh shader support since it's like 50% faster than the 1080Ti, when in no other situation should that be the case.
 
Holy crap dude, CONTEXT.

I didn't say the 1080Ti was a piece of trash in general, I just said it performed like a piece of trash in this specific game, and was referencing a comparison to a 5700XT, which turns out does seem to have some level of mesh shader support since it's like 50% faster than the 1080Ti, when in no other situation should that be the case.

I don't think 5700xt has any mesh shader support. It's just running the fallback path which I guess would be vertex shader based much faster than Pascal. Any card that gets the popup warning about mesh shaders not being support is not running mesh shaders in Alan Wake 2.

Edit: The dev that had originally posted about mesh shader requirements had said there was a vertex fallback that they were working on but basically abandoned because the performance is so bad. It's obviously still in there because the game runs, but it can basically be considered unoptimized/unfinished and not representative of anything useful.
 
A non-mesh shader GPU doing badly is not surprising. But a 5700XT was originally thought to not have any kind of mesh shader support, while it turns out it does. There was a whole lot of talk recently with this game about how the 5700XT had become 'obsolete' because of this.
I'm hoping that is not the take away from my post as it was more of a question rather than a confirmation. Even if it turns out that AMD's version of Mesh shaders are just Primitive shaders I'm sure there was a great deal of optimization and changes made to them by the time RX6000 came out to make them more suitable for Mesh Shading.
 
Great job @oliemack and @Dictator

Appreciate the 1-2 combo you both bring here giving both a full depth run through on PC and console.

Series S performing very well considering its rendering power. X also doing well, PS5 may need some tweaking but if not we’re seeing indication of GPU limits if it’s behaving like that (or the lack of feature compliance may have an effect here)

Loading differences are significant however, nearly 50% faster on ps5.
 
Last edited:
Great job @oliemack and @Dictator

Appreciate the 1-2 combo you both bring here giving both a full depth run through on PC and console.

Series S performing very well considering its rendering power. X also doing well, PS5 may need some tweaking but if not we’re seeing indication of GPU limits if it’s behaving like that (or the lack of feature compliance may have an effect here)

Loading differences are significant however, nearly 50% faster on ps5.
Given that it's still the same Northlight engine which, even at the start of the gen when XSX XDK wasn't up to par, was faster on the XSX, I think it's simply the XSX GPU advantage bearing out. It is a very GPU/compute-heavy game and the PS5 GPU operates a little above the 2080 GPU on PC in this game.
 
Last edited:
Status
Not open for further replies.
Back
Top