Starfield to use FSR2 only, exclude DLSS2/3 and XeSS: concerns and implications *spawn*

I also saw the Gamer's Nexus CPU scaling video. This is a confusing game. I can't figure out what it scales with.

The GN CPU video's test between DDR5 kits looks pretty unimpressive because it looks like the game cares more about latency/timings than raw bandwidth.
The 3 kits tested have a decent difference between them bandwidth-wise, but the latency/timings between them in the absolute sense aren't far off, and so neither are the scores.


Folks with pre-builds and custom builds who messed up their RAM install and have both sticks on the same channel are going to have a bad time.
 
I also saw the Gamer's Nexus CPU scaling video. This is a confusing game. I can't figure out what it scales with.
On the AMD side, it looks like L3 cache has an effect. The more midrange parts, like the 5800x3d with it's greater L3 cache outperforms the 5950x which has more cores at the same base speed and higher boost, and the 5800x which has a higher frequency all around and the same amount of cores.

I do think, at this point, a lot of reviews are benchmarking different spots in the game. Without knowing much as I haven't played it yet, I think that HUB video looks to be further into the game than most. They claim they found a spot that performs the worst to give worst case scenarios and while I do think that's a great way to do things... This is a Bethesda game. If history has shown anything it's that their games are going to have weird performance that isn't necessarily the same on all hardware. HUB clearly states that the Intel parts are faster at the high end, and have the numbers to support that, but I wouldn't be surprised if plenty of other parts of the game perform better on AMD parts. That said, my hypothesis about L3 cache is equally shaky. There are likely parts of the game that need more frequency.
 
Is it a coincidence that DLSS gets added to the game just a short while after AMD suddenly breaks silence? It happened with Avatat Pandora too.

And similarly with Avatar, DLSS support was treated as a side note (while touting the amazingness of FSR). If I were to wager a guess I would say that AMD have now changed their contractual terms to allow DLSS but only on the condition that it's not promoted, merely announced. I've no problem with that if so though.
 
Is it a coincidence that DLSS gets added to the game just a short while after AMD suddenly breaks silence? It happened with Avatat Pandora too.

Hopefully this triggers a floodgate, Dead Island 2 up next please, maybe Borderlands 3/Tiny Tina's Wonderlands. Hell, Resident Evil 4 would be the best (especially as the DLSS mod has problems with bloom), but being Capcom I'm not holding my breath.

It was likely either a timed exclusive deal, or the heat is finally getting too much and AMD has had to retroactively scrub the restriction in their agreements (edit: as already noted by @pjbliverpool). John from DF's post should finally end this 'debate' once and for all, although it should have been plainly obvious after AMD's first non-response to anyone with any degree of objectivity.
 
Last edited:
I also saw the Gamer's Nexus CPU scaling video. This is a confusing game. I can't figure out what it scales with.
Bandwidth/latency? ("[SPECULATIVE RAMBLING] Starfield seems to be very RAM bandwidth limited" by Actually Hardcore Overclocking.) "In looking at this [PCGamesHardware.de] chart, it really looks like Starfield may have the performance scaling of the AIDA bandwidth benchmark."

Might explain the 30fps Xbox Series limit, as they're running higher-latency GDDR vs DDR.

A comment under the video:
@Me1as said:
Hi Buildzoid, I have a 13900KS with two 24GB sticks of RAM running at 8000 MT/s and a RTX 4090.

Since I saw the video and I had the correct setup to do this, I did.
I looked at how the article came up with those numbers and downloaded their savefile to do the same thing.
I had the CPU set to stock, GPU limited at 600W, rBAR activated and also ran Windows 11.

With RAM speed set to 5600 MT/s (same as the 13900K in that article) I got an average framerate of 109fps, so by all means close enough to 110-111. The GPU usage was at 85%, every P-Core at 100% only on one of their threads not both and E-Cores at around 80%.

With 8000 MT/s I managed to get an average of 122fps, but the GPU usage was at 98% (GPU-limited for sure when using the same settings as in the article), P-Cores were still at 100% and E-Cores now at around 90%.

Since the second result managed to get GPU-limited, I changed the Shadow Quality to Low and re-ran the tests.
5600 MT/s achieved 113fps, with GPU usage now at 75%
8000 MT/s achieved 135fps, with GPU usage now at 84%
so that is not the "perfect" scaling I was expecting but it is still a lot for a game. (1.19x more fps for 1.43x higher frequency)

Adding some more of my test results (with Shadow Quality set to Low), maybe useful for a graph?
6000 MT/s achieved 117fps
6400 MT/s achieved 121fps
6800 MT/s achieved 129fps
7200 MT/s achieved 131fps
7600 MT/s achieved 133fps
Edit: Didn’t mean to quote your post back to you, homerdog.
 
Last edited:
Surprised this is still on going thread.
From the start everyone knew dlss & xess was going to get modded in pretty much immediately. Would it have been nice to have been first class citizen, but I'm pretty sure PC gamers understand that. Everywhere it's mentioned that it doesn't support dlss mentions the mod.

Probably easier to implement an FOV slider, but for whatever reason was deemed low priority, even though you can change it in ini file.
Shows decisions can be made irrespective of contracts, doesn't mean that having a marketing association may not sway how much of a priority something is though.
 
Surprised this is still on going thread.

The game was just made available to the general public, being released we have actual confirmation DLSS isn't included like many feared, and there have been mods released. I mean, why wouldn't this thread 'still' be going at this point? Odd comment.

From the start everyone knew dlss & xess was going to get modded in pretty much immediately.

Would it have been nice to have been first class citizen, but I'm pretty sure PC gamers understand that. Everywhere it's mentioned that it doesn't support dlss mentions the mod.

I fail to see how this is relevant at all. The thread was not born out of a simple technical question of how to add it, it was about the corporate politics involved in preventing developers from adding it themselves. It's not a nothingburger simply because it's technically possible to be done by modders, the ease of which just further underscores that the decision to not include it was indeed political.

Probably easier to implement an FOV slider, but for whatever reason was deemed low priority, even though you can change it in ini file.
Shows decisions can be made irrespective of contracts, doesn't mean that having a marketing association may not sway how much of a priority something is though.

Oh please, this is truly ridiculous at this point. It was never about 'time constraints' or 'priorities being swayed' the actual reason was always obvious, was made even more clear when a modder added DLSS within 2 hours, and how we have confirmation from John of DF that working implementations in other games were actually removed due to AMD's contract.

It wasn't included because AMD didn't want it included, and they were forced to put the onus on Bethesda after the huge backlash, and the likely resulting modifying of the contract to remove that exclusionary wording. It is not a coincidence that Jedi just got a DLSS patch after all this now as well.

1693954606037.png
 
Last edited:
Amazingly enough, StarWars Jedi Survivor didn't just add DLSS2, they also added DLSS3 Frame Generation!!

This patch adds support for both DLSS 3 Super Resolution and DLSS 3 Frame Generation. As such, PC gamers won’t have to use PureDark’s DLSS 3 Mod. Since the game now officially supports DLSS 3, we’ll be sure to re-test it, so stay tuned for more!

Gameplay videos with official DLSS3 implementation.



 
Yes but also cache and bandwidth. Plus it's really heavy on the GPU. It's like it's limited by everything all at once.
Yes, but that's because of how things are tested. Every GPU benchmark uses the fastest available CPU and memory. Every CPU benchmark uses the fastest GPU and memory.
 
Yes but also cache and bandwidth. Plus it's really heavy on the GPU. It's like it's limited by everything all at once.

I feel this might be a showcase of the limitations of single scene/scenario testing, especially for more "deep" dive type analysis.

My feeling is that different scenarios likely place a high emphasis on different aspects of hardware here and that a single scenario run averaging it out might not be conveying the entire picture.
 
Would be very interesting to see what they do to make it so heavy.
I kind of understand why it's heavy on CPU/memory, being a massive Bethesda game. Although it is so heavily segmented by loading screens I'm not sure this argument makes a ton of sense. But the way it smashes the GPU is baffling to me.
 
The game was just made available to the general public, being released we have actual confirmation DLSS isn't included like many feared, and there have been mods released. I mean, why wouldn't this thread 'still' be going at this point? Odd comment.



I fail to see how this is relevant at all. The thread was not born out of a simple technical question of how to add it, it was about the corporate politics involved in preventing developers from adding it themselves. It's not a nothingburger simply because it's technically possible to be done by modders, the ease of which just further underscores that the decision to not include it was indeed political.



Oh please, this is truly ridiculous at this point. It was never about 'time constraints' or 'priorities being swayed' the actual reason was always obvious, was made even more clear when a modder added DLSS within 2 hours, and how we have confirmation from John of DF that working implementations in other games were actually removed due to AMD's contract.

It wasn't included because AMD didn't want it included, and they were forced to put the onus on Bethesda after the huge backlash, and the likely resulting modifying of the contract to remove that exclusionary wording. It is not a coincidence that Jedi just got a DLSS patch after all this now as well.

View attachment 9542

I think this thread should remain open until at least the point that Bethesda officially add DLSS to the game now that they are allowed to do so 😉
 
Back
Top