Starfield to use FSR2 only, exclude DLSS2/3 and XeSS: concerns and implications *spawn*

Man, I'm glad I wasn't the only one that was thinking that. And even with a GTX 1070, I was wishing that that game offered TressFX because it was far less demanding on the system than Hairworks. TressFX might have actually been useable in game versus the slideshow that the game became with Hairworks (what a piece of dung, IMO).

I sometimes wonder if NV deliberately made Hairworks perform badly so that NV users would have to upgrade their graphics cards in order to use it.

Regards,
SB
I think they just picked the solution that would result in the largest performance gap vs AMD in benchmarks.
 
I think they just picked the solution that would result in the largest performance gap vs AMD in benchmarks.
I also feel like a lot of the Gameworks games used fallbacks that were less than optimal in some cases. The fallback hair on creatures in The Witcher 3 I think look not as good as they could. Same with the smoke in Arkham Knight. So it was a double edged attack. Gameworks leveraged the hardware advantagge of nVidia hardware, of course, but the difference between on and off also had to be large enough to be desireable.
 
I also feel like a lot of the Gameworks games used fallbacks that were less than optimal in some cases. The fallback hair on creatures in The Witcher 3 I think look not as good as they could. Same with the smoke in Arkham Knight. So it was a double edged attack. Gameworks leveraged the hardware advantagge of nVidia hardware, of course, but the difference between on and off also had to be large enough to be desireable.

I don't really agree with that. The non Gameworks (and hardware/GPU PhysX effects) were identical to the console versions. Essentially without those features the PC side would've just been identical the console (and I feel comparable to other games at the time). This I felt was something that was misunderstood/misrepresented about those features (from either vendor). The GPU gap (and PC hardware gap) opened up much faster against the PS3 and PS4 console generation eras when multiplatform games became vogue. It's the same story as now, developers aren't likely to want to pour heavy resources into PC specific visual improvements nor optimizations. So it was either we got diminishing returns (and basically arguably inefficient) visual addons relative to the performance requirements or we basically just get nothing at all on the PC side aside from higher resolutions and higher frame rates (and I guess better AA).

As an aside I feel the same way as to why ray tracing is important for the PC side going forward due to the scalability relative to the development resources. Without things like that the primary visual difference advantage between the PC and consoles is just going to be higher resolutions/framerates.
 
I also feel like a lot of the Gameworks games used fallbacks that were less than optimal in some cases. The fallback hair on creatures in The Witcher 3 I think look not as good as they could. Same with the smoke in Arkham Knight. So it was a double edged attack. Gameworks leveraged the hardware advantagge of nVidia hardware, of course, but the difference between on and off also had to be large enough to be desireable.
Outside of HBAO+ gameworks effects were often quite useless. More often than not they were completely broken and/or looked worse than the stock effects while being much slower. Quite glad that era is over and done with. AMD has now taken that mantle up with their awful FSR 2 and CACAO etc.
 
It's part of the reason I upgraded from an AMD card to nVidia. PhysX and Hairworks. They were locked to a vendor and there wasn't an option for something close when you had a card from not that vendor.

Hairworks in Witcher3 often looked worse than without though, it would often completely flip out when being animated, and it was a huge performance hog on Nvidia too - any Witcher3 optimization guide was "Turn hairworks off of course". It was often disabled even when doing GPU reviews from my recollection.

The point about 'no one' asking "Where is my TressFX" is because there was no expectation that this particular implementation would be a standard feature for games moving forward, like there is now with DLSS and reconstruction in general. Both TressFX and Hairworks are relatively uncommon in games (albeit it's kind of tough to say as TressFX was made open source and has been heavily modified in other engines as to my understanding).

When the Witcher3 came out, how many games were there with advertised TressFX support before? Tomb Raider, and...? Like yes, I get that somewhere, someone on the planet likely asked that question. The context of the statement though was what is a reasonable expectation wrt to included game features that the public expects, and hence why I think the dust kicked up by this particular FSR/DLSS situation is happening now. A physics hair simulation incorporated a handful of times (and before Witcher3, handful may be generous) is very different than a reconstruction method used in literally hundreds of games before this kerfuffle.

I don't really agree with that. The non Gameworks (and hardware/GPU PhysX effects) were identical to the console versions. Essentially without those features the PC side would've just been identical the console (and I feel comparable to other games at the time). This I felt was something that was misunderstood/misrepresented about those features (from either vendor). The GPU gap (and PC hardware gap) opened up much faster against the PS3 and PS4 console generation eras when multiplatform games became vogue. It's the same story as now, developers aren't likely to want to pour heavy resources into PC specific visual improvements nor optimizations. So it was either we got diminishing returns (and basically arguably inefficient) visual addons relative to the performance requirements or we basically just get nothing at all on the PC side aside from higher resolutions and higher frame rates (and I guess better AA).

Exactly, which is why "Ultra settings are a waste" is an adage now. More often than not, the PC specific enhancements you get on games are largely just upping the fidelity of effects, especially these days when the texture quality is often not improved due to the ample memory the consoles have. It's not that these still can't end up being significant, better LOD and even common stuff like anisotropic filtering can have a noticeable impact, or of course just being a PC you can tailor these options for your framerate whereas a console may disable certain things like SSR for a game's 'performance mode'.

But they've never been particularly well optimized, it's just turning the dials up to 11 and let performance be damned - there's a "Ultra" setting, look PC gamers we support you! Effects that were actually constructed in a completely different way that didn't exist on the consoles was usually just not in the cards unless Nvidia/AMD provided the code for it.
 
Last edited:
Im assuming in console it's gonna be FSR2 with series X getting 1440p upscaled to 4k with dynamic resolution, and series S going from 1080p with dynamic res upscaled to 1440p
 
Im assuming in console it's gonna be FSR2 with series X getting 1440p upscaled to 4k with dynamic resolution, and series S going from 1080p with dynamic res upscaled to 1440p
John L estimated some scenes from the Series X footage was lower than 1440p (1296p), so the native resolution is also dynamic. So it'll be interesting to see how Series S may drop to when things get crazy.
 
John L estimated some scenes from the Series X footage was lower than 1440p (1296p), so the native resolution is also dynamic. So it'll be interesting to see how Series S may drop to when things get crazy.
That's what I meant yeah. The image is only getting smart upscaled so it's not really 4k or 1440p to begin with. I only have series S but I hope image quality and performance is decent at 30
 
That's what I meant yeah. The image is only getting smart upscaled so it's not really 4k or 1440p to begin with. I only have series S but I hope image quality and performance is decent at 30
Bethesda Game Studios has two months to do more optimisation. You have to believe that the game and content is feature complete so now all that is left is quashing bugs, and eeking even last ounce of performance out of the two Xbox consoles.
 
...and eeking even last ounce of performance out of the two Xbox consoles.

This is Bethesda, there's only so much that Microsoft can help them with in not being the bad side of Bethesda. :p I'm expecting continued performance optimization for the next 1-2 or more years. :) And that would be a good thing.

Regards,
SB
 
This is Bethesda, there's only so much that Microsoft can help them with in not being the bad side of Bethesda. :p I'm expecting continued performance optimization for the next 1-2 or more years. :) And that would be a good thing.
I'm quietly optimistic about Starfield. Fallout 4 (2015) showed a massive improvement in performance, hardware utilisation and stability over Skyrim. They leaned heavily on other expertise, like id on getting weapons to feel better, and I would bet they have been doing the same with their whole tech stack.

Let's be honest, Bethesda Games Studio games have looked great for open world RPGs with millions of objects but they've never been lookers. Starfield changes that, it looks fantastic with the exception of Bethesda-face. Nothing that space helmets cannot hide :runaway:
 
For Skyrim which had 360 as lead platform I always thought the performance for that sku was pretty good. It was PS3 that was a big problem and PC which didn't have its hw taken advantage of. Hopefully this is closer to 360 Skyrim
 
A large part of Skyrim's problem was the 2GB memory limit and mods were only able to increase that up to 4GB. The PCs higher settings basically really pushed into those memory limits. Fallout 4 (and Skyrim SE) being 64 bit is likely the major single reason that resulted in better optimization/stability on the PC.

The double edged sword here is DX12. On paper DX12 seems like it would help Bethesda's game's. On paper as well however DX12 has a poor track record especially on first implementation and this is their first DX12 game. Memory usage related issues with respect to the PC also seems to be an ongoing issue with respect DX12 as well, which is also pertinent given Bethesda track record with this issue on the PC.
 
Last edited:
I don't really agree with that. The non Gameworks (and hardware/GPU PhysX effects) were identical to the console versions.
I'm saying that I think they should look better than the console versions, though. Anything that has a Gameworks options would have been on PC, and the fact that the best they could do was console quality without Gameworks tells you everything you need to know. It could have looked better, because there are settings that make it look better. They are just locked to one vendor.

Look at any game with PhysX from back then. If you enabled hardware PhysX, it didn't just add more accurate physics. It added so many more particles. It's the same with hairworks. Some of the monsters hardly look like they have fur, compared to their lush Hairworks on counterparts. There was obviously a half step in-between.
Hairworks in Witcher3 often looked worse than without though, it would often completely flip out when being animated, and it was a huge performance hog on Nvidia too - any Witcher3 optimization guide was "Turn hairworks off of course". It was often disabled even when doing GPU reviews from my recollection.
Yeah, it was glitchy on characters, but it looked great on the monsters.
I'm expecting continued performance optimization for the next 1-2 or more years.
I'll see you guys in the Legendary Edition.
 
This is Bethesda, there's only so much that Microsoft can help them with in not being the bad side of Bethesda. :p I'm expecting continued performance optimization for the next 1-2 or more years. :) And that would be a good thing.

Regards,
SB

starfield has at least one expansion planned and I wouldn't doubt there are more in the pipeline since its an easy way to monetize the game in addition to game pass. have the base game free on game pass but hey here is the first expansion its $20 and then 6-12 months later here is the second expansion and its $20 and right now you can buy expansion 1 and 2 for $30 and oh here comes expansion 3. It would be smart of them to continue making expansions until an ES6 release . Skyrim could have been an eternal bread winner for them if they continued this way over the last what 12 years of that titles life.

At the same time the more they optimize the creation engine the better results ES6 and Fallout 5 will have


As for FSR Vs DLSS. I perfer true resolution but in all honesty if this gets FSR 2.0 or 2.1 its good enough vs DLSS and with FSR it works on amd/intel/nvidia. If this was a DLSS only title then I'd be SOL on my steam deck. My 3080 will be just fine using FSR2.
 
Bethesda Game Studios has two months to do more optimisation. You have to believe that the game and content is feature complete so now all that is left is quashing bugs, and eeking even last ounce of performance out of the two Xbox consoles.

It's likely more than 2 months. I am sure the footage from the trailer was captured in April/May of this year giving them time to go through and pick to them the best footage to fit the narrative that they presented. Remember that was shown June 12th. So even from there they had almost 3 months before the game would release.
 
There's a massive chasm in my mind between giving engineering resources to a studio to add an additional graphical effect that runs the best on your GPU architecture that never would have existed in the game without that marketing deal otherwise, and putting in language that specifically restricts commonly used features though.

There were two incidents in the past regarding this, with the release of Batman Arkam Asylum (sponsored by NVIDIA) in 2009, the game locked MSAA to NVIDIA GPUs only, AMD cards couldn't access that feature, AMD complained about this move in a blogpost, and declared their GPUs perfectly capable of running the new MSAA, following that, there was a huge uproar on the internet, enough that NVIDIA and the developer caved in and allowed MSAA to run on AMD cards with no problems.

The logic behind this was a bit unusual though, NVIDIA came forth and stated they helped develop the MSAA implementation in the game, as the game used UE3 with deferred rendering which is incompatible with MSAA, and the developer couldn't make traditional MSAA to work with the game, so NVIDIA stepped in and helped them make it, the developer then locked it to NVIDIA's GPUs. The developer reiterated this narrative as well. Stating they probed AMD on the matter but AMD didn't care enough to make MSAA work on their hardware, before the launch of the game.

The second incident happened in 2008, with the release of Assassin Creed, the game was sponsored by NVIDIA and supported DX10, but then Ubisoft added DX10.1 support in a later patch, at that time DX10.1 was a rare occurrence, and it was only supported on AMD HD3000 GPUs.

With DX10.1 the game worked faster on AMD GPUs, but then Ubisoft stepped in and suddenly removed DX10.1 in a following patch, there was another uproar on the internet and people pointed fingers at NVIDIA and claimed they pressured Ubisoft to remove DX10.1, Ubisoft denied the allegations, and so did NVIDIA, it was later revealed that DX10.1 altered the image quality of the game, and removed some post processing effects, which is why it worked faster on AMD GPUs, this was reproduced by testings from independent media, Ubisoft removed DX10.1, but never cared to fix it or add it back later. The entire ordeal was transparent from start to finish though, with NVIDIA and Ubisoft responding directly to the press, and denying any kind of deal, also AMD never made any accusations or complaints about this matter.

In fact, In both incidents, NVIDIA and the developers came forth and made official statements that clarified their position, unlike the radio silence we have now, despite the massive uproar.
 
Back
Top