Ultra Settings are Dumb

FC6 doesn’t disappoint.

35% performance improvement from medium to ultra (without RT) with no discernible IQ upgrade.

 
Wow.....this thread is depressing. How about we talk about games that have the biggest visual difference between high and very high/Ultra?

Crysis 2007 comes to mind, the difference between high and very high is insane!

You are misremembering. The major upgrade was going from medium to high. Very high over high was more refinement than anything else.

Isn't that a consequence of the tech maturing as much as anything? The baseline for geometry / lighting /shadows is so much higher than 2007. Cranking them up just doesn't net you as much anymore, or much at all.

Maybe it changes when we see games actually developed properly for this gen? Ultra reduces light propagation lag? More dynamic and varied foliage? More of the environment reflected?

Is that enough for people to stroke their $3000 PC with loving satisfaction?

Tech maturation is certainly an aspect, but I feel the homogenization of game development around console capabilities is by far the leading factor. It simply costs too much money to offer the PC platform any significant differentiation in today's games. The gap between console and PC hardware has also been consistently shrinking each generation as silicon scaling becomes less beneficial and takes longer to achieve.
 
I dont understand?, surely labelling something as 'ultra' is like turning your amps up to 11
its just a term
 
Btw the lack of quality boost with ultra is more due to developer's wizardry to make lower quality settings looks Similar to ultra. Or because ultra is the lazy way to make people able to show off and for placebo? Or it's more of a mix of both of those?
 
The gap between console and PC hardware has also been consistently shrinking each generation as silicon scaling becomes less beneficial and takes longer to achieve.

Seems like its the other way around, hardware gap is way larger now then it was with the first 2001 xbox for example. Games settings usually are around medium on consoles already now, rt and dlss only widen that gap.
 
In my experience, they are usually a resource hog with little to no tangible advantage.

This was less noticeable on my GTX 1080 than it was on my GTX 1060 3GB, but it was still noticeable.
 
Sometimes I wish there were fewer options in PC games, just resolution, framerate, field of view, post-processing options and the game should take care of the rest...
I don't see why I should set texture or shadow quality, or draw distance when the game can check memory available and do a quick benchmark for shadows (or better use virtual shadow buffers) and the same for draw distances.
It feels more like we have plenty of options to check boxes than anything useful.

My only worry there is I find in modern games aesthetics can get in the way of gameplay or visibility. Lots of games you need to be able to read a situation quickly and react, but busy visuals or layers of post processing make it difficult. Sometimes low/medium is the best way to play.
 
in my experience, most of the time they are a resource hog without special tangible benefit.

This was less obvious on my GTX 1080 than on my GTX 1060 3GB, but still quite obvious.

In my experience, they are usually a resource hog with little to no tangible advantage.

This was less noticeable on my GTX 1080 than it was on my GTX 1060 3GB, but it was still noticeable.
 
There's a fear of missing out, I know I fall in this category. I've known about the diminishing returns with ultra/max settings well before this video but I'm still admittingly psychologically fixated on max settings or bust basically (well for the most part). Why? Due to the heavy amount of content available these days relative to time available SP games (or at least story driven) for the most part to me are relegated essentially to 1 and done affairs. This means I feel pressured into the maximum experience on that one and only play through.

Or at least the optimal experience. Which can be a problem in itself as it might take time to find which settings actually have high performance impact to low visual gain. An additional issue also is not all game settings might have uniform impact in every single scene, so just testing at the opening may not tell the full picture. This ultimately again is a fear of missing out and lack of time issue, it's just simpler to set it to max and not worry about it.

Btw the lack of quality boost with ultra is more due to developer's wizardry to make lower quality settings looks Similar to ultra. Or because ultra is the lazy way to make people able to show off and for placebo? Or it's more of a mix of both of those?

Minimal and low settings can often have a very massive visual drop off and also relatively low performance gains. Essentially the same diminishing return issues but on the other end.

It's really the "middle" (maybe labeled differently) settings that optimal from a visual quality to performance stand point. Why? It tends to be the target optimization from a design stand point.

With multiplatform games nowadays especially the developers know that their are certain hardware targets that is applicable to the majority of their audience. As such from a design/art stand point they will optimize what they make with that in mind. They'll get from the software side what limitations they have to worth with when designing content. Ultimately it's the art design/content that by far has the most impact visually and shapes how one perceives the game looks. Simply adjusting the fidelity of how that content is displayed (which is what PC+ settings enable) is ultimately pretty limited in terms of how much you can really change the visual appeal of the game for the better.
 
Maximum settings are often not dumb. In some titles they don't improve much but I've played games where I set most of the settings to the highest level and just wanted to play it that way.
It depends on the game.
 
Back
Top