Ultra Settings are Dumb

Everyone wants to play the latest titles using Ultra settings, everyone wants to crank stuff up to the max, and buy graphics cards that can do this -- but what we’ve discovered after testing dozens of games over the past few years is that Ultra settings are actually pretty dumb.
https://www.techspot.com/article/2338-ultra-vs-high-settings/
in my experience, most of the time they are a resource hog without special tangible benefit.

This was less obvious on my GTX 1080 than on my GTX 1060 3GB, but still quite obvious.
 
I begrudgingly agree with Ultra being mostly stupid. However, I do feel there are individual settings (not the generic "ultra" mode) where it can be useful. Examples near and dear to my heart would be shadow and reflection quality sliders. Now, the other nonsense like film grain and chromatic abberation and motion blur? Turn it all off.
 
I pretty much feel pity for deluded gamers that feel like they must run Ultra settings regardless of how powerful their PC is. I pretty much never use any graphics presets (low, medium, high, very high, whatever) in PC games. I always go in and adjust individual settings to give me consistent game performance.

Ultra is just there if you have the hardware to support it. It isn't something that should be enabled by default. It's the "Hey, look we have an Ultra setting for you PC users so don't pay any attention that the graphics in the game were designed with consoles in mind. See how hard it pushes your PC? Yup PC all the way!"

It'd be more interesting if settings were renamed to something like.
  • low renamed to Potato PC
  • medium renamed to Default
  • high renamed to Only run this if you have really good hardware that is better than a game console.
  • ultra renamed to You're stupid if you run this and you don't have top of the line hardware, but this is proof that we love PC gamers!
:p

Regards,
SB
 
medium renamed to Default

Console settings for medium these days, atleast what devs/studios think. Anyway, Ultra is stupid in the eye of the beholder. Its just as stupid as '4k' or any visual improvement really. There are people who want the best image quality, the higher settings, framerates and sometimes even faster loading times. What some forget is that you dont have to use Ultra settings.
Its good to have a choice ;) Some do appreciate higher fidelity settings.

ultra renamed to You're stupid if you run this and you don't have top of the line hardware, but this is proof that we love PC gamers!

Maybe. But i wouldnt call someone running Ultra stupid but ok. The love for pc gamers could be evident when theres more than medium settings, good kb/m support, DLSS and better-specced RT implementations aswell as ultrawide screen support, higher resolutions and a solid support for higher framerates without making things look bad.
 
Maybe. But i wouldnt call someone running Ultra stupid but ok. The love for pc gamers could be evident when theres more than medium settings, good kb/m support, DLSS and better-specced RT implementations aswell as ultrawide screen support, higher resolutions and a solid support for higher framerates without making things look bad.

Reread what I wrote. The "stupid" people are the ones running Ultra without the hardware capable of running Ultra at the framerate they want it to run at. IE - the ones complaining that they can't run Ultra. :p

Smart person (IMO) - If your machine can't handle Ultra then don't run it in Ultra.
Stupid person (IMO) - If your machine can't handle Ultra, continue to use Ultra settings and complain that Ultra runs badly.

I love developers that put in settings that can't reasonably be run on current generation hardware (almost never happens anymore). Crysis is the posterchild for this.

I think people that insist on running those settings even if their hardware can't handle those settings and then complain about it are stupid. :p This also applies to high or very high settings. If your hardware can't handle those settings, then don't run it at those settings.

Regards,
SB
 
I pretty much feel pity for deluded gamers that feel like they must run Ultra settings regardless of how powerful their PC is. I pretty much never use any graphics presets (low, medium, high, very high, whatever) in PC games. I always go in and adjust individual settings to give me consistent game performance.

Ultra is just there if you have the hardware to support it. It isn't something that should be enabled by default. It's the "Hey, look we have an Ultra setting for you PC users so don't pay any attention that the graphics in the game were designed with consoles in mind. See how hard it pushes your PC? Yup PC all the way!"
I think this type of content highlighting these things are good for PC gamers who don't operate as you describe. I'd like to think that most here, especially all long term PC gamers adjust similarly to yourself.

I get frustrated at my wife who simply loads up games after installing and starts playing... Who does that?? I try to intercept and check out her settings before.
 
Reread what I wrote. The "stupid" people are the ones running Ultra without the hardware capable of running Ultra at the framerate they want it to run at. IE - the ones complaining that they can't run Ultra. :p

Smart person (IMO) - If your machine can't handle Ultra then don't run it in Ultra.
Stupid person (IMO) - If your machine can't handle Ultra, continue to use Ultra settings and complain that Ultra runs badly.

I love developers that put in settings that can't reasonably be run on current generation hardware (almost never happens anymore). Crysis is the posterchild for this.

I think people that insist on running those settings even if their hardware can't handle those settings and then complain about it are stupid. :p This also applies to high or very high settings. If your hardware can't handle those settings, then don't run it at those settings.

Regards,
SB

Yeah well if they run Ultra and then complain, i agree :p I ment that if someone runs Ultra and trandes in performance (below 30fps) for that, its their choice doesnt mean their stupid. If said person then complains about performance, ye its kinda stupid :p
While i dont think normal rasterization has reached its limits (i think were very far from that), ray tracing really murders performance if you want things maxed all the way there, and it seems many titles do go for ray tracing with different settings to choose from.

I get frustrated at my wife who simply loads up games after installing and starts playing... Who does that?? I try to intercept and check out her settings before.

Lol, am recognizing that :p
 
Sometimes I wish there were fewer options in PC games, just resolution, framerate, field of view, post-processing options and the game should take care of the rest...
I don't see why I should set texture or shadow quality, or draw distance when the game can check memory available and do a quick benchmark for shadows (or better use virtual shadow buffers) and the same for draw distances.
It feels more like we have plenty of options to check boxes than anything useful.
 
Sometimes I wish there were fewer options in PC games, just resolution, framerate, field of view, post-processing options and the game should take care of the rest...
I don't see why I should set texture or shadow quality, or draw distance when the game can check memory available and do a quick benchmark for shadows (or better use virtual shadow buffers) and the same for draw distances.
It feels more like we have plenty of options to check boxes than anything useful.

At one point I was excited that some games were coming out with auto-detection of hardware and supposedly setting graphics options to an appropriate level for those settings.

However, that soon soured as whatever they were doing assign settings to hardware that they detected never gave a satisfying level of performance for me. So, I now basically ignore autodetected hardware settings and manually adjust settings.

This is especially true when auto-settings almost always leave DoF and motion blur enabled and I cannot stand either of those settings in games. They are always eyestrain inducing to the point of giving me headaches.

Regards,
SB
 
Something tells me Ultra settings are a concept that's here to stay. I think they help give games that "future tech" vibe because they run too slow on the supposedly amazing hardware of the day. It probably helps make sales with that buzz. Everybody gets excited even if it is kinda dumb. And in the end this is primarily a pretty pixels arms race.

Anyway, it's been discussed ad nauseam even on here. Meh.

Even consoles have Ultra settings now in some cases because of 4K and RT options that they aren't really powerful enough to run but need to have anyway because of spin.
 
Last edited:
Wow.....this thread is depressing. How about we talk about games that have the biggest visual difference between high and very high/Ultra?

Crysis 2007 comes to mind, the difference between high and very high is insane!
 
Wow.....this thread is depressing. How about we talk about games that have the biggest visual difference between high and very high/Ultra?

Crysis 2007 comes to mind, the difference between high and very high is insane!

Isn't that a consequence of the tech maturing as much as anything? The baseline for geometry / lighting /shadows is so much higher than 2007. Cranking them up just doesn't net you as much anymore, or much at all.

Maybe it changes when we see games actually developed properly for this gen? Ultra reduces light propagation lag? More dynamic and varied foliage? More of the environment reflected?

Is that enough for people to stroke their $3000 PC with loving satisfaction?
 
Isn't that a consequence of the tech maturing as much as anything? The baseline for geometry / lighting /shadows is so much higher than 2007. Cranking them up just doesn't net you as much anymore, or much at all.

Maybe it changes when we see games actually developed properly for this gen? Ultra reduces light propagation lag? More dynamic and varied foliage? More of the environment reflected?

Is that enough for people to stroke their $3000 PC with loving satisfaction?

I would argue it's more of a consequence of hardware and API fragmentation making it difficult to have a consistent baseline across all hardware.
 
Not really sure what you mean with regards to today's hardware landscape?

Take Ray tracing as an example...

Not every GPU can do it
Nvidia and AMD do it differently
DXR does it differently then OpenGL
XSX likely does it differently then PC based DXR
PS5's API does it differently then DXR and OpenGL

That much fragmentation slows down adoption and limits what features games has.

It's the same with Mesh Shaders and Direct Storage.
 
Take Ray tracing as an example...

Not every GPU can do it
Nvidia and AMD do it differently
DXR does it differently then OpenGL
XSX likely does it differently then PC based DXR
PS5's API does it differently then DXR and OpenGL

That much fragmentation slows down adoption and limits what features games has.

It's the same with Mesh Shaders and Direct Storage.

Devs seem to be doing just fine with raytracing? Ultra model is all the features and console /mid range is just some.

Direct storage isn't here yet and mesh shader like features will be as soon as we have true XSX/PS5 titles. What does Ultra look like for either of those though?
 
Devs seem to be doing just fine with raytracing? Ultra model is all the features and console /mid range is just some.

Direct storage isn't here yet and mesh shader like features will be as soon as we have true XSX/PS5 titles. What does Ultra look like for either of those though?

I think you’re right about the baseline simply being higher. IQ improvements don’t jump off the screen as easily as back in 2007. We’re definitely in diminishing returns territory when it comes to the type of graphics rendering that we are accustomed to.

When developers start rendering environments that can only be done with RT, Nanite, better particle systems etc then we will feel the difference. But even then ultra will probably still be useless.
 
I can't live without volumetrics on high, and I think per object motion blur does wonders to the fluidity of the animations :D
For example DLSS in cyberpunk does an excellent job in everything else, except volumetrics (in performance or balanced), were lights look pixelated and out of place due to the lower resolution.
It is very apparent in my eyes... (but I don't have a choice in not using it)
nVidia train your algorithms to smoothe volumetrics! (something tells me it's not that simple) :p

But I agree that cranking everything to the max is not an efficient way to use your GPU.
And for some years now, medium/high settings give a great resulting image quality.
 
Back
Top