Wow.....this thread is depressing. How about we talk about games that have the biggest visual difference between high and very high/Ultra?
Crysis 2007 comes to mind, the difference between high and very high is insane!
Isn't that a consequence of the tech maturing as much as anything? The baseline for geometry / lighting /shadows is so much higher than 2007. Cranking them up just doesn't net you as much anymore, or much at all.
Maybe it changes when we see games actually developed properly for this gen? Ultra reduces light propagation lag? More dynamic and varied foliage? More of the environment reflected?
Is that enough for people to stroke their $3000 PC with loving satisfaction?
The gap between console and PC hardware has also been consistently shrinking each generation as silicon scaling becomes less beneficial and takes longer to achieve.
Sometimes I wish there were fewer options in PC games, just resolution, framerate, field of view, post-processing options and the game should take care of the rest...
I don't see why I should set texture or shadow quality, or draw distance when the game can check memory available and do a quick benchmark for shadows (or better use virtual shadow buffers) and the same for draw distances.
It feels more like we have plenty of options to check boxes than anything useful.
in my experience, most of the time they are a resource hog without special tangible benefit.
This was less obvious on my GTX 1080 than on my GTX 1060 3GB, but still quite obvious.
In my experience, they are usually a resource hog with little to no tangible advantage.
This was less noticeable on my GTX 1080 than it was on my GTX 1060 3GB, but it was still noticeable.
Btw the lack of quality boost with ultra is more due to developer's wizardry to make lower quality settings looks Similar to ultra. Or because ultra is the lazy way to make people able to show off and for placebo? Or it's more of a mix of both of those?