Minecraft would be great outlier. Would be interesting if someone took pc version with vrr on and made comparison video with some specific card using one video with default settings and another video with 20% less flops and tweak settings a little bit so fps is same but fidelity is less. Would average person notice the difference between videos, would the difference be significant?
That's sort of been the challenge of every discussion in general here with regards to graphic fidelity.
We choose a baseline, say Gears 5, crank it to ultra 4K60 and say, well this all the hardware is capable of, so it can't possibly be capable of more because neither of these consoles should be able to outperform a 2080TI.
Except that consoles have never run ultra settings, ever. Most of the time they are running low/med and sometimes they are running high. And
@DavidGraham really put out some good posts where most people were completely unable to spot the differences between low, med, high, ultra settings, but ultra settings nuked performance by insane amounts, for visually marginal increases in graphical fidelity.
if any of this sounds familiar, and it should, then suddenly RT makes more sense not less. If you're going to get hit hard, you may as well get hit hard without the having to hack every single scene so that it looks right, without the hacks you keep your budget elsewhere, games can budget according, prioritize where they can put their effort better, world builders don't need to fidget with a bunch of things to make sure that things look 'right'. Instead of high budget teams being the only ones that are able to output high fidelity graphics because they have the resources (money, time, and hardware) to constant fidget, re-bake, and re-do areas and scenes over and over again to mimic lighting. You may as well just run real lighting, toss all that rework out the door and focus on something else.
I call this the equalizer. This is where studios that have insane budgets and support will suddenly not be as special because someone like FYQD can come along by themselves with some contract help and asset purchases and make something like Bright Memory Infinite.
but yea, Minecraft is a big outlier
It's path traced. I'm not expecting many games this coming gen to have any sort of path tracing except for stylized games in that sense.
For me, Ray Tracing is the path I want games to take. Like anyone who is a small/single/indie developer is looking at that and seeing an all new way to showcase games and basically have all new puzzle systems because of the physics of light being simulated... and realizing that you can be competitive with the bigger studios, with virtually little budget and stylized art. Like so many indies right now do basic games like Meatboy, Tower Fall, etc. And even making those bitmaps are time consuming. It takes years to make those types of games.
It would take someone a fraction of the time to put out blocky ray traced lowish models, but people could still really enjoy the graphics. Seriously eyeing UE4 right now. Unity is also a thing. Running your own ground up RT engine is also very much a good possibility now.