Games can offer both fidelity and performance, and that is what games are doing. Why are publishers making an effort to include a 60fps mode?
It an absolute mystery.
I really shouldn't have to explain that this isn't how it works. If it was that simple, then every game in the XB1/PS4 generation could have had a 30fps and 60fps mode as well, but they didn't. It's not like nobody had thought of this("Woah, why dont we just also make the game 60fps!?). It's that you're going to build your game differently if you are targeting 30fps versus having to target 60fps.
We are not just talking fidelity here, either. Again, shouldn't have to explain that there's more to a game's processor demands than just basic graphics features. Obviously any game that has very low CPU demands can scale graphics and performance better.
Please dont waste both our time here making me explain how games aren't infinitely scalable and that having a 30fps/33ms framerate target allows them to fundamentally do more than with a 60fps/16ms target. This isn't some small difference. That's actually a
really big boost in frametime headroom to play with.
Disagree. If your observations are meaningful, than observing the choices of thousands and thousands of console gamers will be more meaningful. Seeing the performance/quality choices console gamers are making in games that support them will give a little useful insight into what gamers prefer.
They aren't
my observations. This isn't some personal anecdote. We can both observe that console gamers are entirely fine with 30fps gaming. So much so that even today, a 30fps game like Tears of the Kingdom can be considered one of the very best games ever made by critics and gamers alike. At absolutely no point has anybody ever said that not being 60fps is some mark against the game and so doesn't deserve its unabashed praise. This is in addition to countless other examples of 30fps games being completely beloved. I just keep bringing up TotK because it's a 2023 game, showing that there's not actually been any large standards shift when it comes down to it.
You are proposing something different - some kind of 'measured test' that can say something about what people
prefer, but I've spent the last several responses already pointing out the many reasons such a test doesn't work and that you
cant measure this. I dont want to keep talking in circles here, but I dont know how else to get that across. This isn't measurable. If a test is flawed to such a hard enough degree that even tentative conclusions cant be made, you discard the test, you dont say, "Well it's the best we have".
I guess it's gonna sound quite arrogant to say, but you cant 'prove' me wrong here. It'll sound even more arrogant when I say that even most console gamers who say "60fps or bust" are fooling themselves and would still be fine playing 30fps games when it comes down to it. Because it's 100% possible that people build up
ideas of standards than actual real standards. Input lag is another area where I think many gamers massively overestimate their sensitivity levels(though this actually would be testable). I think there's a reason we see most of these sorts of comments coming from people in online enthusiast communities(basically any online gaming forum where people love to talk about games), which doesn't represent the average gamer anymore than Twitter represents the average voter.