I dont have any single answer.
On PC, I'm usually aiming for a consistent 60fps, but I rarely freak out if I cant get there. I'll take 60fps with drops easily enough in most games, and if the drops are too bad, I'm also sometimes ok with 30fps if the game isn't hindered by it too much. Controller-based action/adventure games, turn-based RPG's and strategy games, etc.
For console, it again depends on the game, but for the most part I'm pretty well tolerant of 30fps. If the 30fps is consistent and the game is good, I'm going to quickly forget about performance altogether once I'm playing. But also, if a game is really good I'll even grudgingly tolerate 30fps with regular drops. Not being able to do so would have meant missing out on some of my favorite games of all-time. I do have limits, though. While I can stomach 30fps with drops, I am not happy if the performance starts dropping to like sustained 20-25fps or something frequently. I remember having to shelve GTAV on the PS3 after a couple hours cuz the super low performance combined with the already laggy controls crossed into 'unplayable' territory for me.
I know some will see all this as having inherently low standards, but I think it's more of just a perspective thing. I very much appreciate nice, fast, consistent performance. I had a 144hz monitor for a little while. But for as much as I love to learn about and discuss the more technical side of gaming and hardware and whatnot, I do go out of my way to regularly remind myself that gaming should ultimately be about having fun. And so I've made deliberate choices to prevent my standards from getting to a level where my ability to enjoy games would be hindered by being far too picky about technical matters. That would be a shame. It also makes things more affordable, not needing any particularly high end hardware or having to upgrade more frequently.