exactly that.I'll give you HDR, but everything else I'm confused why you would think monitors underperform TVs. Monitors tend to offer lower latency, higher refresh rates and faster pixel response than TVs plus support for full RGB. Image processing on TVs is geared towards watching media while monitors are tuned for performance. There's certainly a case that a quality TV will offer a superior movie watching experience, and might even win in image quality for games, but latency much higher. There are tons of monitors on the market that advertise 1ms response times, while I don't think I've ever seen a TV under 10ms. 240hz monitors are also common. 120hz TVs are just becoming mainstream.
HDR on onitors (and in the PC space in general) is just starting to get good, but image quality and latency are great on monitors right now. And latency is one of the most important things for games.
Console owners expect games at 30 fps to be fully loaded -as in..., you have all the best graphics features at max in exchange of the 30fps- but that's not how it works, imho. I've played plenty of games at 60 or more fps and "ultra" settings.
Years ago I remember how we thought 720p or 1080p with AA x 8 would be the solution to everything and there would be no need for something else. But that's not the cause. There are still jaggies that could cut diamond at MSAAx8 with TAA and adding FXAA via nVidia panel at 1080p, if you run the game on a 1440p or 4k monitor.
But on console people want games fully loaded.