Yes, what people seem to mix up here is that when you take a 60fps game and scale it down to 30fps, it'd automatically max out the entire system. But this is not the case, most of the CPU would sit idle a lot, as the underlying game systems and datasets were designed with 16ms frame times in mind.
So, such a scaled back game would look somewhat better at 30fps than at 60fps indeed; but it would still be limited by a lot of the trade-offs made to have it run at that framerate. Thus it'd always be inferior to a game that was designed to run at 30fps from the start, but it would now lose its only real advantage (speed) as well. This choice makes no sense to me, personally.