I just don't get it and hope for a better explanation:
PC offers 30Hz/60Hz mode in most games. Thus, technically this can't be a big problem.
Of course, you need to start with a 60Hz console game and downscale the fps to 30Hz. This should give you headroom on the GPU and thus offer better GFX (although Sebbi said that uping GFX can/may have an impact on CPU usage as well, I still think that there should be room for noticeable(!) improvement, e.g. change adaptive resolution to fixed high resolution).
It can't cost to much money. We all moan that no money is invested in the PC versions of the games...but most feature the fps options...thus it can't be an expensive/difficult feature.
(It seems that mostly console only devs seem to have problems to support different fps on PC...like the infamous Dark Souls 1).
Console games are all CPU limited? Fine, see that still on two consoles with nearly same CPU the one with better GPU gives often/mostly higher resolution!? So being CPU limited doesn't seem to be a problem at all (on the contrary imo!).
This directly supports that it should be possible to downscale a 60Hz game with dynamic resolution to a 30fps variant with steady higher resolution.
This was exactly what ThePissartist suggested iirc, and got brutally attacked in this thread up until accusations of whatever being bent over something something...
It makes sense to me that this is possible and a very good option for future games (design 60Hz with adative resolution, add option to lock to 30Hz with higher GFX) and other than people shouting 'not possible' from the roof and 'H5 looks great what is the problem' no one offered a real argument other than myself (of course!): game devs just don't care to give console gamers an/this option
Is the QA fundamentally more expensive for such games and thus a nightmare wrt time management and money? How/why did ND include it in their remaster?!