Because the LG FreeSync one for example, doesn't seem to have full scene frame buffer on board.
Why do you say that?
I would hardly call an 'enhanced premium' chip a "normal" part of monitor design.
If you have HDMI in your monitor, it's likely it has memory (because scaling and de-interlacing are required on progressive scan monitors) Low quality scaling and de-interlacing are possible without memory - emphasis on "low quality". I merely linked a document that showed memory is a real thing inside monitors.
Scalers are normally part of monitors. Those that are memory based seem to add something like 10-20ms of latency. The majority of monitors with lag results tested show this approximate value of lag, implying they're memory-based. The reference for latency is usually a CRT.
Screens like the Dell P2714H
http://www.tftcentral.co.uk/reviews/dell_p2714h.htm
don't seem to have a framebuffer based scaler (or they have a super-fast framebuffer scaling algorithm).
The G-Sync based Acer XB270HU obviously does have a framebuffer, but we don't know whether that's used for scaling. With G-Synch off it has very low lag:
http://www.tftcentral.co.uk/reviews/acer_xb270hu.htm
But the lag increases, "a few milliseconds at most", when G-Sync is turned on (which could imply that it doesn't use the framebuffer for scaling or it has a super-fast scaling algorithm or that the scaler is only active in non-native resolutions - most monitors have the scaler permanently active).
Tech Report showed that lag is the same on the G-Sync and the Adaptive Sync monitors it compared. That might indicate that the Adaptive Sync monitor has no scaler (very doubtful), or that it has a framebuffer-less scaler or it falls back to framebuffer-less scaling when Adaptive Sync is turned on (to reduce lag) or that the scaler is disengaged (again, to reduce lag). Can't tell. If the latter, then it would seem that a whole load of monitor functionality is turned off while in Adaptive Sync mode, presumably due to bypassing the components that do things like scaling and response-time compensation. The chip I linked earlier does RTC, so it's likely that there are other cases where scaling and RTC are within a single chip.
Complete guess: BenQ (and the other monitor makers?) chose to do "gaming mode with Adaptive Sync" by bypassing the chip that does framebuffer scaling and RTC (or the bit of the chip that does those things?). If they're all using the same chip with Adaptive Sync support and that's the way that chip works, then whoops.
Apart from that, their thought process seems to be "we can't do variable overdrive RTC, which is the ideal approach for framerates between 40 and 144fps, so we'll just turn it off. At 144Hz, RTC isn't that important." Of course at lower frame rates RTC becomes more important as ghosting becomes more obvious.
It's pretty stupid to turn off RTC just because Adaptive Sync is on. Maybe the chip works this way, so the monitor makers are forced to use this compromise?
And giving yet to come products with projected premium price tag as example doesn't really help it either.
I listed those to simply demonstrate that non-G-Sync monitors have memory.
When did PiP start to become a "normal" feature of PC monitors?
Some time after 2007?
http://www.hdtvsolutions.com/BenQ_FP241WZ_LCD_Monitor_Review.htm
And I still don't see some kind of self refresh functionality in the features list.
Neither do I. I merely suggested that since memory is frequently a part of monitors, it is possible