This isn't much of an emulation. The addresses formerly mapped to the eSRAM are mapped to the GDDR5 in Scorpio. That's all. I wouldn't know what a hardware solution for this should do. It's naturally something MS will do in the firmware/OS of Scorpio.
Perhaps Microsoft found that there weren't too many situations where the ESRAM was being used in a DRAM-unfriendly way, or they're counting on the larger number of channels and the depth of their buffers to massage it sufficiently. It would be a complex undertaking without some kind of reordering of commands or patch to dynamically make a particularly bad sequence into a not so bad one if the hardware isn't provisioned to do so.
If a resource were marked as read-only, perhaps an optimization at the system level would be to elide copy operations and just alias?
What interests me in the case of Scorpio is whether there was a strong distinction in Durango for accesses to the ESRAM from the CPU. There was a disparity, but what hoops the CPU had to go through wasn't made clear. It wouldn't matter for most software if for Scorpio the hardware-created distinction went away. However, when assumptions at a platform level about "this hardware surely can't access X in this way" become challenged, I am curious about buggy or perhaps malicious code having some fun.
I would consider code which breaks on some timing changes (on mostly variable latency things in the first place!) a bug or badly programmed these days. We aren't in the old console days anymore, when everything had a fixed latency and one could count clock cycles to the next line drawn on screen or something. One has to do proper synchronization anyway. Which means games are less likely to break (they shouldn't at all in my opinion).
One consideration is that even if the games are coded against race/timing conditions, the systems below them are also subject to them. That could be kernel functions, firmware, or at a hardware level. Assumptions the games don't make might be wrapped up in the functions they are assuming work just fine.
Perhaps one area that wouldn't be publicly discussed for the Pro is what system methods or functions are deprecated or de-prioritized in Pro mode. Those might be OS hooks or system blocks that were only validated for the BC mode. Boost mode might be the result of them finally being validated, but it could also put a layer of hacks/hangs in place for when they don't work.
Naughty Dog's job system mostly uses atomic counters, except for when it falls back to OS lock routines for when things get potentially too complex or the OS can do things like invert priorities. It's a leap of faith that system functions or the hardware they plug into behaves when the design changes.
An old example I can't find the link for was Id's experience developing a game on the PS3 (Rage?) that had HDD performance tank on specific console revisions.
For a more modern example of foundational assumptions breaking, we can look to the PS4 Pro apparently having problems withstanding the assault of performance bruisers like Stardew Valley...
http://www.playstationlifestyle.net...pring-says-stardew-valley-developer/#/slide/1
It may very well be coincidence that the supposed fix for that problem is in the firmware that introduces boost mode, although I would expect getting the PS4 Pro to not fail at being a PS4 Pro would come first before having the hybrid case handled.