It *does* seem plausible that a game gets non interfered access to the resources it uses though. Nobody likes framedrops because of a background app fetching data. The HDD has more capacity so perhaps devs prefer it over using a, somewhat limiting, 8GB Flash for gamedata storage. A game already has a lot of memory available, how much gain in performance could be achieved by using the flash as additional storage?
I agree that it seems like a Bad Idea(tm) to ever have the OS or some app steal hardware resources away from a game in the middle of play. Trouble is, it seems kinda like MS has already committed to something like that happening. I'm speaking of that Game DVR thing. If you are continuously recording, you must be accessing the HDD. Unless, you've got the game recording to RAM, which is of course an even more precious resource than the hard disk. You certainly can't use the flash as a video buffer, since the continuous writes would eat it alive.
Maybe the video gets written to a buffer over in the OS side of the memory. That means there's not as much memory left over to keep non-running apps in memory, which in turn makes keeping those apps "installed" in the flash a better option. That might make sense.
As for the, "How much can a small flash cache gain you in terms of apparent HDD performance" question, I have no idea. It seems like it might help, if developers could isolate "troublesome" data and get it moved into flash at install time. For instance, they might notice that the game hitches every time the player leaves the forest and takes in a glorious mountain vista. They could then tag those mountain textures as ones that need to be installed to flash. Would that help much? I dunno.
Probably doesn't matter since it would kill the flash it you switched games too often.