Considering the games operate with the assumption that those textures are in memory for when they need them, I'd be very surprised if any amount of technical wizardry could make this happen transparently for older games without actually directly rewrite how textures are managed in those older games.
On PC, drivers do have some ability in determining whether to keep unused textures in memory or not (An 11 GB graphics card using 8 or 9 gigabytes of memory isn't an indication that the game actually needs those 8 or 9 gigabytes of data in memory, for example) so that if a request for that memory is seen by the GPU driver, it doesn't have to go and fetch it from PC storage.
The best I could see is somehow detecting exactly what part of a texture is needed (without the game letting the system know what part of the texture it's likely to use) and loading that part of the texture into memory first, but I'm not sure that's a win or worth the effort since you'll be loading the entire texture anyway because you don't know if or more importantly how the game plans on using that texture.
Regards,
SB