ShootMyMonkey
Veteran
Been there. We usually avoid it by just having textures layer on top of each other and emulate a few Photoshop blend modes in the process to cover things up. Only thing is you can't really do that too much on characters because they're so dense with geometry and material parameters to begin with. That's why I was finding it odd the way things looked localized as if they were layered on and thought "damn, how many textures is he throwing on there?" even with the realtime model.they are both using the same textures, however, not the same material. the material in the render is specific to mental ray and the in game material is a proprietary skin shader. This will definetly make it look a little different.
there is also texture compresion as you stated which kinda mushes a lot of the detail together resulting in less variation and an almost "flatter" look. It sucks, but its games.. something we all have to deal with to an extent.
I've often found that among the hardest things to do, though, is just to make things accessible to artists on both ends. Things like PTMs are just not very efficiently authorable or modifiable without a lot of work and a long testing time for minor tweaks. Conversely, things that make artists' lives a lot easier like bidirectional constraining are harder to implement in practice without creating a mess of problems and affecting a million systems at once.