As for the OP: yes, but I wager not anytime soon.
Is it ever going to be good for eye health to focus at screens inches in front of the eye for hours at a time?
My theory is that the best system I could think of would work something like this:
1. you have two curved panels in front of your eyes that make pixels look the same size no matter where your eye moves and covers most of your field of vision.
2. the display tech of whatever device is sending the image feed knows about your eyes and adjusts its image so that any distortion they have is compensated for. My guess is that this should be possible in theory at least, as your glasses refract light such that it compensates for your eyes problems. Then, why not make display panels that can send light in the correct way?
3. the VR glasses come with cameras for each eye that you can look at the world with, and the feed of these cameras can be mixed with or modified with the computer generated graphics. Crucially, this feed is already manipulated digitally to be corrected for your particular eyesight as well.
I don't know how easy it would be for these panels to show a stereo image that your eyes can deal with properly, but I reckon it should be possible? I've never so far heard of any display technology that does something like this to be honest, but I can't immediately think of why it would be impossible.
On the other hand, these days it's fairly easy to quickly and automatically determine your eyesight deficiencies, so this part could be automatically handled, if you don't care for typing the correction values into the device manually in its 'settings'.
But I don't expect something like this to be ready for affordable mass production anytime soon. And the fact that the army says they expect this to be useable for next year, is generally an indication that consumer grade/affordable stuff is a ways off yet.