Nyquist rate tells you the sampling frequency from which you can correctly reconstruct the original signal. However, a signal sampled at the Nyquist rate is itself not identical to the original signal - it's still sampled. And so that's not useful in answering the question "at what resolution does aliasing cease to be a problem".
You could ask at what resolution/distance is the angle subtended by each pixel onto the retina so small that the human eye stops perceiving aliasing. There's probably good research on that, but note that Apple's "retina" definition is probably inadequate since it assumes that the source content is antialiased. E.g., text looks crisp on a retina display, but that's because of font antialiasing (e.g., cleartext). Pictures and videos are naturally antialiased because camera sensors accumulate light over an area (instead of a point). In contrast, in computer graphics, each sample represents an infinitesimally small point, the final color value of which is blown up to fill an entire pixel on screen. Without any AA filtering, that s*** is going to be visible to the naked eye at any "retina" setup.
(This was all a layman's take, so I apologize to any experts in the audience who may be able to explain things more precisely).