No, I get it exactly, but I must not have been clear in the way I described it. Shifty seems to have the same idea I do, because he re-explained what I was trying to say here:Albuquerquue, I don't think you've understood what Xmas has been saying.
The intrinsic nature of the upscale from singularity sample to area of light of an LCD doesn't have any obvious bearing on the conversion from an area of samples to a larger array of samples. The LCD upscale is out of our hands. We can only work with the array of samples that will be transmitted to the display device.
We can only work with the display technology that we have. Thus, arguing pixel size is a semantics argument at best. Pixel sizes are what they are, but that's not what we 're talking about.
We're talking about non-native-res frame being resampled. I'm pretty sure that's been the discussion point that Xalion has been mentioning this entire time, even though a few people have now attempted to derail it into something else.
Rescaling of the digital image is the discussion. Not the pixels. So again, show me a method of rescaling an image that doesn't do exactly what Xalion has described, and I'll finally understand what we're trying to get at. But for now, all I see is red herrings...