Alright, so looking at normal 6800xt performance numbers with Godfall, I think I can reconstruct the upscale setting to something like 2:1 (might be a bit bigger even), 3:1, 4:1, 5:1 and for FSR performance to be really quite good, but only on RDNA2 so far (performance deficit from those normal resolutions are small).
Thus the screenshot example for 1060 would have to be upscaled from about 810p. Which, well, no wonder it's so blurry. And as with DLSS it'll get less and less effective the lower the final resolution is. Maths! We're looking at scene complexity being fixed, upscaling when you already have a good set of samples of the scene complexity reconstructing a better version is relatively easy. Once you start dropping important parts of the scene it gets progressively hard to guess what they are anymore, and your effectiveness goes down. IE reconstructing to 4k will give better results than reconstructing to 1440p at the same quality settings.
If "Ultra" really is fairly close to native quality for 4k then I can see it becoming pretty popular with small to medium studios as a PC option, and especially for use on the new consoles. I would guess DLSS 2.0 has better image quality results, but also just takes more performance to run, limiting the comparative benefit. Glancing at benchmarks on say, a 2080ti and balanced settings, gaining a 35% performance boost on Watchdogs Legion for a hit to image quality with DLSS may not be a whole lot different from gaining say a 50% performance boost and a slightly worse hit to image quality with FSR. Of course, this assumes Ultra quality 4k really is at least a decent approximation of native 4k.