Those viewing distances have been tested with film material. (and most 4K films are actually 2K upscaled, it requires uncompressed 4K sources ideally)Sorry but pretty much always when people post that stuff, all I can think of is that probably none of you guys have ever properly tested those kind of sizes, distances and resolutions. I sit about 3 meters away from my 75" TV and I greatly appreciate native 4K resolution. I like to think my eyes are pretty good, but I don't think they are exceptional or anything like that. those vieving distance charts are just BS or don't translate well into a real moving images. 4K brings out a ton of more detail much sooner than your examples suggests.
I just started playing Arkham Knight on PS4 Pro, but apparently it does not have a Pro batch and 1080p with jaggies produces a absolutely terrible looking image quality in that game, while the assets itself are pretty good. I need to play that on a PC. 1440P is good, but not a huge fan of the checkerboarding etc. methods. They seem to fall apart in many situations.
In VFX we have hundreds of rays per pixels, some difficult scenes can go up in the thousands of rays per pixels to deal with specular shimmering.
In games you have very very few samples per pixels. I see mention of intermediate buffers at half res, AA is reducing details in a bid to reduce jaggies, etc... If you compare a 1.84TF console with jaguars to a 10TF+ PC it's not the raw resolution of the frame buffer that's being compared.
Any technique reusing previous frame data is multiplying the effective samples per pixel. It's ludicrous not to use those techniques, just like it was stupid not to use compression where media bandwidth was the bottleneck. We have a severe compute bottleneck compared to what would be needed to brute force rendering. Full scene native 4K is just wasting power that could have been used to improve lighting quality instead.
Last edited: