*downscaled* Resolving Resolution

Status
Not open for further replies.

Obviously scaling up an image with heavy aliasing is going to look terrible.
If I take the 1080p shot and downscale it to 720p and then resample it back to 1080p it looks much better:
ssjlz.jpg


Next gen we should expect decent antialiasing, so the scaling to 1080p will look more like the RE image, Gubbi's Bulletstorm image or my seaside image, i.e. a bit soft and lacking some of the fine detail of the native 1080p but perfectly fine for most people (and pretty much imperceptible to a lot of people with smaller sets or who sit quite far from their TV).

Using an intelligent scaling algorithm that preserves sharp edges, you can get really good results.
I redid the Bulletstorm image using a smart algorithm and I don't think anyone can say they look 'horrible' compared to 1080p.
FDsL7.jpg

1080p1_pc.jpg.jpg
 
Last edited by a moderator:
Pure 720p and 1080p are both wasteful (a simple regular sampling grid). Brute force processing of million (or two) pixels isn't the most clever thing to do. If you could position the sampling points perfectly (where it matters the most), you would need less samples than 720p to provide more quality than 1080p.

Blu-ray movies look good. But still BR saves croma in half resolution. And it stores most frame data with a low resolution (compressed) approximation. Bandwidth is utilized in the areas that need it the most. It's all about human perception. You don't need to render more than maybe 10%-20% of the pixels of each frame at full resolution to make the image look like real 1080p. The real question is: which pixels? and how to generate the others efficiently.

Are you thinking about adaptively distributing the pixel density? If you use a fixed raster, say X times Y pixels, you can use smooth mappings to distribute the pixels. In our community, numerical simulation of e.g. fluids, we call this 'r-adaptivity'.

The answer where to distribute your pixels should not be that difficult I guess?

If you want better de-aliasing -> go for the gradients and distribute pixels there.

If you want more detail, search for high frequense areas to make pixels more dense.

What we sometimes do is to use wavelet analysis of the data to detect strong gradients, and/or high frequency...not sure if this tec is appropriate for real time applications though...
 
Obviously scaling up an image with heavy aliasing is going to look terrible.
Sadly downscaling and upscaling a aliased image causes super sampling to the low resolution image and gives quite nice boost..

Would be better to start with both images having comparable AA and upscale the 720p.
 
It seems like a number of you are forgetting that TV do a lot of post processing on their own, and most of them have both lag & ghosting, making your nice crispy framebuffer shots not nearly the same, factor in the distance from which you play and you should realize it's not going to look anywhere as good as on your still shots at short range on your computer screen.
Computer screens don't post process images, but they still have display lag and ghosting, although you can obviously purchase the best which exhibit little of both and are also well calibrated.

Point is, comparing still shots is irrelevant IMO.
 
Up-scaled graphics are horrible...
A lot depends on the content and context. I play 720p games upscaled to 768p and they look great. I play 720p games upscaled to 1680x945, and they look great. The upscaling isn't turning the graphics horrible - it's often not perceptibly different. Remember people are playing games rather than looking at stills, and the TV is adding ghosting and the like. It's also important to remember that our brains don't see stills, but accumulate information and build up a mental picture. So your example with the lost eyebrows won't be so clear-cut in motion, because changes in the image data is interpreted as eyebrows. We've all experienced this to remarkable effect on CRT SDTVs, where we could make out individual hairs from that blurry source material. Higher resolution improves clarity, but the gains are, as ever, diminishing with effort. So 720p at 60 Hz will quite probably look better to most people than 1080p30. Certainly not 'horrible' though. Maybe 'not quite as sharp' if you picked a less sensationalistic description.
 
QFT.

Especially when you factor in FXAA, which its best on still shots, and not so much in motion.

Cheers

Well from my experience while it's inferior than MSAA it's quality varies..in some games it does a pretty good job in others not so well mostly because of the art style but it's a pretty good and cheap solution.

That being said I'm hoping for a jump in res for next gen, the difference between 720p and 1080p in clarity is quite big even with a great upscale method...sure devs can go for 720p with a whole lot of AA (MSAA+FXAA?) and avoid jaggies with a smooth IQ but that doesn't change the fact that a 1080p framebuffer will be a lot more cleaner.

To be honest if next gen consoles go the route of 720p with prettier graphics I'll probably build a gaming PC rig for the first time in my life. :p
 
Given how much dof is used nowadays, they could also consider rendering stuff that falls in Z focus at full res, then all the rest whose Z falls into the out of focus areas can be rendered at lower res. The would especially help with cutscenes which tend to be dof heavy, it would let them bump up the render quality there.
DOF is nice, and motion blur is even better. Surfaces that are not moving (or are moving slowly) can be easily reprojected (last frame data reused). The faster the surface moves, the more it is blurred, and the less per pixel quality is required to render it. Either way you can save huge amount of processing cycles.

It's a huge waste to render everything first in full detail, and then blur a huge portion of the sharp image (dof / motion blur). It's even a larger waste to render everything again every frame (esp in 60 fps games), knowing that the difference between two consecutive frames is minimal. Compression algorithm used in movies (Blu-ray 1080p) knows both of these facts, and is optimized heavily towards reusing last frame data and saving low-frequency (blurred) data by using less storage space. Blu-ray quality is enough for most of us. Games should start using similar tricks to reach 1080p / 60 fps.
 
The problem with that is compression has the two final images and compares the delta, whereas a delta renderer would need ot predict how much something is going to change and render the changes accordingly. That's akin to building a camera that records only the parts of a scene that are going to change in the next frame. ;) A rotation on the spot in an FPS is 100% pixel change - how do you determine which pixels will be sufficiently unchanged that they can just be translated instead of rerendered?

Another solution is perhaps to go with partial renders of every frame, rendering some surfaces anew and others reusing old frame data? SO one frame, render the left rock, and the next frame, shift the left rock according to the camera change and update the right frame. I imagine that might look pretty wonky! But at higher framerates, it might work and give the impression of a faster framerate overall.

I also wondered about rendering interlaced, seeing CON on PS2 again recently. That'd halve the pixel workload every frame, so you could render the game 60 fps, rendering a half-res field one frame and stretching it to full field, and then render a single pixel offset for the next field and stretch that. The TV would receive a 60fps progressive image, but the image would be interlaced in having a one pixel offset every other frame. There'd be a 30 hz vertical shimmer. Of course, that wouldn't help with vertex savings, but it sound like it could work to me.

The main advantage with the current brute force systems is they are straight forward and easily scalable. Developing clever delta renderers would be a hell of a lot more work and probably not very portable between game types, as the types of changes could be very different from game to game.
 
You are aware that exactly zero broadcasts are in 1080p? They are either 1080i or 720p

SKY Sports is broadcast in 1080i even though 720p would be a much better fit.

Cheers

WRONG.

Some PPV Movies on Direct TV are broadcast in 1080p at 24fps.

The jump in clarity is mind numbingly obvious. Check out the Avengers in 1080p on PPV.

As for 720p broadcasts... The Fox Channel has dedicated 720p broadcasts like American Idol.
 
Pure 720p and 1080p are both wasteful (a simple regular sampling grid).

Serious question:

Xbox 360

Trials Evolution = 1152x648 (FXAA)
Trials HD = 1280x680 (no AA, black borders)


Would there be any chance you could go to your dev kit, start up "Trials HD" and/or "Trials Evolution", take some 1080p PNG screenshots of it rendered in full 1920x1080 and come back here and post those screenshots :mrgreen:?

Thanks in advance :mrgreen:;).
 
Would there be any chance you could go to your dev kit, start up "Trials HD" and/or "Trials Evolution", take some 1080p PNG screenshots of it rendered in full 1920x1080 and come back here and post those screenshots :mrgreen:?
Only if you then go and view the 1080p and 720p images on the same TV at the same distance. Oh, and as they are still they aren't a fair comparison.

You're better off trying games that allow you to choose. I've switched PixelJunk Monsters and Age of Booty between 720p and 1080p on PS3, and although there's a little clarity improvement on a 40" 1080p TV viewed from pretty close, it's not a massive difference such that I couldn't tell just by looking whether I was seeing the game in 720p or 1080p. I'd have to go out of my way to look for which version I was playing, or maybe if I had 1080p and 720p side-by-side, I could notice one was a bit sharper.
 
WRONG.

Some PPV Movies on Direct TV are broadcast in 1080p at 24fps.

The jump in clarity is mind numbingly obvious. Check out the Avengers in 1080p on PPV.

Since every TV sold the past six years has anti 3:2 pulldown functionality, the different between 1080p24 and 1080i50/60 is zero.

Bitrate makes for a bigger difference between PPV and broadcasts.

Cheers
 
Since every TV sold the past six years has anti 3:2 pulldown functionality, the different between 1080p24 and 1080i50/60 is zero.

Bitrate makes for a bigger difference between PPV and broadcasts.

Cheers


There's less blurring when viewing on a LCD in 1080p24 mode. But then that's more of a inherent LCD problem when viewing 24fps material on a 60Hz set.
 
You're better off trying games that allow you to choose. I've switched PixelJunk Monsters and Age of Booty between 720p and 1080p on PS3

Have you seen post #47 for another example ;)?

But to simplify matters (and to add 720p screenshots as well):

Xbox 360

Ridge Racer 6 (demo) = 1440x810 (no AA)
Playstation 3

Ridge Racer 7 (demo) = 1920x1080 (no AA)



720p screenshot (Source: http://www.eurogamer.net/articles/ridge-racer-6-and-7-comparison-720p-screenshot-gallery):

Ridge_Racer_9_720p.jpg




1080p screenshot (Source: http://www.eurogamer.net/articles/ridge-racer-6-and-7-comparison-1080p-screenshot-gallery):

Ridge_Racer_9_1080p.jpg



;)
 
Static screenshots aren't indicative of how the thing looks and feels when being played. Texture resolution is somewhat moot when scenery is whizzing past at a rate of knots, for example. Besides which you're not even comparing like-for-like art here.
 
Static screenshots aren't indicative of how the thing looks and feels when being played. Texture resolution is somewhat moot when scenery is whizzing past at a rate of knots, for example. Besides which you're not even comparing like-for-like art here.

Shifty I'm sure a man such as yourself can see how blurry the up scaled shot looks without resorting to excuses such as 'It's not the same art'

The top up scaled shot is obviously worse and more blurry, the art has nothing to do with it.
 
Status
Not open for further replies.
Back
Top