PS3 and 360: Would lower resolutions allow for photo-realitic graphics?

deepbrown

Veteran
http://insomnia.ac/commentary/not_powerful_enough/

Here's an interesting article about whether pushing for 720p and 1080p on todays consoles is counterintuitive.

" Consider this: when you are watching a movie on your TV you are seeing a standard resolution image; assuming you are in an NTSC country that would be 640x480i. When you are playing an Xbox/PS2/GameCube game on the same TV you are seeing the exact same resolution.

Which looks better, the movie or the game? The movie of course.

The truth is that, contrary to what Sony and Microsoft would have you believe, resolution is not the most important factor in graphics quality. As anyone who is into first person shooters on the PC will tell you, the effects are much more important.

For games running at 1080p half of that power will always be spent in order to show fine, miniscule details that simply aren't there, because the remaining power isn't enough to produce them. Can you imagine the kind of effects, or the number of polygons, developers could utilize if they were allowed to design a PS3 game running at the normal 480i resolution? I am certain they could come very close to the level of detail seen in a DVD movie..."

What do you think?
 
The truth is that, contrary to what Sony and Microsoft would have you believe, resolution is not the most important factor in graphics quality. As anyone who is into first person shooters on the PC will tell you, the effects are much more important.

The most important thing is lighting. That is the reason why your normal tv show looks real even in low resolutions. Even ps3 and x360 wouldnt be close to what is needed to render realistic lighting even at low resolutions so they pump up the resolution in a attempt to make gfx more realistic in a way that is possible with todays hardware.
 
No, they couldnt' do photorealistic at SDTV res. Not even close! First you'd need something like 16x FSAA (preferably supersampling to resolve textures, or some other texture filtering if it exists). Then you'd need realtime global illumination lighting. Bit this topic has been discussed before and I'm sure you could search for previous discussion.
 
I'm sorry...it was a new article so I felt it was worth a post, and I thought it was touching on a way of thinking that wasn't in a post in this site yet. And i agree I do think he is overlooking things.
 
http://insomnia.ac/commentary/not_powerful_enough/

Here's an interesting article about whether pushing for 720p and 1080p on todays consoles is counterintuitive.

" Consider this: when you are watching a movie on your TV you are seeing a standard resolution image; assuming you are in an NTSC country that would be 640x480i. When you are playing an Xbox/PS2/GameCube game on the same TV you are seeing the exact same resolution.

Which looks better, the movie or the game? The movie of course.

The truth is that, contrary to what Sony and Microsoft would have you believe, resolution is not the most important factor in graphics quality. As anyone who is into first person shooters on the PC will tell you, the effects are much more important.

For games running at 1080p half of that power will always be spent in order to show fine, miniscule details that simply aren't there, because the remaining power isn't enough to produce them. Can you imagine the kind of effects, or the number of polygons, developers could utilize if they were allowed to design a PS3 game running at the normal 480i resolution? I am certain they could come very close to the level of detail seen in a DVD movie..."

What do you think?

The thing is that nowdays many have LCD TVs that are 30-50" and SD res wont cut it. The bigger the screen the more blown up the image will look. Scaling can help a bit but there is still a huge difference in visual IQ between a DVD movie and the same on HD-DVD/BD-ROM. This will be even more noticed in games.
 
TV/movie footage is sampling from real life, which is of 'infinite resolution'. Games are not, the resolution you render at is the resolution of the world you're sampling, and if that is low, you have problems.
 
Doesnt seem interseting to me at all....Its footage of real live compared to computer generated images! Silly comparison IMO.
 
It's also absurd to be thinking that just because there are 1/3 the pixels (compared to 720p) that you can draw 3x the polygons, 3x the lightsources, 3x the illumination model complexity or move 3x closer to convergence of the Rendering Equation, or have 3x the information per pixel (as if all this comes for free and there are no limitations here whatsoever). Also, he keeps pretending that the movies he's watching at SD resolution were actually filmed at SD resolution or that there isn't more information present in the image than simply having 150k pixels. Yeah, final image resolution isn't everything, but that doesn't mean that source resolution is meaningless or that simply reducing the final resolution will solve everything.
 
You guys can correct me if I am wrong but part of the reason that TV shows and CG movies look so good on your SDTV is because they are recorded from a Extremely High resolution source which would be either real life or Extemely high resolution CG. The CG movies in games are usally done in a resolution so high that 1080p TV can't support and then are scaled down in the recording process so that they can be shown on your SDTV. So you would still need a high res source to make games look that good on SD. I never quite got the reason why devs don't do this for the 360,use a high res 4:3 res and have the 360 scale it down but I'm sure they have a good reason.
 
4:3? wouldnt that look fugly for people who have 16:9 HDTVs? :|

I'm talking about SDTV. They already do a good job of making sure people get the best graphics they possibly can when viewing thier games in HD.

I was just wondering that if you send maybe an 800x600 image to the scaler to downscale couldn't you possibly get a better quality image.I'm not sure if all games are doing it but there are many games that when 480p is the res the 360 is set to the game uses only 640x480 as it's res before heading to the scaler.
 
Last edited by a moderator:
I'm talking about SDTV. They already do a good job of making sure people get the best graphics they possibly can when viewing thier games in HD.

I was just wondering that if you send maybe an 800x600 image to the scaler to downscale couldn't you possibly get a better quality image.I'm not sure if all games are doing it but there are many games that when 480p is the res the 360 is set to the game uses only 640x480 as it's res before heading to the scaler.

Epic claims to render Gears of War at 960x720 for SD mode (Rein mentioned this on their forums). I don't know any other claims or confirmations though.
 
So you would still need a high res source to make games look that good on SD. I never quite got the reason why devs don't do this for the 360,use a high res 4:3 res and have the 360 scale it down but I'm sure they have a good reason.
This is supersampling. CG movies are actually rendered with lots of AA, and not just at super high resolutions. They are downscaled for TV, but even if shown on TV at native resolution they'll look fantastic. To get a reasonable IQ from supersampling, to get close to TV quality so most folks won't notice the difference, you'd need 16x supersampling. That's rendering the screen 4x wider and 4x higher, and then downsampling. At SDTV, that'd need rendering to a 2560 x 1920 image...

These consoles are a million miles away from being able to render at those resolutions! The best you can hope for is rendering at an HD res like 720p with MSAA, and downsampling that. That's better IQ than rendering to SDTV res, but it's not going to produce TV quality shapes. And that ignores the lighting and shading requirements of real-life footage.
 
This is supersampling. CG movies are actually rendered with lots of AA, and not just at super high resolutions. They are downscaled for TV, but even if shown on TV at native resolution they'll look fantastic. To get a reasonable IQ from supersampling, to get close to TV quality so most folks won't notice the difference, you'd need 16x supersampling. That's rendering the screen 4x wider and 4x higher, and then downsampling. At SDTV, that'd need rendering to a 2560 x 1920 image...

These consoles are a million miles away from being able to render at those resolutions! The best you can hope for is rendering at an HD res like 720p with MSAA, and downsampling that. That's better IQ than rendering to SDTV res, but it's not going to produce TV quality shapes. And that ignores the lighting and shading requirements of real-life footage.

SO in conclusion, this guy has no idea what he's talking about. That makes a change.
 
When talking about the different resolutions, I understand the difference between interlaced and progressive, but not how it relates to consoles. Don't consoles just produce an image 30 or 60 times a second and the TV just displays it, however it was designed to? I mean, when a game supports, let's say 480p, what is it doing differently than if it were 480i?
 
Last edited by a moderator:
It depends, but generally, nothing. If you render interlaced images at full frame rate, you can render every other line, doing half the work per frame. When rendering 30 fps, you render the whole buffer and just output every other at 60 fps. Few games are 60 fps interlaced rendering. They'll be 60 fps progressive, full screen frames, and for interlaced output only output every other line.
 
High resolution and AA is the least important effects. Textures, lighting, geometric complexity and framerate is far more important. That brain has a much easier task abstracting from lack of sharpness and aliasing artifacts, than wonky textures and coarse gouraud lighting.
That said, there are plenty of "good enough" AA hacks that doesn't involve expensive oversampling, but rather softening of the edges (preferably conditioned by the slant of the poly edge) and jittering of the pixel grid.
 
TV/movie footage is sampling from real life, which is of 'infinite resolution'. Games are not, the resolution you render at is the resolution of the world you're sampling, and if that is low, you have problems.
Games may be of infinite resolution if their images are made up of geometric easily characterizable shapes. The portions of each area occluded by each pixel are calculated to construct the anti-aliased image (as I'm sure you all know, since it's graphics 101).
 
Back
Top