PS3 and 360: Would lower resolutions allow for photo-realitic graphics?

Games may be of infinite resolution if their images are made up of geometric easily characterizable shapes. The portions of each area occluded by each pixel are calculated to construct the anti-aliased image (as I'm sure you all know, since it's graphics 101).
I think Titanio was clearly referring to the rasterization step which produces a finite resolution image from the background geometry. We usually think about this step without AA being implied. Cameras, in a simplistic view, achieve essentially infinite SSAA for each pixel represented (not really, but close enough). In rasterizing, calculating the occlusion of a pixel by each primitive is of course the tricky, and expensive part. The devil is in the details.
 
could X360 and PS3 do photo-realistic graphics at lower resolutions / SDTV res? no way, and neither would the future X720 or PS4.

however, I believe in many areas of graphics could be improved. going down to 480p / 480i resolution would lift much of the bandwidth & fillrate burden, allowing for constant 60fps framerates in games that are 30fps @ HDTV resolutions. perhaps also allowing for somewhat more complex graphics & effects due to the saved bandwidth, fillrate, RAM that is chewed up to reach HDTV resolutions.


I think that given enough CPU/GPU/RAM, a machine could render in realtime what rivals movie or videogame or television show quality CGI and display it on SDTVs. it's a problem of computing performance and rendering power. resolution is a small part of it. sure you'd need to render initially at higher than SDTV resolutions, but that's just one thing.

CGI looks so good because the vertex/poly counts in the models and environments are orders of magnitude more than what is used in games. There is far more AA sampling than any game--probably 64x AA or more. The textures & shaders are far more detailed/complex. much better lighting models are used, often global illumination, (even without raytracing), motion blur and lots of post-processing effects, plus, probably other things I'm not aware of.

Suppose there was a console that had say, ~50 times what Xbox 360 or PS3 can do in terms of graphics. Developers could then probably do the CGI found in the best videogame intros and cutscenes in realtime. If devs had ~500 to ~5000 times the performance of Xbox 360 or PS3, they could reach movie-quality CGI in realtime (sans raytracing). I'm just guestimating on how much power is needed but the point is, given enough CPU/GPU power and RAM, at 480i / SDTV res, game graphics could look VASTLY superior to what we are currently seeing at 1080p, 1080i or 720p on PS3 and X360. though we're never going to see truly photorealistic graphics in realtime in our lifetimes. what will happen is that we will see games that seem to lean more and more towards looking photorealistic, but will never see true photorealistic games.
 
Last edited by a moderator:
Though we're never going to see truly photorealistic graphics in realtime in our lifetimes. what will happen is that we will see games that seem to lean more and more towards looking photorealistic, but will never see true photorealistic games.

I agree, and think about the work the artists have to do to try to model things as detailed as the real world...
 
Suppose there was a console that had say, ~50 times what Xbox 360 or PS3 can do in terms of graphics. Developers could then probably do the CGI found in the best videogame intros and cutscenes in realtime. If devs had ~500 to ~5000 times the performance of Xbox 360 or PS3, they could reach movie-quality CGI in realtime (sans raytracing). I'm just guestimating on how much power is needed but the point is, given enough CPU/GPU power and RAM, at 480i / SDTV res, game graphics could look VASTLY superior to what we are currently seeing at 1080p, 1080i or 720p on PS3 and X360.
Given that much more power, at HDTV res, game graphics would look VASTLY superior to what we are currently seeing at 1080p, 1080i or 720p! You don't have to go down to SDTV res to benefit from 50 to 5000x the power! Personally I'm not sure how the lower demands of SDTV allowing for more work per pixel will be offset by the lower image quality of rendering at that resolution. Will photoaccurate lighting and shading and modelling with shimmer and aliasing look better than just under photoaccurate rendering with higher image quality? I don't think so. I guess the mainconcern is how big people's TVs are and how rough SD looks at their viewing distances. For a lot of folk, there's probably not much visible difference between SDTV and HDTV on their AV hardware!
 
Textures and model detail are still problems on ps3 and x360... resolution wise they both look super sharp but the models adn textures still look too basic.:cry: I agrees that resolution needs to be balanced with proper models and textures... oh and jaggies are un acceptable in 2007 yet I see them on both systems...:devilish:
 
Textures and model detail are still problems on ps3 and x360... resolution wise they both look super sharp but the models adn textures still look too basic.:cry: I agrees that resolution needs to be balanced with proper models and textures... oh and jaggies are un acceptable in 2007 yet I see them on both systems...:devilish:

True, I refuse to buy any game that will not function with AA on for PC. It makes a hell of a difference.
 
You will get jaggies! Even at 4xAA, and usually you'll be looking at 2xAA. Jaggies are reduced, but we won't be rid of them for a while yet. Model detail I think very good. People forget how much the triangle power of these machines goes towards behind-the-scenes stuff. 2+ million triangle scenes isn't to be sniffed at!
 
The little screenshots I have seen for Lost Odyssey seem to be shooting for a very realistic style. At 400X300 the tiny pictures look real life pictures to me.

But I have a feeling it will have a very static camera system like the old RE games.

So far the most photo-realistic-at-SD game to me is GTHD.
 
Given that much more power, at HDTV res, game graphics would look VASTLY superior to what we are currently seeing at 1080p, 1080i or 720p! You don't have to go down to SDTV res to benefit from 50 to 5000x the power! Personally I'm not sure how the lower demands of SDTV allowing for more work per pixel will be offset by the lower image quality of rendering at that resolution. Will photoaccurate lighting and shading and modelling with shimmer and aliasing look better than just under photoaccurate rendering with higher image quality? I don't think so. I guess the mainconcern is how big people's TVs are and how rough SD looks at their viewing distances. For a lot of folk, there's probably not much visible difference between SDTV and HDTV on their AV hardware!

I agree you don't have to go down to SDTV resolutions to get the benefit from 50 to 5000x power. I was actually talking about two different things in one post.

with that much more power, there would be no need to go down to SDTV resolutions. but with the current consoles: Xbox 360 and PS3, it would benefit graphics to use SDTV resolutions, at least to a certain extent (framerate for sure).

what I was saying about SDTV resolution is, it is possible to have CGI quality graphics displayed at such resolution -- pop in a DVD or even VHS tape of your favorite CGI movie, and you'll see graphics that are orders of magnitude beyond any X360/PS3 game @ HDTV resolution. so SDTVs are still capable of displaying incredible graphics -- realtime graphics have not even come close to maxxing out what can be seen on an SDTV. it's a matter of CPU/GPU power and RAM.
 
You will get jaggies! Even at 4xAA, and usually you'll be looking at 2xAA. Jaggies are reduced, but we won't be rid of them for a while yet.

I agree. to really get rid of jaggies, games need at least 8x AA. to make things super smooth, 16x AA would be great. I think it would be a waste of processing power, bandwidth, etc to go beyond 16x AA in games. but I certainly believe that 4x AA is not enough to completely eliminate jaggies much make looks really smooth like CGI.

if CGI uses 32x, 64x AA and above, I think realtime should use 8x or 16x AA.
 
if CGI uses 32x, 64x AA and above, I think realtime should use 8x or 16x AA.
There's a simple reason why 16xAA in realtime will probably never happen -- it's a moving target. When hardware has the bandwidth and ability to effectively do it at 640x480 on just about any content sampling god knows how much stuff that many more times (even if it's straight up SSAA), everybody wants to play at 1280x1024. When we get the bandwidth to do it at 1280, everybody will want to play at 2048x1536. Ready for 2048? Oops.... Sorry, standard PC gaming resolution is 3840x2160 now. And so on, and so on, until a physical on-display pixel is a single quantum dot crystal... and people will still complain about jaggies.

Besides which, lack of AA killing the immersion is a far cry from the notion that AA will provide it. I daresay it's one of the more insignificant variables in achieving realism. Though I must say, it's rather amusing hearing everybody talk as if pixelated images killed their parents and raped their children.
 
There's a simple reason why 16xAA in realtime will probably never happen -- it's a moving target. When hardware has the bandwidth and ability to effectively do it at 640x480 on just about any content sampling god knows how much stuff that many more times (even if it's straight up SSAA), everybody wants to play at 1280x1024. When we get the bandwidth to do it at 1280, everybody will want to play at 2048x1536. Ready for 2048? Oops.... Sorry, standard PC gaming resolution is 3840x2160 now. And so on, and so on, until a physical on-display pixel is a single quantum dot crystal... and people will still complain about jaggies. . .

But do you think this holds true for the console space? It seems to me that 1920 X 1080 is the max resolution needed for consoles for quite some time. With current seating distance to screen size ratios in current homes, there doesn't seem to be a big need to go higher than 1080p for displays. I suppose at some point, when OLED screens can wallpaper an entire wall (ala Total Recall; remember the scene where he is eating breakfast 3' from a ~ 12' X 15' screen) there will be a push for higher and higher resolutions.
Until then, it seems like 1080p may be a ceiling for home displays. I would anticipate next gen consoles (X720, PS4) will be targetting 1080p and that few eyebrows would be raised at the odd game developer here or there deciding to do 720p with more eyecandy.
No?
 
There's a simple reason why 16xAA in realtime will probably never happen -- it's a moving target. When hardware has the bandwidth and ability to effectively do it at 640x480 on just about any content sampling god knows how much stuff that many more times (even if it's straight up SSAA), everybody wants to play at 1280x1024. When we get the bandwidth to do it at 1280, everybody will want to play at 2048x1536. Ready for 2048? Oops.... Sorry, standard PC gaming resolution is 3840x2160 now. And so on, and so on, until a physical on-display pixel is a single quantum dot crystal... and people will still complain about jaggies.

Well, I sort of agree, in that the above certainly described the past reality of the situation. However, our eyes do not have infinite resolution, and even 1080p TV's at "further than optimal distance" from the viewer are approaching the viewing angle per pixel that our eyes can resolve. It is likely that the next step in HDTV ('UHDTV' or whatever) will reach or be very near that limit for almost all normal installations. For home viewing on a PC, and I haven't done the math for a 22" screen at normal seating distances, it may take 3840x2160 or such... but the point is that there is an upper limit to what resolution is humanly discernible at a distance.

Which means two things... first, within our lifetimes we'll see a halt to consumer resolution increase and both displays and media will stabilize on that "good enough" resolution. Second, if a single pixel is the discernible limit, AA will be much less beneficial, though to be honest I don't know if the current studies arrived at that limit using high contrast neighboring pixels or typical program information (which is essentially anti-aliased out the wazoo). So, depending on how the viewing angle acuity has been measured, I'd expect a move away from AA at those uber-high resolutions, or for developers to actually target less than display maximum resolutions with MSAA or other "smart" AA technique such that the result was still near the range of our acuity but saved bandwidth by not rendering at full resolution (and of course using that power for other gain).

[edit: looks like acuity research uses high contrast "pixels" so to speak, so our acuity would be about there for worst case gaming, and much reduced for typical recorded images]
 
Last edited by a moderator:
Well, I sort of agree, in that the above certainly described the past reality of the situation. However, our eyes do not have infinite resolution, and even 1080p TV's at "further than optimal distance" from the viewer are approaching the viewing angle per pixel that our eyes can resolve. It is likely that the next step in HDTV ('UHDTV' or whatever) will reach or be very near that limit for almost all normal installations. For home viewing on a PC, and I haven't done the math for a 22" screen at normal seating distances, it may take 3840x2160 or such... but the point is that there is an upper limit to what resolution is humanly discernible at a distance.
its gonna be higher than that something like ~11000x8000, the human eye goes down to 600dpi
true for tvs its not so important then again ive seen kids sitting on the carpet like 2meters away from a huge screen whilst playing games ( i believe when ppl play games they tend to sit closer to the tv screen than when they watch tv )
 
its gonna be higher than that something like ~11000x8000, the human eye goes down to 600dpi
true for tvs its not so important then again ive seen kids sitting on the carpet like 2meters away from a huge screen whilst playing games ( i believe when ppl play games they tend to sit closer to the tv screen than when they watch tv )
Using a popular number for 20/20 visual acuity of the eye, being about one second of arc resolution (1/60th of a degree), even a 60" diagonal 1080p HDTV at just 8 feet viewing distance is at that limit.

More recent work including post-processing retinal and neuronal effects have pushed the maximum acuity of the human eye closer to 1/150th of a degree of arc resolution (obviously some people have better than 20/20 vision). That's a little better than double the above, so yeah... I think one more major step that about doubles 1920x1080 HDTV (maybe a little more if we could get there in one step and be done with it) to establish a UHDTV resolution should be pretty sweet. Most people probably don't sit much closer than 8 feet to a 60" HDTV... even your example of kids on the floor was over 6 feet. We don't need 19000 x 10000 resolution in consumer TV's... our eyes just aren't that good. There are some arguments that resolutions beyond what our retinal cells and processing can differentiate (using Raleigh's criteria) still provide more information to our eye, resulting in a less "annoying" picture owing to better stochiastic sampling (a sort of neuronal AA) of the eye, even if we can't conciously tell a tester if there are two dots or one.

Perhaps, but I think there are more important things to worry about once we get to about double HDTV... like adding the critical 3rd dimension. Of course, if you have a 10 foot diagonal projection setup and sit 10 feet away, I'm sure you could use a little more than double HDTV, but in that case you are way, way past the recommended viewing angle for optimal enjoyment anyway.

BTW, I'm not sure what it means that our eyes "go down to 600dpi"... at what viewing distance? That's kinda important.
 
But do you think this holds true for the console space? It seems to me that 1920 X 1080 is the max resolution needed for consoles for quite some time. With current seating distance to screen size ratios in current homes, there doesn't seem to be a big need to go higher than 1080p for displays.
That's probably true for consoles, but I don't think bandwidth nor "smarter" AA techniques will scale up quickly enough to make a difference before the whole UHDTV transition happens either. There's probably a point where it will all come to a head in terms of the displays, but there will be other failings in the hardware architectures before then (e.g. rasterization is insufficient altogether), and there will probably have to be a transition on a different level.

Well, I sort of agree, in that the above certainly described the past reality of the situation. However, our eyes do not have infinite resolution, and even 1080p TV's at "further than optimal distance" from the viewer are approaching the viewing angle per pixel that our eyes can resolve.
Yeah, but I don't think that will actually stop people's complaining. When you get down to that sort of resolution, "jaggies" won't be a result of resolution but of bad filtering. I don't think there will ever be a day when we can render something that isn't an instantaneous time-slice (and I'm including the fact that motion blur attempts will never look convincing), and that means hard edges, and that will still have the psychosomatic effect of jaggies or ringing or other such artifacts being present whether they really physically discernible or not.

I'd expect a move away from AA at those uber-high resolutions, or for developers to actually target less than display maximum resolutions with MSAA or other "smart" AA technique such that the result was still near the range of our acuity but saved bandwidth by not rendering at full resolution (and of course using that power for other gain).
Well, I still find it funny that people think that reducing resolution will account for so much on the rendering side. If everything in all of graphics was fillrate limited, then sure, but that's not really the case. It won't mean much to vertex processing or lighting and shadows and a lot of other things that account for a whole lot more in bringing you closer to "photorealistic graphics". Artistically, it can mean a few things, namely where things like texel fillrate and blending operations all play a role, but that's more on the side of "aesthetically pleasing" than "realistic." On the software side, well, that's a different story (assuming it frees up memory and time per frame), but it's all indirect and not guaranteeably significant.
 
That's probably true for consoles, but I don't think bandwidth nor "smarter" AA techniques will scale up quickly enough to make a difference before the whole UHDTV transition happens either....

Well, I still find it funny that people think that reducing resolution will account for so much on the rendering side. If everything in all of graphics was fillrate limited, then sure, but that's not really the case.
Those statements seem a bit at odds to me. Maybe I just didn't understand you completely. For one, would reducing resolution not reduce the number of pixels going through shader routines?

But my point was that even if the architecture supported fillrate and shader power for UHDTV, devs might choose a lower resolution with MSAA or similar to leave a bit of spare bandwidth and computational resources for other effects. Not that the extra effects would be dramatic, or that the extra bandwidth would make or break the game, but simply that if UHDTV was beyond our visual acuity then a lower (but still high) resolution + AA would still be near our limits and it might leave a little breathing room. As you said, you don't think hardware will scale quickly enough before we make that next jump, so devs might well be faced with that decision (much like current ones are about 720p + AA vs. 1080p and no AA).

Yeah, but I don't think that will actually stop people's complaining. When you get down to that sort of resolution, "jaggies" won't be a result of resolution but of bad filtering. I don't think there will ever be a day when we can render something that isn't an instantaneous time-slice (and I'm including the fact that motion blur attempts will never look convincing), and that means hard edges, and that will still have the psychosomatic effect of jaggies or ringing or other such artifacts being present whether they really physically discernible or not.
That's probably true. I would hope that a forward looking UHDTV resolution would include support for framerate increases as the display and broadcast technologies matured without breaking backwards compatibility. UHDTV @ 120Hz would probably make most people very happy... I know it would me!

Then perhaps we could get started adding back that whole extra dimension we have neglected in displays for decades. :)
 
Remember back in the old days when some games supported the ability to render at a higher resolution than displayed? Why can't the hardware render at some stratospheric resolution (basically SSAA right?) then resize the picture to display resolution. The PR people can use the Resolution Independance buzzword.
 
BTW, I'm not sure what it means that our eyes "go down to 600dpi"... at what viewing distance? That's kinda important.
very close :) ie a piece of paper that ppl can discern details on
btw sorry if it wasnt so clear with my above post, i did say 'true for tvs', but me + im sure lotsa others dont have tvs but monitors, where viewing distance is usually less than a meter ( theres nothing like watching horrors at that distance )
 
Back
Top