PS3 and 360: Would lower resolutions allow for photo-realitic graphics?

Remember back in the old days when some games supported the ability to render at a higher resolution than displayed? Why can't the hardware render at some stratospheric resolution (basically SSAA right?) then resize the picture to display resolution. The PR people can use the Resolution Independance buzzword.

Because SSAA is much more expensive than MSAA. If that were an easy thing to do, devs wouldn't have to make the decision between 1080p and 720p today... they'd just render at 2560p and resample the image (SSAA).
 
Because SSAA is much more expensive than MSAA. If that were an easy thing to do, devs wouldn't have to make the decision between 1080p and 720p today... they'd just render at 2560p and resample the image (SSAA).

That and you'd find that your available RAM starts approaching zero a lot quicker. Zomg, 64x64 resolution textures! :eek:
 
Those statements seem a bit at odds to me. Maybe I just didn't understand you completely. For one, would reducing resolution not reduce the number of pixels going through shader routines?
Well, I wasn't disputing that you can do more because there are fewer pixels to draw and fewer pixels to shade.

I was basically saying that getting greater realism includes a hell of a lot of things that aren't necessarily affected by fillrate. e.g. being able to process denser meshes, being able to move more information per vertex (i.e. more interpolators), more lights and shadows per pass, being able to sample information about the environment from every point, temporal antialiasing...

Things that will very clearly have an advantage at lower resolutions may be having more texture layers and texture blending (texel fillrate), alpha blending (pixel fillrate), cases where you're clearly pixel shading limited (and not hitting limits on anything else because pixel shading is by far the greatest cost).
 
I really think that high resolution came a little late on consoles.
I have been playing at higher resolutions than 1280 x1024 on pc for many years now, and i really think it makes a huge difference, when u put an xbox game wich is as good as the pc version on the highest settings (its rare but happens), you will see that the high resolution on pc makes all the difference even if the textures effects etc are the same.

I guess it depends alot on how good your eyes are, but still consoles should have used 720p standard last gen (ofcourse they would have to come out with more horsepower) wich they could have, higher defenition its just a part of games evolution as anything else.

When i bought my 360 and started playing on an sdtv with my first game gears of war, i could only see the high resolution textures when i would stand right next to a wall and zoom in with the pistol, thats when i decided i should buy a vga adapter to connect to my monitor and play it 720p the result was all the detail seen from big distances and it really made a big difference.

Though 1920x1080 is still too much for this gen hardware at least 720p should be standard.
 
I dont think sony could have done a 720p consoles 7 years ago and still make it somewhat affordable. Besides that, HDTV's are only becoming somewhat standard for the last few years so most of the time 99% of the people wouldnt even had a use for a 720p console.
 
One thing to remember here when doing comparisons between hd and sd is to use the proper TV

A 480i CRT will look an order of magnitude better at sd resolution than a LCD made to show 480i.

I have a 720p plasma and a 720p CRT HDTV and both of them cannot come close to a 480i sd TV on native SD material.

The point im trying to make is that i see a lot of comparisons between games in HD res and SD res but on the same TV. This is wrong. An SD game will look much better on a proper old SD TV.

Also many games right now use only 2x MSAA or no AA at 720p or higher but at SD res they use 4X and some of them also some SSAA via the downscaler. These look much cleaner on an SDTV then when shown on an HDTV at the expense of some detail which IMO if you view the TV from a proper distance would be also hard to spot especially for a causal gamer.

Just my opinion after swapping over some 4 type of TVs in a matter of days.
 
I dont think sony could have done a 720p consoles 7 years ago and still make it somewhat affordable. Besides that, HDTV's are only becoming somewhat standard for the last few years so most of the time 99% of the people wouldnt even had a use for a 720p console.

If not HD at least ED like 1024 x 768 games so many "weak" PCs at the time of te ps2 release could run way superior games at way higher res than consoles.
They could just have let 480i behind and move to something at least a little better with a little more power on the hardware and a little less money for sony i think they've made quite enough with ps2.
 
Given the abundance of SDTVs, rendering at a higher res would have been dubious, leaving even less memory for everything else.
 
If not HD at least ED like 1024 x 768 games so many "weak" PCs at the time of te ps2 release could run way superior games at way higher res than consoles.
They could just have let 480i behind and move to something at least a little better with a little more power on the hardware and a little less money for sony i think they've made quite enough with ps2.

Like Al said, what is the use of that when you dont have tv's that use that resolution? not to mention that pc's capable of doing those res where alot more expensive than sony/ms/nintendo was willing to pay for a console (and in the end, us).
 
Well games these days really are not as much Render Output Processor bound as they are bound by the pixel/vertex/texture hardware instead. Of course you CAN see framerate jumps with smaller resolutions, which does mean the ROPs are having their workload lifted.

Call of Duty 2 for instance, can run in DX9 mode quite nicely when running at 640 x 480 on my GeForce Go 7200 powered lappy, but the framerate goes to hell each level I up the resolution. CoD2 just seems to be very ROP bound.
 
Well games these days really are not as much Render Output Processor bound as they are bound by the pixel/vertex/texture hardware instead. Of course you CAN see framerate jumps with smaller resolutions, which does mean the ROPs are having their workload lifted.

Call of Duty 2 for instance, can run in DX9 mode quite nicely when running at 640 x 480 on my GeForce Go 7200 powered lappy, but the framerate goes to hell each level I up the resolution. CoD2 just seems to be very ROP bound.


Lower resolutions do reduce the load on the pixel shader hardware too. Fewer pixels to affect with the shader programs.
 
Lower resolutions do reduce the load on the pixel shader hardware too. Fewer pixels to affect with the shader programs.

I would imagine so, but the GeForce Go 7200 G72 core only has 2 ROPs, so I think you can get a much better idea of the issue with this GPU core.
 
Back
Top