720p vs 1080p performance hit?

Welcome to the wonderul world of the PC where you have to target your graphics well bellow whats possible because of the split userbase.

OK it wouldn't be as bad, but one of the reasons that console games can look so good on the hardware they have is because developers can pick a single target and optimise for it.

If frame rate isn't a consideration like most PC games then sure you could do it.
I dont think its an accurate comparisson.

They target graphics below whats possible because of the variability in hardware specs each gamer has. Not everyone has the same and powerful specs.

On the other hand, all console owners own the exact same hardware.

So unlike PCs, consoles are only a matter of choice. It has nothing to do with some groups not having the appropriate hardware, thus giving the ability for them to play with lower graphics because they are forced to.

Also in PC games, users tend to choose both high resolution and high detail when they have the technology or both low resolution and low detail when they dont (or other variations). The console example works different though. The example sais they will use the same hardware either for more detail and lower resolution or for more resolution and less detail
 
Looking at the graphic sub system only, is it a fair generalisation to say, in an eyecandy filled game, that the likely bottleneck for 1080p4AA@60 is pixel shading rather than rendering/rop bandwidth ?

At some level it's a self fullfilling prophecy......
If I'm ROP or destination bandwidth limited adding complexity to my pixel shaders is free, so I will....

Having said that there is no general limited by X scenario, except in extremly simple scenarios there will be parts of a frame where the game is pixel limited, parts where it's ROP limited and parts where it's vertex limited. It's one of the reasons that measuring and optimisation are so hard, you have to understand what the bottlenecks are for various tasks, how to measure them and how to address them. There is no one single bottleneck for most games.
 
Nesh said:
I dont think its an accurate comparisson.
It accurate enough.
You want to introduce additional variables into the mix - and that invariably leads to either increased time&cost of development or targetting lowest common denominator where the additional variables have reduced, or no noticeable impact anymore.

Or in worst case, one option will be offered as a complete afterthought, with little or no testing/work done to ensure it's quality - history has already shown countless examples of the latter (PAL conversions, most 16:9, ProScan support, and in new generation, SDTV support on some 360 games...).

DarkRage said:
So what about a not-so-complex choice between 720p@60fps and 1080p@30fps?
At the very least - 1080P consumes more memory(which wouldn't be available for 720 either), so you're already seeing the case of lowest-common denominator. With two different performing modes, chances are you'd also have to run complete Q&A for both, which means most publishers would never go along with the idea in the first place.
 
I am hoping MS will give us more details on how they will accomplish this at X06, and maybe a few titles to showcase 1080p.
 
So what about a not-so-complex choice between 720p@60fps and 1080p@30fps?
1080p at 30fps is harder than 720p@60fps.
thus a game running at 720p@60fps is not guaranteed to run at 1080p@30fps

I am hoping MS will give us more details on how they will accomplish this at X06
i believe ms only mentioned this so it looks good in a bullet point comparrision with the ps3, only expect a few token simple games at most.
( similar to backwards compatiblity, it looks good on a comparrison )

ps3 - is bc for ps1/2 titles ------ xb360 - is backward compatible with xb1 titles

this is plainly not true, the xb1 title that run on the xb360 are ports,
u dont hear ppl say just cause re4 was later available on the ps2 that the ps2 is BC with the GC :LOL:
 
I have a question.

If you have a 720P game with appropriately sized textures (in resolution) and then decide to go to 1080P to keep the same sharpness and detail do you have to literally double the size of your textures? If so, does this literally mean that the footprint in the RAM for textures will actually double?

Basically, I ask this because many people have said one potential problem with 1080P games would be a large framebuffer. However, if you have to increase the size and therefore memory footprint of your textures I could see that as creating even more of a *potential* problem than the larger framebuffer.
 
I have a question.

If you have a 720P game with appropriately sized textures (in resolution) and then decide to go to 1080P to keep the same sharpness and detail do you have to literally double the size of your textures? If so, does this literally mean that the footprint in the RAM for textures will actually double?
no visually the textures appearance will be practically the same, what it buys you by increasing resolution, is less alaising.

actually u can try this yourself, take a screenshot of a game (on the pc) at 640x480 and then choose 1280x960 in the menu + take another screenshot, resize the images so theyre both the same + then compare them
 
Hello Zed,

Thanks for responding to my question. I am not quite sure that I am asking it correctly. For example, lets say I have a 512 x 512 texture. If I double the resolution of the game don't I have to double the size of that texture to make sure it looks just as sharp?

For example, lets say you have one set of textures for a game that's running at 720P and then without changing them at all you increase the resolution to 1080P. Are you saying the textures will actually look the same at 1080P? I really don't understand how that could be the case. It seems to me that a lower resolution texture would start looking blurry.
 
Are you saying the textures will actually look the same at 1080P? I really don't understand how that could be the case.
Because of the screen size. Doubling the resolution doesn't double the size of the polygon you see. Imagine a 40" 720p LCD set, displaying a 512x512 texture up close so that 128 texels fill the vertical. You're going to see texture filtering blurring the oversized texels. Now if you output the same image to a 40" 1080p, you'll see the texels are the same size, just more of them.

1080p gets you higher resolution, which means higher fidelity both on polygon edges and smaller-than-pixel-sized-texels textures. You don't need to increase the texture resolution unless you want to get higher quality texturing. eg. If 512x512 texture in your game give you a 1:1 mapping at 720p, to get a better image from 1080p you'd want higher resolution textures. If you keep the same textures quality will be no worse than 720p.

1080p does cost shader performance though. If you're having to shader twice as many pixels, you can only run half as many shaders. It also impacts BW and you might find you can't handle as many particles for example. Jaggies can be reduced by AA on 720p while still leaving room for more effects.

I'd have to see 1080p and 720p games next to each other to know which I'd prefer.
 
This final question needs to be answered for you to understand the former. On a fixed resolution display like an LCD screen, each pixel is a discrete entity, a little square of light. When you render a picture, if the difference between some pixels and other is large enough, you'll notice a distinct stepping or aliasing. Games suffer from this because every pixel on the screen is a single sample from the game. You take a point in the game (a pixel from the screen), determine what colour it should be, and put that on the display.

In a TV picture, a pixel isn't one sample but lots and lots. Consider the case of inside a car, with back window borders and a bright outside. Rendered in a game, one pixel will be near black and the other near white, depending on whether in that pixel is frame or window. In a TV picture, each pixel contains varying amount of frame and window. You might have a pixel that is halfed filled with frame, half filled with window. The colour for that pixel is than an average, halfway between dark and light. The pixel has more information than just one sample. It's this average of more than one point that creates 'antialiasing'. Adding more samples when you render a game produces more inbetween values, which decreases the contrast between adjacent pixels and decrease aliasing.

As for 1080p not having aliasing, that's a matter of pixel size. If you had 1080p resolution in a 4" display at arms length, you woudln't need AA as you couldn't notice the individual pixels. If you had a 1080p display that was 87" across, each pixel would be a millimetre in size, and depending on the distance you sit from the screen, that may be noticeable. Genreally speaking, we're not, and likely won't be for decades, at a point where pixel resolution is fine enough to remove the need for AA. Jaggies will always be present, because as resolution increases, so does display size. A 14" 1080p display would look pretty jaggie free at a comfortable viewing distance, but doesn't exist and probably never will.


I understand what you are saying, but the fact remains that the jaggies on 1080p resolution are a lot less jagged than on 480 or 720p, which must mean it is less of a problem if you are viewing it on the same size screen. Also as you get closer to the pixel resolution of the screen there is usually some degree of pixel averaging due to bandwidth limitations of the display electronics. This will tend to blur the picture slightly giving a similar effect to sitting back further from the display. What I am wondering is whether this sort of pixel averaging, or software or hardware colour dithering techniques, or some other sort of 2D hardware or software blurring by averaging rendered pixels rather than increasing sampling points which is computationally expensive, might be an acceptable alternative to AA at 1080p. On photos for example, dithering techniques do a good job breaking up colour banding and other patterns.

The other reason to have AA is to avoid pattern aliasing (although texture sampling may take care of this). You get this effect on TV if someone with a striped shirt is filmed on TV. I understand that TV companies get around this by telling people to avoid wearing striped or regularly patterned shirts on interviews etc. Can't the same be done in games - for example avoiding regularly spaced patterns or objects that might cause problems in textures. Alternatively can't textures be substituted when they are more distant by textures with less detail so pattern aliasing won't arise? GT4 is a good example of what not to do with regard to pattern aliasing with lots of wire mesh and overhead cables.
 
Last edited by a moderator:
Shifty Geezer,

Thank you very much for that explanation. I understand what you were trying to get across now. It's good to know that a 1080P resolution for a game would not automatically increase it's texture footprint by requiring higher resolution textures. At least that is one thing we can strike out as NOT a performance hit of 1080P.
 
Having said that there is no general limited by X scenario, except in extremly simple scenarios there will be parts of a frame where the game is pixel limited, parts where it's ROP limited and parts where it's vertex limited. It's one of the reasons that measuring and optimisation are so hard, you have to understand what the bottlenecks are for various tasks, how to measure them and how to address them. There is no one single bottleneck for most games.

Well lets limit the discussion to just performance hit from 720p to 1080p as the topic and not a general performance optimisation, which I know is a difficult topic.

So lets assume the game (say a multiplatform racing game for PC, X360 and PS3) is running great @720p60 with 4x AA, looking pretty and all on all platforms. Marketing call and insisted on 1080p. On PC you let the gamers buy another card, obviously you can't do that on consoles.

Knowing the profile of the game inside out for both X360 and PS3 , what factors do you consider for the possibility of 1080p60 and keeping the same features for each console ? If you're mostly vertex shaders limited with spare pixel shaders, fillrate, memory, would you consider saying yes to the marketing guy ? If not what kind of condition do you need, before being optimistic to such a request ?

On X360, how does the impact of increase resolution have on the overall load balancing of the vertex and pixel shaders on that unified shaders architecture ? Does it take alot of re-engineering to load balance the game again or is it a trivial matter ? What about increasing the amount of tiles to fit that eDRAM ? is that trivial as well when you already have tiling working ?
 
I realise my post here is very on-topic in this thread:

http://www.beyond3d.com/forum/showthread.php?p=838266#post838266

The point is that if developers can push higher resolutions like 1080p, it means that most probably, RSX has time to spare (waiting for data to come from Cell, so it could also be a bandwidth issue like you say). So instead of leaving the RSX idle, they just go for 1080p.

*snip*

This is all in my opinion from info we hads around here, i'm sure real developers could explain this much better than i can.

I think it's much more simple than that. Most good developers always take care to develop something for an understated performance target. While this holds double for games, I myself even will prefer to develop a basic office application on an old piece of crap PC, so that I can get a good feel of where my application is hitting bottlenecks. If there are unsurmountable ones on that PC, fine, I can still fall back on higher performance PCs afterwards, but most slow bits usually just fall down to sloppy/inefficient coding.

In the games market, performance is everything and you'll want to use all the available performance you have, but if you don't know what you'll have exactly in the end, you use a fair margin of error. If early developers for the PS3 did the same, the final hardware should nearly always perform better than their worst case scenario, in which case they have performance to spare.

Now comes the point at which they can figure out the best way to spend the extra performance. One really easy way would be to render at a higher target resolution. If you can achieve this without sacrifices, it is typically an easier way to scale your application's performance upwards than adding new effects.

But make no mistake, there will also be developers who make the choice early on between targetting 1080p and 720p. They will weigh the benefits of either against the amount of effects they need and so on, and the improved looks that 1080p affords. You can say whatever you like, but with most screens on the display area being fairly large, 1080p can indeed make a lot of difference. Someone's argument about the amount of extra memory it takes being 2.25x that of 720p, well, that also means the on-screen detail is that much higher. Now let's put up a little formula to quantify this.

Let's take a few example screens. I have two wide-screen displays that I use frequently for gaming.

One is my PSP, the other is my 82cm Philips 100hz SDTV Widescreen. Let's pick up on the relevant factors here:

Size of screens
PSP: 9.5x5.4=51.3cm
TV: 68x38cm=2584cm

Resolutions
PSP: 480x272=130560 pixels
TV: 1280x720=921600 pixels
TV: 1920x1080=2073600 pixels

Pixels per cm
PSP: 2545 pixels
TV 720p: 356 pixels
TV 1080p: 802 pixels

So, viewed from the same distance, the PSP's resolution per cm is 7 times that of the regular (82cm rated) TV at 1280x720. At 1080p, that goes down to about 3.1 times.

Now you say that the PSP is typically viewed from much closer than a TV, and of course, that's true. In my particular case, I tend to watch my TV from about 3 meters (300cm), and my PSP from about 30cm. That's 10x the distance. That should make up for the difference, right?

Here is where the percentage of field of view comes in as a useful parameter. At these distances, surprisingly to some I'm sure, the PSP and TV take up very similar amounts of my full human field of view. There's no easier test than holding my PSP at 30cm and see how much this covers up my TV at 270cm behind it (helps to close one eye while doing so). The difference turns out to be negligible! This explains the effect why people who watch an UMD or play a game on PSP can easily forget it is such a small screen - in terms of percentage of field of view, the difference is negligible.

What does this mean for resolution?

At the distances of 30cm and 300cm, the tables turn. Now, suddenly, the PSP is relatively low-resolution, and both 720p and 1080p look comparatively sharper (about 7x at 720p already).

However, for gaming I often sit at 150cm from that TV. At this distance, relative to my field of view, my PSP's screen fits into the TV's screen 6 times. At this distance, 720p looks just slightly worse than my PSP, but 1080p still looks twice as good as my PSP.

Now I've just been talking about a 82cm Widescreen TV. (This equates to about 32") These days, that's certainly not the biggest TV anymore. Especially game developers and show-floors tend to work with larger screens at times, at which the increased resolution may become significantly more visible fast.

I personally think that with the way TVs are made (particularly those with fixed pixels like LCD), and most HD video being natively encoded in 1080p (according to HD DVD and BluRay Spec), 1080p may look a lot better also because upscaling from 720p may not look so hot. The whole 720p/1080i is hard on computer games especially, which is why 1080p is more likely to become a settled-on resolution for a while than either of those two, imho, but we'll see - depending on how it goes, I wouldn't be surprised if a number of games will support 1080p or 720p natively, because in the end that may look much better than anything up- or downscalers can accomplish in theory, let alone in practice. Similar benefits may exist for displaying games in SD - to make 720p look good on an SDTV is harder than making a 1080p game look good on SD.

In short, I think there's little argument that 1080p can bring a significant improvement over 720p visually in many real life cases, looking potentially 2.25x as good. Whether that improvement benefits a particular game depends on where the bottleneck of that game is, but also on how well the hardware deals with 720p/1080p signals upscale/downscale.

So when Laa-Yosh says 'at a price', well, let's just say that in the one extreme, it comes at the price of 2.25x the amount of animation / pixel shading detail / framerate of a game, but in the other extreme, it will give 2.25x better looks and potentially (i.e. depending on how well TVs deal with up/downscaling) even more (a simple example - games like Fl0w or even Virtua Tennis 3 are for normal gameplay not really going to require lots more effects, but will benefit much more from increased detail, because in Virtua Tennis for instance you want to view the whole court, and even if you zoom up to a character, you won't see much more than just that character - and tennis doesn't go on during rain either, so rain or mud effects just aren't going to happen).

Nuance is a good thing.
 
Wow nice post Arwin!
However I must slightly disagree with that pixels per centimeter thing...
I don't think it is that simple I believe that to a degree human eye notices the amount of pixels displayed even if they are on a larger area. For example in a movie theater the picture still looks fairly good eventhough there are very few pixels in the screen per cm. A movie theater that I often go in Finland has about 185m2 screen, so the amount of pixels is very very low per centimeter, but still the picture is ok. I think the same applies at home. my 61" at 720p looks better from 2.5 meters than 30" with 480p eventhough the screen is four times bigger on the 61".
Atleast that's how I feel about this.
 
Wow nice post Arwin!
However I must slightly disagree with that pixels per centimeter thing...
I don't think it is that simple I believe that to a degree human eye notices the amount of pixels displayed even if they are on a larger area. For example in a movie theater the picture still looks fairly good eventhough there are very few pixels in the screen per cm.

Ah, but you seem to have completely missed my point about percentage of field of view ... corrected for the percentage of field of view, well, just bring your PSP to the cinema, hold it at 30cm in front of the cinema, and see what it looks like. Then the difference may be much smaller. Perhaps the cinema screen is, relatively, only 4-6x as big as your PSP.

Also, you have to be careful to remember that the technologies used at cinema are different from LCD screens. Even older TVs can look rather nice because their pixels blur into each other. And modern digital cinema typically uses a 1600x1200 resolution, but it is still a projection system that gets more diffuse depending on how far the screen is from the projector.
 
Pixel blend and visual acuity gentlemen.

See this viewing distance calculator: http://www.myhometheater.homestead.com/viewingdistancecalculator.html
If you are at distance higher than the visual acuity the pixels will gradually blend.
For my 17" LCD the visual acuity is 2 feet.

I personally like to place myself in the visual acuity point because of the natural filter it gives and smaller viewning angle resulting in less eye movement and more area in full focus.
 
I'll just throw my piece...

I have a Nokia N80 - crap phone but the screen.. my god... It's a small screen (big for mobiles but still small of course) and it has a resolution of around 450x350 (can't remember the real numbers). At that size, it really is difficult to see the individual screen pixels. Initially when i first got the phone, i couldn't believe my eyes, i was really having a hard time distinguishing the pixels. Amazing dot-pitch.

If someone were to create TVs with such dot-pitch, and the TV did everything right (contrast, black levels etc), it would look like your window. I think that from a distance all the detail would be lost anyway, it's already hard to see the pixels at a 10cm distance...

Or maybe i'm just getting old and my vision is failing me... err
 
yeah ild love one of those monitor size as well
just done some quick calcs with the nokia n80 its DPI is ~250 dpi
current monitors are in the region ~80-120 dpi
i believe the human eye can't distingwish above 600dpi

so we are close to matching the eyes resolution the problem is getting the constrast, for screens this will need to increase at least 100fold (imagine the power used in one)
 
Back
Top