720p vs 1080p performance hit?

slapnutz

Regular
Ok so the 360 will apparently be able to create 1080p natively with this patch coming out.

My question is just say... GameX runs at 60fps on 720p on the 360. Now if devs were to simply bump up the resolution to 1080p for GameX, what would the fps drop down to?

Yes i know depends on a whole bunch of shizz.... just curious.
 
If it were maxing out the fillrate at that 60fps, you'd get ~26fps. There are 2.25 as many pixels in a 1920x1080 framebuffer. But other than that, what ERP said...
 
Assuming fillrate isn't a problem,
The bigger consideration would be memory usage.

If you were (say) trying for 1920x1080 with 4xaa. (If I could choose between 1280x720 with 4xaa, or 1920x1080 without, I'd go for 1280, so I'm considering AA)

Consider,

you need the original render target. 1920x1080 x (4 byte colour, 4 byte depth) * 4 sample,
you need the buffer to down sample to, 1920x1080 x 4 byte colour. And possibly the front buffer too? (same?). And you also possibly may want to keep a copy of the depth buffer handy as well.

Add that up, 63mb for the render target, 8mb each of the (possibly 3) other buffers. That adds up really, really quickly.

Streaming content around is already one of the big hurdles of modern games, so this is cutting into your memory pool for streamed textures, etc. Putting further pressure on IO, and your streaming code.

The point I find interesting, however, is that the EDRAM on the 360 acts as a temporary render target, so the first (63mb) does not need to be allocated in main memory. I'd be interested in knowing how the PS3 deals with this issue.

~80mb is a lot.
 
The point I find interesting, however, is that the EDRAM on the 360 acts as a temporary render target, so the first (63mb) does not need to be allocated in main memory.
Uh, the eDRAM can't even fit a full 1080P color/Z frame without antialiasing. Much less one with it. So yes, you DO need to allocate it in main memory, and you need to tile it while rendering it...

One positive side-effect of MS allowing 1080P games might actually be that more games start using 4xAA at 720P, rather than 1080P without any AA, since you'd have to tile regardless to render at either frame format. Or well, that's what I'm hoping anyway! :D
 
Uh, the eDRAM can't even fit a full 1080P color/Z frame without antialiasing. Much less one with it. So yes, you DO need to allocate it in main memory, and you need to tile it while rendering it...
Storing a complete copy of the backbuffer, at 8 bytes per sample, for...... what?
 
How came there is a "sudden" influx of PS3 games @ 1080p, did they up the RSX speed or what?

I mean, they all look impressive, or would they have looked alot better at 720p ?
 
Fillrate is not PS3's bottleneck apparently. ;)

On the other hand, short term I wonder what would be smarter. More FX (+higher framerate) on 720p or going to 1080p? I mean, most people won't own a 1080p set for some time...

Having said that, if there's a bottleneck we don't know of, maybe they can use everything they've been aiming for and still go to 1080p without a problem. So in that case it's an obvious move.

Nice anyway.
 
Graham said:
is that the EDRAM on the 360 acts as a temporary render target,so the first (63mb) does not need to be allocated in main memory.
You'll be trading off display list memory for it though (unless you want your tiling to get ... expensive).

That said 1080px4AA backbuffers ARE a bit of an extreme example for bruteforce approach(in terms of memory storage), and predicated tiling is not necesserily something exclusive to XBox360.
 
Last edited by a moderator:
Uh, the eDRAM can't even fit a full 1080P color/Z frame without antialiasing. Much less one with it. So yes, you DO need to allocate it in main memory, and you need to tile it while rendering it...

One positive side-effect of MS allowing 1080P games might actually be that more games start using 4xAA at 720P, rather than 1080P without any AA, since you'd have to tile regardless to render at either frame format. Or well, that's what I'm hoping anyway! :D

Ops. Thought I mentioned tiling when I originally wrote that. My mistake.
It's still an interesting possibility.

I would still rather 1280x720 with 4xAA than 1920x1080. Although without AA, 1920 will require only 2 tiles, not 3. Then again I'd still imagine it to be significantly slower.
 
Some questions - would you want to anti-alias at 1920 x 1024, and wouldn't the physical pixels on the HDTV screen prevent anti-aliasing having any effect over blurring the image? Anti-aliasing surely only works when you have screen pixel resolution more than the displayed pixel resolution.

Think of what happens when you see a grille or a striped shirt on TV - you get jaggies due to the screen pixel size itself. Would a simple and cheap blurring effect rather than anti-aliasing suffice. Also bear in mind that at 1080p the human eye (and maybe the effects of codec loss of information) will effect some blurring which may hide the jaggies.

Final question - why should games at 1080p be any different from TV at 1080p? If we see jaggies in games at this resolution, will we see jaggies in TV programs/video at this resolution?
 
Some questions - would you want to anti-alias at 1920 x 1024, and wouldn't the physical pixels on the HDTV screen prevent anti-aliasing having any effect over blurring the image? Anti-aliasing surely only works when you have screen pixel resolution more than the displayed pixel resolution.

Final question - why should games at 1080p be any different from TV at 1080p? If we see jaggies in games at this resolution, will we see jaggies in TV programs/video at this resolution?
This final question needs to be answered for you to understand the former. On a fixed resolution display like an LCD screen, each pixel is a discrete entity, a little square of light. When you render a picture, if the difference between some pixels and other is large enough, you'll notice a distinct stepping or aliasing. Games suffer from this because every pixel on the screen is a single sample from the game. You take a point in the game (a pixel from the screen), determine what colour it should be, and put that on the display.

In a TV picture, a pixel isn't one sample but lots and lots. Consider the case of inside a car, with back window borders and a bright outside. Rendered in a game, one pixel will be near black and the other near white, depending on whether in that pixel is frame or window. In a TV picture, each pixel contains varying amount of frame and window. You might have a pixel that is halfed filled with frame, half filled with window. The colour for that pixel is than an average, halfway between dark and light. The pixel has more information than just one sample. It's this average of more than one point that creates 'antialiasing'. Adding more samples when you render a game produces more inbetween values, which decreases the contrast between adjacent pixels and decrease aliasing.

As for 1080p not having aliasing, that's a matter of pixel size. If you had 1080p resolution in a 4" display at arms length, you woudln't need AA as you couldn't notice the individual pixels. If you had a 1080p display that was 87" across, each pixel would be a millimetre in size, and depending on the distance you sit from the screen, that may be noticeable. Genreally speaking, we're not, and likely won't be for decades, at a point where pixel resolution is fine enough to remove the need for AA. Jaggies will always be present, because as resolution increases, so does display size. A 14" 1080p display would look pretty jaggie free at a comfortable viewing distance, but doesn't exist and probably never will.
 
One question for developers...

How feasible do you see to have both options for the player? For example, maximum detail at 720p and medium detail at 1080p.

I know it means more effort for testing, balancing, etc, but... does it make any sense for you? do you know if some studios are considering offering both options instead of sticking to just one?

Because that would be great from a gamer's point of view.
 
Huh?

You're not making any sense.
Why would you store the complete backbuffer, which would be upwards of 8 times larger in size than the resolved frontbuffer, in main memory? You don't necessarily save memory, but as long as we're pointing out the need for tiling, might as well add that complete copies of the backbuffer in main memory are going to be less common than just the resolved tiles.

But perhaps this is just a bunch of not reading everybody's words entirely.

I agree, though, on wanting 720p w/ 4xAA over 1080p.
 
One question for developers...

How feasible do you see to have both options for the player? For example, maximum detail at 720p and medium detail at 1080p.

I know it means more effort for testing, balancing, etc, but... does it make any sense for you? do you know if some studios are considering offering both options instead of sticking to just one?

Because that would be great from a gamer's point of view.

Welcome to the wonderul world of the PC where you have to target your graphics well bellow whats possible because of the split userbase.

OK it wouldn't be as bad, but one of the reasons that console games can look so good on the hardware they have is because developers can pick a single target and optimise for it.

If frame rate isn't a consideration like most PC games then sure you could do it.
 
Welcome to the wonderul world of the PC where you have to target your graphics well bellow whats possible because of the split userbase.

OK it wouldn't be as bad, but one of the reasons that console games can look so good on the hardware they have is because developers can pick a single target and optimise for it.

If frame rate isn't a consideration like most PC games then sure you could do it.

Thanks.

So what about a not-so-complex choice between 720p@60fps and 1080p@30fps?

I'm all for 720p (that's what my TV can show anyway) but it could be ineteresting for some guys.
 
Looking at the graphic sub system only, is it a fair generalisation to say, in an eyecandy filled game, that the likely bottleneck for 1080p4AA@60 is pixel shading rather than rendering/rop bandwidth ?
 
Back
Top