720p vs 1080p performance hit?

Discussion in 'Console Technology' started by slapnutz, Sep 22, 2006.

  1. slapnutz

    Regular

    Joined:
    Jul 2, 2006
    Messages:
    504
    Likes Received:
    0
    Ok so the 360 will apparently be able to create 1080p natively with this patch coming out.

    My question is just say... GameX runs at 60fps on 720p on the 360. Now if devs were to simply bump up the resolution to 1080p for GameX, what would the fps drop down to?

    Yes i know depends on a whole bunch of shizz.... just curious.
     
  2. ERP

    ERP Moderator
    Moderator Veteran

    Joined:
    Feb 11, 2002
    Messages:
    3,669
    Likes Received:
    49
    Location:
    Redmond, WA
    Answered your own question.

    Depends if it's pixel/vertex or CPU bound and how much of each one at which points in the frame.
     
  3. mech

    Regular

    Joined:
    Feb 12, 2002
    Messages:
    535
    Likes Received:
    0
    If it were maxing out the fillrate at that 60fps, you'd get ~26fps. There are 2.25 as many pixels in a 1920x1080 framebuffer. But other than that, what ERP said...
     
  4. Graham

    Graham Hello :-)
    Moderator Veteran Subscriber

    Joined:
    Sep 10, 2005
    Messages:
    1,479
    Likes Received:
    209
    Location:
    Bend, Oregon
    Assuming fillrate isn't a problem,
    The bigger consideration would be memory usage.

    If you were (say) trying for 1920x1080 with 4xaa. (If I could choose between 1280x720 with 4xaa, or 1920x1080 without, I'd go for 1280, so I'm considering AA)

    Consider,

    you need the original render target. 1920x1080 x (4 byte colour, 4 byte depth) * 4 sample,
    you need the buffer to down sample to, 1920x1080 x 4 byte colour. And possibly the front buffer too? (same?). And you also possibly may want to keep a copy of the depth buffer handy as well.

    Add that up, 63mb for the render target, 8mb each of the (possibly 3) other buffers. That adds up really, really quickly.

    Streaming content around is already one of the big hurdles of modern games, so this is cutting into your memory pool for streamed textures, etc. Putting further pressure on IO, and your streaming code.

    The point I find interesting, however, is that the EDRAM on the 360 acts as a temporary render target, so the first (63mb) does not need to be allocated in main memory. I'd be interested in knowing how the PS3 deals with this issue.

    ~80mb is a lot.
     
  5. Guden Oden

    Guden Oden Senior Member
    Legend

    Joined:
    Dec 20, 2003
    Messages:
    6,201
    Likes Received:
    91
    Uh, the eDRAM can't even fit a full 1080P color/Z frame without antialiasing. Much less one with it. So yes, you DO need to allocate it in main memory, and you need to tile it while rendering it...

    One positive side-effect of MS allowing 1080P games might actually be that more games start using 4xAA at 720P, rather than 1080P without any AA, since you'd have to tile regardless to render at either frame format. Or well, that's what I'm hoping anyway! :D
     
  6. TurnDragoZeroV2G

    Regular

    Joined:
    Nov 14, 2005
    Messages:
    583
    Likes Received:
    23
    Location:
    Who knows...
    Storing a complete copy of the backbuffer, at 8 bytes per sample, for...... what?
     
  7. Guden Oden

    Guden Oden Senior Member
    Legend

    Joined:
    Dec 20, 2003
    Messages:
    6,201
    Likes Received:
    91
    Huh?

    You're not making any sense.
     
  8. -tkf-

    Legend

    Joined:
    Sep 4, 2002
    Messages:
    5,633
    Likes Received:
    37
    How came there is a "sudden" influx of PS3 games @ 1080p, did they up the RSX speed or what?

    I mean, they all look impressive, or would they have looked alot better at 720p ?
     
  9. pipo

    Veteran

    Joined:
    Jun 8, 2005
    Messages:
    2,624
    Likes Received:
    28
    Fillrate is not PS3's bottleneck apparently. ;)

    On the other hand, short term I wonder what would be smarter. More FX (+higher framerate) on 720p or going to 1080p? I mean, most people won't own a 1080p set for some time...

    Having said that, if there's a bottleneck we don't know of, maybe they can use everything they've been aiming for and still go to 1080p without a problem. So in that case it's an obvious move.

    Nice anyway.
     
  10. inefficient

    Veteran

    Joined:
    May 5, 2004
    Messages:
    2,121
    Likes Received:
    53
    Location:
    Tokyo
    Obviously they have switched to the G80!! :lol:
     
    London-boy likes this.
  11. Fafalada

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    2,773
    Likes Received:
    49
    You'll be trading off display list memory for it though (unless you want your tiling to get ... expensive).

    That said 1080px4AA backbuffers ARE a bit of an extreme example for bruteforce approach(in terms of memory storage), and predicated tiling is not necesserily something exclusive to XBox360.
     
    #11 Fafalada, Sep 22, 2006
    Last edited by a moderator: Sep 22, 2006
  12. Graham

    Graham Hello :-)
    Moderator Veteran Subscriber

    Joined:
    Sep 10, 2005
    Messages:
    1,479
    Likes Received:
    209
    Location:
    Bend, Oregon
    Ops. Thought I mentioned tiling when I originally wrote that. My mistake.
    It's still an interesting possibility.

    I would still rather 1280x720 with 4xAA than 1920x1080. Although without AA, 1920 will require only 2 tiles, not 3. Then again I'd still imagine it to be significantly slower.
     
  13. SPM

    SPM
    Regular

    Joined:
    Dec 18, 2005
    Messages:
    639
    Likes Received:
    16
    Some questions - would you want to anti-alias at 1920 x 1024, and wouldn't the physical pixels on the HDTV screen prevent anti-aliasing having any effect over blurring the image? Anti-aliasing surely only works when you have screen pixel resolution more than the displayed pixel resolution.

    Think of what happens when you see a grille or a striped shirt on TV - you get jaggies due to the screen pixel size itself. Would a simple and cheap blurring effect rather than anti-aliasing suffice. Also bear in mind that at 1080p the human eye (and maybe the effects of codec loss of information) will effect some blurring which may hide the jaggies.

    Final question - why should games at 1080p be any different from TV at 1080p? If we see jaggies in games at this resolution, will we see jaggies in TV programs/video at this resolution?
     
  14. Shifty Geezer

    Shifty Geezer uber-Troll!
    Moderator Legend

    Joined:
    Dec 7, 2004
    Messages:
    43,382
    Likes Received:
    15,836
    Location:
    Under my bridge
    This final question needs to be answered for you to understand the former. On a fixed resolution display like an LCD screen, each pixel is a discrete entity, a little square of light. When you render a picture, if the difference between some pixels and other is large enough, you'll notice a distinct stepping or aliasing. Games suffer from this because every pixel on the screen is a single sample from the game. You take a point in the game (a pixel from the screen), determine what colour it should be, and put that on the display.

    In a TV picture, a pixel isn't one sample but lots and lots. Consider the case of inside a car, with back window borders and a bright outside. Rendered in a game, one pixel will be near black and the other near white, depending on whether in that pixel is frame or window. In a TV picture, each pixel contains varying amount of frame and window. You might have a pixel that is halfed filled with frame, half filled with window. The colour for that pixel is than an average, halfway between dark and light. The pixel has more information than just one sample. It's this average of more than one point that creates 'antialiasing'. Adding more samples when you render a game produces more inbetween values, which decreases the contrast between adjacent pixels and decrease aliasing.

    As for 1080p not having aliasing, that's a matter of pixel size. If you had 1080p resolution in a 4" display at arms length, you woudln't need AA as you couldn't notice the individual pixels. If you had a 1080p display that was 87" across, each pixel would be a millimetre in size, and depending on the distance you sit from the screen, that may be noticeable. Genreally speaking, we're not, and likely won't be for decades, at a point where pixel resolution is fine enough to remove the need for AA. Jaggies will always be present, because as resolution increases, so does display size. A 14" 1080p display would look pretty jaggie free at a comfortable viewing distance, but doesn't exist and probably never will.
     
  15. DarkRage

    Newcomer

    Joined:
    Jul 25, 2005
    Messages:
    70
    Likes Received:
    1
    Location:
    Spain
    One question for developers...

    How feasible do you see to have both options for the player? For example, maximum detail at 720p and medium detail at 1080p.

    I know it means more effort for testing, balancing, etc, but... does it make any sense for you? do you know if some studios are considering offering both options instead of sticking to just one?

    Because that would be great from a gamer's point of view.
     
  16. TurnDragoZeroV2G

    Regular

    Joined:
    Nov 14, 2005
    Messages:
    583
    Likes Received:
    23
    Location:
    Who knows...
    Why would you store the complete backbuffer, which would be upwards of 8 times larger in size than the resolved frontbuffer, in main memory? You don't necessarily save memory, but as long as we're pointing out the need for tiling, might as well add that complete copies of the backbuffer in main memory are going to be less common than just the resolved tiles.

    But perhaps this is just a bunch of not reading everybody's words entirely.

    I agree, though, on wanting 720p w/ 4xAA over 1080p.
     
  17. ERP

    ERP Moderator
    Moderator Veteran

    Joined:
    Feb 11, 2002
    Messages:
    3,669
    Likes Received:
    49
    Location:
    Redmond, WA
    Welcome to the wonderul world of the PC where you have to target your graphics well bellow whats possible because of the split userbase.

    OK it wouldn't be as bad, but one of the reasons that console games can look so good on the hardware they have is because developers can pick a single target and optimise for it.

    If frame rate isn't a consideration like most PC games then sure you could do it.
     
  18. DarkRage

    Newcomer

    Joined:
    Jul 25, 2005
    Messages:
    70
    Likes Received:
    1
    Location:
    Spain
    Thanks.

    So what about a not-so-complex choice between 720p@60fps and 1080p@30fps?

    I'm all for 720p (that's what my TV can show anyway) but it could be ineteresting for some guys.
     
  19. Megadrive1988

    Veteran

    Joined:
    May 30, 2002
    Messages:
    4,667
    Likes Received:
    186
    all they need to do is put back in the 256-bit bus and 8 of the rops that they (on paper) ripped out of RSX :)
     
  20. V3

    V3
    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    3,304
    Likes Received:
    5
    Looking at the graphic sub system only, is it a fair generalisation to say, in an eyecandy filled game, that the likely bottleneck for 1080p4AA@60 is pixel shading rather than rendering/rop bandwidth ?
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...