10mb EDRAM ok for 1080i?

skilzygw

Newcomer
I saw someone posted this in another post. 10mb EDRAM can't even fit ONE 1080p frame, without tiling.

I was wondering if this same problem existed with 1080i?

Thanks.
 
Yes, because 10 MB eDRAM isn't enough for even 720p (unless you have no AA etc). 1080i needs less storage than 1080p as you only need render every other line in 1080i. Hence 1080i (1920x540) is similar in requirements to 720p (1280x720).

But none will fit into 10 MB eDRAM if you want the trimmings which is why XB360 uses tiled rendering.
 
As I understand it, i think it depends on how your rendering.

I believe there's one mechanism that allows you to only render alternate lines at any one time, which would be little different than 720p in terms of capacity requirements. You could fit that into eDram without AA.

The other way is to render 1920x1080, which wouldn't fit into eDram, no.
 
skilzygw said:
I saw someone posted this in another post. 10mb EDRAM can't even fit ONE 1080p frame, without tiling.

I was wondering if this same problem existed with 1080i?

Thanks.

The 360 GPU was designed to tile so it's a non-issue.
 
can somebody explain how tile rendering works? They don't get a performance lost or anything at all? How much can they fit with tile?
 
weaksauce said:
can somebody explain how tile rendering works? They don't get a performance lost or anything at all? How much can they fit with tile?

Read Daves Xenos review, link on the main page.

But in principle it works like this.

Render the entire Frame Z only pass - this inserts predicates into the display list used for clipping primitives against the tiles, it also eliminates a significant porion of the subsequent over draw

Once for each tile resubmit the display list with the predicates set, then resolve (do the AA filter and copy) the result back to main memory

There will be some wasted work, where some polygons at tile edges are transformed multiple times, but it's likly with good scene organisation and triangle batching that this will be minimal. And in exchange for this redundant work you are guaranteed than Bandwidth will never bottleneck your rendering.

Exactly how you split the screen into tiles will matter and I can probably construct cases where 3 tiles are actually faster than 2 because of the amount of redundant work.

If you were writing a 1080i game on either X360 or PS3 it is EXTREMLY likely that you would choose to render a full 1920x1080 framebuffer, if you render a half heght buffer you would have to be able to guarantee a constant 60fps, and frankly I think that would be difficult on either platform at that resolution.
 
So more likely than not, 1080i games will have 2xAA, and 720p games will have 4xAA?

1080i w/ 4xAA would require 7 tiles, is that a big problem in your opinion?
 
ERP said:
If you were writing a 1080i game on either X360 or PS3 it is EXTREMLY likely that you would choose to render a full 1920x1080 framebuffer, if you render a half heght buffer you would have to be able to guarantee a constant 60fps, and frankly I think that would be difficult on either platform at that resolution.

Do modern GPU's even support rendering alternating fields? I was under the impression that this kind of functionality died with software renderers. For that matter, does the video out chip have a mode for separate sets fields as opposed to whole frames?
 
skilzygw said:
I saw someone posted this in another post. 10mb EDRAM can't even fit ONE 1080p frame, without tiling.

I was wondering if this same problem existed with 1080i?

Thanks.

The term "frame" might be misleading to use. With interlaced, the number of "fields" per second to draw a frame might be more relevant.

Both 1080p and 1080i are exactly the same resolution: 1920 x 1080. The difference is progressive vs interlaced, or another way of putting it, how the 1920 x 1080 resolution image is scanned by the display. It's either interlaced or progressivly scanned.

Interlaced takes that 1920 x 1080 image and displays the picture in two different fields. One odd numbered lines the other even number lines.

Things get more complicated in a good way with modern displays with their fancy electronics for deinterlacing an image. The point being, a high end HDTV takes a 1080i signal and after it's filtered through it's electronics you get a phenominal image displayed. Now a lesser quality HDTV might have lower quality deinterlacing technology, lowering the fidelity of the image.

Quite frankly I don't understand why Microsoft, ATI, and Samsung haven't teamed up their PR departments and explain/diagram a 1080i XB360 image being displayed on a specific Samsung HD display like the new LED DLP. Just make a graphically pretty but well designed flow chart, starting with the XB360 hardware and show the signal chain all the way until it reaches the Samsung DLP.

There is so much confusion over what HD is, yet Microsoft keeps spouting HD era over and over again. 1080i is the dominant worldwide HD signal standard with very few broadcasters supporting 720p.
 
scooby_dooby said:
So more likely than not, 1080i games will have 2xAA, and 720p games will have 4xAA?
More likely than not, developers won't render anything at 1080i aside from maybe a few games with very simple graphics where they figure they might as well just tile 1920x1080 since the 360 would have the power to spare.
 
kyleb said:
More likely than not, developers won't render anything at 1080i aside from maybe a few games with very simple graphics where they figure they might as well just tile 1920x1080 since the 360 would have the power to spare.

I already have 3 out of 7 of my games that render at 1080i. COD2 for example clearly labelled on the back of the box as 720/1080i, while most games are simply labelled 720p.
 
scooby_dooby said:
I already have 3 out of 7 of my games that render at 1080i. COD2 for example clearly labelled on the back of the box as 720/1080i, while most games are simply labelled 720p.

What makes you think that this means that it is rendering at 1080x? I really doubt that any games are rendering at anything higher than 720p right now, and personally would ignore that as simply a more explicit way of saying "supports scaling to 1080i duh". Unless I see some actual evidence to the contrary, that is.
 
kyleb said:
More likely than not, developers won't render anything at 1080i aside from maybe a few games with very simple graphics where they figure they might as well just tile 1920x1080 since the 360 would have the power to spare.

As time goes on I think it will end up like the PC market but with two main resolution choices. 1024 x 768 or 1920 x 1080. You make your choice based on what the native resolution of you display is. If you have a 720p display you obviously want 1024 x 768, if you have a 1080p display you would want 1920 x 1080.

A resolution of 1024 X 768 will have A.A. while 1920 x 1080 might not, but since 1920 x 1080 is twice the pixel resolution that will make the jaggies smaller anyway.
 
Bohdy said:
What makes you think that this means that it is rendering at 1080x? I really doubt that any games are rendering at anything higher than 720p right now, and personally would ignore that as simply a more explicit way of saying "supports scaling to 1080i duh". Unless I see some actual evidence to the contrary, that is.

Why would they need a more explicit way of saying it scales to 1080i? Every single game on the system 'scales' to 1080i so it would be pointless.

No, seems to me these mean something, otherwise what would be the point of the labels at all?
 
Technically speaking, 10 MB is more than enough to hold a 1920x1080 backbuffer:

10 MB = 10 * 1024 * 1024 = 10,485,760 bytes
1920 x 1080 x 16 bpp = 4,147,200 bytes
1920 x 1080 x 32 bpp = 8,294,400 bytes

With 16 bpp you could do a YUV backbuffer, or with 32 bpp you could do ARGB with 8/8/8/8/ or 10/10/10/2. But of course, that excludes any AA, stencil, or Z-buffer. Add in any of those and you probably don't have enough room.

But it doesn't really matter. Nobody is going to use a 1920x1080 backbuffer, on PS3 or 360. The bandwidth requirements are simply too high when evaluated against the tradeoffs in graphical returns. A 720p backbuffer scaled to 1080p is going to look almost as good.

Also, while technically 1080i only displays every other line each frame, in reality you'd still need a full 1920x1080 backbuffer.
 
scooby_dooby said:
I already have 3 out of 7 of my games that render at 1080i. COD2 for example clearly labelled on the back of the box as 720/1080i, while most games are simply labelled 720p.
CoD2 renders exactly the same whether the 360 is set to output 720p or 1080i, which makes it a good example of how the stuff on the back of the box is often missleading.
 
Sethamin said:
Technically speaking, 10 MB is more than enough to hold a 1920x1080 backbuffer:

10 MB = 10 * 1024 * 1024 = 10,485,760 bytes
1920 x 1080 x 16 bpp = 4,147,200 bytes
1920 x 1080 x 32 bpp = 8,294,400 bytes

With 16 bpp you could do a YUV backbuffer, or with 32 bpp you could do ARGB with 8/8/8/8/ or 10/10/10/2. But of course, that excludes any AA, stencil, or Z-buffer. Add in any of those and you probably don't have enough room.

But it doesn't really matter. Nobody is going to use a 1920x1080 backbuffer, on PS3 or 360. The bandwidth requirements are simply too high when evaluated against the tradeoffs in graphical returns. A 720p backbuffer scaled to 1080p is going to look almost as good.

Also, while technically 1080i only displays every other line each frame, in reality you'd still need a full 1920x1080 backbuffer.
I see what you are saying but isn't it a requirment to have the Z-buffer with the pixel depth? Why was it excluded from the equation?
 
Last edited by a moderator:
Back
Top