Xbox 360 have 4x AA enable at all times?

If 2x is for free, that would tend to indicate that the eDRAM stores 2 samples per pixel. It would also make interior edge pixels compressible in the 4x case.
 
http://www.firingsquad.com/features/xbox_360_interview/default.asp

FiringSquad: You said earlier that EDRAM gives you AA for free. Is that 2xAA or 4x?

ATI: Both, and I would encourage all developers to use 4x FSAA. Well I should say there’s a slight penalty, but it’s not what you’d normally associate with 4x multisample AA. We’re at 95-99% efficiency, so it doesn’t degrade it much is what I should say, so I would encourage developers to use it. You’d be crazy not to do it.
 
Jawed said:
The AA samples aren't stored in EDRAM. They're stored in system memory.

The framebuffer is split, so that only the filtered pixels are stored in EDRAM. When a filtered pixel isn't finalised, the pixel isn't a colour, but a pointer to the AA samples (to a memory address, forming the start of a linked list for the entire set of AA samples that make up the pixel).

This isn't the case, evidently.

The EDRAM stores the upsampled values, and these are downsampled on the way out to system RAM before display. However, the graphics and API automatically tiles the screen so that if a triangle requests to be rendered is outside of the area that it is currently being processed the graphics will discard it and recall it later when it is processing the area that trinalge reside in. This means that when a resolution/AA depth/bit depth combination is set that is greater than the quantity of EDRAM the system will tile it up and render the initial tile into EDRAM, when that is down it will downsample that and pass that out to system RAM and move on to the next tile.

So, upsampled pixels (AA resolution) pixels only reside in EDRAM and downsampled only reside in system RAM.

As a side note, I mentioned before that 1080p is not a function of the display logic. What I didn't realise until now is that ATI aren't responsible for the display - this doesn't exist in C1/Xenos, but something that Microsoft has done themselves.
 
It sounds like the eDRAM is pretty flexible. I am curious what FP blending R500 has and if the card will be fast enough to do quality HDR effects.
 
What impact does having 192 FPUs inside of Edram have? :oops:

I thought it was just memory but it seems more like its an active processor with 10MB to work with!
 
The really interesting bit from Dave B.'s earlier post is that the XBOX 360 GPU doesn't handle the display output. So even though the frame buffer is in EDRAM, the pixels get written to main memory (i.e., the GDDR3 RAM) and then sent to a Microsoft designed output chip (e.g., RAMDAC, video scaler functions). And the Microsoft chip only supports a single display at 1080i max. (i.e., no 1080p). It's a weird design choice by Microsoft since it would seem to be less efficient design, but perhaps they were really focused on lowest cost and this was the best way to achieve it.
 
MS obviously had to make many decisions here, and I think they've found the better compromises.
How much of their market needs 1080p output? How much of their market needs full backwards compatibility? How many users would buy a second HDTV?
And how much would they have to pay in manufacturing costs and hardware capabilities if they'd make the above the more important features?
 
DaveBaumann - that's fantastic news. The r500 looks to be an extremely elegant design.

One question: that block diagram - is it from ATI or did FS cook it up on their own?

Regards,
Serge
 
mech said:
I don't think there's enough RAM to do 4x FSAA at 720p. Just 2x.

I should have clarified this - I meant enough RAM to do 4xFSAA for free at 720p.

And it looks like I was correct.
 
ATI mentioned a 2-5% hit to performance. Nice trade off... I would take 55fps with 4x AA over 60fps with 2x AA.
 
psurge said:
One question: that block diagram - is it from ATI or did FS cook it up on their own?

Thats a "dumbed down" version for the purposes of an ATI press conference. I was showed a more detailed schematic that is probably a little more representative of the actual structure / flow but wasn't allowed to take a copy of it. Again, it now looks like I'll be having a CC with them later on to get a little more info.
 
Luminescent said:
Acert93 said:
ATI mentioned a 2-5% hit to performance. Nice trade off... I would take 55fps with 4x AA over 60fps with 2x AA.
And that would only be the case if the title was bandwith/rop limited.

FWIW the overhead they're talking about isn't from the ROP/Bandwidth.....
In the case where it's entirely ROP/Bandwidth limited 4xAA is entirely free....
 
Acert93 said:
ATI mentioned a 2-5% hit to performance. Nice trade off... I would take 55fps with 4x AA over 60fps with 2x AA.

Well, since we're being picky, and we're gonna look at the games through TVs or monitors, that would rather sound "55fps with 4xAA and screen tearing, 30fps with 4xAA (or more, since now they have double the time to process pixels) without tearing, or 60fps with 2xAA and obviously no tearing"...

I'd take one of the last 2, cause i cannot stand screen tearing for the life of me. You can't just have 55fps on a display, the screen either tears or it just clocks down to 30fps (if VSynch is enabled).

But since 2-5% of 60 is a tiny 1.2-3, they could just drop one frame or 2 every second without us noticing too much.
 
london-boy said:
Acert93 said:
ATI mentioned a 2-5% hit to performance. Nice trade off... I would take 55fps with 4x AA over 60fps with 2x AA.

Well, since we're being picky, and we're gonna look at the games through TVs or monitors, that would rather sound "55fps with 4xAA and screen tearing, 30fps with 4xAA (or more, since now they have double the time to process pixels) without tearing, or 60fps with 2xAA and obviously no tearing"...

I'd take one of the last 2, cause i cannot stand screen tearing for the life of me. You can't just have 55fps on a display, the screen either tears or it just clocks down to 30fps (if VSynch is enabled).
Screen tearing is the most annoying experience ever.
 
london-boy said:
Acert93 said:
ATI mentioned a 2-5% hit to performance. Nice trade off... I would take 55fps with 4x AA over 60fps with 2x AA.

Well, since we're being picky, and we're gonna look at the games through TVs or monitors, that would rather sound "55fps with 4xAA and screen tearing, 30fps with 4xAA (or more, since now they have double the time to process pixels) without tearing, or 60fps with 2xAA and obviously no tearing"...

I'd take one of the last 2, cause i cannot stand screen tearing for the life of me. You can't just have 55fps on a display, the screen either tears or it just clocks down to 30fps (if VSynch is enabled).

But since 2-5% of 60 is a tiny 1.2-3, they could just drop one frame or 2 every second without us noticing too much.

This just doesn't sound right. I've played games that ran at around 40fps on my computer monitor and there is no screen tearing...
 
Back
Top