Here we go again with our HDR/AA topic of the week
Some interesting info here, I hope this has not been posted already:
(From Bit-Tech interview)
Some interesting info here, I hope this has not been posted already:
(From Bit-Tech interview)
Using AA with HDR
For those of you with super-duper graphics cards, you will have come across a problem: you can't use Anti-Aliasing when using HDR lighting, for example in Far Cry. In these cases, it's a situation where you have to choose one or the other. Why is this, and when is the problem going to get solved?
"OK, so the problem is this. With a conventional rendering pipeline, you render straight into the final buffer - so the whole scene is rendered straight into the frame buffer and you can apply the AA to the scene right there."
"But with HDR, you render individual components from a scene and then composite them into a final buffer. It's more like the way films work, where objects on the screen are rendered separately and then composited together. Because they're rendered separately, it's hard to apply FSAA (note the full-screen prefix, not composited-image AA! -Ed) So traditional AA doesn't make sense here."
So if it can't be done in existing hardware, why not create a new hardware feature of the graphics card that will do both?
"It would be expensive for us to try and do it in hardware, and it wouldn't really make sense - it doesn't make sense, going into the future, for us to keep applying AA at the hardware level. What will happen is that as games are created for HDR, AA will be done in-engine according to the specification of the developer.
"Maybe at some point, that process will be accelerated in hardware, but that's not in the immediate future."
But if the problem is the size of the frame buffer, wouldn't the new range of 512MB cards help this?
"With more frame buffer size, yes, you could possibly get closer. But you're talking more like 2GB than 512MB."