Why Can't Nvidia cards do..

HDR+AA. Look, we know it does HDR fine .. and we know it does AA fine .. so why can't it do HDR+AA together? Is the setting just being disabled in the drivers or is the hardware just not able to do so?

Doesn't the 7800 run HDR+AA in HL2? I thought I saw a mention of it doing so in a review.

Since many people don't think HDR+AA is gonna be fixed in the G71 .. I want to know what Nvidia will need to do to fix it for the G80.

Regards,

US
 
You can run HDR+AA in HL2:Lost Coast (and presumably any other Source engine games) and there's a patch to enable you to do so in 'Far Cry', too (though it may not work with Nvdia cards). There's not that many other games that really use it. I guess it's down to how individual developers implement HDR.
 
Last edited by a moderator:
But if it is capable for it to render HDR + AA why don't developers take advantage of this? Also why couldn't Futuremark do something to get it working in 3DMark06. Yes, it's a bit of work but dammit, make it work. Or does Nvidia just tell the Developers not to do so?

It seems stupid to me.

US
 
Unknown Soldier said:
But if it is capable for it to render HDR + AA why don't developers take advantage of this? Also why couldn't Futuremark do something to get it working in 3DMark06. Yes, it's a bit of work but dammit, make it work. Or does Nvidia just tell the Developers not to do so?

It seems stupid to me.

US

It's just not that simple. Remember that the developers have to decide what form of HDR would work best on their engine... not just to add HDR for the sake of it.

If you were to use NVidia's method of using FP16 render targets... you cannot use HDR+AA

http://www.beyond3d.com/previews/nvidia/nv40/index.php?p=12

Although most of the pipeline operations work under the OpenEXR format, at present the FSAA multisampling scheme does not.

Using ATI's method doesn't exhibit such limitations..

http://www.beyond3d.com/reviews/ati/r520/index.php?p=06

Although ATI have provided these HDR blending capabilities in the R520 architecture they haven't removed any orthogonally, meaning that all modes of operation that run through the ROP's work with one another, the net result is that all of the FSAA options that are provided under standard blending buffers are equally supported under any of the HDR blending modes, however the costs associated with the 64-bit modes and FSAA enabled are likely to be fairly high due to the bandwidth utilisation and memory space requirements.

The simple fact that it depends on how you are rendering.. apparently ATI's method of getting HDR+AA is much more flexible. Futuremark uses NVidia's FP16 render targets and thus is unable to use AA in that method.
 
It isn't capable of doing so with a floating point frame buffer format like the openEXR format that farcry is using.

This is from an interview with david kirk(works for nV, interview is from july 2005):

For those of you with super-duper graphics cards, you will have come across a problem: you can't use Anti-Aliasing when using HDR lighting, for example in Far Cry. In these cases, it's a situation where you have to choose one or the other. Why is this, and when is the problem going to get solved?

"OK, so the problem is this. With a conventional rendering pipeline, you render straight into the final buffer - so the whole scene is rendered straight into the frame buffer and you can apply the AA to the scene right there."


"But with HDR, you render individual components from a scene and then composite them into a final buffer. It's more like the way films work, where objects on the screen are rendered separately and then composited together. Because they're rendered separately, it's hard to apply FSAA (note the full-screen prefix, not composited-image AA! -Ed) So traditional AA doesn't make sense here."

So if it can't be done in existing hardware, why not create a new hardware feature of the graphics card that will do both?

"It would be expensive for us to try and do it in hardware, and it wouldn't really make sense - it doesn't make sense, going into the future, for us to keep applying AA at the hardware level. What will happen is that as games are created for HDR, AA will be done in-engine according to the specification of the developer.

"Maybe at some point, that process will be accelerated in hardware, but that's not in the immediate future."

But if the problem is the size of the frame buffer, wouldn't the new range of 512MB cards help this?

"With more frame buffer size, yes, you could possibly get closer. But you're talking more like 2GB than 512MB."

link: http://www.bit-tech.net/bits/2005/07/11/nvidia_rsx_interview/3.html

HL2 uses some integer format and some tricks to do descent HDR with plain old MSAA.
 
soylent said:
It isn't capable of doing so with a floating point frame buffer format like the openEXR format that farcry is using.

OpenEXR really has nothing to do with not being able to render... it's just a storage format.
 
The use of FP16 render targets is but one method for achieving "HDR".

Far Cry is the most famous example of a game using this technique (added as a patch after the game was released), but it's also used in Splinter Cell:Chaos Theory.

NVidia hardware can't calculate the blending of MSAA samples into final pixels when a render target is in FP16.

In the same scenario ATI hardware that supports FP16 render targets also supports AA - but apparently only up to 4xAA. That's all X1k cards.

One issue with MSAA support for FP16 HDR is that any tone-mapping pass that's performed on a render target after the AA samples have been resolved (i.e. once rendering is complete) will introduce subtle errors in the edge pixels. These errors are similar to the errors introduced when gamma-correct AA is not performed:

http://www.beyond3d.com/forum/showpost.php?p=541268

(that thread is a general discussion of Shader AA.)

Additionally, there've been rumblings from NVidia that programmable ROPs are coming. That's a broad concept, but if NVidia has a sight on this kind of solution (something similar to texture sampling/filtering which could also become entirely programmable) then it seems conceivable that ROPs as fixed function units will eventually disappear.

Jawed
 
Deathlike2 said:
OpenEXR really has nothing to do with not being able to render... it's just a storage format.

Yes it does. It's a floating point storage format. And nV's hardware can't do AA on fp render targets.
 
alright, idk if i'm a bit in left field here, but just from skimming over some of this, and my somewhat foggy recolection of Matrox's AA


Couldn't the AA that Parehelia uses (where it analyzes the scene and doesn't AA every single element) be applied in tandem to the HDR rendering? (so it writes to the final buffer with the AA calculations already performed, and it's not AA'ing everything, just portions that it deems "AA-needed")

Not entirely sure if it'd even work, just wondering really
 
Does anyone else find this state of afairs on the best implementation to do HDR ironic when compared to when the GF6 was released?
 
Unknown Soldier said:
HDR+AA. Look, we know it does HDR fine .. and we know it does AA fine .. so why can't it do HDR+AA together? Is the setting just being disabled in the drivers or is the hardware just not able to do so?

Doesn't the 7800 run HDR+AA in HL2? I thought I saw a mention of it doing so in a review.

Since many people don't think HDR+AA is gonna be fixed in the G71 .. I want to know what Nvidia will need to do to fix it for the G80.

Regards,

US

Well, it can. . it just can't accelerate it in the rops.

Are you sure they are going to "fix" HDR AA rather than abandon AA-specific hardware accel entirely in favor of something ps-based? At least that would seem to be the logic of David Kirk's statement. . .tho I suppose if the transistor cost isn't too great they could also keep it around for legacy purposes.
 
Last edited by a moderator:
Xmas said:
There is no logic in David Kirk's statement.

I'll play it straight, in spite of feeling impish. ;) The man is CTO --we have to take him seriously when he says stuff. . .right up until it doesn't pan out, then we get to hammer him. That's the way it works. :LOL:
 
Unknown Soldier said:
But if it is capable for it to render HDR + AA why don't developers take advantage of this?

Valve doesn't use FP16, so that's why they can still have AA with HDR.
 
soylent said:
Yes it does. It's a floating point storage format. And nV's hardware can't do AA on fp render targets.

OpenEXR != FP16. It's just a file format. The hardware can't read it directly, instead it has to be decoded to FP16 on the CPU before uploading to the GPU. OpenEXR has as much to do with rendering as JPEG does.
 
It's debatable what to call valve's implementation.

I'd call it HDR-S, ie, high dynamic range storage, instead of high dynamic range rendering.
The thing is, they still render in 8bit/channel but do so using some higher range textures. Actually, it's probably lower dynamic range render, as I get the feeling you only see 0-127 colour range, with the higher range 128-255 being used for the ugly bloom filter.
The result of this is that while you can compute the overall brightness of the screen (for the subsequent tone-mapping value in the next frame), the value isn't anywhere near as accurate, and the result is that the dynamic tone-mapping exposure effect isn't as effective as it should be, so for example very small but very bright objects (the sun) don't contribute to the final brightness as much as they should. Simple point, they still use a lens flare effect to make the sun 'bloomy', had they not done this, and you looked at the sun, it would look overcast. They simply don't have the dynamic range.

Therefore MSAA does work in valve's implementation, because the tone mapping step isn't a post processing effect, it's a end-of-every-shader effect. So the MSAA errors that jawed mentioned still should occur (although I feel that image is a bit misleading jawed. :p )

Valve could have helped the effect by storing a non-linear value in the colours above 127, but then blending wouldn't work correctly, and it would still be 8bit. Real FP16 based bloom (subtle bloom!) looks absolutly bucket loads of coolness better than 8bit bloom.


Hope that helps clear up why being able to do MSAA with HDR is up to the developer when it comes to nvidia cards. As with all things it's a quality trade off.
 
Last edited by a moderator:
If I may ask. How does the X8xx do HDR? Nuke, Militia and DOD:S have HDR(Bloom) running on the cards, yet the cards ain't meant to do HDR since they aren't SM3.0. And the thing is HDR looks fantastic on those cards.

US
 
Unknown Soldier said:
If I may ask. How does the X8xx do HDR? Nuke, Militia and DOD:S have HDR(Bloom) running on the cards, yet the cards ain't meant to do HDR since they aren't SM3.0. And the thing is HDR looks fantastic on those cards.

US
Consider yourself a victim of NVidia's FUD.

Jawed
 
Back
Top