HDR+AA...Possible with the R520?

DaveBaumann said:
Vysez said:
Personally, the question I'd like to ask is :"Does R520 have a 16fp or 10fp precision for the framebuffer"?

Would it need to be an either/or situation?
Probably since have two seperate paths would probably cost a fair few transistors.
 
"Path"? These are different pixel formats that the ROP's would be supporting (with conversion down from the PS precision).
 
DaveBaumann said:
"Path"? These are different pixel formats that the ROP's would be supporting (with conversion down from the PS precision).
So your suggesting they support AA via shaders?
 
DaveBaumann said:
Vysez said:
Personally, the question I'd like to ask is :"Does R520 have a 16fp or 10fp precision for the framebuffer"?

Would it need to be an either/or situation?

Dave it would be great if R520 supported FP10 & FP16.

1. Mid level cards (your 9500/9600 and X700 level cards) are probably not going to be fast enough or have the memory bandwidth to do FP16 at solid framerates. FP10 gives a nice tradeoff of IQ for speed.

2. The High end cards could use FP16 at first, and later in their life scale down to FP10 to lengthen the life of the card.

Both of these options are great from a consumer standpoint.

3. MS continues to push the "MS Gaming Platform" approach and wants to see more 360 games on the PC and more PC games on the 360. As it seems FP10 will be the primary HDR format for most 360 games, supporting this feature would be a good move by ATI. Having a basic feature set with the consoles--especially if it comes with a minimum speed penatly--could be a big perk for ATI.

Overall I really like the idea of supporting FP10, especially if it has a minimal transistor hit AND we can get AA with it.
 
DaveBaumann said:
Vysez said:
Personally, the question I'd like to ask is :"Does R520 have a 16fp or 10fp precision for the framebuffer"?

Would it need to be an either/or situation?

Well, you already know the answer to that ;) so I would just add that I see no reason why a 16fp precision framebuffer cannot hold fp10. 16 bits per component = 5 bits of exponent and 10 bits of mantissa, right?
 
It wouldn't make any sense for an FP16 frame buffer to hold an FP10 framebuffer.

Dave's suggesting that the chip can support both formats. But I suspect that FP10 will be used most due to memory space constraints (though I still have very serious doubts as to its quality).
 
Well it's about time we had an FP10 HDR versus FP16 HDR thread, just like we've had all those discussions about FP24 versus FP16.

For a start I'd be interested in seeing an image quality comparison...

Jawed
 
Jawed said:
Well it's about time we had an FP10 HDR versus FP16 HDR thread, just like we've had all those discussions about FP24 versus FP16.

For a start I'd be interested in seeing an image quality comparison...

Jawed

The moment I read about FP10 FB on R520 I started laughing. I really want to read what all those people who complained that NV40 only had FP16 FB support will say now. Remember all those comments all aorund the Web? How FP16 is not "real HDR" and you really need FP32? Now watch as with sudden smoothness FP10 becomes "just right". LOL.

Of course I am not saying FP10 is not good enough. I am talking strictly about all the online noise about it.
 
I think you are getting framebuffer format arguments confused with internal precision arguments. I don't recally people saying you need external FP32 formats.
 
DaveBaumann said:
I think you are getting framebuffer format arguments confused with internal precision arguments. I don't recally people saying you need external FP32 formats.

Well, I hope that is what the "Anti Nvidia fanclub" was thinking when they decided that not only is HDR without FSAA worthless, but it needs to be FP32 (because FP16 is the "bad bad bad and cheating!" OpenEXR format :p). However, I have seen so many posts that NV40's HDR (FB blending) is just some "half-baked version of the true FP32 one". I am not talking about just on here. I am talking in general.
 
Isn`t that what`s always happening with new generations?Anyway, perhaps the R520 is tailored to do FSAA only with FP10, not FP16 to alleviate the bandwitdth hit....who knows?We`ll have to wait and see
 
wireframe said:
I have seen so many posts that NV40's HDR (FB blending) is just some "half-baked version of the true FP32 one". I am not talking about just on here. I am talking in general.
Well, that's just ignorant. At least FP24 vs. FP16 was at least somewhat of a legitimate discussion/argument, as we had hardware doing both. There's no FP32 HDR in sight, and nV's the only one doing FP16 HDR, so I just don't see how people can argue that at all--especially since ATI's been at FP24 for the past, what is it now, three years?

</obviousness>
 
wireframe said:
DaveBaumann said:
I think you are getting framebuffer format arguments confused with internal precision arguments. I don't recally people saying you need external FP32 formats.

Well, I hope that is what the "Anti Nvidia fanclub" was thinking when they decided that not only is HDR without FSAA worthless, but it needs to be FP32 (because FP16 is the "bad bad bad and cheating!" OpenEXR format :p). However, I have seen so many posts that NV40's HDR (FB blending) is just some "half-baked version of the true FP32 one". I am not talking about just on here. I am talking in general.

Hmmm since most here at B3D are excited about HDR and floating point blending I think you are overstating the issue a bit. Just because one or two people say something does not make it a consensus. I actually have not read a single thing complaining about the G70 supporting FP16. So if it was ever an issue no one is complaining now.

As an NV40 owner your point would be more relevant if the NV40 could even do FP16 in a modern game at solid framerates. It cannot, and thus is one of those overblow features that takes a refresh/new gen to be usable.

As for current HDr implimentations they do stink. We may actually indeed find that higher levels of percision ARE needed. I was out hanging laundry with my wife this morning and the sun was very hot. The roof next door was shinning brightly, and it hurt to look up due to the sun, yet my wife's skin was not all overexposed with light. It had very fine highlights, but definately not a radiating look we find in games. Even her hair, which had very very bright highlights due to the sun did not have "UBER GLOWING".

I am excited about HDR in general. GPUs are taking many SMALL steps toward outputting realistic images. The closer we get the farther we are. HDR, and lighting in general, has a LONG way to go.

The important thing is that IHVs are offering features that progressive the IQ in games while keeping the framerates stable. Raw speed with no IQ jump is not acceptible, but neither are nice new features that no one can use because their performance sucks eggs.

And with that I must say I am happy the couple games that use HDR (FC, SC:CT) seem to be able to output stable frame rates with HDR enabled.

The sad part, especially from a IQ perspective, is that it is as the sacrifice of AA. Because HDR looks so poor in FC and seems underwhelming SC it is really not a choice--AA all the way. But I must say in general that solid AA does more for a game image than any HDR I have seen.

Although I do expect some next gen games to change that opinion. So far HDR has a lot of potential and not so good implimentations. That will change... and when it does it will suck for G70 owners because they are forced to choose AA or HDR.
 
Acert93 said:
Even her hair, which had very very bright highlights due to the sun did not have "UBER GLOWING".
That's just a result of people trumping up the blurriness to make sure that you see that they have it in there. It's a combo of that and the fact that the scatter in your eye that causes the blurriness isn't gaussian in shape, it's a much more spikey point-spread function. But give it a year or so, they'll tune it down to a reasonable level. It's a lot like how per-pixel specular bump mapping made everything look like it was coated in floor wax at first because they wanted to make sure you knew they had it.
 
Geeforcer said:
First of all, the bandwidth between eDRAM and GPU in Xenos is 32GB/s. Second, Dave seemed to be dropping hits as of late that R520 will indeed allow for HDR and AA.

DaveBaumann said:
Irrespective of whether its works on R5x0 or not I'd say its a shame because a.) this is one of the most frequent queries that I noticed prior to the release of G70, b.) developers have to take into account this loss of orthogonality in the ROP's between HDR and MSAA, c.) Its a hard trade-off of one IQ feature to the next - inevitably end users will be trading something off for performance, but if it were available they may find they can strike a balance they like between res, AA depth and HDR.

Inevitably it will go away for everbody eventually as the decision is purely down to transistor budgets.
Uh oh. Looks like you're right, Geeforcer. Nuts.
 
Acert93 said:
As an NV40 owner your point would be more relevant if the NV40 could even do FP16 in a modern game at solid framerates. It cannot, and thus is one of those overblow features that takes a refresh/new gen to be usable.

I don't see how this is true when I can run both Far Cry and Splinter Cell Chaos Theory at 1280*1024 with HDR enabled on the 6800 Ultra. It may not be bragging-rights FPS territory, but it is certainly playable. And then I thought you agreed because you say this:
And with that I must say I am happy the couple games that use HDR (FC, SC:CT) seem to be able to output stable frame rates with HDR enabled.

Is that supposed to imply that only just now with G70 is this possible? I already said I play both at above 1024*768 with HDR so you know my position.

The sad part, especially from a IQ perspective, is that it is as the sacrifice of AA. Because HDR looks so poor in FC and seems underwhelming SC it is really not a choice--AA all the way. But I must say in general that solid AA does more for a game image than any HDR I have seen.

This is something I think needs thinking about. Sure, we are accustomed to FSAA and we relate it to good IQ, but...if all the lights in a scene are dyamic and computed/blended and objects are given just that little bit of glow, you get something like FSAA without having it as a geometric function. Look at bloom in HDR enabled games. Think a nasty looking aliased tree or something. Once the bloom kicks in you cannot see the harsh jagged lines of the tree geometry because it has been blended with the backlight. Sure, FSAA and then the blending would be even better, but I think if a game used a very sophisticated lighting scheme with sub-surface scattering and auras, you will see most of the harsh lines disappear. On the other hand, I think people may react negatively to this at first because they are used to seeing very distinct 'digital' lines. But the real world is not this detailed. We only see these things when we stare at objects and most of the things you see with your eyes are actually fuzzy while the brain knows to think of them in terms of flat or straight becuase this observation (bias?) has already been made in something we could call a full scene scan. Of course, we often make the mistake of thinking things are straight when they are not, but this is purely a failing of our own computational/observational powers.

This is something I have been thinking about for quite some time, but I was strongly reminded of it while playing through Metal Gear Solid on the PS2. Now, I actually have no idea what computation possibilities the PS2 has, but it seems to lack FSAA and AF. However, in the cleverly made cutscenes for MGS3 they employ this very technique of blending lights and you have moments when everything looks anti-aliased.

One consideration here is the detail of the blending. In some titles, or areas within one title, you can see the blended blocks around geometries. This can look outright bad or hardly noticeable, but it is certain that if the detail of the blending can be increased, the aliasing from the geomtery vanishes more and more. Essentially you are reducing jaggedness, not by sampling more of one geometry to "fill in the blanks," but you are taking the whole image in a "full scene lit context" and suddenly those jaggies become less pronounced.
 
I think FP10 blending is going to be artifact central. You already get artifacts with 8-bit precision, and going from 8 to 7 bits, makes things much worse. Remember, we use to harp about artifacts with 16-bit color when the green channel has 5-6 bits of precision. With FP16 shaders, we were seeing artifacts with just 10-bit mantissa on *short* shaders which closely resemble what a few HDR framebfufer blends will give you in terms of accumulated error. Moreover, without a sign-bit, you can't use an FP buffer for RTT operations that create vertices or particles. (well you can, but you have to bias yourself in vertex shader) I would hope that both FP10 and FP16 are supported. FP10 is a hack right now because of bandwidth issues, but it trades off precision for range, but both are needed for next-gen titles. It is not a good tradeoff like FP32->FP24 was IMHO.
 
Besides all the technical aspects of comparing FP10 and FP16, are there any demos or other means to compare the output images that each produces? I don't know of any off hand, but there's gotta be something...

It just seems like it would put some things to rest if we knew approximately what we were losing or gaining in terms of IQ with HDR alone before even arguing about the AA factor.
 
Back
Top