DemoCoder said:I think the point of HDR is to be used in render to texture in most cases (as I said before). Valve did an HDR demo and it seemed to run just fine BTW.
I'm sure the R420 only supports the pixel formats the R3xx supported.Mintmaster said:Does anyone know if R420 supports:
- FP16 blending?
- FP32 (well actually, FP24 internally) blending?
- Blending with other formats, like I16?
- FP filtering?
DemoCoder said:Render to texture doesn't do AA. MSAA rendertargets aren't supported nor is FP texture filtering.
DaveBaumann said:I know - they didn't use a floating point frame-buffer, which is what this topic is about. (And a render to texture would still operate with MSAA as well)
Zeno said:DaveBaumann said:I know - they didn't use a floating point frame-buffer, which is what this topic is about. (And a render to texture would still operate with MSAA as well)
I think there is a misconception here. As far as I know, there is no such thing as a "floating point frame buffer" in any of the cards we're talking about. When doing HDR, you have to set up a high precision offscreen buffer and render to it. It is this buffer that you will now be able to blend into on the 6800.
After you have your scene rendered into your high precision buffer, you then have to do a final fragment program pass to map the 16-bit floats into the 8-bit fixed range of the frame buffer. This is usually referred to as "exposure" or "tone mapping".
It would be great to have additional precision in the frame buffer, but I don't think normal computer displays can handle 16 bits of dynamic range.
DaveBaumann said:No, the render to texture operation doesn't, but the rest of the operation can. For instance the rthdribl demo used render to texture if FP render targets are supported and will still utilise MSAA.
bloodbob said:Older style monitors are analog so the number of bits isn't a problem the range is though.
Mintmaster said:Does anyone know if R420 supports:
- Blending with other formats, like I16?
WaltC said:Mintmaster said:It seems everyone has been caught up in either performance numbers or shader models. However, even Chalnoth was saying that FP-blending is likely the most important feature to be introduced this generation.
Does anyone know if R420 supports:
- FP16 blending?
- FP32 (well actually, FP24 internally) blending?
- Blending with other formats, like I16?
- FP filtering?
If they are supported, has anyone tested them?
But then I worry that nVidia had no official demos lined up to illustrate all of this amazingly important stuff like PS3.0 and fp blending, so that reviewers could run them, analyze them, and praise them, if deserving of it. I mean, if it's so "important" and so forth, then WHERE ARE THE IN-HOUSE nVIDIA DEMOS to illustrate it???? I think a bit of empirical evidence is definitely in order.
Tic_Tac said:It makes me suspect that there is no significant qualitative or performance advantage using SM3.0 over SM2.0 at this point.
Are you sure? I read many times that you couldn't. I know I16 filtering is supported, but I didn't think blending was.Hyp-X said:Mintmaster said:Does anyone know if R420 supports:
- Blending with other formats, like I16?
Well since R300 supports that I assume R420 does too.
Yeah, I knew that, but I didn't know what they were doing for blending. I assumed they just thought of a compromise, like doing alpha blending afterwards into the final X8R8G8B8 display buffer. You can't get HDR reflections (i.e. with glare, bloom, lens flares, etc) off transparent glass, as one example of a drawback. You could counter this by drawing only the reflections into another I16 (or FP) buffer, but now we're getting into nasty ad-hoc solutions with Valve might be willing to go through, but most wouldn't.Both the HDR launch demo and Valve's HDR implementation are using A16R16G16B16 buffers - NOT float buffers.
Agreed, but I'm not asking for PS 3.0. VS 3.0 would be nice, but so long as some sort of uberbuffers type of capability (allowing render to vertex array) is standardized, I'll be happy.joe emo said:"This sucks. R300->R420 is almost as bad as GF3->GF4."
I'm not so sure that's bad thing. As I recal, when the NV30 was launched, the general consensus was that NVIDIA had hit one out of the park with the GF4, but struck out the GF-FX. The GF4 was a damn fine product for it's time -- the Radeon8500, while supporting more features, was slower than the GF4 and therefor did not have the same "warm" market response the GF4 did.
So what? Why do we need such insane framerates or super high resolutions anyway? They're a luxury because we have nothing else to do with our graphics cards. I want the best image possible at 60 fps, but that doesn't necessarily mean 1600x1200 at 6xAA. 800x600 is already better than DVD quality. I'll play a game at 1024x768, 800x600 or even 640x480 if it looks as good as the rthdribl demo.DaveBaumann said:FP frame buffers. With NV40 fill-rate and bandwidth will be halved. There is no AA either. I suspect that these will be the long term limiting factors.
So what? Why do we need such insane framerates or super high resolutions anyway?