PS3's Inability to perform HDR + FSAA

london-boy said:
Actually no, you're missing it. We're not discussing how good a game will ultimately look, we all know anything will look gorgeous, but here it's a technical forum (the console forum has been divided into 3 entities afterall for a reason), so we're discussing technicalities. Mostly useless technicalities you'll argue, but that's the point here.

The possibilities.

1. PS3 can do both
2. The games that use HDR have jaggies because of :
A. Limitations of the Dev Kits
B. That PS3 cant do both
3. If Devs want HDR + FSAA they can use the same method that the HS team are using, Use interger HDR.
 
!eVo!-X Ant UK said:
You Misst the point
As LB has said, no I haven't. What you're posting is relevant to a thread 'do games need AA and HDR to look good?' but not to a 'can this hardware achieve these graphical effect simulataneously' thread. Regardless of whether games look good with or without HDR and AA, the question being asked and answered is one of hardware capabilities.
 
Oh erm.. i can rep again... everyone shiver under the mighty repper!!! :LOL:

In the end, as i've stated many times, AA is not all about jaggies, which on a HDTV at a distance will barely be seen. AA is about all other little flaws that aliasing carries with it, and also show on that picture you posted Evo. See the moire pattern on the very finely detailed textures on Snake.

Jaggies are the least of our problems at HD resolutions, at a certain distance.
 
Shifty Geezer said:
Regardless of whether games look good with or without HDR and AA, the question being asked and answered is one of hardware capabilities.

With shaders, hardware capability is a rather fuzzy thing. RSX is capable of HDR with AA, regardless of whether there is hardware support or not.

The hardware support that is there is FP16 buffers with blending, but without MSAA.

That does not preclude HDR with MSAA via software. And that's not some fringe experimental case, either, as we're hearing of its use now in a production title (Heavenly Sword - although I guess everything is subject to change until it goes gold!).

Your HDR with AA options on RSX basically boil down to:

HDR on a FP16 buffer with SSAA
HDR via shaders with MSAA

Of the two, the latter would probably be more feasible in most cases.

Bandwidth is an issue, of course, but it's very hard to quantify the requirement across the possible techniques. One can presume from HS's experience, with the option it took, that it's not a deal-breaker since they're apparently saving bandwidth this way.
 
Well, David Kirk breezily says this is a developers problem. With NV providing both the gpu and the tools for PS3, I wonder if they did anything with CG to help a brother out? I suppose that would make too much sense. . .

Edit: I'm going to expand on this, because while Kirk's answer in this area annoyed me somewhat at the time, I hadn't given any consideration to the relationship to PS3. So now I've gone from annoyed to really annoyed. :LOL:

Let's consider a couple facts:

1). David Kirk says that for the forseeable future he sees this problem as a developer problem. Maybe --maybe-- some day in the future when a developer consensus/technique is arrived at, NV will look at hardware accel for that solution.

2). NV is the graphics heart --both gpu and dev tools-- for PS3. PS is the console flagship and reigning champ. Just like in boxing, you're the champ until someone puts you on your backside. Will that happen this time? I dunno, but I wouldn't bet on it. And I'm damn sure that NV isn't betting on PS being dethroned this time around.

So you want to talk about leadership? Kirk and NV are free to advocate pushing AA+HDR over the wall into the software side. What they aren't free to also do, given they own the dev tools for what they presume will be the leading console for the next five years, is then walk off snickering to themselves and call it a day.

They have a responsibility to take a leadership role in getting it solved on the software side. Consulting with devs, building a consensus, doing a kick-ass demo showing that solution, providing code samples for how to do it, and making it as painless as possible to implement that code in CG.

And then, having solved it for console, given the presumption that RSX is close to G70, transferring that solution back to the PC side.

So whatcha gonna do NV?
 
Last edited by a moderator:
To the topic in general, talk is that the GTX@90nm will address and improve NV's AA situation, as well as allow for HDR+AA at the same time. Now that's all talk for the moment, but if we assume the RSX has more in common with the GTX@90nm than the straight G70, then it might be the case that this has been addressed in the RSX as well. And of course, if the RSX deviates from the G70 more than is right now suspected/assumed, then one has to assume that the HDR+AA problem will be addressed in that architecture as well.

Of course as has been said I think we're going to be seeing a lot more done in software on these consoles than the typical PC dev would be used to doing, which might take some of the edge off as the gen progresses.
 
Shifty Geezer said:
Copy nAo's 5 instruction colourspace shader code and include it with the dev tools ;)

Sorry i've been a bit out of the loop lately, but can you head me to where he explains that? Tried to search for it but obviously nothing useful came up... :D
 
london-boy said:
Sorry i've been a bit out of the loop lately, but can you head me to where he explains that? Tried to search for it but obviously nothing useful came up... :D

You'll want to start here if you want the full undertsanding, and then basically read every post by either Deano or nAo from there until the end of the thread. You don't really need to read the rest, as the important points are normally quoted in their responses. I figure filtering out the rest will speed up your read. ;)
 
5-6 clocks..not 5 instructions :)
btw..usually when the code is appended to a complex shader the shader itself gets usually 3-4 cycles slower..
 
Right, read pretty much the whole thread - big mess - but i have to ask, if it's that simple, why is everyone so bothered about FP16 and FP32 and all these transistors "wasted" so to speak for those, when you can achieve pretty much the same results with INT8?

I bet NV and ATI are hitting themselves right about now, with all that silicon wasted for FP16 and FP32 when people are getting better results without them...
 
Oh, I dunno. David Kirk might have a rather smug grin on his face right now. :LOL:

Any feel, Marco, for whether this is something that would be amenable to hardware accel down the road? By that, I mean specifically, rather than "yeah, make shaders run faster and it will go faster." :LOL: Or still a little too early to tell if this is final form, or something even better might turn up?
 
london-boy said:
Right, read pretty much the whole thread - big mess - but i have to ask, if it's that simple, why is everyone so bothered about FP16 and FP32 and all these transistors "wasted" so to speak for those, when you can achieve pretty much the same results with INT8?

I bet NV and ATI are hitting themselves right about now, with all that silicon wasted for FP16 and FP32 when people are getting better results without them...
FP16 blending is the big feature that NVidia brought to the table with the 6 series - this allows semi-transparent objects to be rendered using FP16 precision.

The height of the argument in favour of FP16 blending is that it makes the following easy:
  • particle effects combined with HDR lighting (e.g. explosions lighting up smoke where FP16-blending directly supports the calculation of the lighting within the smoke)
  • HDR effects viewed through transparent surfaces (whether that's "glass" or blended foliage)
  • computation of motion-blurred light from: light-sources, transparencies or reflections
When you use alternative techniques to simulate HDR (instead of a backbuffer format that directly supports the precision and range you require, e.g. FP16), then you need to derive work-arounds for cases such as those above, as your backbuffer format no longer automates the results.

The big issue I have with this talk of HDR is that FP16 isn't high dynamic range, at all. FP32 is the starting point. So every implementation of "HDR" is some kind of simulation that takes shortcuts, rather than using the "mathematically straightforward" approach of an FP32 backbuffer with full blending and AA support - something that no GPU offers anyway.

Jawed
 
London Boy said:
Right, read pretty much the whole thread - big mess - but i have to ask, if it's that simple, why is everyone so bothered about FP16 and FP32 and all these transistors "wasted" so to speak for those, when you can achieve pretty much the same results with INT8?
I've argued this before - but IMO the R&D directions in certain fields/products are often influenced by marketting - and in some cases what you'll get are features that are better marketable - rather then actually better usable.
IMO there are extreme cases of this in some products which I will Not name here, but for what's worth I feel that the HDR issues are at the very least touched by this as well (not only the push into brute force approach, but also comments such as the one from Mr.Kirk about the issue being a 'software problem').

In the end it could be all be considered fluff though - there's no such thing as ideal hw design, there will always be tradeoffs, and the software we write is designed to push boundaries of what can be achieved, so it will inevitably collide with the said tradeoffs.
There's a good deal of subjectiveness on the matter as well - what's considered 'more efficient' by some may very well look like 'needlessly constrained and obtrusive' to someone else.
 
Well i guess this whole thing settles the issue with PS3 being able to do HDR and AA at the same time... And even the little "SSAA hack" with the 1080p internal renders... Obviously different studios will take different approaches, but at least now we know PS3 games will have both HDR and AA more or less "easily"...
 
Jawed said:
The big issue I have with this talk of HDR is that FP16 isn't high dynamic range, at all. FP32 is the starting point. So every implementation of "HDR" is some kind of simulation that takes shortcuts, rather than using the "mathematically straightforward" approach of an FP32 backbuffer with full blending and AA support - something that no GPU offers anyway.

Strictly speaking, all that HDR - high dynamic range - means is that color values can go greatly above and below 0 to 1. Every implementation that provides this should suffice, wether it uses FP16, FP32, FP10, or NAO32. Precision and performance issues are a different matter, and it also depends on what compromises the hw manufacturer or the software developer makes.
 
london-boy said:
at least now we know PS3 games will have both HDR and AA more or less "easily"...

Not all games though. Some devs might end up using FP16 only, some might not use HDR at all...
 
Back
Top