nVIDIA RSX - Native RGB color space output?

It would be an idiotic implementation to have Folding running on 1 SPU and ignore the rest! The results for the Flops count of PS3 is evidence enough of all the SPU's being used. You aren't going to get that performance was a gimped PPC core or a single SPU, and you wouldn't have a single SPU working on the job either.
 
It would be an idiotic implementation to have Folding running on 1 SPU and ignore the rest! The results for the Flops count of PS3 is evidence enough of all the SPU's being used. You aren't going to get that performance was a gimped PPC core or a single SPU, and you wouldn't have a single SPU working on the job either.

Yes, it would be an idiotic implementation, but do we know for sure that it's not happening (PPE and 1 SPE that is)? I'm just going by what the article said. Do you have another article statement something more conclusively? What are the recent FLOPS count from the project?
 
The FLOP rating for Folding@Home is moderated for various reasons. You can read the F@H PS3 FAQ to find out more.

According to this item, the F@H app consumes similar amount of power compared to running a game (200W on launch PS3, 115W on 40Gb PS3). One of the Japanese system software developers also commented about F@H load last year:
http://www.neowin.net/forum/index.php?showtopic=567411

Nothing yet comes close to the load that Folding@Home applies. And I can see the possibility of a fanless PS3 in the future, just like there was for the
PS2.

This was before applications like BD-Live were released. Although the relative ranking may have changed, it is unlikely F@H runs only on 1 SPU.
 
Many higher end flat screens from Pioneer and SONY have already started accepting Deep Colour and xvColour via HDMI 1.3. The processing units and indeed the panels themselves also both seem to be able to display at least 10bit per colour.

This is of course still pretty much useless without an actual true 10bit source, but that is where the PS3 games come in. First of all, does anyone know for a fact that PS3 games can support outputting colours at bit depths higher than 8 per channel?

If so, since the whole 1080p thing turned out to be bust then deep colour could potentially become an actual exclusive advantage for PS3 games.

I assume the RSX is already doing internal calculations on pixels at higher bit depths than 8, for an example for HDR, but the real question is can that precision be maintained all the way through till that pixel is displayed on newer LCD/Plasma screens?

If the answer is yes, then does anyone know what the cost would be? Rendering games at 1080p resolution is possible but most developers do not see the added cost worth the benefit. What about Deep Colour, or even just xvColour?

I don't know much about the inner workings of the RSX, but would there be a performance penalty if a game uses a bit depth of 10 instead of 8 per channel? The only thing I can think of is the frame buffer needing more memory.

Of course if it is possible, and the costs are negligible then the next question is why, then, isn't anyone using it? Too niche maybe? "Supports Deep Colour" would be a neat little checkbox to have on the back of your game box I think, niche or not.
 
First of all, does anyone know for a fact that PS3 games can support outputting colours at bit depths higher than 8 per channel?
Yes.. it can display an FP16 buffer.

Problem is, most games don't use FP16 for final display buffers. They use kludged HDR using 8 bit buffers. Titles that do use FP16 render targets tend to do a resolve to 8-bit way before the end of the display pipeline.

Never mind that your title would have to support displays that do not handle that kind of input. I mean, my KDL40X3000 would handles it, but my mate's KDL40W2000 wouldn't.. 'tis a support/testing nightmare.

Oh, and tell a developer he has to support two output pixel formats (and any frame rate variance that may involve), *in addition to dealing with NTSC/PAL SD resolutions and HD resolution support*, and you're likely to get a copy of the PPC reference manual (Book 4) thrown at your chops.

Cheers,
Dean
 
Yes.. it can display an FP16 buffer.

Problem is, most games don't use FP16 for final display buffers. They use kludged HDR using 8 bit buffers. Titles that do use FP16 render targets tend to do a resolve to 8-bit way before the end of the display pipeline.

Excellent, thanks for finally settling this for me. Now, to spread the word so people can start pestering developers about it.

Never mind that your title would have to support displays that do not handle that kind of input. I mean, my KDL40X3000 would handles it, but my mate's KDL40W2000 wouldn't.. 'tis a support/testing nightmare.

Now, I don't know much about HDMI, but I think sending higher bit depths than 8 (or deep colour) is only part of the HDMI 1.3 specs and so is only supported when what ever sits on the other side is also HDMI 1.3 and such information is exchanged during the HDMI handshake process.

So it should be a simple matter of disabling the resolve to 8bit step if higher bit depths can be output, or in other words only send out deep colour when connected to a HDMI 1.3 device. Secondly, I am pretty sure that HDMI 1.3 panels that can receive but not display or process higher bit depths than 8 automatically just truncate the 10 or higher bit per channel colours down to 8. So the developer, should, and I say should, not have to worry about anything there. Everything happens automatically, it is part of the HDMI specifications I believe.

Oh, and tell a developer he has to support two output pixel formats (and any frame rate variance that may involve), *in addition to dealing with NTSC/PAL SD resolutions and HD resolution support*, and you're likely to get a copy of the PPC reference manual (Book 4) thrown at your chops.

Come now, we both know code monkeys do what they are told or they get the stick. I really don't see supporting FP16 frame buffers as such a huge hassle, I don't think SD resolutions can even support deep colour, it certainly is moot to try to do so. So, really, that only leaves the 720p resolution to support not truncating down colour bits under a very specific condition. Not that much work I imagine, specially since you get to write, "Supports Deep Colours" in bold letters on the back of the box and people will read it and be impressed and non the smarter about their panels may not even be able to display it. Also, the marketing guys always love adding another checkmark on the spec sheet for some product, eh?
 
So the developer, should, and I say should, not have to worry about anything there. Everything happens automatically, it is part of the HDMI specifications I believe.
Mmmm.. I don't believe you are correct. The reason HDMI negotiation takes place is so you send the right data. I'd be interested to know where you read that this happens, btw?

I really don't see supporting FP16 frame buffers as such a huge hassle

If you're in development, and feel that it's trivial, then I look forward to seeing your game! :) But I would say that changes such as this require careful use so as to not affect GPU performance, or increase the burden of testing caused by such a thing (memory footprint/usage would be different, for example). In my experience, there's enough work to do without adding this to the mix. And the developers I work with are the people who would decide this kind of support.. technical experts, if you will. They're not monkeys who just do as they're told.

Also, the marketing guys always love adding another checkmark on the spec sheet for some product, eh?

A checkmark that most people do not understand would likely not be worth the additional development time, IMHO.

Your mileage may vary, of course.

Cheers,
Dean
 
Mmmm.. I don't believe you are correct. The reason HDMI negotiation takes place is so you send the right data. I'd be interested to know where you read that this happens, btw?

I read it somewhere in AVSForum. HDMI 1.3 capable devices accept higher bit depth colours or Deep Colour, it is in the specifications, so they have to, I think. What the device does with those extra bits is up to the makers though. If the device only works with 8 bits then it is a simple matter of either shifting down the register or wiring it so only the highest 8 bits are read in, right?

If you're in development, and feel that it's trivial, then I look forward to seeing your game! But I would say that changes such as this require careful use so as to not affect GPU performance, or increase the burden of testing caused by such a thing (memory footprint/usage would be different, for example). In my experience, there's enough work to do without adding this to the mix. And the developers I work with are the people who would decide this kind of support.. technical experts, if you will. They're not monkeys who just do as they're told.

The games I work on are small and 2D, thank god. I have mostly just myself to base this on, but programmers are lazy. If people left such decisions up to me I would always do whatever is easiest to implement, not what would give the best result. That is why you should never leave game design in the hands of programmers.

A checkmark that most people do not understand would likely not be worth the additional development time, IMHO.

A checkmark that people don't understand is the best kind of checkmark, just ask TV manufacturers.
 
The games I work on are small and 2D, thank god. I have mostly just myself to base this on, but programmers are lazy.
You might want to better explain that, 'coz programmers who work long shifts and contribute to huge crunch efforts, often being led by moronic managers butt-kissing sales-teams, aren't going to appreciate being called lazy. If you don't mean that, you'll need to clarify exactly what you mean by 'lazy' - implementing a feature-set that'll satisfy the majority of the market, and withholding development that only serves a tiny niche, would be considered by plenty of folk to be a sensible economy.
 
Kamiboy said:
I have mostly just myself to base this on, but programmers are lazy. If people left such decisions up to me I would always do whatever is easiest to implement, not what would give the best result.
That's the exact opposite of many programmers I know (including myself for that matter). Many among us have a near self-destructing tendency for perfectionism (I believe DeanoC called it the mad-scientist syndrome) where we desire to pursue solutions purely based on how challenging they are(the more the better), rather then concerning ourselves with unimportant details such as, eh, implementation time.
The severity of this condition does tend to improve with age/experience though.

That is why you should never leave game design in the hands of programmers.
I completely agree with that, but for the opposite reason to yours ;)
 
You might want to better explain that, 'coz programmers who work long shifts and contribute to huge crunch efforts, often being led by moronic managers butt-kissing sales-teams, aren't going to appreciate being called lazy. If you don't mean that, you'll need to clarify exactly what you mean by 'lazy' -

Ok, I'll make it more clear, I am a programmer and I am lazy, and so are most of all my co-workers. If our job is to program from A to B then we will want to do it in a straight line unless instructed to otherwise, and even then we will fight to get our way.

implementing a feature-set that'll satisfy the majority of the market, and withholding development that only serves a tiny niche, would be considered by plenty of folk to be a sensible economy.

Back in the day I appreciated it a lot when some last gen game supported 480p, specially if it was a PS2 game. It was a very niche feature, few even knew what it was and I imagine implementing it was not trivial. But never the less games did come out that supported it and the few people who used the feature appreciated it a lot.
 
Did 480p inclusion sell you the title? Or would you have bought it without? That's the economy developers (and mostly those funding them) need to consider. I expect DeepColor is going to be far more niche at this time than progressive scan last gen. If it was just a matter of adding a couple of lines of code and enabling a few hardware flags, it'd maybe be worth doing (how difference is DeepColour going to make to games though, especially games where lots of compromises are already being made? DeepColor output of limited HDR renderings with notable banding doesn't make sense to me!). I don't think it's that trivial, and DeanA is a better position to state than than I am.
 
Yes.. it can display an FP16 buffer.

Problem is, most games don't use FP16 for final display buffers. They use kludged HDR using 8 bit buffers. Titles that do use FP16 render targets tend to do a resolve to 8-bit way before the end of the display pipeline.

So why not resolve to 10-bit.
You can obtain info over HDMI what TV can do (deep color, xvColor, resolutions, audio formats....) and then send appropriate format to TV (8-bit/10-bit). Is same with audio formats and it works fine.
 
So why not resolve to 10-bit.
Because you then have to deal with a 10-bit pipeline from render-target down to the end of the pipeline, which isn't natively supported in hardware. The hardware out there just isn't designed to work with 10-bit colour. You have work in 16 bit and resolve to 10but as the final step, or work in 8 bit where 10 bit makes no odds.
 
Back in the day I appreciated it a lot when some last gen game supported 480p, specially if it was a PS2 game. It was a very niche feature, few even knew what it was and I imagine implementing it was not trivial. But never the less games did come out that supported it and the few people who used the feature appreciated it a lot.

+1

for example PAL/NTSC/progressive on Soul Calibur 2
 
Because you then have to deal with a 10-bit pipeline from render-target down to the end of the pipeline, which isn't natively supported in hardware. The hardware out there just isn't designed to work with 10-bit colour. You have work in 16 bit and resolve to 10but as the final step, or work in 8 bit where 10 bit makes no odds.

Ok, so do all work in 16-bit and final step will be resolving to 10-bit or 8-bit. In Deep Color specification is 16-bit as well, but dunno what TV support that. LCD`s with 10-bit panels and Deep Color support are fairly common these days.
 
But then you pay higher processing/bandwidth costs, which is why devs make the choice to work in lower bit depths in the first place!
 
But then you pay higher processing/bandwidth costs, which is why devs make the choice to work in lower bit depths in the first place!

Anyone know anything concrete about these costs? Like, how sizeable they might be? Surely some PS3 games must exist that for the sake of colour fidelity do FP16 all the way to the last step. Uncharted, maybe? That game is already about as much state of the art in every way imaginable that I have ever seen. The lighting in the game was fantastic and the colours! Finally a game that used colours and not just dark browns.

If I had to guess I would say Uncharted is the likeliest candidate for a game that could have benefited from outputting 10bit colours, assuming of course that it used FP16 internally.
 
PS3 Video

There is an updated section on the PlayStation 3 Secrets webpage detailing
PS3 Video standards.

http://www.edepot.com/playstation3.html

The new sections deal with x.v.color and deep color, along with super white and full rgb. I think it is fairly accurate, except for the clipping or scaling. Does the PS3 scale or clip? I am thinking it clips to 16-235 because 0-15 correspond to one same color value, while 235-255 corresponds to another single same color value. If the PS3 scales it, then you have to potential of having two different color gamut for the same source (one mapped to 0-255 and another at 16-235). So it is clipped right?
 
Back
Top