How long till 30bit Deep-Color becomes common?

Discussion in '3D Hardware, Software & Output Devices' started by gongo, Jan 19, 2014.

  1. gongo

    Regular

    Joined:
    Jan 26, 2008
    Messages:
    582
    Likes Received:
    12
    10bit per channel...1.07billions colors...nice?
    How long did 24bit True Color lasted...? very long no?
    IIRC AMD had since HD4870 is able to process 30-bit colors...but its only for Catalyst Pro..Firepro..is 30bit colors so expensive to implement?

    How will PC games look with billions of colors?
     
  2. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,380
    Don't you need a very good and expensive LCD monitor and very good eyes to see the difference between 8 and 10bits?
     
  3. jlippo

    Veteran Regular

    Joined:
    Oct 7, 2004
    Messages:
    1,425
    Likes Received:
    536
    Location:
    Finland
    It certainly depends on the scene, 256 possible colors in a gradient is not enough.
     
  4. MfA

    MfA
    Legend

    Joined:
    Feb 6, 2002
    Messages:
    7,011
    Likes Received:
    537
    With dithering it becomes pretty hard to see ... in my opinion it's more important in storage formats (dithering doesn't compress well) than display.
     
  5. DavidGraham

    Veteran

    Joined:
    Dec 22, 2009
    Messages:
    3,257
    Likes Received:
    3,468
    Even with dithering, some things in current games are still bad looking, for example smoke always appear with color banding, also some water surfaces and skies show the same thing. and that happens in AAA titles with extreme graphics. To name a few : Battlefield, Crysis and ARMA.
     
  6. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,174
    Location:
    La-la land
    That's more a function of rendering precision issues (like integer math alpha blending etc), not the bit resolution of the color channels. 24 bit color is right on the edge of human perception, the reason it has lasted as long as it has is that for most intents and purposes it is HUGELY "good enough".

    Most, as in vast majority, color LCD monitors today are edge-lit with cheap white LEDs, and those just don't have the spectrum needed for deep color. People in general also won't spot the difference - or at least not without side-by-side comparisons - and the difference is going to be either inconsequential to them or actually worse (deep color displays tend to look oversaturated when displaying SRGB source material), so won't be worth the extra expense to them.

    This is not a big deal, there are much more important IQ issues to be concerned with, like better AA and texture filtering techniques for example.
     
  7. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,567
    Likes Received:
    652
    Location:
    WI, USA
    The majority of LCDs aren't even 8 bpp. They dither and it isn't always adequate to avoid banding. And then games have their own banding problems on top of that.
     
  8. gongo

    Regular

    Joined:
    Jan 26, 2008
    Messages:
    582
    Likes Received:
    12
    Correct me if i got it ...wrong, but higher end HDTV uses 30bit deep color panels for use in conjunction with BR? Thats where the Deep-Color thing comes about for most people in general?

    Also don't Plasma Tv advertise as having billions of...colors? What about Sony Triluminous tech?

    I keep getting confused between ....color depth and color gamut...??
     
  9. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,174
    Location:
    La-la land
    Color depth is the number of levels of the individual color channels, gamut is the range of spectrum the display is capable of reproducing...at least as I understand it. IE, the two aren't directly connected.

    You might consider comparing color depth to a digital volume control of a HiFi amplifier, and gamut the frequency range it handles, you might say. Someone, please correct me if I'm wrong here. :D
     
  10. imaxx

    Newcomer

    Joined:
    Mar 9, 2012
    Messages:
    131
    Likes Received:
    1
    Location:
    cracks
    ...that's pretty amazing, really. It reminds me of their glorious times, the trinitron stuff..
     
  11. Exophase

    Veteran

    Joined:
    Mar 25, 2010
    Messages:
    2,406
    Likes Received:
    430
    Location:
    Cleveland, OH
    If you're seeing banding in something with dithering then it's not due to lack of precision in the display. Banding will never be apparent when dithering is applied. You might be able to pick out the dithering itself which will make the picture look grainier, but it will always remove the hard transitions.
     
  12. Thowllly

    Regular

    Joined:
    Feb 6, 2002
    Messages:
    551
    Likes Received:
    4
    Location:
    Norway
    They are connected in the sense that if you increase the gammut you stretch the RGB color cube out to cover a larger part of the color space, which increases the distance between each of the color points in the cube if you don't also increase the bitdepth, making banding more obvious.
     
  13. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,174
    Location:
    La-la land
    Yeah, so it's not a direct connection, like I said. Anyway, most people would be hard pressed to discover much in the way of banding in 24-bit color, so this is not the most pressing CG IQ issue, IMO anyway. :)
     
  14. gongo

    Regular

    Joined:
    Jan 26, 2008
    Messages:
    582
    Likes Received:
    12
    Ok...i come clean Dave...the reason why i started this thread is because i am planning to get a real 10bit monitor soon...i have a 290(X)...a Radeon that can output 10bit colors...in Firepro drivers....just not Catalyst for us normal consumers....

    So if you could kindly get your Catalyst team to enable 10bit colors for Windows..it would nice of you.
     
  15. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,380
    Even with 8bit output, there is a benefit in having a 10bit monitor if all internal monitor calculations are done higher than 8 bits (which I'm sure happens in 99% of all monitors.)

    E.g. If the monitor does gamma correction or even contrast etc settings, it will avoid different 8bit input values getting squeezed onto the same output values.

    If the GPU does gamma correction too, then the same obviously applies there too and 10bit support will indeed help, even if the applications only work with 8bits.

    I'm wondering if the banding that some see with 8bits is caused by that? GPU color transformations?
     
  16. UniversalTruth

    Veteran

    Joined:
    Sep 5, 2010
    Messages:
    1,747
    Likes Received:
    22
    I doubt they would do it because at some point they could have forecasts for some potential financial losses or something... it is all about money, sadly

    But there is something that I don't understand. In your discussion you haven't mentioned a single time 'True colour (32 bit)'. Why?

    [​IMG]
     
  17. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    8,567
    Likes Received:
    652
    Location:
    WI, USA
    32 bit color depth is 24 bit color depth but it also has 8 bits for alpha.
     
  18. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,174
    Location:
    La-la land
    Also, 32 bits don't divide evenly into the standard three color channels... They could have kludged something way back in the early days when this tech first became widespread a la 16 bits per pixel display modes that use 5-6-5 bits for R, G and B respectively (due to the human eye's greater sensitivity for the color green - I assume because of foliage and stuff, so that we'd identify prey and predators more easily while living as hunter-gatherers.)

    However it seems that straight 8 bits per channel were the most effective not just from an engineering standpoint (plain multiples of 8 are easier to build tech for), but also usefulness as well. 16.8M colors really is quite sufficient for nearly all useage scenarios.
     
  19. Babel-17

    Veteran Regular

    Joined:
    Apr 24, 2002
    Messages:
    1,009
    Likes Received:
    248
    The setting in MPC Home Cinema to force 10-bit RGB input/output is useless without the right drivers? Or is it that the setting has nothing to do with this?

    https://trac.mpc-hc.org/wiki/New_Renderer_Settings

    Edit: For some reason I only thought of watching video as a reason for wanting Deep Color. Sorry about that. So even if MPC-HC could force Deep Color it wouldn't help for anything else.
     
    #19 Babel-17, Jan 23, 2014
    Last edited by a moderator: Jan 23, 2014
  20. vjPiedPiper

    Newcomer

    Joined:
    Nov 23, 2005
    Messages:
    90
    Likes Received:
    46
    Location:
    Melbourne Aus.
    10-10-10-2 is a pretty common image format in image processing / image/film production.
    ie. 10 bits each for R, G, and B, and then you ignore the last 2 bits, this works fine for an image that has no alpha associated with it.

    However this is mostly used as a file format and not a display format.

    One could easily say that 30bit colour is already in widespread use in film production, just not in the last step which is display...
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...