NVIDIA patenting displays technology

nAo

Nutella Nutellae
Veteran
Why are they doing that? BTW..it seems to be they patented, more or less, the same method Brightside uses for thei HDR displays.
VARIABLE BRIGHTNESS LCD BACKLIGHT

A display for a computer system, such as an LCD, is configured to consume less power when compared to conventional designs. The display includes a screen and at least one backlight configured to illuminate the screen. An input to the at least one backlight is adjustable to produce a desired level of brightness. The input may be computed based on a generated source image and a defined constraint. An input to the display is computed based on the input to the at least one backlight and the source image. The input to the display modifies the level of brightness provided by the at least one backlight to produce a viewable image.
 
I haven't read the patent, but from this description it seems to be just some power saving feature by auto adjusting the brightness control based on what is on screen. Isn't brightside tech much more elaborate than this?

"The input may be computed based on a generated source image and a defined constaint."

I am not really sure what this means. With brightside tech, the information for adjusting the backlight arrays comes from the extra bits stored per-pixel right?
 
Last edited by a moderator:
I haven't read the patent, but from this description it seems to be just some power saving feature by auto adjusting the brightness control based on what is on screen. Isn't brightside tech much more elaborate than this?
It is.. but the main idea is the same. In fact brightside tech is good(tm) for power consumption as well (hey..where is my HDR laptop LCD screen!? ;) )
 
Oh, I missed this thread.

Interesting. Had a convo awhile back in the IRC chan where I was advocating the graphics IHVs getting more active in display technology, and pushing that envelope. We more or less agreed that the display people have generally not been pushing the envelope for the high-end gamer at the kind of pace we'd like to see. And at the end of the day, those pixels have to be displayed on something don't they?

I'd also like to see better communication between displays and cards, and a much friendlier configuration process to get the appropriate settings for both set. Gamma, temperature, and all that. I go to those "set your monitor" sites and it just looks way too freaking arcane to me. The card and the monitor ought to work it out, and not ask me more than a question or two (showing examples to help) on the way to perfection.

Which is a long way of saying it wouldn't hurt my feelings one little bit to see an NV or ATI branded high-end display for gamers. Tho of course I'd prefer to see some industry standards for card/display communication in place, rather than even more pure platformization (i.e. requires an NV card to work optimally with an NV display).
 
Back
Top