seeing that people buy LCD's that can't display milions of colours, it seems these do not care about "quality" so much... maybe Nvidia was right after all - 16bpp is enough ...K.I.L.E.R said:When will current generation monitors need to be replaced?
What limitations do they currently have in relation to 3D graphics?
Is the range of colours of traditional CRTs and LCDs posing as a future problem?
What could eventually replace traditional monitors?
JHoxley said:I think, if anything, the limiting factor is still internal - the linear 8bit scale doesn't really do images justice these days.
Surface-conduction electron-emitter displays (SED), developed by Toshiba and Canon. IIRC they recently started production of 50" panels for TVs, but it's still years away from mass availability.L233 said:There seems to be a new CRT tech by Samsung IIRC where every pixel has it's own electron ray or something weird like that. Supposedly, this tech enables really flat CRTs with much higher luminescence and no geometric distortions. Maybe this will be a good compromise for computer displays.
Xmas said:Surface-conduction electron-emitter displays (SED), developed by Toshiba and Canon. IIRC they recently started production of 50" panels for TVs, but it's still years away from mass availability.
doob said:Neither CRT or TFT/LCD variants will be a dead tech over the next 20 years or more.
David Kirk: I think that High Dynamic Range Lighting is going to be the single most significant change in the visual quality over the next couple of years. It's almost as big as shading.
The reason for this is that games without HDR look flat. They should, since they are only using a range of 256:1 in brightness—a small fraction of what our eyes can see. Consequently, low-dynamic-range imagery looks flat and featureless, no highs, and no detail in the shadows, the lows. If you game using a DFP (LCD display), you probably can't tell the difference anyway, since most LCD displays only have 5 or 6 bits of brightness resolution—an even narrower 32:1 or 64:1 range of brightness. On a CRT, you can see a lot more detail, and on the newer high-resolution displays, you can see not only the full 8 bits, but even more. There are new HDR displays that can display a full 16-bit dynamic range, and I can tell you that the difference is stunning. When these displays become more affordable in the next year or two, I don't know how we'll ever go back to the old way.
sounds coolGraham said:This has convinced me that I'd stop lurking and finally register:
At siggraph this year, the company BrightSide were showing off their HDR-HD displays. full 1080P resolution, 16bit/channel. They claim 300,000:1 contrast ratio, and I believe it. Of everything I saw there, this by far was the tech that I wanted the most. The images were simply stunning (dispite most of the content they were showing was filtered 8bit)
They did have some 12bit content to show, where were impressive. Interestingly, they were showing a 12bit movie from EA's fightnight round 3 (ps3/xbox360 game). I asked a few questions, and it turned out that it wasn't actually an in-game movie running in HDR, as they had rendered the background in HDR seperatly then compositied it with the 8bit foreground for the movie.
Perhaps the most impressive aspect of this, was that you got naturally occuring image bloom. No more post processing needed - in fact the blooming used in the EA demo actually made it look bad
Of all the videos they showed, one in perticular was impressive. It was still 8-bit, but had a filter running in the TV's hardware to convert it to 16bit depedning on the gradients in the image (I would guess). Anyway, at one point in this video, a 3D genie fired some magic spell at you. There is a brief, but *incredibly bright* flash. It was amazing.
The company were also talking about their HDR-JPEG and HDR-MPEG formats, which sound promising as they are apparently backwards compatible.