When will traditional monitors come to a point where they are infeasible?

MfA said:
Wasn't some company (Philips?) developing LCD monitors which turn off backlighting during switching to get rid of blur?

That still leaves blacklevel as a problem, but the LED based backlighting from the HDR displays can help quite a bit there ... not cheap though.
Samsung has a new LED backlight for mass production in the second half of 2006: http://www.samsung.com/us/Products/Semiconductor/USNews/TFTLCD/TFTLCD_20051012_0000200301.asp

Mass production of 32-inch panels without color filters is scheduled to begin in the second half of 2006.

Specifications
Resolution: 1366X768
Color Saturation: 110% (NTSC)
Contrast: 1000:1
Response Time: 5ms
Power Consumption: 82W
Brightness: 500nits
Aperture Ratio: 78%
 
It doesn't sound like the backlight can adapt to the image though, they'd probably need a patent license to do that.
 
Nappe1 said:
Of Course I tuned it with Nokia Monitor Tester. (suprisingly the auto setup did splendid job for contrast / brightness setup. Brightness was correct to start with and contrast needed on 2 notches down.)
After watching The Lion King and Spirit: Stallion of the Cimarron, there's huge difference to CRT in animation color reproduction. (these two movies especially, because both still have several different drawing techniques applied. CG animations look really synthetic compared to these. (Not that it would mean those being worse, but more like different.))

Also, I was positively supprised the effect that active backlight gives. TV has detector for Ambient light in room and it changes brightness of backlight acording to room light conditions. (more expensive version has even color temperature adjust, but I am happy with this already. :) and of course, you can turn it off, if you don't like it.)

yeah, plus using ONE image source to all tvs and SCART splitter does not help the image quality either.
That light detector i prettt spiffy:D
 
ChronoReverse said:
Speaking of the scaling problems with LCDs. Is there any reason why there hasn't been a hardware scaler that utilizes something like a Bicubic or Lanczos Resize? I'm pretty sure the scaling wouldn't be nearly as blurry if something like that was available.
There are quite a few on the market. Most are bundled in a signal format converter. Only problem is that most target videophiles and cost 10x more than your monitor.
 
I use my computer on and off for the entire morn/day/evening. I have a CRT monitor, and I will not give up multiple resolutions, 10 bit depth- which leads to more precise colour accuracy, and end result; is how graphics are displayed; IE: RGB/Alpha, strong/luminent gamma, whiter whites, blacker blacks, gray scales, etc..CRT has way less dithering than LCD, to acheive same colour output..

This is why I choose ati (matrox is the best, though 3d performance sucks) and my CRT. I guess it's difficult to vary colour reproduction on LCD technology..

My Viewsonic 19" CRT is 6yrs old and still plugging away; and once 0 response time LCDs or newer/thinner/better CRT comes out, I'll hold on to this behemoth..
 
L233 said:
Everyone who buys your average TN-Panel "gamer" TFT obviously doesn't give a shit about image quality. It's interessting to see that some people seem to be unable to perceive the lack of color fidelity, rough gray scale gradient and shitty black display of TN-Panel TFTs. These damn things, that are usually recommended by hardware test websites and magazines, are a huge step back from an CRT in terms of image quality.
i didn't know about the difference between 6bit and 8bit panels when i bought my samsung 172x. believe me, lesson learned... :cry:
 
Last edited by a moderator:
squarewithin said:
There are quite a few on the market. Most are bundled in a signal format converter. Only problem is that most target videophiles and cost 10x more than your monitor.

What does the Viewsonic N6 have? It seems to fair worse than the built in scalar in my monitor.

i didn't know about the difference between 6bit and 8bit panels when i bought my samsung 172x. believe me, lesson learned...

I can tell the difference, if the software is capable of displaying it. The Xbox ports that have made up PC games for the last several years do not show the limitations of a 6 bit display.
 
Chalnoth said:
You're not getting 10-bit color, friskyolive.
Why not? Surely if you have a 10bpc framebuffer, through a good DAC, any CRT will display that analogue signal properly? I ask that in all seriousness, my knowledge of displays is pretty thin in places. If the DAC is capable and preserves the colour data, why isn't it 10bpc?
 
Fox5 said:
What does the Viewsonic N6 have? It seems to fair worse than the built in scalar in my monitor.
No idea, I've never used it. I'm not very familiar with them myself, but I generally know that the good ones cost.

Rys said:
Why not? Surely if you have a 10bpc framebuffer, through a good DAC, any CRT will display that analogue signal properly? I ask that in all seriousness, my knowledge of displays is pretty thin in places. If the DAC is capable and preserves the colour data, why isn't it 10bpc?
Numerous reasons. First, while Windows supports 10 bit framebuffers, it only does so in fullscreen mode. None of the native GUI elements make use of it. Because it steals the extra 6 bits from the alpha channel, you can't use it for games, and no games I know of implement it. Unless you are running some very specialty applications, you aren't using 10 bits. Assuming you are, and you have a magical DAC that does 10 bit conversion properly (which is probably untrue for a consumer-level card), you still have everything down the line. Signal noise in the cable, noise on the other end, power fluctuations all contribute to loss of quality. The circuitry in your monitor probably isn't exact enough to preserve all 1024 discretizations. Generally, 10 bits implies "we aren't using circuitry that quantizes to 8 bits" not "we've tested everything to make certain you get 10 bits end to end". There's a lot of loss along the way. From my discussions with various people in film production and the medical display community, you end up with something like 80% of the bits in the end using analog. Guys at (I think) Disney said they measured their end to end setup and got somewhere between 5 and 6 bits out of 8 in, though they had more intermediate steps. So, at best, given 10 bits starting you could realistically expect 8 bits from a top-end ($100k) system, and I'm pretty sure you're getting less.
 
squarewithin said:
Numerous reasons.
Excellent info, cheers. Chalnoth, is what squarewithin says what you were getting at?
 
the main reason I switched to an LCD is that it is

1. lighter
2. better geometry
3. they have reached a point where the quality is acceptable in terms of PQ.

ViewSonic VX924 is not a bad display at all.
 
friskyolive said:
This is why I choose ati (matrox is the best, though 3d performance sucks) and my CRT. I guess it's difficult to vary colour reproduction on LCD technology..

I don't think it is clear that an ati based card is best (excluding matrox) anymore. I think it is really more a matter of who put together your card. My powercolor x800 was not really that good and my leadtek 6200 is actually much clearer :p that is a sad truth for me since one was $420, and the other was $50.00
 
friskyolive said:
Enlighten me, then, please. ?? :) Tell me it is not ATI drivers and RAMDAC; or God forbid, my monitor? :(
Because there is no support or a 10 bit per channel framebuffer. Basically, it's a software issue.
 
Sxotty said:
I don't think it is clear that an ati based card is best (excluding matrox) anymore. I think it is really more a matter of who put together your card. My powercolor x800 was not really that good and my leadtek 6200 is actually much clearer :p that is a sad truth for me since one was $420, and the other was $50.00
heheh, I've just remembered, when purchasing my 8500, there were comparisons of each radeon, in terms of IQ, FPS and overall function; blast it, I do not recall which board and/or website-t'was a loong time ago. In a nutshell, it depends on the company, I suppose, though, I don't think that there is much deviation from spec today? I've had my sapphire x800xtpe for, approximately, a year. I've chosen, either ati or sapphire..Well, atleast power color is out of the question. :D

Chalnoth said:
Because there is no support or a 10 bit per channel framebuffer. Basically, it's a software issue.
Are you suggesting that it is game code and/or driver? Bah, all I know, is that, in direct comparison, I am able to see the difference, and can see the ghosting/blur when in motion-similar to a wave/rolling effect, and texture discrepancies/anamolies. It's very difficult to detect, though, I can; yet, for the life of me, I couldn't tell the difference between 40 and 60 fps/hz. Whom ever feels happy with their purchase, is all that matters.
 
Last edited by a moderator:
Fox5 said:
What does the Viewsonic N6 have? It seems to fair worse than the built in scalar in my monitor.



I can tell the difference, if the software is capable of displaying it. The Xbox ports that have made up PC games for the last several years do not show the limitations of a 6 bit display.
well believe me now that i use one i can tell the difference. the dithering artifacts drive me insane! :mad: i ordered my 172x based just on word of mouth without much research. i didn't know about different panel types, tn and pva,mva. i thought an lcd was just an lcd...
 
Last edited by a moderator:
friskyolive said:
Are you suggesting that it is game code and/or driver? Bah, all I know, is that, in direct comparison, I am able to see the difference, and can see the ghosting/blur when in motion-similar to a wave/rolling effect, and texture discrepancies/anamolies. It's very difficult to detect, though, I can; yet, for the life of me, I couldn't tell the difference between 40 and 60 fps/hz. Whom ever feels happy with their purchase, is all that matters.
That's just the CRT acting as a low-pass filter since it's pixel boundaries aren't discrete. It can look smoother, since it's ever so slightly blurry, but that's hardly evidence of 10 bits.
 
Back
Top