LCD with widescreen

MPI said:
I must say I snickered a bit when G.Oden trots out the power draw argument against CRTs... when he totally dismisses the same argument when it comes to Intel CPUs. Which incidentally is in roughly the same magnitude, btw. Heh.

TFT and CRT have their advantages and disadvantages. But to say that TFT is superior in all aspects gets you tagged as a nincompoop in my book. Because that's easily proven NOT TRUE. It's not a matter of opinion, it's FACT. Those incontrovertible FACTS may not MATTER to you, however. That you _prefer_ to live with the drawbacks of TFTs over the drawbacks of CRTs, now that's an opinion.

END OF FUCKING STORY ALREADY! :devilish:
Aye.. missed that.
He's an interesting fellow.
My 3200+ winchester oced to 3500 levels idles are 30C and full loads at 43C in an open case with no fans btw.
Too bad my systems borked.
 
radeonic2 said:
Power is cheap btw.. i doubt 150 watts will really be noticeable on your power bill.
It doesn't take long at all to setup a CRT, it only takes a long time if you try to get it perfect.
Lcds are inferior to crts in all but a few ways here's what I'll admit, they're sharper, brighter, have better geometry (but I wouldn't call them perfect) and take up less desk space.
I'm more than confident that most decent calibrated CRTs would do better then all but the most expensive LCDs in a typical test suit.

If you honestly think lcds are anywhere near 800:1 contrast then you are helplessly ignorant of how lcd makers fudge numbers.
Ya know how much backlight bleeding I notice?
NONE because i don't use a display technology that doesn't suite the needs of watching movies or gaming in the dark.
Lcds are great for reading text (sharp and bright).. but for anything else no thanks.
As I was saying in my first post, you fall under the LCD snub who claims not to notice some of the down falls to lcd technology.
That's great for you, but my eyes function pretty well, save being slightly near sighted (25:20).
So are lcds superior for multiple resolutions? ya?.. I think not.
Even with my 7800GT/ 2 ghz A64 I can't run all games maxed at 1600x1200 (20" lcd res).
SS2 for example runs good at 1280x960 maxed (with HDR on) in all but a few areas.
So if you have a 1600x1200 native display, a 600 doller cpu/mobo/graphics card (aprox prices I got system for) doesn't have enough oomph for 1600x1200 native lcd, you're forced to drop down to 1280x1024 which results in the wonders of scaling.. ya know where it takes that nice sharp LCD you had at it's native res and makes it blurry? Ya that...
Of course you could get a 1280x1024 native res lcd.. but with my crt I have the option of 1600x1200.
So excuse me for clinging to an ancient technology that still suits my needs better- better blacks, does multiple resolutions better, has better colors/gamma curves suffers 0 ghosting (yes your lcd ghosts... you just can't notice it or have gotten used to it) and is also less expensive.

Power is not cheap, at least where I live, and where prices will be increasing as much as 40% this year due to increased natural gas prices. It all adds up you know.

Sure CRTs don't take that long to setup, but LCDs require no setup at all. Now lets talk about why CRTs are inferior. They emit radiation (long term use can adversely affect your eyes), they are big, heavy and take up desk space. Their picture quality is a long shot from perfect since the digital signal must be converted to analog and then scanned onto a nonlinear surface. LCDs are straight digital to digital and pixel to pixel meaning its picture is close to perfect and is why a CRT's geometry and sharpness is no match. Brightness on CRTs fade after time, and while this happens on LCDs as well it requires a simple and cheap lamp replacement, CRTs cannot be fixed short replacing the cathode tube which costs as much as getting a new one anyways and which brings me to my next point, CRTs are polluting devices.

The only true advantages a CRT holds above an LCD is contrast ratio like I mentioned and resolution scaling. The validity of the claimed contrast ratio on LCDs depends on the manufacturer, some of the newer ones do indeed approach what they claim and those coming out within the next year or so will be using LEDs, so I consider it a moot point.

As far as being an "LCD snub," I used CRTs for many years, then I noticed LCDs and watched as they became better, now I think their advantages far outweight their disadvantages; there's nothing snobbish about it. When I upgraded to my current LCD about a year back from my CRT, I was more than happy with the result and I do not regret doing so one bit. No, motion blur is not an issue at all. The only time I see it is on text when I scroll down a webpage very fast, and like I said, there are others which have much better reponse times then mine now. As far as multiple resolutions, well I have a 17", which means 1280x1024, so I haven't had too much trouble running games. Even when I do have to run a resolution lower I turn on AA, and it isn't that bad imo. Lower resolutions look bad in the OS UI, but not so much in games.

LCDs are superior to CRTs in every way save two areas, for now; if you cannot see that then it is you who are helplessly ignorant.

I must say I snickered a bit when G.Oden trots out the power draw argument against CRTs... when he totally dismisses the same argument when it comes to Intel CPUs. Which incidentally is in roughly the same magnitude, btw. Heh.

That's a bit of a broad statement. I've got a P4 (Northwood) overclocked to 3.4 GHz that is currently running at 35 C idle and never goes higher then 43 C under load using stock cooling. Then there is the Pentium M and the Core Duo which run cooler then anything AMD has.
 
Last edited by a moderator:
ANova said:
Power is not cheap, at least where I live, and where prices will be increasing as much as 40% this year due to increased natural gas prices. It all adds up you know.
Too bad for you then.. if you have an intel prescott like golden does you have no room to argue

Sure CRTs don't take that long to setup, but LCDs require no setup at all.
Ya you're right, they always have washed out blacks and look worse at non native resolutions
Now lets talk about why CRTs are inferior. They emit radiation (long term use can adversely affect your eyes), they are big, heavy and take up desk space.
Eww radiation you say? Like my microwave? neato mr I have to blow things out of proportion in order to argue my point.
Their picture quality is a long shot from perfect since the digital signal must be converted to analog and then scanned onto a nonlinear surface.
Yet with this new technology the displays them selfs still are the problem.. great, you've upgraded the signal path while leaving inferior technology where the signal goes to
LCDs are straight digital to digital and pixel to pixel meaning its picture is close to perfect and is why a CRT's geometry and sharpness is no match.
Yes.. lcds are sharper and have better geometry.. we've already discussed this.
Brightness on CRTs fade after time, and while this happens on LCDs as well it requires a simple and cheap lamp replacement, CRTs cannot be fixed short replacing the cathode tube which costs as much as getting a new one anyways and which brings me to my next point, CRTs are polluting devices.
My HP CRT from 1998 is still bright and imo you really shouldn't have either display long enough for it to wear out.. just like you upgrade your computer you should update your display when the time comes.. my 8 year old crt is still doin great :D

The only true advantages a CRT holds above an LCD is contrast ratio like I mentioned and resolution scaling. The validity of the claimed contrast ratio on LCDs depends on the manufacturer, some of the newer ones do indeed approach what they claim and those coming out within the next year or so will be using LEDs, so I consider it a moot point.
Don't forget that contrast is a measure of the difference between the blackest blacks and the brightess whites, they may be reaching these high contrast ratios by using brighter blacklights, while still having washed out blacks which CRT users like my self despise.
I've said this before.. I will buy an LCD LED as soon as it becomes affordable... like under 300 for a 1600x1200 display that can run also run at 1280x960 if I need to (that is the res I should be using for a 1600x1200 display since it's a 4:3 display right?

As far as being an "LCD snub," I used CRTs for many years, then I noticed LCDs and watched as they became better, now I think their advantages far outweight their disadvantages; there's nothing snobbish about it. When I upgraded to my current LCD about a year back from my CRT, I was more than happy with the result and I do not regret doing so one bit. No, motion blur is not an issue at all. The only time I see it is on text when I scroll down a webpage very fast, and like I said, there are others which have much better reponse times then mine now. As far as multiple resolutions, well I have a 17", which means 1280x1024, so I haven't had too much trouble running games. Even when I do have to run a resolution lower I turn on AA, and it isn't that bad imo. Lower resolutions look bad in the OS UI, but not so much in games.
That's your opinion and you're entitled to it, just as I am entitled to think you're full of shit:D.
No actually I just think you like the lcds because you can afford the compromise.
I play games at 1280x960 minimum, I aim for 1280x960 with 4x fsaa in games, if the game is running at like 70 fps then I'll bump it up to 1600x1200 and enjoy even less jaggies, while you'r stuck at 1280x1024, which btw isn't proper res.

LCDs are superior to CRTs in every way save two areas, for now; if you cannot see that then it is you who are helplessly ignorant.
CRTs are superior to LCD save a few areas, for now; if you cannot see that then it is you who are helpless ignorant.

That's a bit of a broad statement. I've got a P4 (Northwood) overclocked to 3.4 GHz that is currently running at 35 C idle and never goes higher then 43 C under load using stock cooling. Then there is the Pentium M and the Core Duo which run cooler then anything AMD has.
Wow that's impressive.. 3.4 ghz you say?
Even more impressive is how your cpu fails to outperform an A64 several hundred mhz slower while consuming more power, while I don't care about power draw much, you do, so perhaps you should trade in that power hog for something that's faster and consumes less power? How bout it mate?
You didn't say what HSF you ran, my A64 3200+ winchester temp probe clames it idles at 30C and goes up to 37C underload with the thermal pad on.
As for the core duo, you won't hear any arguments from me.
It's what intels high end desktops should be using instead of the .65nm P4s which still run hot and consume excessive power.
 
Too bad for you then.. if you have an intel prescott like golden does you have no room to argue

I don't, as you should know. The Northwood draws about the same amount of power as an equivalent A64. Btw, the power draw difference between a Prescott and an A64 isn't nearly as much as that of the CRT versus an LCD.

Ya you're right, they always have washed out blacks and look worse at non native resolutions

No, they don't. It depends on the LCD, some are better then others. I originally bought a Sony which claimed the same contrast ratio as my Viewsonic. This was clearly not the case, as the viewsonic looked substantially better. As far as looking bad at native resolutions, yeah you've beat this horse enough. LCDs have native resolutions because of their precise pixel to pixel ratio, they are digital devices like the computers they interface with. At any rate, I fail to see how this relates to setup time.

Eww radiation you say? Like my microwave? neato mr I have to blow things out of proportion in order to argue my point.

Just because you are ignorant to the situation does not mean it does not exist. Microwave radiation consists of rather large particles, the screens on the doors of microwaves block them. The thick glass on CRTs do block a large majority of the radiation emitted, but not all of it. Studies have shown that staring at such screens for long periods of time can affect your eye vision. Oh and I forgot the fact that CRTs scan the image onto the phosphor, this refresh is also not good for the eyes unless you run at least 80Hz.

My HP CRT from 1998 is still bright and imo you really shouldn't have either display long enough for it to wear out.. just like you upgrade your computer you should update your display when the time comes.. my 8 year old crt is still doin great

Way to be contradictory, first you claim we should update our displays and yet you yourself have not updated in 8 years.

Don't forget that contrast is a measure of the difference between the blackest blacks and the brightess whites, they may be reaching these high contrast ratios by using brighter blacklights, while still having washed out blacks which CRT users like my self despise.
I've said this before.. I will buy an LCD LED as soon as it becomes affordable... like under 300 for a 1600x1200 display that can run also run at 1280x960 if I need to (that is the res I should be using for a 1600x1200 display since it's a 4:3 display right?

Have you looked at some of the newer ones or are you too closed minded to even attempt it. It's not that bad, in fact the only time I notice it is when I watch movies, like I said.

That's your opinion and you're entitled to it, just as I am entitled to think you're full of shit.
No actually I just think you like the lcds because you can afford the compromise.
I play games at 1280x960 minimum, I aim for 1280x960 with 4x fsaa in games, if the game is running at like 70 fps then I'll bump it up to 1600x1200 and enjoy even less jaggies, while you'r stuck at 1280x1024, which btw isn't proper res.

Now that's a good one. Proper res? What constitutes a proper res? 1280x1024 is one resolution of many. I could just as well say 16:9 is the "proper res," not 4:3, since our vision is wide after all. And the market seems to agree now that everything is going widescreen. Running at a higher resolution does not eliminate jaggies on LCDs unless you sit very close to them, which makes it not as noticable, turning on AA is what matters.

CRTs are superior to LCD save a few areas, for now; if you cannot see that then it is you who are helpless ignorant.

You already said I was a helpless ignorant, now you're just repeating yourself. At any rate, it's common place to at least explain why CRTs are superior. I mentioned at least 10 reasons why LCDs are better, you mentioned two.

Wow that's impressive.. 3.4 ghz you say?
Even more impressive is how your cpu fails to outperform an A64 several hundred mhz slower while consuming more power, while I don't care about power draw much, you do, so perhaps you should trade in that power hog for something that's faster and consumes less power? How bout it mate?
You didn't say what HSF you ran, my A64 3200+ winchester temp probe clames it idles at 30C and goes up to 37C underload with the thermal pad on.
As for the core duo, you won't hear any arguments from me.
It's what intels high end desktops should be using instead of the .65nm P4s which still run hot and consume excessive power.

You can spout off as much bs as you like. I've benchmarked my CPU with many various applications and compared the results with various AMD equivalents and it is hardly a bad performer, sometimes it loses, sometimes it wins depending on the application. I told you I'm running a Northwood which draws little to no more power then an A64. I also said I was running stock cooling, meaning a stock Intel HSF. If I moved to one of the better HSFs like the Zalman CNPS7000 then it would go down to the low 30s, but I don't really need it. Thanks for confirming my original point regarding Intel CPUs not being power hogs in general though.
 
ANova said:
I don't, as you should know. The Northwood draws about the same amount of power as an equivalent A64. Btw, the power draw difference between a Prescott and an A64 isn't nearly as much as that of the CRT versus an LCD.



No, they don't. It depends on the LCD, some are better then others. I originally bought a Sony which claimed the same contrast ratio as my Viewsonic. This was clearly not the case, as the viewsonic looked substantially better. As far as looking bad at native resolutions, yeah you've beat this horse enough. LCDs have native resolutions because of their precise pixel to pixel ratio, they are digital devices like the computers they interface with. At any rate, I fail to see how this relates to setup time.



Just because you are ignorant to the situation does not mean it does not exist. Microwave radiation consists of rather large particles, the screens on the doors of microwaves block them. The thick glass on CRTs do block a large majority of the radiation emitted, but not all of it. Studies have shown that staring at such screens for long periods of time can affect your eye vision. Oh and I forgot the fact that CRTs scan the image onto the phosphor, this refresh is also not good for the eyes unless you run at least 80Hz.



Way to be contradictory, first you claim we should update our displays and yet you yourself have not updated in 8 years.



Have you looked at some of the newer ones or are you too closed minded to even attempt it. It's not that bad, in fact the only time I notice it is when I watch movies, like I said.



Now that's a good one. Proper res? What constitutes a proper res? 1280x1024 is one resolution of many. I could just as well say 16:9 is the "proper res," not 4:3, since our vision is wide after all. And the market seems to agree now that everything is going widescreen. Running at a higher resolution does not eliminate jaggies on LCDs unless you sit very close to them, which makes it not as noticable, turning on AA is what matters.



You already said I was a helpless ignorant, now you're just repeating yourself. At any rate, it's common place to at least explain why CRTs are superior. I mentioned at least 10 reasons why LCDs are better, you mentioned two.



You can spout off as much bs as you like. I've benchmarked my CPU with many various applications and compared the results with various AMD equivalents and it is hardly a bad performer, sometimes it loses, sometimes it wins depending on the application. I told you I'm running a Northwood which draws little to no more power then an A64. I also said I was running stock cooling, meaning a stock Intel HSF. If I moved to one of the better HSFs like the Zalman CNPS7000 then it would go down to the low 30s, but I don't really need it. Thanks for confirming my original point regarding Intel CPUs not being power hogs in general though.
I see wasting my time here so tell you what.
I agree to disagree with you ok?
 
Brightness: 300 cd/m2 (typ)
Contrast Ratio: 500:1 (typ)

The higher the value the better???
 
mito said:
Brightness: 300 cd/m2 (typ)
Contrast Ratio: 500:1 (typ)

The higher the value the better???
Generally yes but don't forget that companies will inflate their contrast ratio figures to look better on paper, when the set can't actually reach that. Or they do thqat by turning the brightness right up to have a bigger ratio, but the blacks are not really black..
 
Back
Top