Graphics card for photoshop etc.

_xxx_, Matrox doesn't have any image sharpness advantage anymore.

A Radeon has sharper image quality than my G200. A Radeon also has superior TV output compared to my G400 MAX. For CRT output, I can not tell the difference between a G400 and a Radeon, R7500, R8500, and R9700 even at high resolutions. Except that (shock) the Radeons are faster at 2D.

Once we got past NVIDIA's allowing AIB vendors to use horrible filters on their boards (ATI was always quite good), we got past Matrox being better for 2D. I haven't seen a blurry analog image come out of a card for years. Well, except for the VGA output on a Sony Vaio notebook (R9200) I had a few years back. Even some onboard IGP video can be decent.

Matrox is dead. They are trying to create some odd image of value with their super ridiculous prices these days, and it's all fake. Their cards are broken hardware with bad drivers at exorbitant pricing. Don't buy one. If you can get past the cloud9 conversations at MURC, you can find lots of issues with the cards.

I'm not sure you'd want to buy a bottom barrel card from any vendor though. The board circuitry gets awful thin on some of those. Some of the super cheap Radeon 9000 boards with the VGA output being on a ribbon cable come to mind, lol.
 
Last edited by a moderator:
_xxx_, Matrox doesn't have any image sharpness advantage anymore.

A Radeon has sharper image quality than my G200. A Radeon also has superior TV output compared to my G400 MAX. For CRT output, I can not tell the difference between a G400 and a Radeon, R7500, R8500, and R9700 even at high resolutions. Except that (shock) the Radeons are faster at 2D.

Once we got past NVIDIA's allowing AIB vendors to use horrible filters on their boards (ATI was always quite good), we got past Matrox being better for 2D. I haven't seen a blurry analog image come out of a card for years. Well, except for the VGA output on a Sony Vaio notebook (R9200) I had a few years back. Even some onboard IGP video can be decent.

The stuff I read actually said at the time Nvidia was fastest in 2d.

And xxx I had a leadtek geforce4 that had better analog quality than any matrox card I have ever seen, though I never had a parahelia. Anyway I would not say that all analog cards are good. My saphire 9800 pro had pretty crappy analog out actually, the x800 power color was better, and the geforce6200 was better than either of the previous ati cards actually. Now I have a LCD and use DVI so I don't know about my x1900.
 
Well in the end, use what makes you happy. I stick to Matrox for the conveniance of the package as a whole. Also, my eyes do see the difference in the IQ. But if you're satisfied with nV or ATI for work, go for it. My work notebook has the Radeon 7000 too and it's ok for office work and viewing webpages, but you'd have to use serious force to make me use this for say, doing a huge Simulink model or serious schematics/circuit layout work. That's what the other machine in my lab is used for, with the old trusty G400 in it.

And I prefer CRT to LCD for work anyday, that's a total no-brainer for me. Though they will soon disappear this way or the other. Oh well.
 
Yeah most people really can't see the many advantages that CRTs do have. Problem is LCDs offer more tangible benefits for most people (weight, size, looks, brightness, perceived clarity, etc).

Their inability to do black and the color banding that they all suffer from, in addition to the 60 Hz thing for games, really causes problems that you don't see until you've used one for a while.

I own Dell's 2405FPW and 2005FPW, and while I do love the monitors, I can see their weaknesses and do like using my old Samsung 19" Syncmaster occasionally. My friends just balk at CRTs though. It really tells you how much people know about computers in how they respond to that CRT vs LCD debate.

I am a old Matrox fan myself _XXX_. I own Mystique & 220, Millennium II, G200, G400 MAX. I gave up on them after that because they stopped caring about games and their pricing didn't reflect that. G450 and G550 were slower than G400. Honestly, I have a hard time believing that their analog output is superior enough to be worth messing with when ATI & NV offer so many other advantages. You basically pay exorbitant sums for a potentially unnoticeable analog quality gain while at the same time giving up on far more 3D capability (even if it's useless for your task). A modern Parhelia can't hold up to a Radeon 9000. I also found it fascinatingly cheap of them to chop the old Parhelia in half to build the current models.
 
Re: RAMDACs

An ancient R8500 card has a 10-bit RAMDAC, it is not some speciality of expensive Matrox cards.
 
And I prefer CRT to LCD for work anyday, that's a total no-brainer for me. Though they will soon disappear this way or the other. Oh well.

But there is the possibility that OLED or LCD may someday rival CRT's dynamic range...I hope.
 
SED's you mean?

BTW _xxx_ I haev a suggestion for you to look into.

Find a nice CRT that has DVI inputs. They do have them and that might really be the ticket to free you fror legacy hardware and still give you excellent clarity etc... it is just an idea, but I think it might be worth it. I am not sure really how they work though, perhaps it is just using the analog signal that is present over the dvi cable I am not sure to be honest, but it it uses digital even if it converts it back and forth at monitor =you would still end up with consistent results and not be playing the lottery each time you bought a card which is how I felt.
 
Simply take a look at "Workstation Graphic Cards" as offered by HP and their likes for that class of machines. Meant for CAD/CAM and Photoshop work.

They're just cheap nV/ATi chips with a very hefty pricetag and special drivers.

If money is no objection, buy one of those. Otherwise, just buy the cheapest nV/ATi card with at least 128 MB RAM. You won't notice any difference unless you do professional off-line rendering.
 
Back
Top