Graphics card for photoshop etc.

Tokelil

Regular
Im getting more and more questions from friends/coworkers about what to buy when the primary use of the graphiccard is to do photoshop. (and similar 2D image editing) And since I have no skills for those things myself, I rarely touch those programs and have very little knowledge about how they work and are optimized...

So what is recommended these days? Good image quality is ofc. a consideration, but what about speed? Is there even a difference? (Im guessing that the programs aren't using the 3D part of the graphiccard anyway) Is there any gain in going from 128 to 256 MB? (Since the programs are 2D I would think it doesn't matter either)

Thanks
 
Get a Matrox, there`s nothing like a Matrox for 2D only work. Sure, it`s expensive, but it`s one of the best possible alternatives
 
any current ATi or Nv card will do the job.
Of course still proper calibrating is needed (especially if doing pre-press work, rgb<->cmyk, etc)

Matrox R.I.P.
 
Anything Matrox. Neither nV nor ATI even come near Matrox chips (even those from 4-5 years ago) regarding 2D quality. More RAM is always better also.
 
Anything Matrox. Neither nV nor ATI even come near Matrox chips (even those from 4-5 years ago) regarding 2D quality. More RAM is always better also.

prove it , that what you say is not f a n b o y -ism :p

Matrox has better 2D quality? in what aspect ? Crisper image? faster 2D operations (like showing image for 0.01s instead of 0.02?) ?
Few months ago I looked at output of G400. It was OK, no more no less. yes, it was connected to very good CRT monitor which costed ~ 1000 US$ 5-6 years ago
 
Last edited by a moderator:
Put two identical monitors beside each other, one with Matrox and one with nV or ATI and see for yourself. Worlds of difference. Better image, nicer colours, better scaling, you name it. And surely I'm not a "fan" of whatever (except for old Kramer guitars). And their multimonitor support literally kills everything else on the market as well.
 
Put two identical monitors beside each other, one with Matrox and one with nV or ATI and see for yourself. Worlds of difference. Better image, nicer colours, better scaling, you name it. And surely I'm not a "fan" of whatever (except for old Kramer guitars). And their multimonitor support literally kills everything else on the market as well.
Ok, i'll try... although last time I compared Matrox G100 with Savage3d, the S3 won :D

1 more point why buying expensive card will be obsolete - I bet the guys who use Photoshop use LCD monitors... and probably one of these with super-ultra-low response time. Will you expect that the card will somehow overcome the problems with colors on these monitors?!
In the end, its the Human behind the monitor who makes things look well, the card, the monitor... they can help, nothing more. ;)
 
10-bit RAMDAC's alone are enough to give you a crisper picture. Then you have stuff like glyph AA, multi-display zoom, separate ICC profiles for the two (or three) displays etc. Really no comparison. Oh, and they have a dedicated card (Parhelia HR256) for 9mp LCD displays (3840 x 2400) as well.
 
Bullshit.

Don't bother with an expensive Matrox card.



This is from someone who creates ads for magazines, brochures, the web, etc. all day.


Advise them to go with a last generation card, like the Radeon 9200; unless there is a distinct need for Dual DVIs or something, they'll be fine.
 
Put two identical monitors beside each other, one with Matrox and one with nV or ATI and see for yourself. Worlds of difference. Better image, nicer colours, better scaling, you name it. And surely I'm not a "fan" of whatever (except for old Kramer guitars). And their multimonitor support literally kills everything else on the market as well.
Sorry, but I have almost every Matrox graphics card from Millennium I to G400MAX, I used even Parhelia AGP 8x (the newer core revision) and I can't say, that todays high-end graphics cards from good vendors have worse analog output quality - otherwise said, that Matrox has crisper image. The image is as crisp as it can be.
 
From my experiance 2d IQ doesn't really change from card to card at all. Get something with the outputs you need and you should be set. Even something from the gf3 era would be fine IMO. Going with a matrox offering for their supposedly better IQ would be a total waste of cash.From my experience 2d IQ doesn't really change from card to card at all. Get something with the outputs you need and you should be set. Even something from the gf3 era would be fine IMO. Going with a matrox offering for their supposedly better IQ would be a total waste of cash.
 
Bullshit.

Don't bother with an expensive Matrox card.



This is from someone who creates ads for magazines, brochures, the web, etc. all day.


Advise them to go with a last generation card, like the Radeon 9200; unless there is a distinct need for Dual DVIs or something, they'll be fine.

Your opinion is just that, your opinion. My eyes see the difference easily, it's as big as mp3 compared to regular CD's.
 
10-bit RAMDAC's alone are enough to give you a crisper picture. Then you have stuff like glyph AA, multi-display zoom, separate ICC profiles for the two (or three) displays etc. Really no comparison. Oh, and they have a dedicated card (Parhelia HR256) for 9mp LCD displays (3840 x 2400) as well.

10 bit ramdacs mean that you will get less quantization artifacts when using gamma ramps, nothing to do with crispness. Crispness of an image requires the response time of the ramdac to be low.

ICC profiles are actually very little to do with the graphics card. All the ICC stuff is handled by windows itself.

I haven't seen a Matrox card recently, but they certainly used to be nicer image quality.

CC
 
ICC: dunno if it's the difference in drivers then or whatever, but I haven't seen the possibility to handle these separately for two displays elsewhere. Though I didn't tinker around with that much anyway.
 
It may be that they supply an application that lets you fiddle with ICC profiles.

For serious photoshop work you should be using a custom ICC profile anyway, the devices that you use to generate one of these normally come with software to select the ICC profile.

I am sat here with two monitors running on a Radeon 9600 with seperate profiles for each.

CC
 
I don't work with Photoshop or alikes but with lots of CAD stuff. Could care less about colours but the IQ is important. And the scaling (when zoomed) on nV and ATI is abysmal (artefacts, extreme aliasing etc.). Hell, even zooming a normal jpg in ACDSee gives you a blocky mess with those two.
 
Thank you for your input all. :) Sounds like it is as I was thinking and there is no speed advantage going with fast 3D cards or huge amounts of GPU RAM. (Too little system memory properly the biggest performance killer)

Image quality is ofc. debatable, but most I know use either LCD these days anyway.

Btw. 10bit RAMDAC was mentioned for the Matrox cards. Doesn't the X1xxx cards have that as well?
 
From B3D Avivo article:

Although we've been used to 10-bit per component RAMDAC's in previous graphics parts, this was limited to just the DAC, however with Avivo products there are dual 10-bit per component pipelines prior to the DAC's, operating 1.07 billion colours as opposed to 16.7 Million on standard 8-bit display pipelines. The pipelines perform the following functions at 10-bit precision, and an apply to both video playback as well as standard 2D or 3D operations where needed:

* Gamma Correction
* Colour Correction
* Video Scaling

Each of the 10-bit display pipelines have the complete set of post processing functionality, matched to the properties of the particular display output they are ultimately driving. A single video source can be displayed on two independent outputs each with their own scaling, gamma correction and positioning again, tailored to the output that each are displaying on.
 
When using analog signals, the sharpness of the picture is mostly a function of the analog parts on the card and your monitor cable - not the chip. At least with very high resolutions/refresh rates. I have experienced notable increase in image quality by replacing the standard D15-D15 with a high quality D15-BNC cable. But this is only a case when monitor ships with a substandard cable.

And Matrox cards had consistently very high 2D quality. There were, for example, a large amount of TNT cards that really couldn't reasonably do anything much more than 1024x768. Some were good, of course. But it was hard to tell without either trying a particular brand and model yourself, or finding a review that was interested in more than FPS in Quake. Plus, Powerdesk with its color correction and per resolution refresh rate setting and image calibration was pretty sweet. It was much easier to just pick up a Matrox than quessing if some other card was any good, if all you cared was 2D.

All this is a moot point with DVI, of course. And I think that even the cheap cards or integrated solutions of today no longer totally suck. I might be wrong, though.
 
Back
Top