NV30/ 9700 PRO Image quality

John Reynolds said:
Galilee said:
Either way, the IQ was pretty much the same. The colors were just as good, and beside the edges it looked very similar. But one thing was differet: The R300 pictures is a little bit sharper. Not much. I guess it's because of the one level higher aniso.

I tend to disagree a little. That pic's edge AA is noticeably better than what I can get out of Morrowind at 16x12 with even 4xS enabled. I play the game at 1280x960 with 2x AA and 4x AF (in-game options maxed, except for shadows, which are at half) and get decent performance on my P4 2.4B with 1066 RDRAM. I do have a 19x14 shot I use as a desktop image.

Yes I was thinking mainly about colors. On my 64MB card I can't use 4X in 1600*1200 :(
 
I think full screen shots should be included too...thats what we look at correct.
Sure there is a place for these thumbnails but not showing the overall screen shot takes away from the quality of the review.
 
I disagree with Galilee's opinion on the GF4 image quality as many of the units I've tried and tested have been utter crap.

My biggest beef with GeForce cards has been no baseline of quality components as buying, say, a Gainward versus a Leadtek versus an Asus (or whatever)- your 2D and overall signal quality can be night and day between the bunch.

I think the 'reference' design should set a standard for quality and/or ratings of individual components, then 3rd party manufacturers can either meet or exceed this 'reference' standard (or fall short and be unable to earn this 'reference' rating). This will go a long ways in assuring at least some baseline is met, while still giving 3rd parties flexibility in shopping for individual components.

I went through a dozen different GF4's before finally settling for a VisionTek Ti4600, which still isn't 100% as far as sharpness and signal quality, but better than many I installed and tested. On the GF3, the Leadtek Winfast GF3 was the same case, only after unboxing and testing six different GF3's as well. Obviously little to no difference can be determined if one tries hard to clip accuracy- such as only testing at lower resolutions or lower frequency, in non-colorful conditions, or on non-high end monitors. At 1024x768x32 @ 60hz on a Viewsonic/ADP/Relisys, with Quake2 there is no discernable difference. At 1280x1024 @ 100hz, on a Sony or NEC with Morrowind or Dungeon Siege, the difference is night and day.

As far as rasterization methods on the chipsets themselves- you can argue this until you are blue in the face. The fact of the matter is ATI and NVIDIA chipsets do things differently. Mipmapping, texturing, stenciling, alpha blending- they are very, very different. Subjective opinion is what reigns at the end of the day, but sane/fair/non-fanboyish monotony can easily pick out the differences. IMO, alpha blending in OGL is very clean on the GF4 yet ugly on the 8500, as with mipmapping in D3D is very clean on the 8500 yet ugly on my GF4. You can pick out, case by case examples that illustrate the different methods these chipsets use for various effects so there is no sweeping "XYZ is bettah!" since both have different methods and different goals in mind.
 
Maybe, but most GF4-reviews seems to think that 2D is just as good as on Radeon cards. Maybe you have a better eye.

Personally I bought a Gainward GF4 since I have had a very good experience with them earlier. So faar I think the 2D is very good (atleast 1600*1200, can't go higher)
 
The Gainward GF4 I give very high kudos to as it had 2D/IQ about as good as the VisionTek I wound up with. It was a pretty close tie actually between them.

My deciding factory between those two was the Visiontek Ti4600 bundle as the PowerDirector Pro (with cheap $25 upgrade to "full" which only added MPG2 format encoding) was included and I really like this program for capture and editing. :)

Maybe, but most GF4-reviews seems to think that 2D is just as good as on Radeon cards.

Yeah, and there are also R300 reviews now that somehow think 6xRGAA and 16xAF is just as good as 4xOGMS + 4xAF. It's of zero value what the monkeys at websites have to say about image quality as can easily be disproven by plugging a couple cards into some high end monitors.
 
It might be that you are absolutely right, but like most people I have no way of showing you otherwise. Most users can't test ten different videocards before they buy something. I will have to put my trust in reivews, and when anandtech compare five GF4 cards and give them Very Good in 1600*1200 I would have to trust them. The other solution is to trust you, but I don't see why I should trust you instead of five-six reviews.

The only cards I have compared mine to is a Matrox card, and in 1600*1200 the quality was the same. Maybe I am just lucky with my card.

The other factor is: Are the customers happy? Well, beside you I have yet to meet a GF4-user that is unhappy with the 2D-quality. That might not proove anything, because as everyone knows: Nvidia-users don't have a clue about quality.
 
Back
Top