Which has better image quality the Kyro or Radeon ?

Pottsey

Newcomer
I know the Kyro and Radeon have better IQ then a GF2 but were does the Kyro come in the list? Is it above the Radeon the same or below? How does the GF4 add into this?

What I want to know is if you pick almost any random game and run it at 1024x768x32 with all settings max could tell which was a Kyro II SE and which was a Radeon from a IQ point of view as long as nothing special was turn on in the drivers like FSAA or fast colours e.c.t.

What about if you just pulled an average game player off the street. Would he be able to say which card looked better or would both cards be to close?

For the above I don’t want to include games with Pixel Shaders or Anisotropic Filtering as its clear the Radeon would win. Also games in 16-bit colour only should not be counted as its clear the Kyro wins there.

How does TV-out quality and FSAA add into this?

The only way I can think of testing any of this is for someone to send me screen shots from both cards and I host them while making a new post with both screen shots. Then I ask everyone to guess at which is a Kyro an which is a Radeon.
 
Just to chime in here about this 'IQ' or picture quality rubbish.

Here at work, we recently upgraded a lot of our computers from Dell GX1 desktops with integrated ATI video to new DIY computers with Radeons.

The Radeons' output on our 20 inch monitors was WORSE than the crappy ATI rages integrated on the GX1s. So much worse that most of us went out and splurged on $50 GF2/MXs which made a world of difference in clearing up the blur.

Bleh!
 
About Kyro vs Radeon....I doubt an "average gamer off the streets" would be able to tell the difference...AFAIK they're pretty close.
Granted I didn't have my Kyro for very long,but I liked the IQ....I also liked it on my Radeon...

Russ...I swapped from a GF2 MX (Hercules) to a Radeon DDR (OEM) and there's no way in H I'll go back to GF2 graphics...(19" monitor)
Fallout for instance never looked as good as it did with my Radeon...
We all base our opinions on our own personal experiences and hence I will gladly recommend it whereas you won't...(just like I'll never recommend an Abit MoBo due to my experiences with them...)
 
Hmmm, maybe it was a Radeon chip, but different manufacturer. It certainly would explain it not living up to its reputation and being worse than Dell's integrated ATI video.

Anyways, it all goes to show that the real blame lies in the manufacturer of the card, not the chip IHV. Speaking from experience, there's only so much you can do to steer an OEM to using your recommendations (short of not selling them the chip).
 
No the fault lies in the reference design, ATI has very strict requirements for their reference design and since they have changed their business model there have been no complaints about 2D on Club 3D and Hercules and Gigabyte Radeon 8500 cards.
Nvidia even admits that the 2D has issues based off their references design.
http://www.tomshardware.com/graphic/02q2/020522/ti4400_4600-02.html

NVIDIA is aware of this problem and has promised to address it through stricter reference design requirements.

We have Dell Optiplex's where I work with the Intel Integrated chipset and the image quality on that chip is very bad also, I think Dells are Garbage anyways...my wife works for them and I get all the juicy stories...lots of issues.
 
I think both the Kyro and the Radeon offer about the same IQ, especially texture compression, very close. The Kyro cards really surprised me on performance, has to be best 'bang for buck' card on the market.
 
I can comment on this from personal experience. You see, I upgraded from a Kyro 2 to a Radeon 8500. I gave my friend my "old" kyro. Just for fun we set the two up next to each other and played/tested many games for several hours.

This is the truth.

The Radeon 8500 looks *Considerably* better in 32 bit color. There is simply no contest. In 16 bit color the Kyro looks better.. UNTIL... You crank the ansio on the Radeon, then they are very close.

The biggest specific differences.

1. The Radeon has MUCH better depth percision. Several games with long views get pretty shaky on the Kyro. Most notably Tribes 2, NOLF. The Radeon is simply massivly more accurate. Kyro image gets jaggy, and texture popping- tearing.

2. The Radeon Colors simply have a much deeper, more Vibrant feel. Colors on the kyro seem a bit pale when the two are set side by side.

3. The overall sharpness and persicion is crisper on the Radeon.


Here However is the BIGGEST problem in comparing the two......

You simply *cant* play games on the Kyro with "everything cranked" It simply dogs the card to death. Where as, You can crank the details, I mean EVERYTHING, with Trilinear, and 16x Ansio on the Radeon. Do that on a Kyro even with reduced settings and you are playing a slide show on many many of todays games.

In all fairness there are a coupple of games that actually play faster on the Kyro than the 8500. Anarchy online is much smoother on the Kyro as an example.

The bottom line is this,

With all the settings both cards are capable of, completely MAXXXED, and all game Settings (in general) completely MAXXXED. The Radeon 8500 simply looks better. More than that, Its playable, usually more than playable even at higher Resolutions (note: excluding 6x SV). The Kyro under the same conditions becomes very unstable and at anything above 1024x768x32 its basically a slide show.


These are my personal findings based on an actuall several hours long, side by side comparisson.
 
Reverend said:
I know the Kyro and Radeon have better IQ then a GF2...
In what way?


Hmmm texture compression and LOD come to mind right away,
example:

Geforce 2 GTS
1010979849aGDrY8relO_3_1_l.jpg


Radeon 32 Meg

1010979849aGDrY8relO_3_2_l.jpg


http://www.hardocp.com/article.html?art=MTEzLDM=

Geforce 2 No Texture Compression
q3_1_geforce.png


Geforce 2 Texture Compression Enabled

q3_1_geforce_comp.png


q3_2_geforce_comp.png


Radeon No Texture Compression
q3_1_radeon.png


Radeon TEXTURE COMPRESSION ENABLED
q3_1_radeon_comp.png

q3_2_radeon_comp.png


Even with no texture compression enable the image quality is better, LOD is better but there is something else..not sure what it is ??
 
"I know the Kyro and Radeon have better IQ then a GF2...

In what way?"

I have only compared two GF2MX’s next to the Kyro and I guess the best way to describe the GF2MX was the colors looked washed and pale on the walls in the lobby test for 3Dmark 2001. Most games suffer from that.

Perhaps we just had bad GF2MX’s but most users I talk who upgrade from a MX to Kyro say the Kyro II looked far better. So I took that as normal.

I have also been that the GF2MX looks bad in games which use more then 2 texture passes. As on the second pass the frame buffer is lowered to 16-bit. Is that true? If it is then that explain why the GF2MX’s in my house look so bad in 3Dmark 2001.
 
Doomtrooper said:
Even with no texture compression enable the image quality is better, LOD is better but there is something else..not sure what it is ??

The original Radeon supported the exact same anisotropic filtering as seen in the Radeon 8500. It was on by default.
 
Chalnoth said:
Doomtrooper said:
Even with no texture compression enable the image quality is better, LOD is better but there is something else..not sure what it is ??

The original Radeon supported the exact same anisotropic filtering as seen in the Radeon 8500. It was on by default.


No it didn't,the 8500's anistropic filtering is much improved over the original Radeon..ansitropic filtering is not on by default you have to enable it in the OGL control panel, I own the cards ;)

The article said all settings were the SAME, 32-bit color..etc..no anistropic.

opengl.gif
 
Typedef Enum said:
As a point of reference...

I looked @ the Quake3 shot, and attempted to replicate it on a GF4.

shot0002.jpg

Looks better than a Geforce 2 :p

I assume texture compression is on, is there tweaks being used here by forcing lower compression ??
 
I have a Videologic Vivid! (KYRO 1) 32 MB card, and I recently bought my brother a Radeon 7200 64 MB card. I installed the Radeon 7200 in my machine and tested all my games on it, and this is what I observed:

1) the kyro has much better 16-bit output, and I did not like the Radeon 16-bit output at all, and told my brother to play only in 32-bit mode
2) both cards are roughly equal in 32-bit output, but the graphics output of both cards look different

Performance was rougly the same also, as I ran a few benches.
 
Doomtrooper said:
No it didn't,the 8500's anistropic filtering is much improved over the original Radeon..ansitropic filtering is not on by default you have to enable it in the OGL control panel, I own the cards ;)

How is it improved? And there is no possible way to improve texture LOD without increasing aliasing artifacts, unless some degree anisotropic is enabled.
 
The Quake3 shot (red sky) looks familiar...I know I took some screenshots for my review w/ the red-sky.

The shot I took was with TC enabled...10x7x32, etc.

The Unreal stuff never exhibited serious S3TC issues, AFAIK...About the only thing that ever stood out were banding in/around bright light sources.

I think the amount of compression errors is directly related to the usage/selection of different textures. Quake3 has always had some areas that tended to be more visible than others.
 
Back
Top