Which has better image quality the Kyro or Radeon ?

The screen shots don't lie, the Lod bias was lower on the GTS shot, blurry.
As for the Ansitropic improvement, ATI has improved its adaptive approach to Anistropic filtering vs. the original Radeon, as seen here in this forum with screen shots from Bambers. The 8500 has trouble with 45 degree angles still, but you know this :LOL:
 
Here is a shot on my 8500 with same settings S3TC enabled and some light sources..
shot0012.jpg
 
The gf2mx does seem to have very washed out colours from my experience. I noticed the 8500 was more colourful than my geforce256 SDR but the gf2mx is far below both these cards. It just looks dull.
 
Try taking a shot of the sun in KGalleon.

Here's are two shots on a GeForce DDR:
UTS3TC.jpg

UTNOS3TC.jpg


I'd give you a shot of it on my GeForce4, but I have since copied the uncompressed textures for that sky from the original CD, eliminating the problem.
 
Doomtrooper said:
Kgalleon it is, who plays Deathmatch anymore...boring ...

Yes, the sky definitely looks much better than it does on any of the GeForce boards (Btw, I chose that map because that particular section actually looks much worse on GeForce4 boards than previous ones). Note that it is fixable. For those GeForce users, out there, some of the image quality problems can be removed by replacing the uncompressed versions from the first UT CD of the files: Indus6.utx, city.utx, and GenFX.utx. I don't believe any of these textures include high-res compressed textures, and if any of you continue to have problems, you can always edit the level that has the image quality problems, and replace the offending texture file (Just don't forget that this is all or nothing...if that texture file also includes high-res compressed textures, you'll lose some of those...).

However, I would like to say that I did notice another problem in that shot. The wood of the 'railing' of the ship looks far, far less clear than it does on my machine. Perhaps your Radeon is only displaying 512x512 textures for some reason?

Here's a comparison shot that *should* bring out the differences:
utworld.jpg


This is from CTF-Face (but you knew that, right? :) Please be sure to get the spotted clouds on the top right in the shot, thanks...[/img]
 
Doomtrooper said:
CTF face shot...I thought your shots that were posted the wood look like crap :p

Well, that was because I took those shots at low res (800x600, I'm pretty sure).

But, those world shots do look identical, so I guess the problem was probably just my imagination.
 
Heh, I nearly fell of my chair when I saw the title..

I had the KYRO II in my system for over 1 year, and it was great.

However, when I chucked in an ATI Retail card into the machine. All I can say is WOW.

Heh, I even chuckle now, when I play games like MOH:AA, Jedi Knight2, Unreal, GTA3, SOF2, need i go on?

The KYRO II is owned completely by the Radeon 8500.

This is done through the gamers eyes, and I think alot of gamers would agree.

Oh and btw, why not do a R300 vs Kyro III thread on who has the better IQ. Oh yes, we can do that in a few months when the R300 is released, but looking at the situation with IMGTEC I doubt we'll be able to do that with the KYRO III.
 
I see the original question being about image quality between a Radeon and a KYRO, where did he ask about performance? Funny how the original poster didn't even mention a specific model yet the immediate assumption goes to the R200.

Apples to apples R100 vs KYRO II, based on specifications and price. Of course if it makes sense to compare a 4 piped/2TMU 275mhz vga with a card with half it's specs and retail price so be it.

Those who understood the initial question and mindset of Pottsey have already answered his question adequately. He's a smart enough guy to know that K2=last year's budget card, 8500=even today high end card.

I have in the meantime a vga in my rig that's in some occasions up to 3x times faster than a K2, yet I don't need to bent issues out of shape sheesh
 
Ailuros said:
I see the original question being about image quality between a Radeon and a KYRO, where did he ask about performance?

You cannot separate image quality and performance, period.

The two are intimately intertwined. Higher performance allows you to turn on more image quality features (aniso, FSAA, higher res). Higher framerate is also a type of better image quality.
 
I was only talking about the VGA output signal quality.

Strangely, my Dell box with integrated ATI video has better output than an add-in radeon card.
 
You cannot separate image quality and performance, period.

The two are intimately intertwined. Higher performance allows you to turn on more image quality features (aniso, FSAA, higher res).

Another blanket statement.

Performance and quality relative to another part are not always the same dependant on implmentation, so sometime you have to think about them independently. For instance if a low speed tiler were introduced with no FSAA cost then it may be considered a low speed part without FSAA however when comparing it to another part with FSAA it my outperform it; likewise Radeon 8500 may perform worse than some GF4’s without Aniso enabled but enable it and that can be turned around (and then you have the argument of which looks better). So, baseline higher performance doesn’t necessarily lead to higher quality dependant on implementation.

Higher framerate is also a type of better image quality.

I think its fairly obvious that this not what is being asked in the context of this thread.
 
“Those who understood the initial question and mindset of Pottsey have already answered his question adequately. “

Thanks to all of you. Sometimes I wish I could build my own lab and figure this stuff out for my self. But the usual problem of lack of money and time pops up.

The only question that I think got missed was

“I have also been told that the GF2MX looks bad in games when more then 2 texture passes are used. As on the second pass the frame buffer is lowered to 16-bit. Is that true?â€￾
I looked for evidence of this and it looks true in a few screen shots from a few games but I cannot find any articles on the subject and I do not want to believe its true just by a few screen shots.
 
That can't be true. If you drop to 16 bit at the second pass, you would overwrite an already 32 bpp frame buffer with 16 bit values. That would give weird results, to put it mildly. Unless of course the MX was already working at 16 bpp. I guess it's possible that the driver could force 16 bpp if more than one pass is needed, but that wouldn't really make sense either.
 
Chalnoth said:
Ailuros said:
I see the original question being about image quality between a Radeon and a KYRO, where did he ask about performance?

You cannot separate image quality and performance, period.

The two are intimately intertwined. Higher performance allows you to turn on more image quality features (aniso, FSAA, higher res). Higher framerate is also a type of better image quality.

Apart from Wavey´s adequate reply to that one already, I never really digested very well with selective quoting. There´s more in that post than just the one sentence above, but I really don´t have any intention to get this one any further.
 
Pottsey,

The usual highest number of textures used in recent games should be around 4 or sometimes even 5 (SS/SS2). The amount of textures/blending operations isn´t enough to suggest loss of accuracy on cards like GF2MX´s.

I´m not 100% sure but I think the KYRO drivers limit to 4 textures per pass in openGL vs 8/pass in D3D.
 
That can't be true. If you drop to 16 bit at the second pass, you would overwrite an already 32 bpp frame buffer with 16 bit values. That would give weird results, to put it mildly. Unless of course the MX was already working at 16 bpp.

Heh – I don’t think that’s whats being suggested. I think what he’s asking is whether the frame buffer on an MX is being defaulted to 16bit regards of whether the user selects 32bit or not – this would be most evident in any multipass operations and alpha blended ops.

In reality this would actually be a good way of keeping speed up on these low end parts – I mean anyone who has one are they really going to be studying the nuances between a 16bit frambuffer and 32bit? Likewise, reviewers couldn’t give two shits about a crappy part such as this because their to busy drooling over the high end parts leaving this type of stuff to fly under the radar.
 
iRC said:
Heh ? I don?t think that?s whats being suggested. I think what he?s asking is whether the frame buffer on an MX is being defaulted to 16bit regards of whether the user selects 32bit or not ? this would be most evident in any multipass operations and alpha blended ops.

In reality this would actually be a good way of keeping speed up on these low end parts ? I mean anyone who has one are they really going to be studying the nuances between a 16bit frambuffer and 32bit? Likewise, reviewers couldn?t give two shits about a crappy part such as this because their to busy drooling over the high end parts leaving this type of stuff to fly under the radar.

I *seriously* doubt this is the case. Firstly, all multipass ops will definitely drop some color depth. Secondly, my little brother's nForce (he's using integrated video) looks exactly like my old GeForce DDR, and I'm certain that my GF DDR uses a full 32-bit frame buffer.

The only possibility is the use of a 16-bit Z-buffer, though I really doubt this is the case...
 
Back
Top