ms/nv/ati=bad iq for games; ?'s

Status
Not open for further replies.
4870 questions

does increasing the lod bias in ati tray tools purge the shimmering caused by the 4870's filtering optimizations?

its af pattern in af tester is pretty far from perfect. i believe they'd need to match nvidia's hq mode or better to purge the texture aliasing.

was asking b/c ati's maximum aa is actually acceptable to me, but their hq af has always been too far from perfect for me. with nvidia the aa isn't acceptable to me, but their filtering is near-perfect. so i was willing to suffer a little bit of blurriness from turning up the texture lod bias to get ati's superior aa, as long is purges the aliasing caused by angle variance.
 
I think older games in general look as good as they ever did
all I can think of are 3 exceptions system shock2, theif2 and crimson skies
I cant think of any more if your just including dx5 and afterwards
what examples do you have ?
 
Last edited by a moderator:
Where exactly do you see texture shimmering on a 4870?
 
I don't know what that genesis emulator is doing (can we have some screenshots?), but by the sound of it you're comparing an old console that did everything with plotting little 2D pictures on to a framebuffer with current tech which consists mainly of plotting 3D polygons (triangles). And entire framebuffer is already processed (resolved) to get you AA frame on the screen.


Tell me what good trilinear filtering does with polygons that are parallel to the screen? And how the hell is this supposed to help AA?


And why exactly MSAA is curently not 100% compatible? Sure there are some techniques that currently don't work with it, or require some hacking, but by large it works.And don't mix Voodoo 5 and how well it did and looked back in the day, becouse techniques that developers used back in the day were alot different then what they use now.


So you basically have a problem in some older game and now you're throwing the "IQ is unacceptable becouse of the optimizations" line. And even though ATI driver guy tells you that these optimizations have nothing to do with quality, you now go to that they brake compatibility with older games. Do you know that w-buffer was ditched all together?


DX10 doesn't have distance fog anyway. Developers have to do this in the shaders. So you are comparing what some developer using OpenGL has done and what some other developer using DX10 has done and drawing conclusions while totally missing the point.

OpenGL isn't magical in removing aliasing either.
Actually, it doesn't make sense. If the developer has concluded that an effect only needs certain precision, why would the driver change that?

Such as? Direct3D and OpenGL are largely the same when it comes to rasterization.
to look better. for example if a game programmed in 16 bit color and textures was forced by the driver to run in 32 bit color and textures, then the artifacts due to 16 bit color would be gone.

i never said opengl was magical in removing aliasing.

i said open gl games have always lacked distance fog or had much less of it in general (except the nvdistance fog to reduce rendering load which can be turned off) and have had much longer draw distances with more precision in the distance. it's safe to conclude that dx has been lacking something opengl always had, due to dx games' excessive fog/much less precision far away.

msaa is not 100% compatible b/c not every single game works with it and some require hacks. ati's best aa is 100% compatible.

i thought that the ms ref rasterizer used a certain type of distance fog and that all dx 10 hardware used it and that it was forced.

yes, i do know the w-buffer was ditched altogether. but that doesn't mean the hardware isn't capable of replicating it perfectly b/c forcing a fp32 z buffer, inverting the values and disabling distance fog, would give you a perfect equivalent to the w-buffer, if the drivers would just allow it.

finally, in dmc 4, the latest pc game i've played, the drivers could've made it so that there was no fog in the distance even if though the game called for distance fog. but they didn't. so it didn't look as good as was possible.
 
I think older games in general look as good as they ever did
all I can think of are 3 exceptions system shock2, theif2 and crimson skies
I cant think of any more if your just including dx5 and afterwards
what examples do you have ?
16 bit color is awful on dx10 hardware, so every game in 16 bit color.

specific games that i've tried, rayman 2 and pop 3d look awful on dx10 hw just like ss 2 and thief do.

but, the driver could easily take care of that if nvidia and ati cared.
 
Where exactly do you see texture shimmering on a 4870?

with pretty much every game on an x1k series, with the highest quality setting on everything including a.i. disabled. i read in the hd 2k series' analysis that the af was the same. so that means there's texture shimmering if it's the same.
 
to look better. for example if a game programmed in 16 bit color and textures was forced by the driver to run in 32 bit color and textures, then the artifacts due to 16 bit color would be gone.
And could open a whole new class of bugs as well.
i said open gl games have always lacked distance fog or had much less of it in general (except the nvdistance fog to reduce rendering load which can be turned off) and have had much longer draw distances with more precision in the distance. it's safe to conclude that dx has been lacking something opengl always had, due to dx games' excessive fog/much less precision far away.
It's not safe to conclude that unless you found something in the spec.
msaa is not 100% compatible b/c not every single game works with it and some require hacks. ati's best aa is 100% compatible.
Your two statements are contradictory to say the least. Developers need to support AA. Relying on the drivers for forcing AA makes for bugs.
yes, i do know the w-buffer was ditched altogether. but that doesn't mean the hardware isn't capable of replicating it perfectly b/c forcing a fp32 z buffer, inverting the values and disabling distance fog, would give you a perfect equivalent to the w-buffer, if the drivers would just allow it.
It's not equivalent at all! W-buffer is linear, Z is not. Inverted float Z can be good, but it's not a W-buffer.

Again, it's up to the apps to support this.
finally, in dmc 4, the latest pc game i've played, the drivers could've made it so that there was no fog in the distance even if though the game called for distance fog. but they didn't. so it didn't look as good as was possible.
Now you want drivers to ignore fogging when apps request it? That doesn't make sense.
 
i said open gl games have always lacked distance fog or had much less of it in general (except the nvdistance fog to reduce rendering load which can be turned off) and have had much longer draw distances with more precision in the distance. it's safe to conclude that dx has been lacking something opengl always had, due to dx games' excessive fog/much less precision far away.

No Direct3D doesn’t lacking here anything. It’s all in the hand of the developers how they handle the depth precession problem.

msaa is not 100% compatible b/c not every single game works with it and some require hacks. ati's best aa is 100% compatible.

Every AA method can be broken if it is forced in the background of the game.

i thought that the ms ref rasterizer used a certain type of distance fog and that all dx 10 hardware used it and that it was forced.

This is only true if the game use fixed function based distance fog. But this was defined long time ago and current hardware only supports it for compatibility.

yes, i do know the w-buffer was ditched altogether. but that doesn't mean the hardware isn't capable of replicating it perfectly b/c forcing a fp32 z buffer, inverting the values and disabling distance fog, would give you a perfect equivalent to the w-buffer, if the drivers would just allow it.

finally, in dmc 4, the latest pc game i've played, the drivers could've made it so that there was no fog in the distance even if though the game called for distance fog. but they didn't. so it didn't look as good as was possible.

Today games have their fog inside the shader code. No driver would be able to find these parts in the shader code and remove it. You need to blame the developers and not the hardware manufactures.

16 bit color is awful on dx10 hardware, so every game in 16 bit color.

specific games that i've tried, rayman 2 and pop 3d look awful on dx10 hw just like ss 2 and thief do.

but, the driver could easily take care of that if nvidia and ati cared.

I believe you refer to the missing 16 bit dither. But this isn’t something new for Direct3D 10 hardware. I can’t remember the exact chip generation that remove this but I know for sure that the GeForce 6 Series doesn’t support dither anymore.

Sure it would be easy to force 32 bit instead of 16 bit but as someone who have done this for old games I need to say that this is only the beginning. Many of these old games doing very bad things with the frame buffer. One common technique at this time was locking the back buffer and write text directly in the video memory. This will fail totally if the buffer is not longer 16 bit. Sometimes I am even surprised that new drivers can still handle this bad programming from theses old days.
 
This has got to be one of the craziest troll we've had on this board.

The audio comment is especially ludicrous. Uncompressed audio is a waste of space. You can losslessly compress audio and get half the space usage. If you use a lossy codec and use it with a high q, you get even more space saved. Not compressing audio is worthless.
 
The audio comment is especially ludicrous. Uncompressed audio is a waste of space. You can losslessly compress audio and get half the space usage. If you use a lossy codec and use it with a high q, you get even more space saved. Not compressing audio is worthless.
I'm sure he's complaining about games where he's heard audio compression artifacts. It's rather rare, but I've heard it too. Doom3 and UT2003/04 are good examples. The moment you start up Doom3 you hear artifacting in the music at the menu. UT2003/4 have numerous music files as 64kbps OGGs. Actually, the X2 and X3 games have 128kbps MP3s. That's shoddy stuff in this day and age.

KOTOR 2 shipped with a mono MP3 soundtrack encoded with an 11KHz lowpass filter. They actually released a free "music pack" download after players bitched enough. That was insanity. Maybe they were going for the old B&W movie soundtrack feel. LOL.
 
Last edited by a moderator:
Games released in 2008 do both dynamic clip planes (texkill) and distance fog in shaders. Both OpenGL and DirectX allow you to do these things in full 32 bit floating point precision (on all recent hardware).

Z-buffer precision is identical in OpenGL and DirectX. Assuming you create your projection matrix the same way and use the same Z-buffer formats.

Hierachial Z-buffer, stencil compression and multisample AA color compression do not affect image quality at all. These are only performance optimizations. Because of better performance, you can have more demanding graphics in the games --> better image quality.

Compressed sound is often better quality than uncompressed sound if both take same space. You can have much higher sampling rate and bit depth in compressed sound if you want both uncompressed and compressed to take same space. Also there are several completely lossless compression codecs, that offer 1:1 bit by bit perfect sound quality. Uncompressed sound is just waste of space and bandwidth.

It's true that the new DX10 hardware does not support 16 bit dithering anymore. That's true. But completely irrelevant on games released on 2008, as nobody uses 16 bit colors anymore.
 
well thanks for informing about everything.

so i have another question. why does distance fog serve any purpose if highly programmable fp32 z buffers exist? wouldn't it have been possible for dmc4 to have had no distance fog at no costs?

i guess it's safe to say that bw compatibility will only be any good if done thru shaders one day in the far future.

the wma format used in x360 and pc games does not allow for highs or lows as good as redbook did. ever since the redbook standard died when "lossless" audio took over, games have sounded much flatter.

i don't know how 2 channel uncompressed pcm 24/96 wouldn't sound any better than the highest bitrate lossless wma.

i'm sure if a game was encoded in it, then people probably wouldn't be saying it's a waste of space.

also, try to answer my lod question. i know i can't be the only one who notices shimmering in many cases with ati's hq af.
 
the wma format used in x360 and pc games

Show me one game that uses WMA Pro.

i don't know how 2 channel uncompressed pcm 24/96 wouldn't sound any better than the highest bitrate lossless wma.

Because lossless is LOSSLESS. WMA lossless sounds exactly the same as the original PCM counterpart.
 
the voodoo5 worked fine with any of the optimizations i mentioned. and it had the highest image quality for it's day.

BAH! ANyone who agrees with this statement also gets a huge BAH! All 3DFX AA did was to make the whole scene look washed out and blurred compared to the same scene without AA.
 
BAH! ANyone who agrees with this statement also gets a huge BAH! All 3DFX AA did was to make the whole scene look washed out and blurred compared to the same scene without AA.
BAH!!!!!!!!! I disagree cuz I was using one the other day for a few games and its AA is really quite impressive. Recently played through Jedi Knight partly with the V5 5500 and with a GeForce FX 5950. V5's AA is superior to anything FX 5950 can put out unless you use Rivatuner and enable the hidden 8X mode (I don't know exactly what this is.)

see mysterious "8X" setting
grimaafix002.gif



The Voodoo 5 may not support AF, but the ground textures were being sharped by something and were surprising low on aliasing.

;)
 
Last edited by a moderator:
Show me one game that uses WMA Pro.



Because lossless is LOSSLESS. WMA lossless sounds exactly the same as the original PCM counterpart.
there's many different formats called "lossless."

the option to rip cd tracks to wav lossless (almost 500 kbps higher than the best wma) wouldn't be in windows xp if wma lossless was as good.

940kbps isn't enough, no matter how "lossless" it may be.

dynamic range in x360 games isn't very good, due to the low bit rate of wma lossless.
 
well thanks for informing about everything.

so i have another question. why does distance fog serve any purpose if highly programmable fp32 z buffers exist?

800px-32919747_ae077d271d_o.jpg


It's not like fog or haze, artificial or atmospheric, is unheard of in the real world so I don't see why we should be at some point where the hardware will forcibly eject the programmer from his chair if he tries to add fog to a game.


the wma format used in x360 and pc games does not allow for highs or lows as good as redbook did. ever since the redbook standard died when "lossless" audio took over, games have sounded much flatter.

Do you happen to own any Monster brand cables, wooden volume knobs or other types of holistic audio enhancing equipment?
 
there's many different formats called "lossless."

the option to rip cd tracks to wav lossless (almost 500 kbps higher than the best wma) wouldn't be in windows xp if wma lossless was as good.

940kbps isn't enough, no matter how "lossless" it may be.

dynamic range in x360 games isn't very good, due to the low bit rate of wma lossless.
holy lol, I cannot believe this statement. mind is blown, and now I have to think this is some game dev fucking with us.
 
Status
Not open for further replies.
Back
Top