Wich card is the king of the hill (nv40 or R420)

Wich card is the king of the hill (nv40 or R420)

  • Nv40 wins

    Votes: 0 0.0%
  • they are equaly matched

    Votes: 0 0.0%

  • Total voters
    415
Status
Not open for further replies.
DemoCoder said:
The question is, how many 2004 games will be playable at hi-res and 6xMSAA?

Enough people complain about the lack of SSAA (unusable in modern games) that I think it is something worth noting, new games or no.

I agree with the point you made regarding the focus on AA. I think we're in danger of replacing FPS with things like AA and AF numbers and images. There are other paths to be trod.
 
I would truly expect 6X MSAA to be very usuable at decent resolutions on the X800 series..... and to some it makes a huge difference..... I HATE jaggies..... even itty-bitty ones.....
 
Why is it so hard to understand that determining which one looks better *solely* depends on the monitor and gamma settings you are using, and it can look entirely different for someone else?
On my laptop display and my own desktop gamma settings (which are per-channel adjusted to suit my eyes), I'd give ATI a slight edge. However, with my 3D gamma settings (which are quite high), NVidia's 4x looks better and almost approaches ATI's 6x, because the gamma "correction" actually has an adverse effect in that case.
 
martrox said:
I would truly expect 6X MSAA to be very usuable at decent resolutions on the X800 series..... and to some it makes a huge difference..... I HATE jaggies..... even itty-bitty ones.....

You know, I hardly ever noticed them until I started getting into 3D hardware. Now .. they drive me up the wall. I can barely play a game on the xbox without scowling about it.
 
Well, firstly I suspect 95% of people never touch the gamma sliders, and secondly since there are a lot of reviews across many different screens, and none come down on the side of nVidia, I'd say that in itself is telling regarding how big an effect the screen has.

That's just me though.

Going back to AA, considering how CPU limited we are, I wonder if 6x AA really will be unusable in new games?
 
Xmas said:
Why is it so hard to understand that determining which one looks better *solely* depends on the monitor and gamma settings you are using, and it can look entirely different for someone else?
On my laptop display and my own desktop gamma settings (which are per-channel adjusted to suit my eyes), I'd give ATI a slight edge. However, with my 3D gamma settings (which are quite high), NVidia's 4x looks better and almost approaches ATI's 6x, because the gamma "correction" actually has an adverse effect in that case.

Yep..I just turned my gamma outrageously high from default level of 1 too 3.(blinding.) Not that the setting is necessary too see what you are talking about. It does indeed effect the shot drastically.. Thanks for that.

EDIT:In fact turning the gamma up by a factor of .5 is enough to see a huge difference.

EDIT2:It also makes a difference in the 3DM2003 screen shots. On that note I suggest too everyone who could not see the difference to turn down their gamma settings and take a look at the AA shots too see what the hell I was talking about.

EDIT3: heh, sets gamma back too default levels.. rubs eyes.
 
Sabastian said:
You know, I hardly ever noticed them until I started getting into 3D hardware. Now .. they drive me up the wall. I can barely play a game on the xbox without scowling about it.
How very true. Never train yourself to spot bad things, you will spot them all the time. I've got away with it in 3D mostly, it's just the glaringly obvious (the PS2's shocking texture aliasing, the PS1's jiggly jiggly vertex snapping) that annoys me.

I made the mistake of doing this with MP3's. Spent 2 days compressing at different bitrates and with different codecs. Result: trained my ear to spot the tiniest MP3 artifact at a hundred yards. Now even good MP3's grate a bit. Don't do it, kids.
 
DemoCoder said:
The question is, how many 2004 games will be playable at hi-res and 6xMSAA? Looks to me like it's only an option on older games.

define older, CS isn't a good example, but an extreme one to prove a point. I find Far Cry with AA on high settings fairly CPU limited on my system.

Look at these Hexus.net numbers on UT2003 and X2, the 420 has plenty of headroom for 6XAA at 1280x1024.

http://www.hexus.net/content/reviews/review.php?dXJsX3Jldmlld19JRD03NTgmdXJsX3BhZ2U9MTk=
 
Dio said:
Sabastian said:
You know, I hardly ever noticed them until I started getting into 3D hardware. Now .. they drive me up the wall. I can barely play a game on the xbox without scowling about it.
How very true. Never train yourself to spot bad things, you will spot them all the time. I've got away with it in 3D mostly, it's just the glaringly obvious (the PS2's shocking texture aliasing, the PS1's jiggly jiggly vertex snapping) that annoys me.

I made the mistake of doing this with MP3's. Spent 2 days compressing at different bitrates and with different codecs. Result: trained my ear to spot the tiniest MP3 artifact at a hundred yards. Now even good MP3's grate a bit. Don't do it, kids.

Too late Dio. Many of us are at a point of no return. And frankly it's in the best interest of any IHV that sells PC graphics sollutions. Once a user is used to an antialiased and filtered scenery in resolution X, it's only a matter of time when newer more demanding games come out and you suddenly have to tune down to X(-1) or X(-2) and that's exactly the point where you storm into the next best hardware store waving with plastic before you even enter the door.... :LOL:


****edit: there are certain cases where the human eye can in fact see aliasing in real time........you don't want to know what the first best spontanious thought is until I snap back into reality hehehe...
 
Ailuros said:
Too late Dio. Many of us are at a point of no return.

I agree, I keep telling myself I have no need for a 6800 or X800, I don't know how long I can hold out..
 
I'll wait another 6 months or so until prices become more reasonable and all the dust from the different releases will have somewhat settled.
 
Ailuros said:
I'll wait another 6 months or so until prices become more reasonable and all the dust from the different releases will have somewhat settled.

that's my intention, oh and after the refreshes would be good.
 
Quitch said:
Well, firstly I suspect 95% of people never touch the gamma sliders, and secondly since there are a lot of reviews across many different screens, and none come down on the side of nVidia, I'd say that in itself is telling regarding how big an effect the screen has.

That's just me though.

While it might be true that most don't even touch the gamma slider that is not the case in my instance. I don't like even having the + .5 difference from the default level, it is just too bright. I can see why Xmas would have his turned up considering he has to deal with an LCD monitor.

EDIT: I wonder if NV cards have a slightly higher default gamma level?
 
On most lcd monitors, gamma will vary greatly depending on where you look on the screen and also depending on the angle : so the AA will look great on the top and not so great at the bottom.

Plus on lcd, you get flickering colors due to emulation of some colors with temporal dithering.

On the other hand on LCD you could play with some variation of clearType if it was implemented in 3d engines.
 
LeGreg said:
On the other hand on LCD you could play with some variation of clearType if it was implemented in 3d engines.
Impossible to do in a 3d engine, and hardly possible in hardware.


Actually, I have gamma turned down on the desktop, but up in (most) games. Too many dark corners ;)
 
Xmas said:
Impossible to do in a 3d engine

I have some questions about your mipband images:
what do the weird colors mean (it has something to do with lod level selection or not ?)
 
ClearType techniques for 3D is not impossible to do.

Anyway, about gamma, many LCDs will get adjusted automatically by install disks which copy ICC profiles for the monitor and tweak gamma.
 
DemoCoder said:
ClearType techniques for 3D is not impossible to do.
I didn't say that. But you either need hardware support or do the rendering in software. And there are several issues with blending, depth buffer, multisampling, etc. that you cannot trivially fix. ClearType is great, but I don't think it has a place in 3D graphics.
 
LeGreg said:
I have some questions about your mipband images:
what do the weird colors mean (it has something to do with lod level selection or not ?)
ot: each mip level has a different color, so you can derive from the color of a pixel which mip level/blend of two mip levels for trilinear is used for texturing this pixel.
 
Status
Not open for further replies.
Back
Top