x800 texture shimmering: FarCry video

jvd said:
1028i I believe .

Mine is far below that and still looks way better then any game at 1600*1200 :)

Look at it this way. Doom 3 is capable of running on a geforce 2 at 640x480 at 15fps or so. How many people do you know would have bought this game when the geforce 2 was top of the line if they were going to have problems running it at 640x480 ?

Or how about when the geforce 3 first came out. How many people would have been happy playing it at 800x600.

I've tried the beta and it looks way better then Quake 3 1600*1200 at 800*600.

Fact is that many people do want better res. As a higher res = better image quality

Of course, everybody want's higher res. But Quake 3 graphics at 6000*5000 wouldn't make anybody happy either.
 
Mine is far below that and still looks way better then any game at 1600*1200

Yes but does your ntsc 640x480 interlaced look better than my 1080i ? I think not.

Of course images of real people will look better than cgi. Thats like saying the terminal looks more realistic than shrek.

But when comparing quake 3 to quake 3 it will look a hell of a lot better on 1600x1200 then on 640x480 .

I don't know why we are aruging that point.

And as time goes on reses become higher and after a person is used to it , its even harder to go back. You can only go higher. NO one wants a new generation of games to come out and have to go from 1027x768 and higher with high fsaa back to 640x480. No one . Unless thier monitor can only do 640x480 .



I've tried the beta and it looks way better then Quake 3 1600*1200 at 800*600.

That may be the case . But will doom 3 at 800x600 look better than farcry at 1600x1200 ?



Of course, everybody want's higher res. But Quake 3 graphics at 6000*5000 wouldn't make anybody happy either.

I'd be happy just having a monitor that is capable of that res. But even so i'd rather have quake 3 at 6000^5000 than reduced to 640x480 . THen after i got used to 6000x5000 i don't think i'd be happy going to 1600x1200 again .
 
I have a question for you jvd, Just sheer curiosity, If you had turn down effects, to maintain a higher resolution, Would you opt for the higher resolution AA/AF with lower settings? Or would you opt for a lower resolution AA/AF settings for maximum effects.
 
ChrisRay said:
I have a question for you jvd, Just sheer curiosity, If you had turn down effects, to maintain a higher resolution, Would you opt for the higher resolution AA/AF with lower settings? Or would you opt for a lower resolution AA/AF settings for maximum effects.

First i would turn down aa . Once aa is gone then i wont turn anything else down. I will simply not play the game and update my video card to something that can play it the way i want it or i would wait for however long it would take
 
BRiT said:
radar1200gs said:
BRiT said:
Sandwich said:
radar1200gs said:
You can of course back your claims up with proof?

We have the shader replacements in 3dmark.

And Doom 3...

3DMark03 is irrelevant.

Doom3 is not a DX9 title.

Why the requirement on being a DX9 title when you asked for proof about Nvidia's shader-replacements? Why is 3dMark03 irrelavant?

Because you can't play 3dmark03.

Because OpenGL (and Doom3 itself) don't prohibit the use of FP16 or lower, and Doom3's shaders are written by Carmack, not nVidia.
 
jvd said:
radar1200gs said:
i'm not talking about the vertex units, i'm talking about the pixel shaders, and they either up convert lesser values to to FP24 or truncuate FP32 to FP24 before operating on data.

Search the forums. One of the ATi guys explained exactly how it works a while back.
whats it matter how its done. Fp24 is full percision. Don't matter if they do it internaly at 128fp and then down sample to fp24. As long as its fp24 which is the full persicion in dx 9

It matters because R3xx truncuates the value DX/the app hands it. If thats not forced partial precision, I'd love to know what is.
 
radar1200gs said:
jvd said:
radar1200gs said:
i'm not talking about the vertex units, i'm talking about the pixel shaders, and they either up convert lesser values to to FP24 or truncuate FP32 to FP24 before operating on data.

Search the forums. One of the ATi guys explained exactly how it works a while back.
whats it matter how its done. Fp24 is full percision. Don't matter if they do it internaly at 128fp and then down sample to fp24. As long as its fp24 which is the full persicion in dx 9

It matters because R3xx truncuates the value DX/the app hands it. If thats not forced partial precision, I'd love to know what is.

Want to know what forced partial percision is radar ?

Well first you need to know what partial percision is . That would be fp16.

Then you have to look at the one company capable of fp16 . Nvidia.

Then you have to look at 3dmark2003 and you find out what forced partial percision is . Or farcry . Or half life 2 .

There ya go radar .
 
You cannot change formats at will unless you have explicit permission to do so. Reagrdless of whether the format you are changing to is full or partial precision, supported or unsupported.

You are required to work on the data presented to you by the system, not change it as you see fit.
 
If the spec states that full precision is a minimum of FP32, then ATI is using partial precision. If the spec states that full precision is a minimum of FP24, then ATI is using full precision. In the case of the latter, if FP32 data is being truncated to FP24, it is being truncated to what is still full precision.
 
radar1200gs said:
BRiT said:
radar1200gs said:
3DMark03 is irrelevant.

Doom3 is not a DX9 title.

Why the requirement on being a DX9 title when you asked for proof about Nvidia's shader-replacements? Why is 3dMark03 irrelavant?

Because you can't play 3dmark03.

What a joke! It's pointless arguing any further with this guy. He's not willing to accept anything.
 
Bjorn said:
jvd said:
1028i I believe .

Mine is far below that and still looks way better then any game at 1600*1200 :)

The images on the tele don't suffer from aliasing.

You could have very complex 3d scenes on your computer, but it's not going to look as smooth as a real images. Now I've never had a 6800 to play with 8x AA, but I'm sure the kind of aliasing you'd need to match tv quality is far beyond the capabilities of any 3d card.
I suppose you wouldn't quite need AA to match the grain of the film used, but I'd guess something like 16 SSAA on 1024x786 minimum.
 
Sandwich said:
Bjorn said:
jvd said:
1028i I believe .

Mine is far below that and still looks way better then any game at 1600*1200 :)

The images on the tele don't suffer from aliasing.

You could have very complex 3d scenes on your computer, but it's not going to look as smooth as a real images. Now I've never had a 6800 to play with 8x AA, but I'm sure the kind of aliasing you'd need to match tv quality is far beyond the capabilities of any 3d card.
I suppose you wouldn't quite need AA to match the grain of the film used, but I'd guess something like 16 SSAA on 1024x786 minimum.

a tv image is interlaced which blurs the image alot. But if u plug in a game u can see just how ugly an image is . Put a ps2 in there and be ready to throw up . Its like leaping back 6 years or so in time
 
Sandwich said:
The images on the tele don't suffer from aliasing.

You could have very complex 3d scenes on your computer, but it's not going to look as smooth as a real images. Now I've never had a 6800 to play with 8x AA, but I'm sure the kind of aliasing you'd need to match tv quality is far beyond the capabilities of any 3d card.
I suppose you wouldn't quite need AA to match the grain of the film used, but I'd guess something like 16 SSAA on 1024x786 minimum.


you obviously never hooked up a console to the computer monitor :)
 
jvd said:
But when comparing quake 3 to quake 3 it will look a hell of a lot better on 1600x1200 then on 640x480 .

I don't know why we are aruging that point.

Because you seem to be implying that higher res, better FSAA and so forth is the only thing that determines IQ. Of course higher res is going to look better if we're talking about the same game or same featureset. And i'm saying that i'll gladly sacrifice higher res for f.e more realistic lighting, shadowing. Instead of hearings things like "feature X for card Y is unusable, it only gets 22 fps at 1600*1200" which i think is a bit ridicilous.
 
So what was the conclusion of this? Is only the X800 showing artifacts, or are they there in the 3xx series too? What about the NV40 (running DX9 path)?

Has this been submitted to ATI yet?
 
weeds said:
Well congratulations jvd you have succesfully derailed this thread to be about nvidia, as youve tried to do to every thread about ati filtering...
If you want to talk about dx9 compliant start a new thread. If you want to talk about nvidia aa theres already a thread for that also..

HAHA sorry that was a good line. JVD is very fanatical and has to defend ati to the death and disparage Nvidia, so don't fret to much about it. Just pretend he isn't posting and contiune with whatever discussion you were having. He wil never concede anything so arguing or cajoling him is pointless.

edit:
Sorry if that sounds attack like, that doesn't in any way mean he is a bad person or anything, just that arguing about it with him is unproductive. It is like trying to convince many of our fellows in the political forum of different ideas, it will never happen. If youguys enjoy endless debate though by all means continue.
 
radar1200gs said:
It matters because R3xx truncuates the value DX/the app hands it. If thats not forced partial precision, I'd love to know what is.

DX precisions are just a command to tell the pipeline what format to process in (no command = high, PP = low), there is no "truncation" occuring. What you also seem to be forgetting is that with ATI hardware for all those apps that have PP hints they will be processed with greater precision.
 
kihon said:
So what was the conclusion of this? Is only the X800 showing artifacts, or are they there in the 3xx series too? What about the NV40 (running DX9 path)?

Has this been submitted to ATI yet?

AFAIK ATI have looked at Far Cry and this appears to be an application specific issue. As a result of tEd's other thread they have identified an issue with Max Payne and I believe it is being addressed.
 
DaveBaumann said:
radar1200gs said:
It matters because R3xx truncuates the value DX/the app hands it. If thats not forced partial precision, I'd love to know what is.

DX precisions are just a command to tell the pipeline what format to process in (no command = high, PP = low), there is no "truncation" occuring. What you also seem to be forgetting is that with ATI hardware for all those apps that have PP hints they will be processed with greater precision.

I agree that for _PP and PS1.4 and lower the tranlsation to FP24 will not hurt - you can't hurt quality by increasing precsion, only by decreasing it.

However, in the case of a DX9 pixel shader without _PP hinting, the data enters the chip in FP32 format and is truncuated in hardware by the GPU to FP24.
 
Back
Top