Another Site reviwing with 3DMARK and 53.03

Long time follower, first time poster... :)

Regarding the Far Cry benchs...

I´am doing a review of a PowerColor 9600XT, comparing with a FX5700U and also a Ti4200 (128Mb version), and the 9600XT is a lot faster comparing with the FX5700U... and i´am comparing the low frame rates using fraps... the 9600XT is about 40% faster comparing with the FX5700U using a manual run with FRAPS. Benchs at 1024x768 Highest Detail... using medium detail the advantage is minor between the two cards.

I tried to force PS20 in the FX and give the same results... also i tried forcing only PS11 in the 9600XT and give me the same result.

But there is some reason for the engine forcing only PS11 in the FX... of course can be the performance penalty using PS20, but maybe this demo doesn´t need PS20 at all...

Using PS20 doesn´t give any quality image enhancement, or reducing the number of passes...

About Halo...

Also i´am using a manual run with fraps and the 9600XT is a way faster... interesting that in the FX forcing only PS11 doesn´t give me a large boost.

Well sorry for the grammar because my mother language is Portuguese and i have some dificult handling this English.

Thanks
 
Veridian3 said:
ok,

I've spent this evening testing Farcry again...

From a framerate point of view here are the findings:

Review Scores:
Avg: 29.640 - Min: 1 - Max: 50

With dev mode/Pixel Shader 2.0 command:
Avg: 28.933 - Min: 1 - Max: 49

To me those scores are within the margin of error. I have taken some screenshots and to the naked eye the quality appears to be the same (To me, a family member and a friend that was over). When i have time i'll upload them and post the links here or add them to DH's review however its not too high a priority for me as there is really nothing to see.


Thanks for taking the time. However that does not jive with what the folks in the NV thread saw (and I would tend to have some confidence with more than one reporting the same thing). Not saying your results were wrong, just saying these compared to others are inconsistant. Still you took the time to look at them and I thank you for that. Did you try the same spots that they showed in the NV Thread (Did not see the images yet)? Thanks again!

Brent, find anything on this yet?
 
I've been looking around the net on this issue and also playing with the game a little more when i had time. I'm wondering now if its lighting settings/detail rather than pixel shaders thats causing the IQ differences. The demo seems to have trouble maintaining the users custom settings in the menu and i'm thinking this may account for the lack of comparable IQ...rather than the pixel shader hack. (less lighting = looks like lower reflections/pixel shaders???)

What i really need to do is spend time in the ini with all the tweaks on brents link and set up a manual quality setting for the game which will hopefully stick more than the menu settings. I'm a bit busy over the next week on other stuff...but i'll try to find time.
 
Veridian3 said:
What i really need to do is spend time in the ini with all the tweaks on brents link and set up a manual quality setting for the game which will hopefully stick more than the menu settings.
Once you get the ini the way you like it, change it's attribute to "read only"....it's helped me convince a few stubborn programs to quit playing with my setttings. ;)
 
Thanks for the post, crusher_pt. Your English was clear. Maybe a few too many ...'s for my taste, but otherwise very understandable. :)
 
StealthHawk said:
The Baron said:
GLSL support in ATI drivers is somewhat odd.. GLSL in the NVIDIA developer drivers is most odd. I don't think it's really surprising.

Can you elaborate on what "odd" things happen?
according to Humus, in ATI, some GLSL functions/calls simply don't behave as expected (i.e., how they should according to the spec). I've also heard that NV GLSL support is blatantly experimental (same type of problems as ATI, only much worse).
 
The Baron said:
StealthHawk said:
The Baron said:
GLSL support in ATI drivers is somewhat odd.. GLSL in the NVIDIA developer drivers is most odd. I don't think it's really surprising.

Can you elaborate on what "odd" things happen?
according to Humus, in ATI, some GLSL functions/calls simply don't behave as expected (i.e., how they should according to the spec). I've also heard that NV GLSL support is blatantly experimental (same type of problems as ATI, only much worse).

Yeah, I remember reading that. But does it behave unexpectedly in the same way every time you do "X?" I thought we were talking about repeatability.

Regarding the NV35 on pre 44.03 drivers, sometimes a certain test would run just fine. No visual problems, and reproducible test results. Sometimes it would have visual glitches(and the score would be divergent from times when the test ran without visual glitches). This was the type of odd behavior to which I was referring to.
 
Kkevin666, its gone. I posted another thread and will see how long it last.

Edit
How sad it is to argue with a person that belives in NV PR. :oops: :rolleyes: :cry:
Some one posting in the thread there named Allanpan.
 
crusher_pt said:
Long time follower, first time poster... :)

Regarding the Far Cry benchs...

I´am doing a review of a PowerColor 9600XT, comparing with a FX5700U and also a Ti4200 (128Mb version), and the 9600XT is a lot faster comparing with the FX5700U... and i´am comparing the low frame rates using fraps... the 9600XT is about 40% faster comparing with the FX5700U using a manual run with FRAPS. Benchs at 1024x768 Highest Detail... using medium detail the advantage is minor between the two cards.

I tried to force PS20 in the FX and give the same results... also i tried forcing only PS11 in the 9600XT and give me the same result.

But there is some reason for the engine forcing only PS11 in the FX... of course can be the performance penalty using PS20, but maybe this demo doesn´t need PS20 at all...

Using PS20 doesn´t give any quality image enhancement, or reducing the number of passes...

I have a TON of pics with a 9800XT and 5900NU@5950 speeds with the Farcry beta2. Twelve with each card in as close to the same spot as I could get, and Fraps running. Same system, and same game res, with the same game setting. The results are pretty suprising, although not to some I guess.

Im just not sure if I can post them here. :?:
 
i took a lot of pics too, it'll be in an upcoming review

i have confirmed farcry demo is running at ps 1.1 on gffx cards

Check this out from the farcry log file:

Driver description: NVIDIA GeForce FX 5950 Ultra
Full stats: HAL (pure hw vp): NVIDIA GeForce FX 5950 Ultra
Hardware acceleration: Yes
Full screen AA: Disabled
Stencil type: Two sided
Projective EMBM: enabled
Detail textures: Yes
Z Buffer Locking: Yes
Use multitexture mode: Yes (8 texture(s))
Use bumpmapping : Yes (DOT3)
Use paletted textures : No
Current Resolution: 1024x768x32 Full Screen
Maximum Resolution: 1792x1344
Maximum Texture size: 4096x4096 (Max Aspect: 4096)
Texture filtering type: TRILINEAR
Use 32 bits textures
Gamma control: Hardware
Vertex Shaders version 2.0
Pixel Shaders version 2.0
Use Hardware Shaders for NV3x GPUs
Pixel shaders usage: Replace PS.2.0 to PS.1.1
Shadow maps type: Mixed Depth/2D maps

check out the second line from the bottom there, get a load of them apples

it'll all be in the review

oh, and this is using a new driver 56.56
 
fallguy said:
crusher_pt said:
Long time follower, first time poster... :)

Regarding the Far Cry benchs...

I´am doing a review of a PowerColor 9600XT, comparing with a FX5700U and also a Ti4200 (128Mb version), and the 9600XT is a lot faster comparing with the FX5700U... and i´am comparing the low frame rates using fraps... the 9600XT is about 40% faster comparing with the FX5700U using a manual run with FRAPS. Benchs at 1024x768 Highest Detail... using medium detail the advantage is minor between the two cards.

I tried to force PS20 in the FX and give the same results... also i tried forcing only PS11 in the 9600XT and give me the same result.

But there is some reason for the engine forcing only PS11 in the FX... of course can be the performance penalty using PS20, but maybe this demo doesn´t need PS20 at all...

Using PS20 doesn´t give any quality image enhancement, or reducing the number of passes...

I have a TON of pics with a 9800XT and 5900NU@5950 speeds with the Farcry beta2. Twelve with each card in as close to the same spot as I could get, and Fraps running. Same system, and same game res, with the same game setting. The results are pretty suprising, although not to some I guess.

Im just not sure if I can post them here. :?:

you can easily make it so you can take screenshots in the same spot each time

go to where you want to make a screenshot bring down the console in the game with the ~ key and type \save_game xxx where xxx is the name of your saved game

then all you have to do is anytime you want to go back to that spot to take another screenshot just bring down the console and type \load_game xxx where xxx is the name of your saved game

it'll plop you right back in that exact spot, you can make as many saved games as you want

i've taken 2 water shots, 2 land shots, and 5 shader shots to put in my review for comparison, ABIT 9800XT (cat 4.2) vs. 5950U (forceware 56.56)
 
{Sniping}Waste said:
Nice work Brent! Do you have the same info on the ATI cards?

just ran farcry on my personal system here with a 9800pro and cat 4.2 drivers

Driver description: RADEON 9800 PRO
Full stats: HAL (pure hw vp): RADEON 9800 PRO
Hardware acceleration: Yes
Full screen AA: Enabled (2 samples)
Stencil type: Two sided
Projective EMBM: enabled
Detail textures: Yes
Z Buffer Locking: Yes
Use multitexture mode: Yes (8 texture(s))
Use bumpmapping : Yes (DOT3)
Use paletted textures : No
Current Resolution: 1024x768x32 Full Screen
Maximum Resolution: 1792x1344
Maximum Texture size: 2048x2048 (Max Aspect: 2048)
Texture filtering type: TRILINEAR
Use 32 bits textures
Gamma control: Hardware
Vertex Shaders version 2.0
Pixel Shaders version 2.0
Use Hardware Shaders for ATI R300 GPU
Pixel shaders usage: PS.2.0 and PS.1.1
Shadow maps type: Mixed Depth/2D maps

I have 2XAA enabled from the ati control panel. But in Farcry AA does not work on r3xx cards, just fyi.

I will have all this information in an upcoming review. I've got screenshots of this information from in-game via the console.
 
You can get AA to work on an R3xx card in FarCry, I'll poke around and see if I can find the link.

I liked the game a LOT better with AA. If I recall it was force AA thru the control panel and set AF to app preference, I'll try and find the link.
 
digitalwanderer said:
You can get AA to work on an R3xx card in FarCry, I'll poke around and see if I can find the link.

I liked the game a LOT better with AA. If I recall it was force AA thru the control panel and set AF to app preference, I'll try and find the link.

I tried many combinations of forcing AA, and I got it to work by forcing AA from the config file in FarCry, but, the graphics are horribly corrupt with AA enabled and the game is not playable because of the graphics corruption.

I've tried many combo's of through the control panel, through the in-game setting, and through the config file in the games directory.

All with AF disabled "app preference".

All with the same conclusion, basically AA no worky.

Post the link here if you find it, and i'll give it another shot.

BTW using Cat 4.2.
 
Back
Top