Review done with FarCry demo.

{Sniping}Waste said:
http://www.digit-life.com/articles2/gffx/gffx-40.html

If I remember right, all Nvidia card convert PS2 to PS1.1 in this demo.
So claming that PS2 is being done with a NV3X is not true. :oops:
Either you're remembering it right or I'm falsely remembering it too, I'll go see if I can hunt up a thread about it.

EDITED BITS: Gads, they didn't even know how to enable AA & AF in the demo! :(

Digit-life said:
Unfortunately, the AA and anisotropy modes don't work correctly in this DEMO, and they were disabled. We are waiting for a patch for the DEMO or the release of the game itself.
 
FarCry demo does indeed run at PS 1.1 by default on GFFX cards, and PS 2.0 default on r3xx cards. And even with that r3xx cards still perform better.
 
{Sniping}Waste said:
http://www.digit-life.com/articles2/gffx/gffx-40.html

If I remember right, all Nvidia card convert PS2 to PS1.1 in this demo.
So claming that PS2 is being done with a NV3X is not true. :oops:
It's the game that is using PS 2.0 or PS 1.1 depending on what card is installed.
 
OpenGL guy said:
{Sniping}Waste said:
http://www.digit-life.com/articles2/gffx/gffx-40.html

If I remember right, all Nvidia card convert PS2 to PS1.1 in this demo.
So claming that PS2 is being done with a NV3X is not true. :oops:
It's the game that is using PS 2.0 or PS 1.1 depending on what card is installed.

Or? Means never mixed, right?
 
OpenGL guy said:
{Sniping}Waste said:
http://www.digit-life.com/articles2/gffx/gffx-40.html

If I remember right, all Nvidia card convert PS2 to PS1.1 in this demo.
So claming that PS2 is being done with a NV3X is not true. :oops:
It's the game that is using PS 2.0 or PS 1.1 depending on what card is installed.
Maybe he assumed it was due to the unified compiler? ;)
 
T2k said:
Or? Means never mixed, right?
I would assume there's certainly going to be SOME mixing... and that it points to the higher-end quality level that it feels certain cards can handle.
 
but it all boils down to the final output, and to that one card is rendering a higher level of image quality than the other in this game by default, there are clear differences between both cards in this game
 
T2k said:
OpenGL guy said:
{Sniping}Waste said:
http://www.digit-life.com/articles2/gffx/gffx-40.html

If I remember right, all Nvidia card convert PS2 to PS1.1 in this demo.
So claming that PS2 is being done with a NV3X is not true. :oops:
It's the game that is using PS 2.0 or PS 1.1 depending on what card is installed.
Or? Means never mixed, right?
It could be mixed for this game, I have no idea. However, it's been shown that there are apparent rendering differences between the modes, that's the main point.
 
OpenGL guy said:
It could be mixed for this game, I have no idea. However, it's been shown that there are apparent rendering differences between the modes, that's the main point.
True, but just to get a kick in before the horse is fully dead let me point this out from that thread that Snipe found in a post of Brent's about the Far Cry log:

Brent's Far Cry log in that thread Snipe found said:
Driver description: NVIDIA GeForce FX 5950 Ultra
Full stats: HAL (pure hw vp): NVIDIA GeForce FX 5950 Ultra
Hardware acceleration: Yes
Full screen AA: Disabled
Stencil type: Two sided
Projective EMBM: enabled
Detail textures: Yes
Z Buffer Locking: Yes
Use multitexture mode: Yes (8 texture(s))
Use bumpmapping : Yes (DOT3)
Use paletted textures : No
Current Resolution: 1024x768x32 Full Screen
Maximum Resolution: 1792x1344
Maximum Texture size: 4096x4096 (Max Aspect: 4096)
Texture filtering type: TRILINEAR
Use 32 bits textures
Gamma control: Hardware
Vertex Shaders version 2.0
Pixel Shaders version 2.0
Use Hardware Shaders for NV3x GPUs
Pixel shaders usage: Replace PS.2.0 to PS.1.1
Shadow maps type: Mixed Depth/2D maps
(emphasis mine)

I take "Replace" to imply the "or" option much more than any mixed mode.
 
From all the info I have seen, the NV3X done in PS1.1 only and ATI in PS2 and PS1.1. I have heard that many have tried to force PS2 on the GFFX and it still goes back to PS1.1. It will show PS2 but render the same image that matches PS1.1. Have anybody been able to force PS2 on GFFX yet?
 
digitalwanderer said:
OpenGL guy said:
It could be mixed for this game, I have no idea. However, it's been shown that there are apparent rendering differences between the modes, that's the main point.
True, but just to get a kick in before the horse is fully dead let me point this out from that thread that Snipe found in a post of Brent's about the Far Cry log:

Brent's Far Cry log in that thread Snipe found said:
Driver description: NVIDIA GeForce FX 5950 Ultra
Full stats: HAL (pure hw vp): NVIDIA GeForce FX 5950 Ultra
Hardware acceleration: Yes
Full screen AA: Disabled
Stencil type: Two sided
Projective EMBM: enabled
Detail textures: Yes
Z Buffer Locking: Yes
Use multitexture mode: Yes (8 texture(s))
Use bumpmapping : Yes (DOT3)
Use paletted textures : No
Current Resolution: 1024x768x32 Full Screen
Maximum Resolution: 1792x1344
Maximum Texture size: 4096x4096 (Max Aspect: 4096)
Texture filtering type: TRILINEAR
Use 32 bits textures
Gamma control: Hardware
Vertex Shaders version 2.0
Pixel Shaders version 2.0
Use Hardware Shaders for NV3x GPUs
Pixel shaders usage: Replace PS.2.0 to PS.1.1
Shadow maps type: Mixed Depth/2D maps
(emphasis mine)

I take "Replace" to imply the "or" option much more than any mixed mode.
That appears to be the case... You'd think that in a benchmark mode (or are people using Fraps here?) the game would create a level playing field when possible...
 
OpenGL guy said:
You'd think that in a benchmark mode (or are people using Fraps here?) the game would create a level playing field when possible...
I would think that too, I wonder what could have made them not do something like that... :?




;)
 
My log shows;

Full stats: HAL (pure hw vp): RADEON 9800 XT
Hardware acceleration: Yes
Full screen AA: Enabled (2 samples)
Stencil type: Two sided
Projective EMBM: enabled
Detail textures: Yes
Z Buffer Locking: Yes
Use multitexture mode: Yes (8 texture(s))
Use bumpmapping : Yes (DOT3)
Use paletted textures : No
Current Resolution: 1024x768x32 Full Screen
Maximum Resolution: 1280x1024
Maximum Texture size: 2048x2048 (Max Aspect: 2048)
Texture filtering type: TRILINEAR
Use 32 bits textures
Gamma control: Hardware
Vertex Shaders version 2.0
Pixel Shaders version 2.0
Use Hardware Shaders for ATI R300 GPU
Pixel shaders usage: PS1.1 only
Shadow maps type: Mixed Depth/2D maps

I think Brents showed 2.0 to be used :?: The demo, and the first and second beta both show this same log for me.
 
digitalwanderer said:
OpenGL guy said:
You'd think that in a benchmark mode (or are people using Fraps here?) the game would create a level playing field when possible...
I would think that too, I wonder what could have made them not do something like that... :?


Well thats not the case in the Past. Remember Serious Sam? When you ran it in benchmark mode it still rendering things differently based on which card was used. Thats why we needed B3D scripts to enforce the work to be the same. And when the full game ships I bet you will need a script as well to ensure FarCry does the same.
 
fallguy said:
My log shows;

Full stats: HAL (pure hw vp): RADEON 9800 XT
Hardware acceleration: Yes
Full screen AA: Enabled (2 samples)
Stencil type: Two sided
Projective EMBM: enabled
Detail textures: Yes
Z Buffer Locking: Yes
Use multitexture mode: Yes (8 texture(s))
Use bumpmapping : Yes (DOT3)
Use paletted textures : No
Current Resolution: 1024x768x32 Full Screen
Maximum Resolution: 1280x1024
Maximum Texture size: 2048x2048 (Max Aspect: 2048)
Texture filtering type: TRILINEAR
Use 32 bits textures
Gamma control: Hardware
Vertex Shaders version 2.0
Pixel Shaders version 2.0
Use Hardware Shaders for ATI R300 GPU
Pixel shaders usage: PS1.1 only
Shadow maps type: Mixed Depth/2D maps

I think Brents showed 2.0 to be used :?: The demo, and the first and second beta both show this same log for me.

i've checked and rechecked mine, its correct

what version of the game are you using?

The file version shown in the top right corner inside the game says FC 5104B1120

In the logfile it says:

FileVersion: 1.1.5.1120
ProductVersion: 1.1.5.1120

This is the latest publically released demo.

BTW I'm using Cat 4.2 on the 9800XT
 
Im not at home right now, I will be in about an hour or so and can tell then.

I had my 5900NU in my main system, where my 9800XT is in now. I thought perhaps the log kept the settings from it. So I downloaded the demo, and the log says the same thing. The first beta, the second beta, and now the demo tells me the same thing. I dont have any third party tweakers installed or anything, just the 4.2 Cat's.

When I get home, just to make sure, Ill back up my log. Then delete it, and hopefuly the game will make a new one. I have not had the 5900NU in that system since I did a format, so there is zero nVidia drivers or anything else in the system. So if it makes a new log, it should be correct.

Anybody with a R3x core should be able to just look at their log in the main dir of the demo, and see what it says.
 
Back
Top