x800 texture shimmering: FarCry video

radar1200gs said:
No, resolution certainly won't hold developers back.

They don't care what resolution you play their game in, they only care that you are able to play their game in the first place.

I disagree. Devs do care . As all the effects in the world don't matter if a person is stuck at 640x480 when other games using older engines will have a higher res and thus better image quality :) but u believe what u want radar.
 
radar1200gs said:
Its fines to have an opinion, but, facts are NV3x does run farcry (and any other DX9ish game you care to mention) and does so while remaining compliant with DX9 specs.

That's with 24bit to 16bit shader replacements though. Technically, the FXes are "compliant" because microsoft said so, easing the specs a little for nvidia.

I seem to recall dx9 originally called for 8x1 or nx1 pipelines, not 4x2 or 2x2 and I'm pretty sure the aim was to have 24bit or 32bit shaders as a useful feature, not a checkbox feature.
If microsoft had wanted to, they could have flexed their muscles a little and deny nvidia the right label the FX dx9 compliant.

So yes, the FX is just barely dx9 compliant, and this IS holding dx9 back.
 
jvd said:
I disagree. Devs do care . As all the effects in the world don't matter if a person is stuck at 640x480 when other games using older engines will have a higher res and thus better image quality :) but u believe what u want radar.

I think it's pretty obvious that higher res doesn't automatically mean better IQ. I mean, what kind of resolution do you think you're TV is using ?
 
Sandwich said:
radar1200gs said:
Its fines to have an opinion, but, facts are NV3x does run farcry (and any other DX9ish game you care to mention) and does so while remaining compliant with DX9 specs.

That's with 24bit to 16bit shader replacements though. Technically, the FXes are "compliant" because microsoft said so, easing the specs a little for nvidia.

I seem to recall dx9 originally called for 8x1 or nx1 pipelines, not 4x2 or 2x2 and I'm pretty sure the aim was to have 24bit or 32bit shaders as a useful feature, not a checkbox feature.
If microsoft had wanted to, they could have flexed their muscles a little and deny nvidia the right label the FX dx9 compliant.

So yes, the FX is just barely dx9 compliant, and this IS holding dx9 back.

Just for your information, DX9's native precision isn't FP24 - data is handed to R300 as FP32 if the dev hasn't explicitly specfied another format and the R300 then internally truncuates FP32 to FP24 (In my books thats full time _PP with no user choice).

And no, FX chips do not replace FP24 (or any other format) shaders with FP16 unless they are preceded by a _PP hint.

And guess what, the _PP (partial precision) hint is a valid, compliant part of DX9.
 
Bjorn said:
jvd said:
I disagree. Devs do care . As all the effects in the world don't matter if a person is stuck at 640x480 when other games using older engines will have a higher res and thus better image quality :) but u believe what u want radar.

I think it's pretty obvious that higher res doesn't automatically mean better IQ. I mean, what kind of resolution do you think you're TV is using ?

The number of pixels in a horizontal line would be infinite. :p
 
Bjorn said:
jvd said:
I disagree. Devs do care . As all the effects in the world don't matter if a person is stuck at 640x480 when other games using older engines will have a higher res and thus better image quality :) but u believe what u want radar.

I think it's pretty obvious that higher res doesn't automatically mean better IQ. I mean, what kind of resolution do you think you're TV is using ?

1028i I believe .

Look at it this way. Doom 3 is capable of running on a geforce 2 at 640x480 at 15fps or so. How many people do you know would have bought this game when the geforce 2 was top of the line if they were going to have problems running it at 640x480 ?

Or how about when the geforce 3 first came out. How many people would have been happy playing it at 800x600.

The fact is not many. If farcry forced fx cards to run at full persicion and all p.s 2.0 shaders (let alone 2.0a ) and suddenly thier 400$ video cards were stuck at 800x600 with 30fps with no ansio and no fsaa how many people would have bought the game ? how many would have bitched and called it a crappy engine ?

Fact is that many people do want better res. As a higher res = better image quality
 
radar1200gs said:
Sandwich said:
radar1200gs said:
Its fines to have an opinion, but, facts are NV3x does run farcry (and any other DX9ish game you care to mention) and does so while remaining compliant with DX9 specs.

That's with 24bit to 16bit shader replacements though. Technically, the FXes are "compliant" because microsoft said so, easing the specs a little for nvidia.

I seem to recall dx9 originally called for 8x1 or nx1 pipelines, not 4x2 or 2x2 and I'm pretty sure the aim was to have 24bit or 32bit shaders as a useful feature, not a checkbox feature.
If microsoft had wanted to, they could have flexed their muscles a little and deny nvidia the right label the FX dx9 compliant.

So yes, the FX is just barely dx9 compliant, and this IS holding dx9 back.

Just for your information, DX9's native precision isn't FP24 - data is handed to R300 as FP32 if the dev hasn't explicitly specfied another format and the R300 then internally truncuates FP32 to FP24 (In my books thats full time _PP with no user choice).

And no, FX chips do not replace FP24 (or any other format) shaders with FP16 unless they are preceded by a _PP hint.

And guess what, the _PP (partial precision) hint is a valid, compliant part of DX9.

The problem with this is the _PP hint is forced (app detect) in the drivers by nvidia when it isn't used by the game developers in desperation to get the FX running decent speeds.

With ATI you get 24 bit as a valid dx9 precision, while for the FX you get 16bit, valid or not. Guess what everyone prefers.
 
Just for your information, DX9's native precision isn't FP24 - data is handed to R300 as FP32 if the dev hasn't explicitly specfied another format and the R300 then internally truncuates FP32 to FP24 (In my books thats full time _PP with no user choice).

full persicon under dx 9 is 24fp . Not 32fp. It is not ati's fault that nvidia exceed the standards and procided to build a part that would have difficulty rendering at the percision .


No that is not full time _pp as the spec calls fp 24 full percision. if ati forced you to use fp 16 then it would be full time _pp


And no, FX chips do not replace FP24 (or any other format) shaders with FP16 unless they are preceded by a _PP hint.
That depends on what the drivers tell it to do.

The fx should not replace any calls into fp16 unless precedded by a _pp hint. But can you back it up with proof that nvidia's shader replacements aren't forcing it into fp 16 ?

And guess what, the _PP (partial precision) hint is a valid, compliant part of DX9
yes now it is . Score 1 for nvidia holding the industry back yet again .
 
radar1200gs said:
Just for your information, DX9's native precision isn't FP24 - data is handed to R300 as FP32 if the dev hasn't explicitly specfied another format and the R300 then internally truncuates FP32 to FP24 (In my books thats full time _PP with no user choice).

That is incorrect, in DX the developers must exactly specify the size/format of the data, especially for textures. The R300 does natively support FP32 in one of the shader units (Vertex Shaders I believe).
 
i'm not talking about the vertex units, i'm talking about the pixel shaders, and they either up convert lesser values to to FP24 or truncuate FP32 to FP24 before operating on data.

Search the forums. One of the ATi guys explained exactly how it works a while back.
 
radar1200gs said:
i'm not talking about the vertex units, i'm talking about the pixel shaders, and they either up convert lesser values to to FP24 or truncuate FP32 to FP24 before operating on data.

Search the forums. One of the ATi guys explained exactly how it works a while back.
whats it matter how its done. Fp24 is full percision. Don't matter if they do it internaly at 128fp and then down sample to fp24. As long as its fp24 which is the full persicion in dx 9
 
There is no conversion. It is truncation. Certain bits are hardwired to a specific value, meaning you can't go above it. The registers are (IIRC) are all FP32 format. The R300 just doesn't route the truncated bits through the system.
 
3DMark03 is irrelevant
for something so irrelevant nvidia sure did spend alot of time boosting thier scores in it


Simple fact is nvidia has done it and may still be doing it. When the partner is in twimtp team (farcry) they do it using the game engine. When the developer is not part of it they do it behind thier backs with shader replacements (futuremark)
 
Well congratulations jvd you have succesfully derailed this thread to be about nvidia, as youve tried to do to every thread about ati filtering...
If you want to talk about dx9 compliant start a new thread. If you want to talk about nvidia aa theres already a thread for that also..
 
radar1200gs said:
BRiT said:
Sandwich said:
radar1200gs said:
You can of course back your claims up with proof?

We have the shader replacements in 3dmark.

And Doom 3...

3DMark03 is irrelevant.

Doom3 is not a DX9 title.

Why the requirement on being a DX9 title when you asked for proof about Nvidia's shader-replacements? Why is 3dMark03 irrelavant?
 
Back
Top