Why all the artefacts? (for examply in Farcry)

I think it is because of Ultra Shadow being enabled by default in later drivers. if someone would test any pre NV35 cards (NV30, NV31 or NV34) and still complain about it that would be wrong. Is it so hard?

I don't have a GF FX personally.

please someone help.
 
My friend showed me that review and used it as proof of a 5700 being faster than the 9600xt . I then had to point out that the 9600xt was run in a higher res than the 5700

I really don't like hardocp and how they review things
 
vb said:
I think it is because of Ultra Shadow being enabled by default in later drivers. if someone would test any pre NV35 cards (NV30, NV31 or NV34) and still complain about it that would be wrong. Is it so hard?
.

I think [H] has found the problem in NV31 too? I'm not sure though, just a faint memory.
 
UltraShadow is OGL only, you crazed monkeys.

Oh, and you can't just go, "Poof! UltraShadow enabled!" A developer has to specifically code for UltraShadow support.
 
The Baron said:
UltraShadow is OGL only, you crazed monkeys.

Oh, and you can't just go, "Poof! UltraShadow enabled!" A developer has to specifically code for UltraShadow support.

UltraShadow is a chip feature only exposed in OpenGL. That means only in OpenGL you can call it or not. The driver doesn't give a flying f**k whether the app is DirectX or OpenGL. If the driver is shipped with a state in which it doesn't render lights/shadows further than X% distance (or whatever conditionals) from light source it will not matter that the app doesn't ask for it.

Actually it does because it can't turn it OFF.

That is my point.
 
so you're saying that because the card has SUPPORT for the UltraShadow extension (yes, Shirley, to use UltraShadow, you SPECIFICALLY have to CALL the EXTENSION, which is OGL only), there are rendering artifacts?

did I wake up in Bizarro World? are you on drugs? the driver screws up lighting (which happens all the time--if you really want to complain about lighting problems, look at the X2 Rolling Demo), and you're saying it's becuase.. what... NVIDIA is preventing lights/shadows from being rendered if they're beyond a certain distance? do you realize how insane that sounds?
 
The Baron said:
so you're saying that because the card has SUPPORT for the UltraShadow extension ... there are rendering artifacts?

Nope.

I say it is reducing workload.

The Baron said:
yes, Shirley, to use UltraShadow, you SPECIFICALLY have to CALL the EXTENSION, which is OGL only?

Who says that anybody is trying to use UltraShadow except Nvidia?

Are you telling me that they have to call it?

It is their chip AFAIK.

The Baron said:
did I wake up in Bizarro World? are you on drugs? the driver screws up lighting (which happens all the time--if you really want to complain about lighting problems, look at the X2 Rolling Demo), and you're saying it's becuase.. what... NVIDIA is preventing lights/shadows from being rendered if they're beyond a certain distance? do you realize how insane that sounds?

No, they wouldn't do that, would they?

What next? Shader replacement? Camera on rail optimisations?

I can't imagine why it crossed my mind.

After all it is just a bug. It apeared on a couple of games and then it spead like a virus. Not at all like Brilinear.

I think Nvidia are being Hacked by malitious individuals that are screwing their "golden" drivers. How does this sound? Less Bizaro World?
 
vb said:
Who says that anybody is trying to use UltraShadow except Nvidia?

nVidia cannot call UltraShadow of their own volition. It has to be explicitly called by the application, and even then only in OpenGL. For it work any other way is simply impossible, it couldn't be forced.
 
Hanners said:
nVidia cannot call UltraShadow of their own volition. It has to be explicitly called by the application, and even then only in OpenGL. For it work any other way is simply impossible, it couldn't be forced.

In theory, If it is called it will put a nice little Flag that says: "this light source only lights or lays shadows from X to Y"

what if the driver sets X and Y, arbitrary for most light sources?
 
Here. Read. Learn.

It applies to stencil shadows only. Considering HEY, we don't have anything that uses stencil shadows yet (definitely not Far Cry or X2)... Houston, we have a problem. Even if they WERE using some sort of ridiculous app detection and pixel discarding, do you see an improvement in frame rates versus whatever hypothetical driver set does not have that bug/feature?

But anyway, vb, congratulations, you win my Ridiculous F@nboy Award for March.
 
The Baron said:
It applies to stencil shadows only. Considering HEY, we don't have anything that uses stencil shadows yet (definitely not Far Cry or X2)... Houston, we have a problem. Even if they WERE using some sort of ridiculous app detection and pixel discarding, do you see an improvement in frame rates versus whatever hypothetical driver set does not have that bug/feature?

Far Cry Website said:
Lighting and Shadows: Combines pre-calculated, real-time shadows, stencil shadows and lightmaps to produce a dynamic environment. Includes high-resolution, correct perspective, and volumetric smooth-shadow implementations for dramatic and realistic indoor shadowing. Also supports advanced particles technology and any kind of volumetric lighting effects on particles.

The Baron said:
But anyway, vb, congratulations, you win my Ridiculous F@nboy Award for March.

Thank you. considering it all started with this "if someone would test any pre NV35 cards (NV30, NV31 or NV34) and still complain about it that would be wrong. Is it so hard?" before it turned into drugs, I feel it is undeserved, but it is such a joy to go to a place where you can talk about things, relax and, once in a while, you get some recognition. I have to admit you make B3d such a nice place to be in.
 
There is no way in hell the NV driver is "turning on" UltraShadow. The optimization it provides is only applicable to fill limited drawing of shadow volumes and there is no (efficient) way at all for the driver to know what bounds to use without higher level information from the app. Besides, the rendering errors look nothing like stencil shadows with holes in them so the theory is bogus anyway.

Some of the difference in those shots is due to difference in fogging which is a well known problem. Essentially nv expects the vertex program to output a fog weight in [0,1] and ATI want's something else, world space distance I think. Or it's the other way around, I forget.
 
jvd said:
My friend showed me that review and used it as proof of a 5700 being faster than the 9600xt . I then had to point out that the 9600xt was run in a higher res than the 5700

I really don't like hardocp and how they review things

Perhaps instead you should blame your friend for not reading the review properly, rather than [H]ard|Ocp?
 
Quitch said:
jvd said:
My friend showed me that review and used it as proof of a 5700 being faster than the 9600xt . I then had to point out that the 9600xt was run in a higher res than the 5700

I really don't like hardocp and how they review things

Perhaps instead you should blame your friend for not reading the review properly, rather than [H]ard|Ocp?

The fact that they have have 3 diffrent cards running at diffrent aa and aniso settings and then compare what card you should buy based on that is just stupid

If the 5700 sucks at a certian res and aa and aniso performance then it should be marked that way and you should be shown the performance .
 
jvd said:
The fact that they have have 3 diffrent cards running at diffrent aa and aniso settings and then compare what card you should buy based on that is just stupid

If the 5700 sucks at a certian res and aa and aniso performance then it should be marked that way and you should be shown the performance .


It's not so strange as you would think. Actually, when you think about it, they have the BEST way of comparing... Let me explain:

The current generation videocards cannot easily be compared. Suppose you run both the Radeon and GeforceFX in 2xAA. Would that be fair? Not at all! It's well know that the Radeons AA is much better than the FX's. Considering the AA results, it's probably more fair to run the Radeon at 2xAA and the FX at 4xAA...

Equal card settings does not mean that you're comparing apples to apples!!
And that not only holds for AA, but also AF, etc etc. Even without AA and AF you can see IQ differences between cards.

A GOOD reviewer should remember that a video card can exchange speed for image quality and vice versa. Actually, that's what most of these 'driver optimizations' do! Loose some image quality, to gain speed.
You also do it yourself, when setting optimal settings for a game.

Because there is a relation between image quality and speed, a benchmark should either:
1) set a certain IQ level, and compare the speed
2) set a certain speed, and compare the image quality.

Most reviewers think that they're doing (1) because they set the same resolution and AA and AF settings. As I explained above, they are wrong. Meaning they run different IQ, and then compare speed. That's meaningless.

Furthermore, you have the silly thing that in those reviews you see framerates of >100 where it really doesn't mean a thing when one is faster than the other (Quake?). And who cares if card x is 50% faster than y, when both are doing < 10 fps. Can't play the game anyway. Even though those number could be fair, they are meaningless!

What HardOCP is starting to do, is benchmark according to (2). Set a certain speed, and see what image quality you did. As I explained above, nothing wrong with this approach.

Actually, it has quite a number of advantages:
* It's mostly very well possible to set equal speeds, because of the large number of graphics settings. So, they DO compare apples to apples.
* You're not dependable upon differences in AA, AF algorithmen. It doesn't matter HOW they do it, only the results count.
* Even 'driver optimizations' are more fairly treated. If they exchanged IQ for speed, then they don't win anything in this kind of benchmark. If they really improved speed, without damaging IQ, they do win.
* It's closest to the actual gameplay situation. It's the same as someone at home would do. Achieving a good balance between speed and IQ.


Ofcourse, you DO have to be aware of this different kind of reviewing. You have to READ the review, and not just quickly browse through it.
But that's just a matter of getting used to it.

You'll probalby think, it's also more dependend on the reviewer, because he has to assess image quality.
But think about this: a PROPER review based on the first method, should ALSO assess image quality, select the settings that produce equal image quality, and only then compare the framerate the cards achieve. (Unfortunately, they're not doing it...)

So, there's really not much difference there. And actually, I think it's more reliable setting equal speed, than equal IQ.... That's just a number. Let the judging of IQ be left to the reader posting screenshots...
 
Back
Top