x800 texture shimmering: FarCry video

To me the main question is that if trylinear can be turned off by one way or another then are reviewers of retail X800XT and 6800U cards going to be turning off trylinear and brilinear or are some sites going to be leaving both on, or depending on the colour of the site ;) , selecting what looks best to them ?

Dave @B3D , would you like to comment ?

The problem with this thread is that it is all getting very subjective, similar to the temporal AA thread that was very subjective as well. Benching with no optimisations on at least tries to get it more objective and removes a set of permutations.
 
radar1200gs said:
The nx3x series is DX9 compliant enough to run Ruby with no shader changes, which was ATi's demonstration of their "new big thing".
"Compliant enough" now that's amusing. So "compliant" that the NV3x still can't run all the Shadermark tests after over a year after being released. So "compliant" that many games don't even use a PS 2.0 path on NV3x parts.

Funny definition of "compliance".

-FUDie
 
PatrickL said:
Bjorn, buy a X800 pro, play with it then you will see how little sense your post have in real gaming situation :LOL:

I don't doubt that actually. But some sites claims that even brilinear is not noticeable when playing games (Extremetech) so why should we automatically enable full trilinear on the NV40 then ? And the fact is, both the 6800 and X800 have the raw power to always do full trilinear so an option wouldn't exactly hurt anyone. At least not the consumer.
 
FUDie said:
radar1200gs said:
The nx3x series is DX9 compliant enough to run Ruby with no shader changes, which was ATi's demonstration of their "new big thing".
"Compliant enough" now that's amusing. So "compliant" that the NV3x still can't run all the Shadermark tests after over a year after being released. So "compliant" that many games don't even use a PS 2.0 path on NV3x parts.
Hmmm, even the 6800 doesn't run all the SM tests, and it does seem that's due to some render target format that are not supported by Nv.

BTW, it's not SM which determine if something is DX compliant it's M$, and performance have nothing to do with compliance, unfortunately.
 
Evildeus said:
FUDie said:
radar1200gs said:
The nx3x series is DX9 compliant enough to run Ruby with no shader changes, which was ATi's demonstration of their "new big thing".
"Compliant enough" now that's amusing. So "compliant" that the NV3x still can't run all the Shadermark tests after over a year after being released. So "compliant" that many games don't even use a PS 2.0 path on NV3x parts.
Hmmm, even the 6800 doesn't run all the SM tests, and it does seem that's due to some render target format that are not supported by Nv.

BTW, it's not SM which determine if something is DX compliant it's M$, and performance have nothing to do with compliance, unfortunately.
Well, duh, but that also applies to which cards can run Ruby.

-FUDie
 
jvd said:
I get an error crysystem.dll loading failed

Something about default font from the xml file ?
Well, that doesn't surprise me. As I've mentioned before, I've got the german version, and I'm not surprised that the save game from the german version doesn't work on the US version.

Sorry...
 
FUDie said:
radar1200gs said:
The nx3x series is DX9 compliant enough to run Ruby with no shader changes, which was ATi's demonstration of their "new big thing".
"Compliant enough" now that's amusing. So "compliant" that the NV3x still can't run all the Shadermark tests after over a year after being released. So "compliant" that many games don't even use a PS 2.0 path on NV3x parts.

Funny definition of "compliance".

-FUDie

Ever stopped to think that it may be shadermark at fault, not the hardware? it was written on ATi hardware and people AA-U-ME'd that ATi's way is the only correct way.
 
Bjorn said:
PatrickL said:
Bjorn, buy a X800 pro, play with it then you will see how little sense your post have in real gaming situation :LOL:

I don't doubt that actually. But some sites claims that even brilinear is not noticeable when playing games (Extremetech) so why should we automatically enable full trilinear on the NV40 then ? And the fact is, both the 6800 and X800 have the raw power to always do full trilinear so an option wouldn't exactly hurt anyone. At least not the consumer.

The little problem with that is some sites are saying they dont see brilinear ( one ?) xhile all other see it, albeit way less on 6800, while some other (2?) claim to see ATI using optimisation while they use special tools to show it. Seems to me obvious difference ;)

Grestorm videos prove about nothing for me as i can't reproduce it on my pc. Could be a driver bug, a bug on his pc or anything. And if it is a bug, well I think Nvidia cards have some too in Farcry.
 
Have there actually been any examples outside of FarCry? Halo was mentioned earlier, but it seems that it *hasn't* been demonstrated in this game. If it's only FarCry (and in very limited circumstances) it might not even be trylinear, but the engine as has already been suggested and the theory demonstrated in another thread.

I don't see where all these conclusions are coming from. There's very little actual evidence of anything and some people just seem to be very willing to jump one way or the other. Never have so many, argued so much, about so little ;)

Frankly, I think people need to sit down and do some real investigation. Can this effect be demonstrated outside of the circumstances of the video this thread contains? Is it visible at other locations? Has the theory about the way the engine works been put to the test?

I'd have thought after last year we'd see less knee jerk reactions and more investigation.
 
Evildeus said:
FUDie said:
Well, duh, but that also applies to which cards can run Ruby.

-FUDie
What do you mean? :?
Speaking of cards that can run ruby...

All the performance in the world does you no good at all if your lack of features makes you obsolete as ATi so hilariously proved with R300 and ruby.

It doesn't help that the obseleting features were the ones fans of the company derided as unecessary, non compliant, waste of silicon, etc in NV3x...

Most enjoyable.
 
Well, if the same thing is used in different application and not supported by Nv's cards, i don't see how it is supposed to work :? So what's the point? Moreover if WHQL are given i supposed M$ doesn't required this.
 
PatrickL said:
The little problem with that is some sites are saying they dont see brilinear ( one ?) xhile all other see it, albeit way less on 6800, while some other (2?) claim to see ATI using optimisation while they use special tools to show it. Seems to me obvious difference ;)

How many tests have we seen with brilinear (6800) and actual gameplay ?
 
radar1200gs said:
Evildeus said:
FUDie said:
Well, duh, but that also applies to which cards can run Ruby.

-FUDie
What do you mean? :?
Speaking of cards that can run ruby...

All the performance in the world does you no good at all if your lack of features makes you obsolete as ATi so hilariously proved with R300 and ruby.

It doesn't help that the obseleting features were the ones fans of the company derided as unecessary, non compliant, waste of silicon, etc in NV3x...

Most enjoyable.

Eck my r3x0 can't run ruby. Your nv3x can't run farcry with sm 2.0 on.

Wonder as the consumer which one is more important to have running .

What you posted in quotes shows exactly how bias you are .

(btw ruby runs just fine with a wraper on my r3x0 just like on your nv3x )\


And the "reduced performance in AF on Ati cards" would hardly mean lower performance then on the NV cards. Just lower in comparision. And the "other reason" for enabling it would of course be to get a more "apples to apples" comparision of raw performance and would get rid of any supposed corner cases where trylinear wouldn't get optimal quality. And of course get the community to shut up about this. I'm also guessing that we'll see brilinear vs trylinear (just look at the Extremetech article where they stated that the difference between all methods were minimal and not noticeable when playing the game) in coming reviews anyway so i don't think it would hurt at this point in time. Make it a checkbox like it is in Nvidias drivers.

You just said exactly why they will never do it.

As you said the only reason to do it is to make nvidia look better. Why would ati do this ?

Esp if there is no image quality decrease ?

I have only seen two examples both by users on this board (one or both signed up just to post it ) and it has yet to be reproduced. Surely if the inq and other sites can come and quote and take stuff from these forums if these videos were repeatable they would have already had an article up showing the bad bad images from the r420 don't u agree ?
 
Oh yes it can run farcry with SM2.0 on, just set the device ID to R300.

Sure ruby may work fine with a wrapper, but that wrapper modifies the DX9 shaders.

Now, NV3x can run ruby's shaders totally unmodified and the same goes for farcry when the ATi device ID is set.
 
radar1200gs said:
Oh yes it can run farcry with SM2.0 on, just set the device ID to R300.

and how playable is it radar ?

can u play it on a mid end gpu like the 5600 or 5700 ? how is the performance against that of the radeon 9600?
 
jvd said:
radar1200gs said:
Oh yes it can run farcry with SM2.0 on, just set the device ID to R300.

and how playable is it radar ?

can u play it on a mid end gpu like the 5600 or 5700 ? how is the performance against that of the radeon 9600?

Both my 5700 Personal Cinema and 5900 Ultra can play Far Cry with the r300 device id,

I just have to lower the resolution. Making IQ sacrafices to run them at full precision and effect. The game is "Not" unplayable. Unless you see 640x480 and 800x600 resolutions unplayable. Just because the card isnt as fast as the competition in directx 9.0 effects doesnt mean the card is completely unplayable with them.
 
ChrisRay said:
jvd said:
radar1200gs said:
Oh yes it can run farcry with SM2.0 on, just set the device ID to R300.

and how playable is it radar ?

can u play it on a mid end gpu like the 5600 or 5700 ? how is the performance against that of the radeon 9600?

Both my 5700 Personal Cinema and 5900 Ultra can play Far Cry with the r300 device id,

I just have to lower the resolution. Making IQ sacrafices to run them at full precision and effect. The game is "Not" unplayable. Unless you see 640x480 and 800x600 resolutions unplayable. Just because the card isnt as fast as the competition in directx 9.0 effects doesnt mean the card is completely unplayable with them.

well considering my sister is playing at 1027x768 with 2x fsaa and 8x aniso i would consider 640x480 for video cards that cost more than her 9600pro (125$ or less)

I mean really the 5900ultra ? a pretty much high end card running at 800x600 ?
 
jvd said:
ChrisRay said:
jvd said:
radar1200gs said:
Oh yes it can run farcry with SM2.0 on, just set the device ID to R300.

and how playable is it radar ?

can u play it on a mid end gpu like the 5600 or 5700 ? how is the performance against that of the radeon 9600?

Both my 5700 Personal Cinema and 5900 Ultra can play Far Cry with the r300 device id,

I just have to lower the resolution. Making IQ sacrafices to run them at full precision and effect. The game is "Not" unplayable. Unless you see 640x480 and 800x600 resolutions unplayable. Just because the card isnt as fast as the competition in directx 9.0 effects doesnt mean the card is completely unplayable with them.

well considering my sister is playing at 1027x768 with 2x fsaa and 8x aniso i would consider 640x480 for video cards that cost more than her 9600pro (125$ or less)

I mean really the 5900ultra ? a pretty much high end card running at 800x600 ?

Yeah when you're forcing the card into 32 floating point which just about halves preformance.. I'd think 800 x 600 is reasonable.
 
Yeah when you're forcing the card into 32 floating point which just about halves preformance.. I'd think 800 x 600 is reasonable.

Sorry but with a high end card halving perfromance for running it at the same settings as other cards in its price range is not reasonable .


Has anyone forced the 6800 to run on the r3x0 path ? wonder if that has perfromanced halved to
 
Back
Top