What s up with FarCry and 6800?

ChronoReverse said:
At least it seems nvidia is smartening up and allowing true trilinear to be used and "brilinear" to be user selectable.

Here's to hoping it will be in all games and synthetic benchmarks, so the option "suddenly" doesn't work in game x or application y
 
kind of dissapointing to see such a monster of a card only enable 1x higher AA setting thatn the current champ, certainly not the jump as from GF4 to R300 as I hoped...

perhaps 400 clock is killing it somewhat... now if R420 XT comes out with 600mhz as per that slide now that would be impressive, but somehow I can't really see that happening.
 
The Geforce 6800 finally has RGMS FSAA just like the Radeon. You can easily see that by looking at this screenshot. I have to say that I don't see any difference in quality between the Radeon and Geforce 6800, so maybe the Geforce 6800 also has gamma corrected FSAA. Or gamma corrected FSAA isn't that much better as we have thought, but RGMS is far more important.

ATI has sparse sampled MSAA.
 
Some nice gains in some games (in Splintercell) but the Far Cry benches are dissapointing, I don't hope this is a sign for another subpar shaderperformance.
 
http://mbnet.fi/elixir/NV40/10817474486qLMOmeutS_3_2_l.jpg

I still prefer the ATI AA. On the left where the leg of the platform sweeps downwards its almost like there's no AA in the NV shots.

I've also noticed that in every image the 6800 is darker and when there's a specific light effect like against a wall reflection - it's much darker than both the 5950 and 9800XT.

I hope the H results are due to driver limitations as so far it seems high fps games get much higher, low fps games - well just get a few extra.
For Farcry to have really smooth gameplay 85fps should be the lower average unless there's something like a dual framebuffer on the card.

still, early days...
 
ninelven said:
My beef is that different people like to use different settings / make different tradeoffs. The review is fine for what it is, but I still don't have any idea as to how much faster the NV40 is, what its strong and weak points are, or how it might perform in the settings I like to use :? .
Exactly. I've always enjoyed HardOCP's image quality comparisons as most reviews gloss over it, but this method of trying to find "playable" settings is extremely ill-suited for such a new product. We want to see where the bottlenecks lie.
 
FarCry is cpu limited I think, like HL2 and UT2k4 botmatch.

I think ATI's advantage in 4x mode probably isn't so much the sample positions, but the GC.

Eric Demers said:
If we look at MSAA on the 9700p, we did tons of image quality simulations of AA algorithms. We varied sample positions, sample numbers, etc. This made us realize that we needed to gamma correct the samples. Otherwise the intensity of the AA line would vary too much.

It's not impossible that GC can be implemented in a driver as I outlined in another thread.
 
DC, and as OpenGL Guy pointed out, just because it's not impossible doesn't mean it's feasible for the architecture either.

I wouldn't put my hopes up on seeing GC AA on the nv40.
 
Tom's hardware seems to indicate that FarCry is extremely CPU limited outdoors, while relatively GPU limited while indoors. Food for thought?
 
surfhurleydude said:
Tom's hardware seems to indicate that FarCry is extremely CPU limited outdoors, while relatively GPU limited while indoors. Food for thought?

Might be a case for SM3.0 and instancing. (lot's of trees there).
 
ChrisRay said:
ChronoReverse said:
At least it seems nvidia is smartening up and allowing true trilinear to be used and "brilinear" to be user selectable.


Wonder if "FX" users will get this.
It doesn't. I've tried it. No trilinear optimization option. Working on a registry hack or something to that effect.
 
Back
Top