GFFX Anti-Aliasing with FSAA Tester

Chalnoth said:
One thing you have to keep in mind is that Fablemark probably won't use the two-sided stencil acceleration available in the Radeon 9700 and GeForce FX.

Correct since its a DX8 app and double sided stencil is not available. But FableMark is fully fillrate limited, stencil fillrate most likely, and not vertex/geometry throughput limited hence double sided stencil will make no difference at all to the final score.

K-
 
I've modified Brent's "applet" to fix some javascript errors (it didn't preload), and also to make it easier to see what's really going on via a zoom type thing. You can check it out here.

I've left the page format and wording as is (redid the HTML though), to give proper respect to the original author :)
 
Why Horizontal Supersampling instead of Vertical Supersampling?

2 Words: Texture Cache

Being a little more specific, when doing horizontal supersampling vs vertical on a scanline renderer such as the NV30, the bandwidth requirements for textures reads will be reduced because the texels will already be in the texture cache.

For small triangles there may not be a significant differences between vertical and horizontal because the texels are quite likely to still be in the texture cache. However, for a larger triangle, it's quite possible that the texels will have already been discarded before beginning rendering the next scanline of the triangle.

Of course, I could be completely wrong, since they might be and probably are rendering multiple scanlines at a time. Still, there is always the chance that supersampling vertically will still impact the texture cache more than horizontally.
 
It's rather clear that these "new" modes totally suck, and were slapped together as an afterthought...

Boy...If Antialiasing is important to somebody, it looks like you should steer clear of the FX. That one pages @ www.ratchetspage.com really shows the massive superiority of ATI over nVIdia. I mean...in comparison, the nVidia implementations are just about worthless. I mean, those modes > 4-sample are just pointless, other than something marketing can use/say.
 
I still don't get why nvidia replaced the "old" xS-modes (2x2 OGSS + 2x RGMS and 1.5x2 OGSS + 2x RGMS ) with these new OG-modes. Especially the new 8xS-mode is a lot worse than the old one (which looks a bit better than ati's 4xAA).
Could it be because of OpenGL? Do these new modes work under OpenGL (because the others don't)?
 
They've said 4XS and 6XS aren't compatible with OpenGL's requirements for combining filtering modes.
 
Typedef Enum said:
Boy...If Antialiasing is important to somebody, it looks like you should steer clear of the FX.

Sad but true. As I said before, I was really hoping AA performance/quality would be the FX's trump card.

When watching the NV30 launch, and being told the NV30 was the first project that combined the efforts of Nv and 3dfx, I had high hopes for AA on the FX. After all, Mr T-Buffer (aka Gary Tarolli) is a 3D architect at NV.

Wtf did 3dfx engineers do on this project? :?
 
WOW! ATI smacks the low down on AA here. Performance numbers with equivalent IQ would be hard to do since ATI 4x is better then anything Nvidia is showing.

Any alpha tests that can be done? Something has to be there for Nvidia not to go home crying :cry:.
 
Fuz said:
Wtf did 3dfx engineers do on this project? :?

As I posted in another thread, all I can see that 3DFX had any input over was the lateness of it. There is no 3DFX tech in this card whatsoever. It is nothing more that a contunation of standard nVidia tech with DX9 support....... what a waste!
 
Testiculus Giganticus said:
What would you regard as being 3dfx tech?

Can you tell me what 3DFX tech you see in the GFFX? Quality FSAA? High bandwidth? T-buffer or anything like it? Anything from Rampage and/or Sage, Fear or Gigapixel? Please let me know, anyone........
 
The 3DFX excuse needs to die, 3DFX never used MSAA, nor made the choice for .13 :rolleyes:

Nvidia is making the choices for their technology, and looking at the screen shots shows the FSAA basically the same modes implemented into the Geforce 3 and 4 and exposed with Riva Tuner..is 3DFX responsible for that too :?:

No need to make excuses, accusing a company long gone, they don't deserve to be constantly insulted.
 
Doomtrooper said:
The 3DFX excuse needs to die, 3DFX never used MSAA, nor made the choice for .13 :rolleyes:

Nvidia is making the choices for their technology, and looking at the screen shots shows the FSAA basically the same modes implemented into the Geforce 3 and 4 and exposed with Riva Tuner..is 3DFX responsible for that too :?:

No need to make excuses, accusing a company long gone, they don't deserve to be constantly insulted.

Doom, I have to think the end product would have been worlds better - at least in FSAA - IF 3DFX tech had been used......

So, I don't view the inclusion of 3DFX tech as an excuse, however, maybe the lack of 3DFX tech can be used as an excuse.....
 
Crusher said:
They've said 4XS and 6XS aren't compatible with OpenGL's requirements for combining filtering modes.

IIRC nvidia said that at least 8xS will work with OpenGL applications. So maybe that's the reason for choosing the worse sampling-pattern. I doubt the better 8xS mode will work with OpenGL because its pattern is very similar to 4xS (2x2 OGSS instead of 1x2 OGSS, the rest is the same). I'm just speculating here though...
 
Brent said:
Can't do anymore with the GFFX right now, I had to give the GFFX back to Kyle for a few days so he can prepare for the TechTV bit on Tuesday...

I will be gettting it back however so I can finish the articles I am working on.. when he is done with it.

So for right now no GFFX in my possesion.

If he doesn't blow it up!!! You guys are famous for doing that with new equipment.. 8)
 
You didn`t seem to get my point.People think that 3d card making is a lego game or something of the sort.You think that when nv aquired 3dfx they got some new pieces that could be taken and fitted accordingly.That is not the case.3dfx input was in the way some things, not transparent to the end-user, were/are made.Period
 
CeiserSöze said:
IIRC nvidia said that at least 8xS will work with OpenGL applications.
Can we clear up a common misconception please. NVIDIA has never stated anything about an 8xS mode.

NVIDIA has documented 6xS and 8X modes for GeForce FX. The "8xS" appeared from users after seeing the new modes in Rivatuner for GeForce 3/4 etc. But, NVIDIA has not talked about an 8xS mode once, per se.

They have mentioned that 8X will be a mixed mode, but its obviously not in the same fashion that 4xS/6xS is a mixed mode since its supposed to work with OpenGL. With a mixed mode you have two pixel pipelines runnin the samples per pixel, however, I assume with 8X that each of those pipes must take the texture sample in the same position for it to be able to operate under OpenGL and Multisampling.
 
DaveBaumann said:
CeiserSöze said:
IIRC nvidia said that at least 8xS will work with OpenGL applications.

Can we clear up a common misconception please. NVIDIA has never stated anything about an 8xS mode.

And how would you explain this ;)
driver2.gif



But to get to the bottom of this:
There are currently 2 6x- and 2 8x-modes available in the drivers. 2 are officially supported by nvidia and 2 aren't.

The modes supported by nvidia are:
-4x OGMS combined with 1.5x1 OGSS (let's call this one 6x to keep it simple)
-4x OGMS combined with 2x2 OGSS (= "8x")

The modes not supported by nvidia but present in the drivers, unlockable with certain tweak-tools and working on NV2x-cards are:
-2x RGMS combined with 1.5x2 OGSS (let's call this one 6xS)
-2x RGMS combined with 2x2 OGSS (= 8xS)

The latter two are obviously the better modes (quality-wise) so I'm just searching for a reason why nvidia went with the other two modes. Is it because these may work with OpenGL while 6xS and 8xS won't? Is it a driver problem? Or even a hardware-bug (extremely unlikely imho)?
I'm just searching for answers here... :)
 
Back
Top