FP16 and market support

If it does not look like 4xfsaa but instead looks like 2X how can you call it 4x? Thats like stating a Ford Escort is a Cadallac......

It has been rumored that Nvidia actually detects screeshots and the driver will use higher quality to render the screenshot....

Why are nVidia's drivers encrytped? answer they do that so you cannot get around their cheating mechanisms that allow them to alter IQ.

Why didin't Ati encrypt their drivers?

What about the way they treated omega? he made their drivers so that you could choose speed or IQ and they threatened him with legal action.....funny that ATi supports him and ecourgages him to continue knowing he helps a certain segment of the population.

I fail to see how the actions nvidia has taken over the least few years has benifited the gaming industry...

oh and you never did answer my question WHY SHOULD ATI OWNERS SUFFER BECAUSE OF NVIDIA'S MISTAKES?

Althornin said:
YeuEmMaiMai said:
like I said claiming 4x fsaa that looks like 2x. every single reviewer states that NV3X 4X FSAA looks like ATi's 2X setting....

as for the dawn demo, what exactly are you trying to say? It has been proven that ATi renders it faster and it looks better due to FP24 vs mostly FP16

What Nvidia is doing to the comminity is not good why should I as an ATI owner be foreced to suffer because of a competitors lusy DX9 performance and lower percision?
Yes, we all know that nVidias AA looks inferior (over 2 samples, and its OG, and its not gamma corrected).
This does NOT mean that nVidia isnt doing 4xAA.
Stop ranting and raving!
 
The designation of an anti-aliasing method is usually determined by the number of samples used for the value of each anti-aliased pixel. As such, NVidia's 4x certainly is 4x -- just a relatively poor looking 4x.
 
ok so tell me where exactly are they cutting the corners? cause it definately does NOT look like 4x.......

Ostsol said:
The designation of an anti-aliasing method is usually determined by the number of samples used for the value of each anti-aliased pixel. As such, NVidia's 4x certainly is 4x -- just a relatively poor looking 4x.
 
YeuEmMaiMai said:
Showcasing their supposid DX9 demo Dawn while using 99.9% FP16 (this was before the crapola hit the fan) I find it extremely funny that ATi's Radeon 9500Pro or greater cards can run the demo faster than NV3x can and ATi has to use a WRAPPER
FP16 is part of the DirectX 9 specs so I don't see any problem with it being a valid DirectX 9 demo.

If the demo is mostly GPU limited (which I suspect) then a wrapper won't make a big difference. Probably less than 1%. But that ATi's hardware performs better is just well done by them.

YeuEmMaiMai said:
Claiming things like 4X FSAA that is in actually more like 2x......
No, they support 4x OG FSAA and that is 4x FSAA, no matter what you will claim. ATis 4x RG FSAA looks better at near horizontal/vertical edges, while nVidias 4x OG FSAA looks better at diagonal edges. So, both ATIs and nVidias FSAA implementation have their pros and cons.
 
1. FP is only part of the spec when the developer opts for partial percision, not when Nvidia decides to replace shaders that call for FP24 (well FP32 for nvidia) with hand coded FP16 ones.... do you see the difference?

If the demo was GPU limited, it is only GPU limited on NVidia's hardware since we already determined that it runs faster on ATi's/ Personally I think it is CPU limited as my 1.8G AMD 2200+ and 9500Pro get an average of 35 fps at 1600*1200*32.

The gripe I have is FP16 (contrary to what Nvidia claims) clearly is not sufficient for if it was MS would have adopted it and we would not be having this discussion.

lets take a look at Nvidia's history concerning IQ shall we?

NV1 failure

Riva128 inferior IQ to every other card out there (load up Jedi Knight on it and notice it cannot render transparent textures inferior IQ compared to S3 verge, Ati rage3d, rendition vertite, 3dfx, matrox, etc but was fairly fast so it sold
2d was crap at anything above 1024*768

TNT/2 once again inferior IQ to every other card out there including Rage IIc, Savage 3d/4, rendition vertite 2X00, banshee, matrox very fast so it sold

ditto for GF1,2,3 and finally in GF4 they get their 2D act together but they still have trouble with ANSIO and FSAA quality

Just when it appears that GFFX may erase nvidia's past IW problems, Nvidia once again shows that they will do anything to get the speed crow including screwing over IQ


[quote="sonix666]FP16 is part of the DirectX 9 specs so I don't see any problem with it being a valid DirectX 9 demo.

If the demo is mostly GPU limited (which I suspect) then a wrapper won't make a big difference. Probably less than 1%. But that ATi's hardware performs better is just well done by them.

[/quote]
 
Doomtrooper said:
FP16 is part of the DirectX 9

Not all the time, only *hints* when allowed. That is why Dawn is OGL as their proprietary extensions allow partial precision.

That doesn't compute. DX9 allows partial precision modifiers to be placed on any instruction which would be passed through to NVidia's driver. NVidia could have coded Dawn in DX9 using _PP modifiers.

And of course, because NVidia developers prefer OpenGL coding over DX, and DX wasn't even shipped at the time Dawn was written.

Perhaps if you had said, NVidia's extensions allow access to FX12, predicates, pack/unpack, and a lot of other nice NV30 instructions which are not exposed in DX9 (ps_2_x didn't exist yet either), you might have something. Still, I think demowriting has more to do with the programmer preference. Why doesn't Humus switch over to writing all his demos in DX9? Because he is used to OpenGL and prefers it.
 
YeuEmMaiMai said:
ok so tell me where exactly are they cutting the corners? cause it definately does NOT look like 4x.......

Ostsol said:
The designation of an anti-aliasing method is usually determined by the number of samples used for the value of each anti-aliased pixel. As such, NVidia's 4x certainly is 4x -- just a relatively poor looking 4x.


Oversimplyfied since ordered grid patterns are cheaper in terms of HW implementation, it's virtually the only place were corners have been cut in relative terms.

Ordered grid sampling pattern:

subpixel19.png


Sparse grid sampling pattern:

subpixel20.png


(images borrowed from 3DCenter's Anti-aliasing article; it's just in german otherwise I'd link to it and save time).

Now if you look at the red dots at both pics on each axis (x,y), you'll see that ordered grid has a 2*2 grid, while 4x sparse a 4*4 grid.

2x sparse (or mostly called RGMS)= 2*2
4x OGMS = 2*2
4x sparse = 4*4

But that's concentrated mostly on the edge equivalent resolution...

ATis 4x RG FSAA looks better at near horizontal/vertical edges, while nVidias 4x OG FSAA looks better at diagonal edges.

True, yet the differences on diagonal edges are not only miniscule (ignoring gamma correction) there are also fewer in a scene. In terms of EER an ordered grid pattern would require for horizontal/vertical edges 16x sample OG (4*4) to match it. Simple question: can you claim with absolute safety that in terms of EER (a Matrox term by the way AFAIK) Parhelia's FAA is actually vastly superior (wherever the algorithm doesn't kick off) overall than ATI's 4x sparse grid pattern?

Of course the more samples the better, but the sampling pattern is not completely irrelevant either. Yet an ordered grid implementation is a design decision; 4x samples are 4x samples and there are really no but but's about it.
 
DemoCoder said:
That doesn't compute. DX9 allows partial precision modifiers to be placed on any instruction which would be passed through to NVidia's driver. NVidia could have coded Dawn in DX9 using _PP modifiers.

The entire Demo can't be in FP 16 to be DX9 compliant, some shader would have to be FP32..at least from what I understand. Minimum DX9 precision is FP 24 with FP 16 hints...as you know of course. If a title/game/demo can be written in FP16 all the time...then how does precision spec come into play.

Nvidia likes OpenGL for one reason, they can write demos that work only on their hardware. ;)
 
YeuEmMaiMai said:
ok so tell me where exactly are they cutting the corners? cause it definately does NOT look like 4x.......

Ostsol said:
The designation of an anti-aliasing method is usually determined by the number of samples used for the value of each anti-aliased pixel. As such, NVidia's 4x certainly is 4x -- just a relatively poor looking 4x.
wow, your rabid fanishness causes you to 100% miss the point.
You are as bad as chalnoth is (almost, because at least you dont pretend to be unbiased).
Does it take 4 samples?
YES.
Is it 4xFSAA - YES.
END OF STORY.

This is why reviews need to do image quality comparisons.
 
Doomtrooper said:
If a title/game/demo can be written in FP16 all the time...then how does precision spec come into play.
Isn't that obvious? If you code for _PP on all your shaders, you're allowing them to all be implemented using at least FP16.

Just like only putting 16 bit art into your textures. You're purposefully deciding (as the developer) that you don't need the extra precision.
 
Doomtrooper said:
The entire Demo can't be in FP 16 to be DX9 compliant
This sounds ridiculous.

Apps/software do not have to be, nor need to be, nor should it be termed by anyone as "DX9 compliant".
 
Doomtrooper said:
If a title/game/demo can be written in FP16 all the time...then how does precision spec come into play.

Because it specifies a level of support that must be present *IF* the developer requests it. If I have a particular calculation that requires more than FP16, then I will leave off the _PP modifier, and DX9 will guarantee that it is *atleast* FP24. All the _PP modifer says is "I am requesting this calculation be done at ATLEAST FP16). Leaving off the _PP modifier says "I am requesting this calculation be done at atleast FP24). It's a contract between the developer and the driver.

DX9 allows framebuffers to be created with 128-bits of precision. Does that mean every game must be 128-bit? Do all textures need to be stored at 32-bits of precision?
 
Reverend said:
Apps/software do not have to be, nor need to be, nor should it be termed by anyone as "DX9 compliant".

Game titles have had 'DX7, DX8, DX9' requirements for some time. If a game title was written with HDR lighting, I'm sure the developer would request the consumer to have a DX9 compliant card.
 
DemoCoder said:
Because it specifies a level of support that must be present *IF* the developer requests it. If I have a particular calculation that requires more than FP16, then I will leave off the _PP modifier, and DX9 will guarantee that it is *atleast* FP24. All the _PP modifer says is "I am requesting this calculation be done at ATLEAST FP16). Leaving off the _PP modifier says "I am requesting this calculation be done at atleast FP24). It's a contract between the developer and the driver.

Well you are saying what I want to say Democoder, but just expanding on my thought. If a game developer requires full precision HDR lighting for example, or render targets.
 
they'd better be of they are using the DX SDK to develop the game.... I feel pitty for those people who buy a Nvidia $500 DX9 video card and they are not getting it since the cards are only good for FP16 performance wise. This single fact makes the card not worth the $500 they are asking for in their top of the line model......it's like paying a honda accord but instead getting a ford escort..

Reverend said:
Doomtrooper said:
The entire Demo can't be in FP 16 to be DX9 compliant
This sounds ridiculous.

Apps/software do not have to be, nor need to be, nor should it be termed by anyone as "DX9 compliant".
 
An app/sw requiring a DX9 compliant card to run does not make it a "DX9 compliant" software. The day that softwares are forced to be "compliant" with an API's specifications is the day ISVs will cough blood.

I'm just being nit-picky with the use of words and terms of course but I thought it was amusing to read... please ignore me and continue with this amusing thread.
 
Reverend said:
I'm just being nit-picky with the use of words and terms of course but I thought it was amusing to read... please ignore me and continue with this amusing thread.

You're a twisted man if this thread amuses you... ;)
 
Back
Top