I have a Radeon

Fodder said:
Ah, the tenth thread but the same old arguments. :LOL:

Here's some spice for the mix.

It appears that DST provides superior IQ when the shadows are rendered correctly, while when stepping/pixelation occurs the inferior grainy/fuzzy shadow edges help mask this, hence providing the illusion of better IQ. Therefore, it is unfair that non-NVIDIA/DST cards aren't required to do the requisite work to achieve the same rendering result as NVIDIA/DST cards, meaning NVIDIA are in fact at more of a disadvantage with DST enabled.

Discuss.

Well nvidia can turn off dst which is not part of dx 9 for 3dmark 2005 and do less work as you claim and still end up with a lower score.

But from what i've read dst is a mixed bag (hmm what other things have i heard this about ) sometimes it provides better image quality and some times worse image quality .
 
jvd said:
Well nvidia can turn off dst which is not part of dx 9 for 3dmark 2005 and do less work as you claim and still end up with a lower score.
Please don't tell me you've managed to post in every one of these threads yet still don't understand how DST works?
 
Fodder said:
jvd said:
Well nvidia can turn off dst which is not part of dx 9 for 3dmark 2005 and do less work as you claim and still end up with a lower score.
Please don't tell me you've managed to post in every one of these threads yet still don't understand how DST works?

No i do know how it works .

In 3dmark 2005 you can disable dst can you not ? (I know the answer is yes)

Doing so makes the geforce 6000 series loose 5-15% (based on reverends numbers )

You claim That dst provides the illusion of better iq , you then go on to say that non dst cards aren't made to increase the image quality to equal that image quality. You then go on to say that nvidia is at even more of a disadvantage with dst enable.

Which is why i said they can turn it off and do the same amount of work as other vendors are doing and get a lower score (5-15%)

However its debatable that dst is providing better image quality, from reviews around the net some places its better quality and some places its worse quality .

The main question is , if it gives diffrent iq output why was it included in the benchmark ? THe only reason i can come up with is that with out it the 600 series card would be at a disadvantage or a larger one then they are at now (in terms of score diffrences )
 
jvd said:
You claim That dst provides the illusion of better iq
No I didn't.
jvd said:
However its debatable that dst is providing better image quality, from reviews around the net some places its better quality and some places its worse quality .
I think you'll find it is as I stated, generally DST only appears to be worse when the image is already ruined by shadow map artifacting.
jvd said:
The main question is , if it gives diffrent iq output why was it included in the benchmark ?
Flip that one on it's head. Why does the non-DST path give a different IQ output? Given that DST is in the benchmark, why are non-DST cards allowed to render an easier, inferior image? SM2.0 cards are working harder to match the SM3.0 result, why shouldn't non-DST cards work harder to match the DST result?


I'm just trying to demonstrate an alternative your own blinkered opinion here. :)
 
jvd wrote:
You claim That dst provides the illusion of better iq

No I didn't.

That is how i read this

It appears that DST provides superior IQ when the shadows are rendered correctly, while when stepping/pixelation occurs the inferior grainy/fuzzy shadow edges help mask this, hence providing the illusion of better IQ

Flip that one on it's head. Why does the non-DST path give a different IQ output?
Because they aren't the same paths . Just like 3dc vs dxtc5 if it was included , the point is , it isn't .

that DST is in the benchmark, why are non-DST cards allowed to render an easier, inferior image?

Well by nvidia's own benchmarks its not the easier path , also by reviews on the web neither is inferior . Dst has its advantages in some areas while it looks worse in other areas . Which i don't get your point here as you were just above trying to claim that you never said dst gives better image quality. So wtf are you talking about or do you have no clue yourself .

SM2.0 cards are working harder to match the SM3.0 result
How do you know that ? Can you show me proof ?


why shouldn't non-DST cards work harder to match the DST result

THat is the whole point . DST should never have been included into the benchmark . Since it doesn't give the same image quality as the dx 9 implementation . SInce this is a dx 9 path only the other ways that are included in dx 9 should have been used .

The only reason why dst was used was because it gives nvidia a speed increase .




I'm just trying to demonstrate an alternative your own blinkered opinion here.

No your not , your channeling john kerry adn flip flopping through out the thread.

In this very post you try to claim that you never said Dst gives better image quality and then later in the same post you claim it does and that non dst card should have to work harder to equal it.

Make up your mind


The whole point of the arguement is that DST is not part of dx 9 , it does not give the same out put as the dx 9 way of doing things . So it shouldn't have been included and since it was 3dc which isn't part of dx 9 and doesn't give the same output as the dx 9 way of doing things should have been included . Since it wasn't its showing bias towards nvidia in regards that a feature that currently only benefits them was added into the benchmark. Thus stoping an apple to apple comparison
 
ChrisRay said:
Scali, your problem isnt your opinions, Its the way you go about discussion, You are condescending towards other users, and you treat them terribly,

To be fair, he isn't the only one here who does that. I can think of someone else who claims to be impartial, yet somehow manages to act less than maturely while promoting ATI's side of the story.

And no, it's not digi ;)

Here's a suggestion for all concerned - just express the facts and stop taking any disagreements as a personal attack.

[edit] this is not meant as an attack on scali or anyone else - just an observation on what seems to be getting personal between some people.
 
jvd said:
That is how i read this
It appears that DST provides superior IQ when the shadows are rendered correctly, while when stepping/pixelation occurs the inferior grainy/fuzzy shadow edges help mask this, hence providing the illusion of better IQ
Read it carefully. I said the grainy/fuzzy shadow edges of the non-DST cards give the illusion of better image quality by blurring out the shadow map artifacts.
jvd said:
Well by nvidia's own benchmarks its not the easier path
Because NVIDIA use DST. For a non-DST card to match a DST cards output, it has to do more work.
jvd said:
you were just above trying to claim that you never said dst gives better image quality.
No I wasn't, pay attention.
jvd said:
SM2.0 cards are working harder to match the SM3.0 result
How do you know that ? Can you show me proof ?
Too easy. The Geforce 6 loses performance when asked to run the SM2.0 codepath.
jvd said:
DST should never have been included into the benchmark . Since it doesn't give the same image quality as the dx 9 implementation .
Now I've been lead to believe that non-DST cards could give the same IQ as DST cards, but FutureMark chose not to as it would negatively impact their scores. Would you like to dispute that?
jvd said:
The only reason why dst was used was because it gives nvidia a speed increase .
Replace "NVIDIA" with "cards that support DST" and you might well be onto the something.
jvd said:
No your not , your channeling john kerry adn flip flopping through out the thread.

In this very post you try to claim that you never said Dst gives better image quality and then later in the same post you claim it does and that non dst card should have to work harder to equal it.

Make up your mind
Or, you could try making a serious attempting to read and comprehend what I've posted, and see that I never claimed such a thing. :?
jvd said:
The whole point of the arguement is that DST is not part of dx 9 , it does not give the same out put as the dx 9 way of doing things .
Indeed, it gives a better output, one that non-DST cards could apparently also achieve at the expense of some 3DMarks.
jvd said:
showing bias towards nvidia
And for you, that's what it all comes down to no doubt.
 
jvd said:
The whole point of the arguement is that DST is not part of dx 9 , it does not give the same out put as the dx 9 way of doing things . So it shouldn't have been included and since it was 3dc which isn't part of dx 9 and doesn't give the same output as the dx 9 way of doing things should have been included . Since it wasn't its showing bias towards nvidia in regards that a feature that currently only benefits them was added into the benchmark. Thus stoping an apple to apple comparison

ummmm, "DST" is a texture format. Not something that you flip on or off.

"dx 9 way of doing things" ??
 
Read it carefully. I said the grainy/fuzzy shadow edges of the non-DST cards give the illusion of better image quality by blurring out the shadow map artifacts.
You should make it read clearly as that is not the way it reads at all.

Because NVIDIA use DST. For a non-DST card to match a DST cards output, it has to do more work.
But nvidia doesn't have to use dst . INfact when using what you call a lesser output (which above u claim is sometimes worse but whatever ) it is up to 15% slower .

No I wasn't, pay attention
Wtf u just did it again up aboe , you claim that a non dst card should have to do more work to match it.

Too easy. The Geforce 6 loses performance when asked to run the SM2.0 codepath

ANd so why are some cards allowed to do partial percision ? shouldn't they be forced to do full percision to match the work done by other cards .

Its a simple answer no they should't since partial percision , sm 2.0 , sm 2.0a , sm2.0b , sm3.0 are all part of the dx 9 spec while DST IS NOT PART OF DX specs

Replace "NVIDIA" with "cards that support DST" and you might well be onto the something.
Show me a non nvidia card that gains speed with DST .

Or, you could try making a serious attempting to read and comprehend what I've posted, and see that I never claimed such a thing.
I have , you've done it again in this very post .

Indeed, it gives a better output, one that non-DST cards could apparently also achieve at the expense of some 3DMarks.
Here you go channeling kerry again.

YOu go to say in the first quote of this post that it is not giving better iq in some cases and once again u are saying it gives better output. Will u make up your mind ?

Also for a dx 9 benchmark why would they force cards that are running in dx 9 specs to match something that is not part of dx 9 specs ? that is no longer a dx 9 benchmark.

Also since 3dc gives better image quality than dxtc5 why not add 3dc in and make the other cards do more work to match the quality .

And for you, that's what it all comes down to no doubt.

Heh if u say so big guy .



I have been voicing my problems with 3dmark for a very long time even when ati was winning .


First they introduce Pixel shader 1.0 and v.s 1.0 tests when the geforce 3 came out while it was the only card for a very long time to support it , but they didn't add in a p.s 1.1 test when the radeon 8500 came out ( a test that affected scores )

Why the diffrence ? why did 1 card get an artifical score on the test while the other did not . Its not like p.s 1.4 wasn't used by other cards , in fact even some dx 9 cards like the fx were forced to run it all the time .

3dmark 2003 , there was huge cheating going on by nvidia once they started to loose the tests , Futuremark rolled over and died on the issue .

3dmark 2005 , they started to include non dx ihv specific tech into the tests on by default but wouldn't add other ihv specific tech into the tests at all let alone on by default.

THe last 3dmark that was usefull was 99' and hell it was so long ago that there may have been problems with it too.
 
by some reason this thread reminds me how some time ago there were arguements if it's OK to use ATi's instancing ..... funny how things change....

I really don't see the reason why some people think that FM favors Nvidia - come on, which card has top result in 3dmark05 right now?
 
Scali said:
The card I am currently using in my own PC at home is a Radeon 9600Pro 128 mb.
I am still quite happy with it, and am not considering an upgrade yet.

Over the years I've recommended a lot of Radeons to friends and family. The first would probably have been a Radeon 8500, to my brother, who's used it to great enjoyment for many years. I have also used it myself from time to time to test some code, and I was impressed with the speed and quality of the card.

I have actually recommended a friend of mine to buy a Radeon 9600Pro earlier today.

In short: I have nothing against ATi (or any other IHV for that matter, except for VIA/AMD who made my life hell), so would all the people who think I'm an nvidiot kindly get a life and stop bothering me?!
Thank you.

I always knew you were red!
 
jvd said:
Yes but its not out yet and we have no time frame for a release .

So you expect FM to wait until the videocard is released before they can release their product?

So 6 games in 5 years is tech taking off ? But 5 games in 1 year (3dc ) is not taking off ? How do you figure

It's quite simple, DST was ahead of its time. It was introduced before shadowmapping was a feasible shadowing method. In fact, it was introduced before any kind of self-shadowing method was feasible. If you notice, all those 6 games were released in a short timeframe.

No your wrong on alot of points

But you are unable to name any.

Perhaps you should go back and read your posts including the one where u suggested that ati is trying to trick devs with 3dc .

Richard Huddy's response to that, speaks for itself. It was an unqualified statement.

I don't see why it wouldn't. IN half life 2 the current info is that 3dc improves performance slightly and increases iq .

I explained in great detail why it wouldn't.
Just because it works in HalfLife2 doesn't mean it works everywhere.

Futuremark claims that 3dmark 2005 is a 100% dx 9 benchmark. Which means it should only test dx 9 features . Once it adds in 3dc or dst its no longer a 100% dx 9 card . It may support 100% of dx 9 , but it doesn't benchmark a 100% of dx 9.

This is your interpretation. The way I see it, it is still 100% DX9, it just has some extra features aswell (which you can disable).

As i said in another thread . If 3dmark 2005 used opengl to run and it claimed to be a 100% dx 9 benchmark would u still believe it ?

Since there would be 0% DX9 code in that case, no. Is there 0% DX9 code in 3DMark05?
In fact, I don't know if you understand how DST works, but it doesn't require any additional non-API calls. Even the DX9 caps viewer simply reports all of DST's features.

So it is only being used by 1 ihv . IF the second has no products out then there is no reason to discuss this , sometime in the future they could release the pdoruct , but mabye the cancel it then what ? I can say the same thing about 3dc , sometime in the future another company will release a card using it , but for now it doesn't matter . Because its not out yet.

But when the card does get released, you will have to adjust your statements. Since I don't have any doubt that the card will get released, I don't see why you should even bother to discuss it at this point. The card will get there eventually.

Why one is included and not the other doesn't make sense .

I said that many times before. 3DMark05 chose not to use many features, both within the spec and without.
The same goes for many games. Why does Doom3 use various NV-extensions but not 3Dc?
Perhaps because there were good reasons to use NV-extensions, but no good reasons to use 3Dc?
It's not about using every extension just because it makes some people on forums happy.

I find it telling u still wont answer this .

I find it complete nonsense actually. Did it ever occur to you that in games like, say FarCry, you get quite similar results, even though DST is used there? Perhaps 3DMark05 is just actually showing gaming performance instead of playing favourites?

I als o want to hear your logic that in the past 4 or 5 years since nvidia supported dst it was only used in 6 games , but suddenly its the wave of the future ?

I said that many times before. You need a lot of fillrate and videomemory to use shadowmapping. DST wasn't used because shadowmapping itself wasn't used. Now shadowmapping is used, and DST is used aswell. Simple, no?
 
Reverend said:
Scali, what is the real point in starting this thread? Is this necessary?

The real point was to get people to see that I am actually still a fan of ATi's products.
There are only two things that might make people think otherwise, and that is that I find the way that ATi promotes 3Dc in the press a bit questionable. Then again, Richard Huddy has admitted the same in the email to DaveBaumann.
And I find NVIDIA's DST a good technology and like many game developers I find it very logical that it is used even though it's not entirely within DX9 specs.

At this point, regarding Fodder's and jvd's debate, I would like to point out that you can also use DST along with the 4-pointsample method that the non-DST applies. Then you get identical quality, and NVIDIA would still be at a slight advantage, because DST gets double z-fillrate. It would also mean that on the other IHV's hardware, there'd be no image quality differences because you will pointsample on all cards.
Conversely, you can also modify the 4-pointsample method to get the same quality as DST with bilinear filter, but this would make non-DST hardware very slow.

I assume Futuremark chose their approach because games like FarCry use the exact same approach.
 
Fodder said:
Now I've been lead to believe that non-DST cards could give the same IQ as DST cards, but FutureMark chose not to as it would negatively impact their scores. Would you like to dispute that?

Then you would be asking boards that don't support the DST path to emulate a feature that is outside of DirectX - the non-DST path is the way to do shadows according to the specifcations of DirectX.
 
DaveBaumann said:
Then you would be asking boards that don't support the DST path to emulate a feature that is outside of DirectX - the non-DST path is the way to do shadows according to the specifcations of DirectX.

I disagree. The way NVIDIA implements filtering is a valid way of filtering shadowmaps.
There's nothing in the specifications of DirectX that says you can't implement this method of shadowing, or that you have to use the method that Futuremark chose to use.
At best, it is *A* way to do shadows in DirectX, certainly not *THE* way.
 
HEY! THIS IS THE GENERAL FORUM, IT'S ONLY MEANT FOR ME AND MY GAY JOKES, IF YOU WANT TO BITCH ABOUT ATI AND NVIDIA AND WHATEVER ELSE, YOU HAVE 829 OTHER FORUMS ON THIS VERY SITE!!!!!!!!





















/runs
 
The way NVIDIA implements filtering is a valid way of filtering shadowmaps.

It may be valid, its just not defined by DX documentation - as far as we've discussed with one developer, the what's applied to a D24S8 in these circumstances is very different from other hardware, and not documentmented within directx.
 
Back
Top