OMG HARDOCP REVIEW OF UT2003 AND FILTERING

Status
Not open for further replies.
andypski said:
[Sarcasm]

Turns round to rest of ATI driver guys...

"It's official - it's now open season on image quality guys - [H] says so."

Let's start hacking it up now, and leave no pixel unpolluted.


[/Sarcasm]
[Sarcasm continued]
"LOD-bias?"
"Check."
"Texture compression?"
"Check."
"Clipping planes?"
"Check."
"Trilinear disable?"
"Check."
"Shader replacement?"
"Check."
"Ship it."
[/Sarcasm]
 
I was amused by this article. Although he does point out that nVidia is taking filtering shortcuts that look silly (at least to me) like on the 8500 (don't know if you can call them shortcuts ?), he just pussy foots around the fact that nVidia takes control of IQ away from the user... or steals rather. That and cheating... aren't those bad?

nelg said:
A few days ago [H] had a ?serious? chat with Nvidia?s top brass. Today they publish an article that vindicated Nvidia?s behavior. What did they talk about then?
When they would go sailing on their new boat and get lots of brown stuff on each other's noses. ;)

I just found that total feel of the "research" to be spotty when compared to other works that [H] has published.

I guess I should be glad they are informing their readers. :rolleyes:
 
Kalbaz said:
3) Tseng-Labs release the ET4000000000 and ursurp both power player positions in a single masterstroke of engineering genius and [H] is shutdown because they didn't know how octlinear filtering worked and no one visited them anymore

(sorry got carried away)

hehe... ATI bought Tseng-Labs (tech and engineers) many years ago... :)
IIRC, I think that among other things was the foundation of the 2D core of ATI chips through the Rage days...
 
Slides said:
Perhaps I'm not understanding the issue, but at least from the screenshots, the IQ seems comparable in the game. Can someone explain what is wrong with what Nvidia is doing in UT2003?

The problem is that IQ is subjective, so while the reviewer may honestly be unable to tell a marked difference between bi and trilinear in-game, UT2003's colored mip-map option clearly shows that one board is performing one form of texture filtering and the other a different quality of filtering. So whether or not it's even noticeable in-game is completely irrelevent if you're going to compare benchmark scores between these two products, because, again, those colored mip-maps undeniably show that you're not conducting your benchmarks with apples to apples settings. It is truly that simple.

And in spite of Kyle's denial to the contrary, his articles are most definitely damaging the 9800 Pro's appearance as a competitor to the 5900 Ultras. I can only hope ATI continues to take the high ground and weather such sophomoric and unprofessional articles.
 
hehe... ATI bought Tseng-Labs (tech and engineers) many years ago...

really? didn't know that... so maybe they are renaming the ET4000000000 to the R420? :LOL:

So whether or not it's even noticeable in-game is completely irrelevent if you're going to compare benchmark scores between these two products, because, again, those colored mip-maps undeniably show that you're not conducting your benchmarks with apples to apples settings. It is truly that simple.

I really wish it were that simple to [H] but it seems their eyesight is failing them, or they are colour blind so the mipmaps aren't apparent to them? :)

what raises my ire is the fact that he admits to a degree that there is a difference in quality but he doesn't care because it's not noticable whilst you are playing the game at full speed.
 
Kalbaz said:
what raises my ire is the fact that he admits to a degree that there is a difference in quality but he doesn't care because it's not noticable whilst you are playing the game at full speed.
I think you hit the nail on the head there with the whole crux of my problem with [H]'s reporting on this.

They acknowledge the problem then say, "Meh, so what?". :(
 
digitalwanderer said:
Kalbaz said:
what raises my ire is the fact that he admits to a degree that there is a difference in quality but he doesn't care because it's not noticable whilst you are playing the game at full speed.
I think you hit the nail on the head there with the whole crux of my problem with [H]'s reporting on this.

They acknowledge the problem then say, "Meh, so what?". :(
Didn't mention it in my 5200 review because I wasn't comparing it to anything. Plus, on slow budget cards, it's *not* a bad thing. Should it still be user-configurable? Yeah, of course. But, probably 99% of people who have 5200s would use the UT2003 cheat/optimization/thingy.
 
What I find kind of bizarre is that [H] still don't seem to understand why ATi had a gripe with their benchmarking in the first place, despite the marked performance differences we've seen in UT2003 with application detection (and thus the forced bilinear filtering) disabled.

It would have been wonderful to see [H] being brave enough to give Antidetector a shot to show the difference, but I guess they are too scared of external influences to do anything like that. :(
 
nelg said:
A few days ago [H] had a “seriousâ€￾ chat with Nvidia’s top brass. Today they publish an article that vindicated Nvidia’s behavior. What did they talk about then?

How many 0's they should add to the check?
 
Hanners said:
What I find kind of bizarre is that [H] still don't seem to understand why ATi had a gripe with their benchmarking in the first place, despite the marked performance differences we've seen in UT2003 with application detection (and thus the forced bilinear filtering) disabled.

It would have been wonderful to see [H] being brave enough to give Antidetector a shot to show the difference, but I guess they are too scared of external influences to do anything like that. :(
Yup. They could have legitimately addressed the issue, but instead decided to stick to the old line of "Well, WE can't see any difference when gaming on it!". :rolleyes:

And as we all know by now; if we can't see it, it ain't a cheat... :(
 
I think I have a better understanding of the issue now, even though I have no idea what those wonderfully bright and colourful screenshots mean. Nor do I frankly care.

The issue here is Nvidia taking away control from the user and in it's own hands through the drivers. And that is something that really ticks me off as a consumer. If I pay good money for a video card, I want all basic options to be configurable without such underhanded tactics. The fact that I probably couldn't tell the difference between trilinear and bilinear texturing is irrelevant, as the option of choosing either should lie with me not the drivers. This is truly a slippery slope of a situation. Next thing you know nvidia drivers will start warning us about games that have not been "optimized" for nvidia hardware with a warning dialog box each you start such a game (Yeah I know I'm exaggerating a little but you get the point).
 
Slides said:
The issue here is Nvidia taking away control from the user and in it's own hands through the drivers.

Exactly. And the real point that Hardocp completely fails to mention is that they do this selectively in the case of UT 2003 just because it is a commonly used benchmark. They do not do it, for example, with Unreal II which is using the same engine. This completely blows away the ridiculous argument that Nvidia is concerned for the best playing experience for the consumer.
 
Hanners said:
What I find kind of bizarre is that [H] still don't seem to understand why ATi had a gripe with their benchmarking in the first place, despite the marked performance differences we've seen in UT2003 with application detection (and thus the forced bilinear filtering) disabled.

It would have been wonderful to see [H] being brave enough to give Antidetector a shot to show the difference, but I guess they are too scared of external influences to do anything like that. :(

I don't think fear has anything to do with it--more like a love affair with ignorance that [H] seems determined to carry on, no matter what. His detractors he calls "anal-technical" simply because they converse on general topics he cannot grasp. I am literally stunned that in 2003 we have "major" hardware sites saying that the difference between Tri and Bi-linear filtering isn't important because in certain screen shots you "can't see the difference" (while they studiously ignore the screen shots in which a difference can be seen--such as in the UT2K3 color-code filtering tests designed specifically to show the differences between filtering types such as bi and tri-linear.) Amazing and selective hypocrisy--not to mention the fact that in some cases the mip-map boundaries are visible while playing the game--a fact which [H] declares to be unimportant.

He basically says it's OK to compare tri-linear filtering on an ATi product with bilinear filtering on an GFFX product because in his opinion the IQ differences aren't noticeable while playing the game. So, OK, then, to avoid the hypocrisy the proper thing would be to compare the ATi performance with the GFFX performance while both products are running bi-linear, and to simply concede the GF FX has been deliberately crippled by nVidia so as not to provide Tri-linear filtering of detail textures in any case at all.

But no, it seems if [H] cannot declare a winner in a deliberately stacked contest [Ati's tri-linear versus GFFX's bi-linear] it isn't interested in declaring a winner at all. Like it or not folks the record at [H] post nv30 has been one of consistent apology for nVidia, regardless of how obviously and flagrantly it must skew its reporting.

Kyle is simply a nincompoop of the first order if he thinks nVidia's elimination of detail-texture trilinear support in its drivers is a non-issue. Lots of people care about issues such as these.

I said it in an earlier post: this is unfortunately boiling down to a classic struggle between the haves and the have nots; the generally educated vs. the unwashed ignorant. Basically, the boys at [H] have simply shut off the old brain cells and have decided to substitute whatever nVidia says and does for rational thought.

Who but nVidia might ever say or suggest "Oh, come now. Trilinear filtering isn't really important, is it?" And who but [H] it seems might ever officially respond in the affirmative to such an idiotic question? The hole just gets deeper and deeper--and only those digging it seem oblivious to it. Remarkable.
 
WaltC said:
Who but nVidia might ever say or suggest "Oh, come now. Trilinear filtering isn't really important, is it?" And who but [H] it seems might ever officially respond in the affirmative to such an idiotic question? The hole just gets deeper and deeper--and only those digging it seem oblivious to it. Remarkable.
And utterly fascinating to watch, truly. The thread at [H] about their latest "findings" are a serious case study. :)

A serious case study of what I'm not sure of, but it just seems overly classic to me somehow already. ;)
 
WaltC said:
Who but nVidia might ever say or suggest "Oh, come now. Trilinear filtering isn't really important, is it?" And who but [H] it seems might ever officially respond in the affirmative to such an idiotic question? The hole just gets deeper and deeper--and only those digging it seem oblivious to it. Remarkable.

I've said it before - people have forgotten their history. Nvidia has *always* rubbished any technology when they have not had the lead with that tech. Then Nvidia eventually bring out products that support that tech, and all of a sudden it is the best thing since sliced bread.

The reason *why* Nvidia use patsy websites instead of issuing official statements themselves is to distance themselves from all statements on their behalf, so Nvidia can selectively disown them at will. Nvidia don't have to live up to any promises because Nvidia didn't officially make them.

When it is convenient, Nvidia will cut Kyle free, and he will look like a patsy for supporting a stance that Nvidia will (a) no longer support, and (b) will deny they ever supported. Nvidia will say how they *always* values IQ, and how they *never* reduced quality, and how they certainly don't support the idea that "it's not a cheat if you can't see it" or that you can't tell the difference between bilinear and trilinear.
 
Bouncing Zabaglione Bros. said:
...
The reason *why* Nvidia use patsy websites instead of issuing official statements themselves is to distance themselves from all statements on their behalf, so Nvidia can selectively disown them at will. Nvidia don't have to live up to any promises because Nvidia didn't officially make them.

When it is convenient, Nvidia will cut Kyle free, and he will look like a patsy for supporting a stance that Nvidia will (a) no longer support, and (b) will deny they ever supported. Nvidia will say how they *always* values IQ, and how they *never* reduced quality, and how they certainly don't support the idea that "it's not a cheat if you can't see it" or that you can't tell the difference between bilinear and trilinear.

I think as well the reason they use individuals and web sites in this fashion is to get them to state untruths or to perpetuate and expand errors so that when the truth outs nVidia can plausibly deny ever having said such things--itself. They can say, "This is what [H] said. We never said anything like that ourselves." I've seen this happen a number of times in the last four years. (Edit: Belatedly, I see this is exactly what you said in the first para quoted above. Sorry...;))

Yes, it's unfortunate when a site either doesn't care or is actively complicit in being used by nVidia to make statements the company itself does not wish attributed to it. But for four years I've watched various such sap sites come and go--with the emphasis on the "go" part.

Witness the fact that even while [H] is busy swearing up & down to all who will believe it that nVidia just simply doesn't care about 3D benchmarks, the company publishes "performance increase" statements on new driver sets which deal with nothing but increases in benchmark performance. And even while [H] is stomping around vainly protesting nVidia's innocence nVidia not only continues special-case optimizing for 3DMk03 but goes to the further trouble of encrypting its drivers so that the anti-detection scripts written by people to disable such benchmarking frauds will no longer function. Heh...[H] even now is being used and made to look foolish.

I can understand a certain desire to be towards the center of attention, but not as an object of derision. Does Kyle really want to place [H] on the dunking seat? He must, as there is no shortage of balls to hurl at the target lever and the temptation to dunk [H] is overwhelming...;)
 
I just wanna give Doomtrooper a pat on the back. He posted his ass off in that thread on the hard forums today. Kyle was playing the semantics game, unfortunately, and never ended up saying much. But damn man...that was a lot of posts :D
 
Forbidden Donut said:
I just wanna give Doomtrooper a pat on the back. He posted his ass off in that thread on the hard forums today. Kyle was playing the semantics game, unfortunately, and never ended up saying much. But damn man...that was a lot of posts :D

Kyle seems to be concerned that the images that Doomtropper posted were using to much bandwidth even when Doom had them hosted on another site. :rolleyes:

Is it me or is there someting fishy when even images on his own site are removed when they might counter his own stance. Look at this post below and check out the link. The image is not available. Hmm.


just saw the review and i could see a big difference in the af quality in this shot. http://hardocp.com/image.html? imag... /> 18xX2wuanBn looks as if the 9800 pro is using better af as for bi or tri filtering i dont know how to classify it but the ati card has better quality throughout the mip maps.
 
I would have a lot more respect for [H] if they included a caveat in there review of the BFG 5900 ultra. They should qualify the Ut20003 scores with something like this..... “ Using the 44.03 drivers the 5900 does not use full trilinear filtering while playing Ut2003. It is our opinion that this does not have a detrimental effect on image quality. Though it remains to be seen if this effects performance and image quality of other games.â€￾ Frgmaster and Brent, both seem to be going out of their way to mention that this article reflects testing of Ut2003 with 44.03’s on the 5900u only. This seems to me that they may have suspicions that the issue is wider reaching but are trying to spin it as to lessen impact on Nvidia. As it stands now [H] has turned a sow’s ear into a silk purse.
 
Status
Not open for further replies.
Back
Top