OMG HARDOCP REVIEW OF UT2003 AND FILTERING

Status
Not open for further replies.
Blackwind said:
DaveBaumann said:
Blackwind said:
What you describe as a fluff piece I describe as informative and to the point. A review of IQ.

In your opinion was the conclusion correct?

Yes, I do. I do not believe [H]ard's benchmarking was damaging to ATI. Seeing this was the entire reason for performing the task, I believe it deserves mention.

When a reviewer is trying to measure fps for a comparison, wouldn't the entire act of comparing "near-trilinear" against full-trilinear create an unwanted variable in the benchmark?

The whole purpose of a benchmark, like any comparison of a scientific nature, is to eliminate as many confounds as possible in order to isolate only the desired variable. In video cards, things like the image quality settings, system specs, app settings, and reviewer should be held constant in order to reduce unwanted confounds to produce directly comparable fps numbers. As it is, the difference in fps between the 5900 ultra and 9800 pro may in [H]'s review be due to the difference in filtering. If both were doing equal work, the 9800 pro may come out far ahead. Unfortunately, we do not know if this is the case and it was not mentioned at the time of the review's publication. That is what's damaging from Ati's perspective.

The fact that [H] went back to examine IQ is great, no matter what their conclusions may be. The problem still remains that they are not addressing all of the issues. If nVidia's near-trilinear looks almost as good in games, that's great. It still doesn't tell me if the cards are equally powerful when using technically comparable settings (something that's even more important for someone who's thinking about spending $500 on a card soon i.e. me).

I think the strength of benchmarks as a method of comparison has been lost somewhat and I'm starting to agree with Rev that they shouldn't be used in shootout reviews given the way different IHVs are implementing their features.
 
Unfortunately, Kyle has decided to attack Dave and Rev at B3D for what reason, I truly don't know. All they have done is share the information they've discovered concerning texturing and UT2K3 with everyone here on the forums. It's ironic to me that they've done this on multiple forms, for one it appears Brent's own source for his most recent article was Dave B., and IIRC B3D also broke the IQ texturing story on NV's previous driver also in these forums.

If you recall the Quake/Quack texturing controversy a few years back, Kyle had no problems using the quackifier utility provided to him by NVIDIA for his article a few years ago. I would know, as I received the same utility myself. Tom Pabst wrote a detailed story of what went down, which I highly suggest you check out:
http://www4.tomshardware.com/blurb/20020825/index.html

Kenn and I decided not to use quackifier however (for obvious reasons), and instead we made our own program which did the exact same thing. We posted the source code here:
http://firingsquad.gamers.com/hardware/radeonquack/page2.asp

Obviously, the ethics of using a utility provided by NVIDIA to attack ATI and its latest product are EXTREMELY suspect on Kyle's part. Yet somehow he now feels he can attack the crew at B3D? That's just totally wrong on his part and I had to say something.

Technically, you could go a step further and wonder if he can be trusted now (in light of the fact that his results completely contradict the findings of B3D and AMDMB) but I won't do that. I trust that what he and Brent experienced is indicative of UT2K3, but he has no right whatsoever to attack B3D like he is in his forums. As the old biblical saying goes "let he who is without sin throw the first stone".
 
WaltC said:
Well, based on their article defending the substitution of partial trilinear filtering in place of full trilinear, without the knowledge and consent of either the end user or the application, I really can't agree. Rather, the article itself seems to indicate they don't understand the point being made--that full trilinear is not available for UT2K3 with the present Dets.

I would respond and suggest you did not read the article or Kyle's opinion regarding that very topic. I have. Both Brent and Kyle made it very clear that they would like Det's to give users the ability to get full Tri as they as a user choose. Regardless of Nvidia or Epics good intentions. I 200% agree. I want to decide what I do or do not use on my video card. Simple.



Really, if you don't care that you don't get full trilinear in the game with the Detonators that's certainly OK with me...;) I don't use the Detonators so it's not a concern for me, personally (as I get full trilinear when I want it.)

It's certainly your prerogative not to care about such things..

Clearing it up with precise response, I do care. I want the very control I have with my ATI cards on my Nvidia cards.

I can see they've been "up front" in justifying their strange position--but seem to be very "down back" about spelling out why it isn't a good idea to tell people they are getting full trilinear and then giving them a partial trilinear in its place. That's my complaint.

I find nothing strange about it.


No. It's not their "level" of expressing their opinions that bothers me. It's their opinion....;) In perusing this thread I guess you can see there is no shortage of people who, like me, disagree with that opinion.

I would suggest that the other people are reading [H].


Hmmmm.....how is it "informative and to the point" to pretend that a performance-level trilinear is "almost as good" as a full trilinear, and so we should all be just as happy as chicks in a roost upon discovering the Detonators do not support full trilinear in UT2K3, even when both the end-user and the application expect the Dets to provide it?

The point here has nothing whatever to do with performance-mode trilinear, but rather has to do with the fact that full trilinear is unavailable. Very simple. Nobody's complaining about whatever performance trilinear modes nVidia wants to incorporate--they are complaining about the fact that nVidia has *substituted* performance for full, and not only failed to inform anyone, they actually stated the opposite to reviewers.

As you can see--nVidia's performance trilinear hardly enters the picture at all. Rather, it's the absence of full trilinear capability for the game that is the heart of the problem.

They were not "pretending" any such thing. If you had read it and their views within the forums you would know, "happy as chicks in a roost" none of us are. It is my understanding that yes, full trilinear is available. It simply is not available in present Dets in UT2003. the FX series performs full tri in other games just fine. Epic apparently is aware of the "fix" and that is their perogative. We can as users express our displeasure.

Fair enough--just as it's your prerogative not to care about whether you get full trilinear when you think you are, it's also your prerogative not to have to wade through "techie junk" that will enlighten you and broaden your horizons. Ignorance is a commodity that some hug like a security blanket--as they say, ignorance can sometimes be bliss. I wouldn't dream of asking you to part with it (although I would certainly attempt to convince you of why you might wish to.) But in your case I can see that such an attempt is niether wanted nor welcomed.

I'll have to correct you again seeing that you are prone to assumptions rather then simply asking. I do care. I like tech talk. I do not believe ignorance to be bliss.

You say, "as a group" they appear to have an agenda. OK, fine. What agenda? If you can't define this "agenda" you hypothesize, then you have no basis for inferring it, do you?

I make no inference. I stated my opinion. I believe it to be clear.

Fine--then why would Kyle state that Dave B. was "not a part of the [H] community"? Obviously, Kyle believes in something he thinks of as the "[H] community"--as taken from his own statements. I would advise that you temper your beliefs about "what he means" by way of quoting his own remarks, as that's the only way you can be sure of what he means, IMO..

I would respond by suggesting tempering your own by reading ALL of the topic and comments rather then a keyhole view. As to why would Kyle state what he did about not being a part of the [H] community? My response would be, why did Dave never post prior? Neither answer to either party really matters. That is something for Dave and Kyle to hash out should they choose too. Simple.

Yes, and [H] also goes to "great lengths" in its forums to ban individuals, lock threads, delete threads, delete posts within threads, etc. I have never seen such at B3d. Many of the banned [H] individuals have done nothing except profess an opinion, or attempt to engage in a discussion, on topics the "moderators" do not wish to see discussed. Call it what you will, that is an attempt at censorship and might even be characterized as an attempt at "thought control." Kyle is in no position to call anyone the "B3d police" as from what I've seen the "[H] police" are infinitely worse. Perhaps though, Blackwind, you'll be the next to have your account banned at [H], and at that point you might have an epiphany...;)

Just as I responded to Rev on [H], readers here would actually like to see some sterner enforcement of polite guildeline here. What you state as being "Many of the banned [H] individuals have done nothing except profess an opinion, or attempt to engage in a discussion" I would state with all confidance as very few have done that. In fact, they were even warned by other non-[H]ard moderators that their actions were not being received well. Very few have come with that intention. Anthony's post today on [H] were polite, to the point and IMO shining examples of a great post by someone from B3D.
 
Oblivious said:
When a reviewer is trying to measure fps for a comparison, wouldn't the entire act of comparing "near-trilinear" against full-trilinear create an unwanted variable in the benchmark?

Their intention was not "measuring fps" IMO, it just so happens this is part of the suite of tools used. I find the situation no more fair or unfair then the facts...

ATI does not render or use the same hardware or process of rendering that Nvidia uses and vise versa. This makes an apples to apples comparison near impossible. Regardless of what site does the review.
 
kkevin666 said:
Heres exactly how kyle responded to a polite / direct question on his trilinear filtering knowledge

For crying out loud, this stuff is just too much. No freaking way anyone can post stuff like that.
 
Blackwind said:
ATI does not render or use the same hardware or process of rendering that Nvidia uses and vise versa. This makes an apples to apples comparison near impossible. Regardless of what site does the review.

Trilinear is something pretty basic - they've both had it right before and they can both get it right now. The idea that "they are different so there's no comparison" isn't an excuse in this case. If you are going to take that view you may as well have not bothered with the comparisons in the first place.
 
Blackwind said:
Oblivious said:
ATI does not render or use the same hardware or process of rendering that Nvidia uses and vise versa. This makes an apples to apples comparison near impossible. Regardless of what site does the review.

ah, so because apples to apples is impossible, we should just give up and let things be as far away from apples to apples as nVidia wants?

As a reviewer, you are presenting (in the type of review we are discussing) TWO products up for comparison. You cannot run benchmarks on ALL games, so you run benchmarks on a few games, thinking that this shows relative card performance.

The problem with what [H] has done is to use a much farther from apples to apples comparison than need be - AND to not even mention this issue in said review.
You dont think that this shows an unfair advantage to nVidia?
Of COURSE things arent apples to apples, but we expect the differences to be MENTIONED and commented upon! Not ignored! And the very fact that the GFFX does its tri/bi mix trick in only UT2K3 makes it just about the WORST benchmark for a REVIEW - a review is supposed to give a general idea or impression of a cards abilities (in this case, relative to another card). However, it is doing nothing of the sort.
 
Brandon said:
Unfortunately, Kyle has decided to attack Dave and Rev at B3D for what reason, I truly don't know. All they have done is share the information they've discovered concerning texturing and UT2K3 with everyone here on the forums. It's ironic to me that they've done this on multiple forms, for one it appears Brent's own source for his most recent article was Dave B., and IIRC B3D also broke the IQ texturing story on NV's previous driver also in these forums.

If you recall the Quake/Quack texturing controversy a few years back, Kyle had no problems using the quackifier utility provided to him by NVIDIA for his article a few years ago. I would know, as I received the same utility myself. Tom Pabst wrote a detailed story of what went down, which I highly suggest you check out:
http://www4.tomshardware.com/blurb/20020825/index.html

Kenn and I decided not to use quackifier however (for obvious reasons), and instead we made our own program which did the exact same thing. We posted the source code here:
http://firingsquad.gamers.com/hardware/radeonquack/page2.asp

Obviously, the ethics of using a utility provided by NVIDIA to attack ATI and its latest product are EXTREMELY suspect on Kyle's part. Yet somehow he now feels he can attack the crew at B3D? That's just totally wrong on his part and I had to say something.

Technically, you could go a step further and wonder if he can be trusted now (in light of the fact that his results completely contradict the findings of B3D and AMDMB) but I won't do that. I trust that what he and Brent experienced is indicative of UT2K3, but he has no right whatsoever to attack B3D like he is in his forums. As the old biblical saying goes "let he who is without sin throw the first stone".

Well said, now when is FiringSquad going to start benchmarking with the new found settings, and how about some amendments to previous articles...or better yet a article on the issue itself.

Consumers have a right to know.
 
Blackwind said:
Oblivious said:
When a reviewer is trying to measure fps for a comparison, wouldn't the entire act of comparing "near-trilinear" against full-trilinear create an unwanted variable in the benchmark?

Their intention was not "measuring fps" IMO, it just so happens this is part of the suite of tools used. I find the situation no more fair or unfair then the facts...

ATI does not render or use the same hardware or process of rendering that Nvidia uses and vise versa. This makes an apples to apples comparison near impossible. Regardless of what site does the review.

I think we're talking about two different things here. I believe you're talking about the follow-up article where they compared IQ. I'm talking about the original review where they focused on benchmarks. I assumed that by using the word "benchmark" earlier, you were referring to the review. Just to be clear, IQ comparison does not equal benchmark since they measure two different things. I think they should be kept separate in order to obtain results that are less confused. Of course nVidia does not provide that option anymore and [H] did not mention that in the original review.
 
DaveBaumann said:
Blackwind said:
ATI does not render or use the same hardware or process of rendering that Nvidia uses and vise versa. This makes an apples to apples comparison near impossible. Regardless of what site does the review.

Trilinear is something pretty basic - they've both had it right before and they can both get it right now. The idea that "they are different so there's no comparison" isn't an excuse in this case. If you are going to take that view you may as well have not bothered with the comparisons in the first place.

I agree, they have both had it right before and they can get it correct once again. But to imply that everyone at [H] hard is excusing it is no different then their approach to guilty by association banning when it comes to persons from B3D. I’d suggest reviewing as you see fit here, and allowing them to review as they see fit. You both take different approaches to the same topic. Neither is right or wrong.
 
Oblivious said:
Blackwind said:
Oblivious said:
When a reviewer is trying to measure fps for a comparison, wouldn't the entire act of comparing "near-trilinear" against full-trilinear create an unwanted variable in the benchmark?

Their intention was not "measuring fps" IMO, it just so happens this is part of the suite of tools used. I find the situation no more fair or unfair then the facts...

ATI does not render or use the same hardware or process of rendering that Nvidia uses and vise versa. This makes an apples to apples comparison near impossible. Regardless of what site does the review.

I think we're talking about two different things here. I believe you're talking about the follow-up article where they compared IQ. I'm talking about the original review where they focused on benchmarks. I assumed that by using the word "benchmark" earlier, you were referring to the review. Just to be clear, IQ comparison does not equal benchmark since they measure two different things. I think they should be kept separate in order to obtain results that are less confused. Of course nVidia does not provide that option anymore and [H] did not mention that in the original review.

Agree with you. I was referrring to the IQ second piece. You actually have a good idea there and I hope Brent reads it here. A pure technical review, with all the goods that highly technical people want with numbers and all that jazz. ( I like) A second review strictly from an IQ stand point. Redundant? Maybe but it would cater to a few who need catering. All in all I think their reviews have been great and have only gotten better with Brent on board. While many here may feel it biased or other, I'd simply have to say I do not see it and the proof is in their conclusions.
 
Blackwind:

You do realize that if this were another board, you'd probably be banned by now instead of being allowed to express your opinions? 8)
 
John Reynolds said:
Blackwind:

You do realize that if this were another board, you'd probably be banned by now instead of being allowed to express your opinions? 8)

HEHE, I realize that many opinions here are extremely clouded and not based on fact. :D I have over 533 post on [H]ard and still going. I'm sure Brent can attest to, I've questioned their direction before and have asked and inquired. I would say the largest difference between me and some is I don’t see a shooter on the grassy knoll every time I turn around.

By the way, Kyle responsed to Rev's post. May not be what you want to read but it's a response. :p
 
Blackwind said:
But to imply that everyone at [H] hard is excusing it is no different then their approach to guilty by association banning when it comes to persons from B3D.

I didn't imply anything of the sort. However, if they were aware of it some time ago then at the very least they have concealed information from their readers, which does misrepresent the results since the reader are looking at those results expecting one standard of quality, but in fact they represent a lower standard of quality.

Anyway, you didn't answer - do you feel that the screenshots shown in this thread and our orginal one do not represent any kind of quality reduction over Trilinear filtering?
 
Blackwind said:
They were not "pretending" any such thing. If you had read it and their views within the forums you would know, "happy as chicks in a roost" none of us are. It is my understanding that yes, full trilinear is available. It simply is not available in present Dets in UT2003. the FX series performs full tri in other games just fine. Epic apparently is aware of the "fix" and that is their perogative. We can as users express our displeasure.

This is the point that's always been of interest to me. The issue is necessarily constrained to UT2K3, and that's what's wrong with it. The fact that full trilinear is available on everything else is the major point. The conclusion that nVidia did this merely to skew benchmark scores relative to UT2K3 seems beyond dispute at this point. I mean if you can suggest a parallel rationale for doing this in UT2K3 while telling reviewers and developers that you're doing the opposite, I'm all ears.

Approaching the issue from the standpoint of "how close" the resulting IQ actually is to full trilinear support is missing the point entirely. But this is what [H] did. Again, no one objects to nVidia implementing a performance-mode trilinear compromise. What's objected to is the substitution in this one game of that performance mode for the real deal. The way in which it was done by nVidia leaves no doubt, IMO, as to why it was done: nVidia wanted to ensure that benchmark scoring in UT2K3 using full trilinear support would skew the scores in favor of its products, because it knew ahead of time it could not compete in UT2K3 using full trilinear. That's why nVidia never said a word about this to anyone, stated it was doing the opposite, and didn't bother to make this a driver option: the intent was that this mode be used in a direct comparison with its competition's products running full trilinear in UT2K3. Talking about this ahead of time would have made that an impossibility from the start.

So, that's why I see playing up this performance-trilinear mode as some sort of "wonderful" setting that allows the card to get higher framerates in UT2K3 while producing IQ "almost as good" as the real deal is completely dishonest. nVidia has never offered this as an option--instead they've improperly, and knowingly, used it as a substitute for full trilinear filtering support in UT2K3. And therein lies the crime. All that [H] has done, in insisting it's OK to follow nVidia's lead and compare one product running full trilinear with another one that's not, is to aid and abet nVidia in this deception. The only point [H] has proved is that it has ears only for the way in which nVidia seeks to define the 3D industry. IMO, of course.
 
DaveBaumann said:
Blackwind said:
But to imply that everyone at [H] hard is excusing it is no different then their approach to guilty by association banning when it comes to persons from B3D.

I didn't imply anything of the sort. However, if they were aware of it some time ago then at the very least they have concealed information from their readers, which does misrepresent the results since the reader are looking at those results expecting one standard of quality, but in fact they represent a lower standard of quality.

Anyway, you didn't answer - do you feel that the screenshots shown in this thread and our orginal one do not represent any kind of quality reduction over Trilinear filtering?

I would suggest you have implied exactly that. I actually did answer your question. You have elaborated further your question so I will respond again, with elaboration. There is quality reduction over Trilinear filtering in UT2003 with Nvidias present efforts. Does this boil down into the fiasco its been made into? No. With the events of the past few months would we be as consumers more inclined to trust Nvidia if we had the ability in a new Det set to enable full trilinear? Yup. Do we want it? YUP.

Althornin said:
Blackwind said:
ah, so because apples to apples is impossible, we should just give up and let things be as far away from apples to apples as nVidia wants?

As a reviewer, you are presenting (in the type of review we are discussing) TWO products up for comparison. You cannot run benchmarks on ALL games, so you run benchmarks on a few games, thinking that this shows relative card performance.

The problem with what [H] has done is to use a much farther from apples to apples comparison than need be - AND to not even mention this issue in said review.
You dont think that this shows an unfair advantage to nVidia?
Of COURSE things arent apples to apples, but we expect the differences to be MENTIONED and commented upon! Not ignored! And the very fact that the GFFX does its tri/bi mix trick in only UT2K3 makes it just about the WORST benchmark for a REVIEW - a review is supposed to give a general idea or impression of a cards abilities (in this case, relative to another card). However, it is doing nothing of the sort.

I never stated it was impossible. I stated it was near impossible, very difficult, a hard task to accomplish. I have to ask, where on earth do you see an "unfair advantage to nVidia" with the conclusion of [H] continued recommendation of a 9800 over a 5900? I call that a very clouded opinion.
 
Recomending a Radeon 9800 over a 5900 is the concencus from almost every online reviewer, if you include custom time demos. Then factor in PS 2.0 speed and the lack of Multiple Render Targets on the FX, much better AA modes it is the right decision..even pricing is better.

Just because [H] recommends a 9800 doesn't give them the right to cover up a obvious driver hack that changes the end users setting in the control panel with the detection of UT 2003.exe
Especially when they admit they were aware of it months ago, but happily posted AF graphs on their webpage.
 
Just as I responded to Rev on [H], readers here would actually like to see some sterner enforcement of polite guildeline here.


Forgive my candour, but that's pure malarkey. It is more than a bit presumptuous of you to be speaking for other readers. I am a new member myself and it was apparent to me for months before joining that B3D forums operate on the premise of a more intelligent understanding between participants and moderators, such that deleting posts and banning posters is very rarely required. I appreciate the atmosphere of civilized discussion where people are at liberty to calmly express opinions that can differ strongly without being concerned about being summarily axed from the site by a despot. Ironic that that you seem to consider Kyle's need for frequent forum intervention as a measure of his success, rather than as his failure to command the respect of his "community".
 
Doomtrooper said:
Well said, now when is FiringSquad going to start benchmarking with the new found settings, and how about some amendments to previous articles...or better yet a article on the issue itself.

Consumers have a right to know.

Yes, we will probably cover it in the same fashion the custom demos were done i.e. in a product review. In this case, the ASUS V9950 Ultra. This is a single slot GeForce FX 5900U card that shipped with the 44.71 Detonator driver.

If it is necessary to amend previous articles, I will do that just as we did with 3DMark 03.
 
Very glad to hear the Brandon, maybe we can get some honest reviews from other sites vs. the couple I have left in my favorites..including this one.
 
Status
Not open for further replies.
Back
Top