TWIMTBR

In a personal sense, I can see how someone would be biased if one IHV or AIB gave you review samples and the other didn't. But if you are getting boards from both sides that's largely balanced out. Also, smaller sites do tend to send the boards back, so it's more a loan than a gift in those cases anyway.

Bias doesn't have to be a bad thing. I think there are a very large number of people on this board who are biased in favor of ATI's implementation of DX9 on the R3xx series chips as opposed to Nvidia's implementation on the NV3x series. However this is a perfectly reasonable bias because ATI's hardware maps much more closely to the spec than Nvidia's. Please note, that statement did not say that ATI-based cards are better than Nvidia-based ones, since that depends on other factors not just DX9 compliance. A 64MB Radeon 9500 non-Pro may have a better DX9 implementation than a 256MB FX5950 Ultra, but most of us would rather have that GeForce than that Radeon.

It's been about 20 years since I took an ethics class, but from what I remember it's really based on three things: honesty, fairness, and integrity. Beyond3D does a great job of demonstrating all of these qualities

In a perfect world it would be nice if all sites could go out and buy all the products they reviewed. It would avoid a perception of inappropriate biase, and ensure they weren't getting cherry-picked review samples. However, this isn't a perfect world.

Also, as has been stated before, it can be questioned whether they are really true gifts. Normally a gift is something extra, not something that is required. You can't review a video card without acess to it, so it's a necessity to have one.

Anyway, there's my two cents.
 
Rugor said:
Bias doesn't have to be a bad thing. . .

This is part of the problem; not everyone uses the word "bias" in the same fashion even. Some people would say that a bias based on experience is fine, or a preference for a certain aspect of one card over another (say you are really, really hot for quality AA and give that a significantly larger role in your decision making, for instance, than some others would do) is fine. I think this is what you are pointing at above. Technically speaking, that is not true bias on its own. At the extreme it could be (i.e. everything but the AA sucks, say, and you still prefer it over the competition). Real bias, by definition, is a lack of objectivity; an *unfair* preference for one over another. Real bias is relying entirely on past experience to say, "Well company A has always had better drivers than company B --I'm sure it is still true" and not actually checking the current status of company A and B drivers.

But the word doesn't always get used that way, and it is often very difficult to tell which version someone means.
 
thegrommit said:
ByteMe said:
If YOU want to remain unquestionably unbiased you can NOT accept any gifts.

So, I take it you'll be chipping into the beyond3d new hardware fund? Or do you have another suggestion for how they should "acquire" said hardware?


The ideal business/ethics is to have paying subscribers (money is ONLY made by this). This is now rare. You can justify all you want, the fact is taking "gifts" does introduce an amout of bias. It is up to you (reviewers/readers) how nuch bias is acceptable.

Just at first impression; would you trust a review from someone that bought it retail with their OWN money... or one from some site that a card was given to? The guy that spent his own money has nothing to lose. The guy that got the free card on the other hand...

With that all said. It is not that I believe the reviewers here at B3D have done a bad job. I believe it would just be better (more credible?) if they bought it themselves.
 
ByteMe, a site cannot in itself be biased. Taking the actions you mentioned would only give an impression of being unbiased. It would still be possible to corrupt an individual reviewer at a site modeled like the one you described. The most we can hope for is a site that is open to dialogue about its testing. This allows any results that are questionable to be scrutinized and/or verified. That is what makes B3D great. There are so many knowledgeable members here that the crap (none of which emanates from B3D) gets cut through pretty quickly.
 
Bias is detected through what you READ, not what happens behind the scenes--since we'd hardly even know it ANYWAY, not to mention we have no idea how it compares to the rest of the industry and what is accepted.

Every notice what people latch onto when they accuse Tom or Anand or Kyle of bias? They grab comments, they grab reviews and analyze the methods, they test concluding statements against the tests that they showed, they follow commentary on their forums, they test reviews against what shows up on a myriad of other sites... Do we know just how their sites operate? How they compare to each other? To B3D?

Detecting bias in tech reviews, in fact, is MUCH EASIER than anywhere else because they are immersed in numbers and FACTS, rather than being opinion-laden such as with movie/music reviews and short blurbs on one subject or another.

It's rather crucial to realize THAT should be the telling point, since we never KNOW what's going on anyway, and even things like contest-sponsoring are forms of "gifts" which you seem to decry as so damning. Getting invited to cover events, going to trade shows, having better and faster Q&A with IHV's... There's FAR too much out there that "could cause bias" that in the end it's just better to use the tried and true method rather than trying to micro-manage bias levels--WYSIWYG.

Frankly, we can't even be positive that reviewers aren't using astrology to determine performance numbers or posting whatever they like. It is CRUCIAL to judge them through the content they deliver and its context in the broader landscape.
 
Hey cthellis42, if it became known that the Rev received a hundred grand from nvidia for a day of "consulting". Would you believe anything he had to say about them or their products? Or would you verify it from somewhere else?

I don't even care what kind of reputation he has now. There is NO-WAY he would remain unbiased. We would all be guilty of this.


Now the question is.... what if it was only 10 grand? one grand? 200 hundred dollars? see?

It is impossible to accept "gifts" without looking biased.

Just because "everyone is doing it" does NOT make it right. If I was a reviewer it would NEVER be acceptable to take gifts ... and I don't care if it was a can of soda.

It kills me that so many of you are willing to compromise your own ethics. But I forget... everyone is a victum... we can't remain competitive unless we proceed like others. What happened to the people that dared to be GREAT!
 
ByteMe said:
Just at first impression; would you trust a review from someone that bought it retail with their OWN money... or one from some site that a card was given to? The guy that spent his own money has nothing to lose. The guy that got the free card on the other hand...

That doesn't necessarily follow - A lot of the time people want to defend their purchase when they have bought an expensive product, which can get in the way of their true feelings towards it. Whichever way you look at it you can be biased which is why, as others have mentioned, you are better to judge a review on its content above all else.

For the record, I bought both of the cards I use for testing with my own hard-earned cash, but that doesn't stop people using the 'B' word against me...
 
I never said you could not be biased for any number of reasons. Just that accepting a "gift" was one way to guarante something of a bias.

Here is a common situation that I believe happens often;

Website A has a "business model" of accepting "gifts" for products to be reviewed.

Nvidia has a standard practice of giving away a few cards to be reviewed.

Nvidia has a brand spanking new kickass card coming out called the Geforce Xtreme.

Website A needs to get one of these cards to pull in the traffic to make money on the add hits. Website A knows Nvidia only plans on giving away 5 cards for the initial reviews. Unfortunity, website A is not nearly as big as some of the other sites, so they call up Nvidia "Hey we would like to get one of those Geforce Xtreme's to review.

Nvidia- "Well, I only have one left and I was thinking about sending it to website B"

Website A- "Remember when we did the last review? You didn't like some of the numbers/tests we published but we did go back at your request and edit the article to add some mention about your XYZ features."

nvidia- "Yes I remember that.... ok.... if you include the benchs we wanted on Game 123 and whatever else benchmarks you want I suppose we could send you a card."

Website A- "Sure we could include that game benchmark, it will only be one of ten so there wouldn't be any bias."

Nvidia- "Of course we wouldn't want you to be biased. Just do your reviews like you have done in the past with that one game included.... By the way could you go into just a bit more detail on those features we talked about before?"

Website A HAS GOT to get this card and it is almost in the bag- "Sure no problem."

Did you see what just happened? I would say one game in ten is a 10% bias. And then add the fact that now features XYZ will get a stronger mention... this just adds to the bias. With as close in performance/features as many cards are what would you say a 10% bias is worth to nvidia on a website?

You want to avoid this? Do NOT accept the cards. There are a billion variations on the above example that all lead to the same bias.

/passes soapbox to YOU
 
ByteMe,
you seem to have a problem that goes well beyond your fear of bias.

Your assumption that bias == loss of credibility is, simply put, stupid.
Of course there will be a bias, and that regardless of how the subject in question have been obtained (see Festingers theory of dissonance).
But that doesn´t mean the published data are invalid.
While the reviewers "conclusion" of course is subject to bias throuhg his personal opinion, the data speaks for itself; this is of course the real purpose of benchmarking. Properly performed and with full disclosure of testing methods you´d have to presume the numbers were falsely reported to scream "bias, bias!" Do you really think that is the case? Here at B3D?
I shouldn´t have to repeat what nelg said above, go back and read it for yourself. Clear and concise.

If your mother tells you that you were a sweet kid, don´t beleive her - she´s biased. You were most likely a troll.
 
rubank said:
ByteMe,
you seem to have a problem that goes well beyond your fear of bias.

Your assumption that bias == loss of credibility is, simply put, stupid.
Of course there will be a bias, and that regardless of how the subject in question have been obtained (see Festingers theory of dissonance).
But that doesn´t mean the published data are invalid.
While the reviewers "conclusion" of course is subject to bias throuhg his personal opinion, the data speaks for itself; this is of course the real purpose of benchmarking. Properly performed and with full disclosure of testing methods you´d have to presume the numbers were falsely reported to scream "bias, bias!" Do you really think that is the case? Here at B3D?
I shouldn´t have to repeat what nelg said above, go back and read it for yourself. Clear and concise.

If your mother tells you that you were a sweet kid, don´t beleive her - she´s biased. You were most likely a troll.


Reading that post of yours is like being viciously assaulted by a perfumed parakeet in a goddamned Parisian bordello.

You are incorrect. A bias does lead to a loss of credibility. What world do you live in? The published data can be perfectly correct and STILL be biased. What if a reviewer only shows benches were brand A always wins?

It's either silence or Armageddon, chucklehead.
 
What if a reviewer only shows benches were brand A always wins?

That doesn't really matter - what actually matters is that the benchmarks selected actually bear some relevance, either in terms of what people play or the technologies they utilise.
 
just remember that:
- ppl see what they want to see. (only your card beating the others in some test matters more than anything else, even though your card would be master only in one test and get crushed on 20 others.)
- The Truth is in the eye of the beholder. (it is all up to how you want to things see.)

I doubt that no one can say that he/she doesn't have any opinions that could affect on results even slightly. (and if someone says something like that, he/she has already prooven to be wrong, because he already had opinion about that if he/she has or has not any opinions.)


I already have benchmarked quite few cards just for fun. for older games compability and texel cache efficiency I use radically modded NFS4 with fraps. (with big textures on track, it is a single texturing nightmare for any card: each _QUAD_ has own texture and usually in user made tracks, it means 256x256 texture.) for basic OpenGL testing I have my own tools. (which have been developed quite bit further since I demoed them at coding forum.)
 
ByteMe, you're essentially talking about a lot of "if" scenarios.

I don't particularly care about other websites and what folks think of them. But you said that "even the holy B3D has some bias". I'd like you to back that up. Read all our reviews/articles/etc. Pick up anything we wrote that you regard as us having "bias". Be specific.

All the while knowing we get all these video cards/hardware/whathaveyous for "free".

If you can do this, hence proving your understanding/definition of what "bias" is or how it can happen, then you've made your point. If you can't, then all that you've offered are nothing but baseless speculations or just a whole lot of beer-talk.

Remember, I don't care about other sites, just ours.
 
Ha-ha-ha-ha-ha! You think sites are really that organized BM? Heck, everything I've ever got to review has been thru a combination of luck & years of hinting/pleading.

I don't think I could be biased towards a company to give it a favorable review just because they gave me something, as will be demonstrated whenever my boss gets around to posting up my ACG4 review. (And will be quickly called into question again when my AIW 9600 Pro review goes up... :rolleyes: )

I wouldn't sell my rep for a piece of equipment worth a couple hundred dollars, it's worth a lot more to me. ($50,000 us, if anyone from nVidia is reading...PM me, we'll talk. 8) )
 
digitalwanderer said:
I wouldn't sell my rep for a piece of equipment worth a couple hundred dollars, it's worth a lot more to me. ($50,000 us, if anyone from nVidia is reading...PM me, we'll talk. 8) )
...or an NV40 engineering sample right now :)
 
The Baron said:
digitalwanderer said:
I wouldn't sell my rep for a piece of equipment worth a couple hundred dollars, it's worth a lot more to me. ($50,000 us, if anyone from nVidia is reading...PM me, we'll talk. 8) )
...or an NV40 engineering sample right now :)
Nah, I wouldn't for that. I'm too pissed at nVidia right now and they'd never give me one without an NDA I'd be refusing to sign.
 
ByteMe said:
Hey cthellis42, if it became known that the Rev received a hundred grand from nvidia for a day of "consulting". Would you believe anything he had to say about them or their products? Or would you verify it from somewhere else?
I do the "verify it from somewhere else" constantly. Smart people ALREADY take into account a broad swath of sites rather than just one. You shouldn't use a single benchmark either, so why use a single review? Not every model card is built the same; heck, not every card made by the SAME vendor is the same. Certainly not every test suite is the same.

Most of the time I don't look to it as "verifying" unless I've already expected bias on a site (or over-simplicity of testing producing non-informative results), but instead understanding overall trends and being able to compare and contrast on a much greater scale than one site can offer. (No matter how in-depth and great said reviews are.)

If I knew a site was "on the take"--such as it were--to the degree you are insinuating, my first impulse WOULD be to question their methodology and commentary, but ultimately it creates an atmosphere to LOOK for bais, not automatically causing it. If I can't detect the reviews being any different than they have for years, then what did that "automitic bias" amount to? Certainly if we knew a site were receiving direct sponsorship to immense levels the whole COMMUNITY would be hyper-analyzing the words and reviews coming out of that site, so any chinks in the armor would be torn wide open.
ByteMe said:
I don't even care what kind of reputation he has now. There is NO-WAY he would remain unbiased. We would all be guilty of this.
Of course the chances of something to this degree are immensely small.
ByteMe said:
Now the question is.... what if it was only 10 grand? one grand? 200 hundred dollars? see?
...and what if that's tiny in comparison to the broad scheme of things? Or operational procedure for the industry? Or evenly spread out among all sides?
ByteMe said:
It is impossible to accept "gifts" without looking biased.
You're already ignoring other typical "gifts"--you're just making an exception for this particular case and blowing it out of proportion.
ByteMe said:
Just because "everyone is doing it" does NOT make it right. If I was a reviewer it would NEVER be acceptable to take gifts ... and I don't care if it was a can of soda.
A reviewer also can't be expected to front the cost for everything personally. Sites getting slowly rolled out can't afford it either, but have to build up SOMEhow. I already can get a numbr of things freely while not being a reviewer in any way, shape, or form--does this automatically mean that if I'm talking about said product to friends/others I am pimping it? Am I incapable of taking it at face value when describing it?

To that effect, I'm sure in some matters things are so commonplace that experienced reviewers just shrug it off the same way they do marketing in general. Are you suggesting anyone who hears marketing from one venue is automatically biased as well?
ByteMe said:
It kills me that so many of you are willing to compromise your own ethics. But I forget... everyone is a victum... we can't remain competitive unless we proceed like others. What happened to the people that dared to be GREAT!
It kills me that so many are willing to over-simplify a situation and apply "black/white" perspectives to things that are decidedly NOT. It kills me when people decide to run rampant with their opinions in one direction but shrug it off wherever parallels can be drawn but they "don't feel it applies." What happened to people that dared to be internally consistent?
 
ByteMe said:
Reading that post of yours is like being viciously assaulted by a perfumed parakeet in a goddamned Parisian bordello.

I don´t have your experience with Parisian bordellos. Did you grow up in one?

ByteMe said:
You are incorrect. A bias does lead to a loss of credibility. What world do you live in?

I live in a world where everyone is biased per deifinition. Living in communities for some 20 000 yrs have let humans develop tools to deal with it. Read nelg´s post for some insight in what these tools might be and how they work. It´s sometimes referred to as "ability to communicate".

ByteMe said:
The published data can be perfectly correct and STILL be biased. What if a reviewer only shows benches were brand A always wins?

Bias in selction of benchmarks doesn´t mean tha numbers are biased.
And again, we have ways to correct that. It´s sometimes referred to as "common sense".
 
rubank said:
ByteMe said:
The published data can be perfectly correct and STILL be biased. What if a reviewer only shows benches were brand A always wins?
Bias in selction of benchmarks doesn´t mean tha numbers are biased.
And again, we have ways to correct that. It´s sometimes referred to as "common sense".
Oh no, I'll perfectly well admit that reviewers can cherry-pick tests to show a card they prefer in better light than it should. Thing is, THAT in itself is easily detectable, and part of the actual process to discover bias. (Of course this can also represent poor knowledge or just plain laziness. ;) ) If a reviewer leans on one particular engine more than others... if they don't test things at quality or high resolutions...

It may not be easy to assign "bias" automatically, but it IS easy to find and comment on poor reviews in general, for anyone who pays enough attention to WANT good reviews. And poor reviews are poor reviews, no matter what "reason" you want to assign to them. The source of problems can be symptomatic, but as long as you recognize the problem itself and know how to frame it, you're still fine. (You can look at other reviews and commentary and label people how you want later on.)
 
Whether or not your answer was directed at me, I totally agree.
That´s just plain old common sense ;)
 
Back
Top