NVIDIA -- How to regain your trust

Joe-

Because in some cases, it is

Then assume it is.

This is about "pulling it out of the box and it working" right?

This is about having functioning drivers. If I was using, say, an Asus K7M mobo and it couldn't function properly in AGP 2X because the motherboard has known line noise issues, I would be pretty foolish to blame it on a vid board maker wouldn't I?

So, why don't you ignore ATI PR when they say they support Aniso?

There claim was backed by review sites and forum posters, but I'll expand on that in a bit.

If nVidia was "forthcoming" about their performance, there wouldn't be any "shader" cheats or "clipping plane" cheats. They are designed to FOOL ME into thinking thier performance is something it's not.

And that means nVidia doesn't earn your trust. I am saying that ATi doesn't earn mine.

It's just wonderful how you attribute all the bugs that others run into as "not nVidia problems", "random", etc.

List the game, list the driver. It is that simple. I see posts from people who have issues that others can't reproduce, to me that indicates it is something elsewhere. I'll give you an example- Install the Catalyst 3.2 drivers and start any HL powered game running OpenGL. Play for any amount of time and hit Esc. Alternatively, fire up Sacrifice and simply look around using any ATi Cat drivers from the launch of the R300 core boards up until the Cat 3.2s(at least, got rid of my board so haven't tested the latest drivers). Another example, the original No One Lives Forever, Catalyst version 3.1 or 3.2, extreme levels of input latency playing the game no matter what framerate or setting.

List something comparable for nVidia, name the game name the driver and I'll check it out. I'm not talking about random issues that only I have come across, I'm talking about issues that impact all the people using the boards/drivers.

Just stick with the "nVidia logo" games, and you'll be fine.

I paid my money for a R300 core board, have you purchased a FX? I'm not talking in a hypothetical BS sense here, I put my money out and then had numerous complaints with the board. I didn't need to listen to second and third hand information about supposed issues, I dealt with them first hand.

And right now, the indication is, unless there are specialized paths / hacks for the NV3x, shader performance is poor compared to R3x0. And if the game gets those hacks, the quality won't be as good.

And in doing as much it is assumed that the games that use shaders will be limited by their performance above all else.

I would word that differently. If I was working at nVidia, I would trust they B3D would try its best to find the real truth, so I would trust that they might not come up with the "best light" kind of review.

What 'real truth' exactly?

So, Ben, which 3DGraphcis site do YOU PERSONALLY trust most for truthful analysis?

None, and that includes B3D. To cover every single aspect of a graphics card you would need several thousand pages worth of review at the very least. Because this is not reasonable, sites are forced to cut things down to cover the things they think are most important. To me, those are drivers first and foremost, followed by IQ and performance.

Besides the driver issues, I don't trust their analysis on things such as IQ. The latter happens to be due to their emphasis on AA over AF(which has been a constant here, btw). To me, proper texture filtering is a lot more important to me then having perfect edge AA. I would take a full, proper, 64x AF implementation over 16x MSAA of any type. This does not agree with the preference that B3D has.

On the driver front, this site focuses on 3D tech and stays away from gaming related issues. As of right now, the only reason I look at 3D vid card reviews is to observe potential benefits to my gaming.

For performance, they look to the 'generic' performance of the board, testing its theoretical specs which is the right way for a tech based review. Take SplinterCell being used as a bench. They level off the playing field to utilize the bench without really getting in to nV's superior visuals in the game. In a tech based sense, the only reason nV has this edge is the developers exploited features not supported directly by DirectX due to the games XBox roots. If I was convinced that SC would be the last XBox port the PC would see, I wouldn't consider this too much, but it isn't. If I want to play SplinterCell "the way it's meant to be played" is actually viable for this particular game. Only running NV hardware can you see the game as it was intended. Focusing on the tech end of the spectrum, this is a non issue as it isn't a truly legit utilization of DX, and it isn't something that is rampant in the PC space. As a gamer, when one of the best titles released in the last six months looks best on a particular piece of hardware due to additional implementations, that is something I do care about. The same would be the case with ATi products. If there is a title that offered visual enhancements only available on ATi boards then I would want to know about it. Is B3D going to cover this? No.

For the non tech sites, most of them don't have a clue what they are doing and horribly screw up everything. I put more faith in to B3D's reviews then I do any other sites, but I do not trust what they find an excellent product will be so much as tollerable for myself.

Static acceleration the standard? Where did that come from? Static T&L is used, but to what extent over CPU T&L?

Could you list the games that have come out in the last year that don't use static hard T&L?

And given that nVidia was first out with both static T&L and vertex shading....how is this "against nVidia?"

It was used as an equalizer for the Voodoo5.

What is this "old stance" you are talking about? Again, their stance on "static T&L"? As if that's an nVidia specific feature? Asif nVidia also wasn't the first part out that had DX8 vertex shaders?

Using it to prop up another part is where the issue comes in. They also mistakenly assumed that developers were going to skip static T&L completely and jump from software to pure VS. Nothing was ever given as to why they assumed this would happen except why they didn't like the technology. Is that what you are talking about in terms of 'honesty'?

In the years I've been reading here B3D has an excellent record of seeing where hardware is going and a fairly horrible record of seeing where games are going. As such in the past they have weighted the importance of each feature based on where they thought the gaming market was headed, even if they were far off the mark. At least now they are using DirectX as a guideline, even if that in itself fails to truly indicate where games are headed.

You can try and spin it in to a conspiracy theory if you want, the history stands that B3D has come out with support for features that have failed to be utilized by developers(T-Buffer, only real use for it was FSAA despite the nigh PR article that B3D posted) while denouncing those that are still in use(static T&L). nVidia takes one direction and this site stands staunchly behind the industry going in another direction where there is a rift at nearly every chance.

It isn't about honesty. Look at nV's pixel pipe configuration. Yes, I know there PR is full of shit but forget that for a moment. With Carmack's direction with Doom3 and the prediction that that is where the industry is headed, why so much focus on the PS side of things when the game utilizes hardly any advanced shaders while ignoring the stencil fill requirements such a direction would require? If B3D were to focus on the fact that nV has nigh 'free' stencil fill available to it, would it be dishonest? It would be a different focus, one that is friendly to nVidia. If they did do this, and I was working for ATi, I would be upset that they weren't focusing on PS performance more and relatively ignoring stencil fill issues. It isn't about them calling it as they see it, it is how they see it looking at where the market it headed.

Again, I ask you, what 3D site do YOU trust most for an honest assesment of the hardware, vs. just spitting out marketing gcrap?

None of them do I trust completely nor will I. If there was a site like B3D that focused more on gaming and attempted to paint all new features as equally important, then I would probably be the most inclined to trust it. Listening to sites like B3D is the reason why I have gripes with ATi right now. Do I think they were 'dishonest'? No. Doesn't change that the impression I got from reading their reviews was not what I needed to know about the product.
 
BenSkywalker said:
Weren't you here back then? I know you weren't working for the site, but I thought you were still around. I also knew that no matter how I stated that part, it would be taken poorly.

Not really. I was around, but not to any sort of degree I am now.

I don’t want to get involved in a point by point rebuttal, of things that occurred before my tenure, and having to search through the articles. The point I’m making is that article wise, which is what I consider to be the sites statement I don’t really remember some of the spin you are putting on it – yes, there were numerous articles explaining the 22bit filter, did they also state that 32bit wasn’t necessary? Not that I remember. I can’t think of once where we’ve stated in an article “48bit and 64bit aren't enoughâ€, nor can I remember us stating that Cube maps and Dot3 weren’t useful.

T&L was a clear standpoint, right or wrong, however Kristof subsequent comments in the forum over the idea of VS probably should have been articulated and an article IMO. But the fact you mention this also kinda negates your assertion that the standpoint then was always for what games use now, doesn’t it?

In your later post you mention the T-Buffer – this was one of the few sites to actually explain it in depth, and this is a technology site and that was a viable piece of technology. The uptake of it may have been very different had 3dfx not been in the state they were then and now.

As for fairness, get this: In the run-up to NV30’s release we have dedicated no less than 4 articles (Zephyrs tech comparison, the launch details and features overview and two interviews) even before we came to the preview (as well as countless news posts and spoon fed PR “Sneaky Peeksâ€). So far, we’ve done none for ATI’s technology, other than the reviews themselves. On the face of it that sounds as though we owe a few more column inches to ATI’s technology doesn’t it? Especially given the disparity between expectations and actual deliverance between their respective parts over the past 9 months.

BenSkywalker said:
Look at the forums and the comments from the people here(including those who pen for the site). PS 2.0 is an obsession on these boards at the moment. With the AF and noise issue resolved all effort seems to have moved to criticizing the weak point that the NV3X has left and of course, anything that could be considered positive is swept under the rug. You may like to think that the forums aren't going to impact an IHV's view on a site, but with the amount of other sites that quote discussions here where they see comments that the members proper of B3D make it will have an impact.

Ben, you cannot paint the comments of the forums as the statement of Beyond3D. The Forums are still relatively insular, it’s the articles and reviews that the unwashed masses read and take note of, I have the stats to prove that.

So what if the people in the forum are talking about PS2.0 performance – why shouldn’t they? And quite a lot of that is coming from the fact that people are developing tests and applications that measure PS2.0 performance, much as they did with PS1.x, so clearly there is a development want to understand and test the performance and, well, why shouldn’t people want to talk about it?

My personal involvement in the forums have decreased over the past few months as well, basacally becuase I just can't keep up with the number of topics and posts these days.

BenSkywalker said:
Take SplinterCell being used as a bench. They level off the playing field to utilize the bench without really getting in to nV's superior visuals in the game...
...The same would be the case with ATi products. If there is a title that offered visual enhancements only available on ATi boards then I would want to know about it. Is B3D going to cover this? No.

The fact of the matter is that I’m currently running a 5200 review and comparing it against an MX. Clearly there are several elements of difference between an MX and the 5200 – the pixel shaders and the shadowing being two. I’ve run the tests in both as close to the MX quality as you can get with the 5200 and I’ve also run it in the full buffer shadowed made, and taken screenshots of all mode to highlight the difference in both rendering performances and relative qualities.

When there is a title that has different rendering elements for one board to the next we will talk about it and mention the lot. If we ever get round to the high end shootout comparison and use SC then we will clearly mark all different rendering elements and what they can do for all boards.

BenSkywalker said:
You can try and spin it in to a conspiracy theory if you want, the history stands that B3D has come out with support for features that have failed to be utilized by developers(T-Buffer, only real use for it was FSAA despite the nigh PR article that B3D posted) while denouncing those that are still in use(static T&L).

Beyond3D’s content covers lots of technology, as it should, and has really taken a stand on one thing. As a pointed out earlier, content wise, we’ve given a disproportionate amount of coverage to a product that sold in the few thousands, if that.

BenSkywalker said:
Yes, I know there PR is full of shit but forget that for a moment. With Carmack's direction with Doom3 and the prediction that that is where the industry is headed, why so much focus on the PS side of things when the game utilizes hardly any advanced shaders while ignoring the stencil fill requirements such a direction would require?

Actually, who is predicting that Doom3 is the direction that everything is heading? Personally I would say that the industry is heading down a much more shader oriented route, which Doom3 basically isn’t.

RussSchultz said:
The apparent flip flop between forward looking features that are indespensible, and ones that aren't useful because their too slow for now. He brings up a few examples, though poorly worded and densly formatted making them difficult to discern.
-Beyond3d was against T&L when it was V5 vs. GF256. The article was very slanted; I also remember some of the information being just plain wrong/misleading, but very vigorously defended. A bit of 'drinking the 3dfx coolaid', as it were. (downplaying advantage of NVIDIA.)
-But, on the other hand, here we are advocating PS/VS2.0 when it is ATI vs. NV. Definately not drinking the NVIDIA coolaid. (emphasising disadvantage of NVIDIA.)
-I don't remember this exactly, though he's suggesting that DOT3, etc were poo-poo'd or downplayed because the V5 didn't have them, not many games used it, and the rampage super duper texture computer was about to come out. Again, drinking the 3dfx coolaid (downplaying advantage of NVIDIA.)
-But PS1.4 (which enabled dependant texture lookups in 1 pass) was very important. (I'm not sure this was a typical Beyond3d stance, and I'm not sure I've got the dependant texture thing right)
-32 bits color was not important because it was too slow, etc. (Downplaying advantage)
-48 bit or 64 bit color isn't enough. We need, coincidentally, 96 bit color, but we don't need 128 bit color. (Emphasising disadvantage)

Again, Russ, other the T&L article, I do not see where as a site we state many of the things mentioned here – where are they?
 
RussSchultz said:
Ill weigh in and state that I think Ben does have at least a few valid points.

Of course you do...because there is also an "undeniable pattern" to your own posts. I was wondering when you would enter the discussion on "Bens" side. ;) And yet, despite this undeniable pattern of posts, you continually assert your "non-biased" approach to ATI and nVidia, right?

There is only ONE way to reconcile those two facts.

That is, your honest assesment of the situation happens to fall in line which favors one IHVs point of view / architecture.

So whqat I'm trying to say is, are you accusing B3D of having unfairly biased articles, or articles where their honest take of the situation just doesn't particularly favor nVidia?

The apparent flip flop between forward looking features that are indespensible, and ones that aren't useful because their too slow for now.

I still don't see any flip-flop of issues here. IIRC, B3D got criticized for not having enough PS 2.0 investigation with the nVidia previews.
The only nVidia product that has been criticized for Ps 2.0 performance not being "good enough to be useful" (though not yet by B3D, because we haven't seen their review on it), is the FX 5200 products.

NV3x shader performance is being criticized from a different perspective: not being as fast as R300, and not being as fast as we were lead to believe from nVidia PR. This is WHOLLY differnt than the T&L situation.

-Beyond3d was against T&L when it was V5 vs. GF256. The article was very slanted;

The article was objective.

I also remember some of the information being just plain wrong/misleading, but very vigorously defended. A bit of 'drinking the 3dfx coolaid', as it were. (downplaying advantage of NVIDIA.)

I saw it as just the opposite. EVERYONE ELSE was "drinking the nVidia cool-aid", and repeating BS about T&L X-Mas '99 and such.

To this day, what has "static T&L" done for gaming? Almost 4 years later...

-But, on the other hand, here we are advocating PS/VS2.0 when it is ATI vs. NV. Definately not drinking the NVIDIA coolaid. (emphasising disadvantage of NVIDIA.)

Again....completely different. First of all, ATI and NV both have PS 2.0 support. The argument isn't over "will it be fast enough to be useful", it's simply "who is fastest, regardless." Are you saying that argument shouldn't exist?

-I don't remember this exactly, though he's suggesting that DOT3, etc were poo-poo'd or downplayed because the V5 didn't have them, not many games used it, and the rampage super duper texture computer was about to come out. Again, drinking the 3dfx coolaid (downplaying advantage of NVIDIA.)

I don't recall that at all. :rolleyes:

-But PS1.4 (which enabled dependant texture lookups in 1 pass) was very important. (I'm not sure this was a typical Beyond3d stance, and I'm not sure I've got the dependant texture thing right)

Again, links would be nice. :rolleyes:

-32 bits color was not important because it was too slow, etc. (Downplaying advantage)

Source? :rolleyes:

-48 bit or 64 bit color isn't enough. We need, coincidentally, 96 bit color, but we don't need 128 bit color. (Emphasising disadvantage)

Source? :rolleyes:

What I see here is a lot of inferring things that B3D did not say, or st the very least stating in a way that's not representative of the actual view.

What if I say "DirectX and OpenGL "needs" 96 bit color, not 32 bit. (Which is simply an observation of fact.) If the "facts" emphasize an nVidia disadvantage, what is B3D supposed to do about it?

Not that I agree with everything he says, but there is a case for Beyond3d seems to adopt stances that are "against NVIDIA", either lambasting them for not having a feature or for being slower, or downplaying the feature advantage they have as unimportant for now. I can see how the history of such would be unhelpful to garnering the trust of a company.

I can see it as being unhelpful to gaining the trust of a company's PR, sure. If I didn't think the site would parrot my PR, I would be reluctant to put them on the list, too.

There is an undeniable pattern. You can come up with reasonable arguments for each of the stances (except the T&L one, in my opinion),

IMO, they have more than a reasonable stance for everything.

but you can't deny there is a pattern--and it seems to be on the surface inconsistant.

Pattern? Perhaps. Inconsistent? No.

Of course, they also probably don't want to talk to you because you're in like flynn with Futuremark, who is their major pain in the behind right now.

Right, because Futuremark isn't trustworthy? Right, "major pain" defined as "not showing NV in the best light via removing cheat detection." Another reason to not trust nVidia.

I don't suggest you kiss ass to become a tier 1, but perhaps a bit of navel contemplation over the past might be useful and enlightening.

Enlightening indeed. Moral of the story: don't cave to PR. Do what you think is right, and let the chips fall where they may.
 
BenSkywalker said:
This is about having functioning drivers. If I was using, say, an Asus K7M mobo and it couldn't function properly in AGP 2X because the motherboard has known line noise issues, I would be pretty foolish to blame it on a vid board maker wouldn't I?

Pick a side and stay with it. Are we talking experience out of the box, or not?

Would you buy a new motherboard in this case, or get a different video card?

And that means nVidia doesn't earn your trust. I am saying that ATi doesn't earn mine.

Fine.

List the game, list the driver. It is that simple.

Don't all the new dets fubar AA on DX7 and earlier titles?

I paid my money for a R300 core board, have you purchased a FX?

No, because my 8500 is still OK for my rig. No driver problems with it either.

I'm not talking in a hypothetical BS sense here, I put my money out and then had numerous complaints with the board. I didn't need to listen to second and third hand information about supposed issues, I dealt with them first hand.

Great, then, as you did, return the card. I would have returned my 8500 if they didn't get smoothvision working when they did.

And in doing as much it is assumed that the games that use shaders will be limited by their performance above all else.

No assumptions are made. I don't assume any type of absolute performance. My assumption is: if special coding isn't done for FX, it CAN make a big difference. No need to worry about "special coding" for R3xx.

So, Ben, which 3DGraphcis site do YOU PERSONALLY trust most for truthful analysis?

None, and that includes B3D. To cover every single aspect of a graphics card you would need several thousand pages worth of review at the very least. Because this is not reasonable, sites are forced to cut things down to cover the things they think are most important. To me, those are drivers first and foremost, followed by IQ and performance.

What SINGLE SOURCE would you trust most. I understand your need to check several sources. If you had to choose ONE, which would it be?

Besides the driver issues, I don't trust their analysis on things such as IQ. The latter happens to be due to their emphasis on AA over AF(which has been a constant here, btw). To me, proper texture filtering is a lot more important to me then having perfect edge AA. I would take a full, proper, 64x AF implementation over 16x MSAA of any type. This does not agree with the preference that B3D has.

I just don't see where this "B3D preference" is dreamed up. Does or does B3D NOT give you all the information about AA and Aniso so that, no matter what your preference is, you can make a choice?

On the driver front, this site focuses on 3D tech and stays away from gaming related issues. As of right now, the only reason I look at 3D vid card reviews is to observe potential benefits to my gaming.

Like every other site, their definition of "driver goodness" is typically "did it run my specific set of benchmarks without errors?"

....The same would be the case with ATi products. If there is a title that offered visual enhancements only available on ATi boards then I would want to know about it. Is B3D going to cover this? No.

See Dave's response.

For the non tech sites, most of them don't have a clue what they are doing and horribly screw up everything. I put more faith in to B3D's reviews then I do any other sites, but I do not trust what they find an excellent product will be so much as tollerable for myself.

That's fine. The question is, if you put more fiath in B3D's reviews, why don't you demand the same from nVidia? That THEY put more faith in their reviews.

Could you list the games that have come out in the last year that don't use static hard T&L?

Could you list the games that use Static Hardware T&L, where software T&L doesn't provide nearly the same results?

It was used as an equalizer for the Voodoo5.

It was used as an equalizer to T&L x-mas '99, if anything.

Using it to prop up another part is where the issue comes in. They also mistakenly assumed that developers were going to skip static T&L completely and jump from software to pure VS. Nothing was ever given as to why they assumed this would happen except why they didn't like the technology. Is that what you are talking about in terms of 'honesty'?

Yup. And I still to this day don't see any talk about DX7 hardware T&L for nything other than "prfesstion 3D apps". It's all about vertex shaders for games.

Where does the Kyro fall behind GeForce, for example, in terms of rendering performance / quality?

You can try and spin it in to a conspiracy theory if you want, the history stands that B3D has come out with support for features...

See Dave's response. Where do you see B3D advocting "support" for features, rather than supplying technical explanations for them? Specific links please.
 
Yeah, missed that one - where is this FSAA imbalance? In the first look reviews we dedicate up to as much as 3 pages of performance and IQ analysis of both filtering and AA. We're one of the first to routinely adopt the aniso filter pattern app as well which clearly highlights some of the architectural difference between the main two vendors at the moment.
 
Perhaps it was Dave Barron's posts that overly influence my recollection. He, as a voice for Beyond3d, was definately steeped in 3dfx coolaid.

So whqat I'm trying to say is, are you accusing B3D of having unfairly biased articles, or articles where their honest take of the situation just doesn't particularly favor nVidia?

No, I'm stating there is a pattern of statements from "beyond3d". It just so happens the pattern is generally minimizing one particular vendor and to me sometimes appears capricious for its reasoning.

But my opinion is wrong because I'm a die hard nvidiot. :rolleyes:. Do we always have to tread that water Joe? Ignoring the fact that I've had a Rendition, matrox, S3, 3dfx, 3dfx, NVIDIA, NVIDIA, ATI card in my box, maybe you just think this because the only time you and I get in these stupid discussions when when we disagree--and since you're always on the anti-nvidia camp, every time we disagree I'm not(and therefor in the 'nvidia camp')? This completely ignores the times our opinions are concordant or non-intersecting.
 
RussSchultz said:
Perhaps it was Dave Barron's posts that overly influence my recollection. He, as a voice for Beyond3d, was definately steeped in 3dfx coolaid.

I think it was obvious that Ben's posts that overly influenced your recollection.

No, I'm stating there is a pattern of statements from "beyond3d". It just so happens the pattern is generally minimizing one particular vendor and to me sometimes appears capricious for its reasoning.

Cop-out response. Define "just-so-happens."

1) Pure luck
2) B3D bias against nVidia
3) The facts themselves generally minimize one particular vendor.

Again, this is assuming there is a history of "minimizing" nVidia on this site, which I disagree with in the first place.

But my opinion is wrong because I'm a die hard nvidiot. :rolleyes:.

:rolleyes:

I did not say that. Good to see you're still putting words in my mouth.

I said that what you are saying "happens" at B3D can certainly be applied to your own behaviour.

1) Your posts minimize nVidia "faults" and maximize ATI ones.
2) You claim to not be biased
3) You explanation for this, to be consistent with your explanation of B3D: it "just so happens" that "your statements" minimze ATI, and maximize nVidia.

In other words: if you accept that your own behavior is OK (I presume), but then, you get irritated with others at the whiff of being portrayed as being an nVidiot?

If this upsets you, "perhaps a bit of navel contemplation over the past might be useful and enlightening", eh?

Ignoring the fact that I've had a Rendition, matrox, S3, 3dfx, 3dfx, NVIDIA, NVIDIA, ATI card in my box....

Just like ignoring the fact that B3D reviews ATI, nVidia, and other vendor products and technologies?

maybe you just think this because the only time you and I get in these stupid discussions when when we disagree--and since you're always on the anti-nvidia camp, every time we disagree I'm not?

Apply this line of reasoning to B3D, Russ.
 
Sigh. I'm not saying Beyond3d IS biased, or that I necessarily think that they are. I'm saying there is at least a somewhat reasonable case for somebody to come to that conclusion.

But, by definition, if that isn't the conclusion that you come to, it can not be reasonable. That is the one constant I've gotten from all of our discussions. There is no room for dissenting opinion in your mindset--everybody who disagrees with you is simply wrong and you're more than willing to tell them how they're wrong ad nauseum.

And this makes it very tedious to share opinions on this board for fear of being dragged into another multipage back and forth as you disect each statement as you invalidate an opinion, or drag in other discussions or poke at old disagreements.

Ever notice you always get the last word? It generally isn't because you've convinced anybody to accept your superior viewpoint.

Gah. I've gone completely OT now. Must be the temperature outside.
 
RussSchultz said:
Sigh. I'm not saying Beyond3d IS biased, or that I necessarily think that they are. I'm saying there is at least a somewhat reasonable case for somebody to come to that conclusion.

Sure, and to keep off topic ;), that's what many people say about you with repsect to nVidia. That was one of my points in the last post.

Now, I personally think that the case for B3D being biased against nVidia is a stretch. I still haven't seen the links to the articles / statements in question that show the evidence that is "said" to be there. I see nVidia biased against B3D, because B3D doesn't tote the company line, and nVidia is "afraid" of this.

But, by definition, if that isn't the conclusion that you come to, it can not be reasonable. That is the one constant I've gotten from all of our discussions. There is no room for dissenting opinion in your mindset--everybody who disagrees with you is simply wrong and you're more than willing to tell them how they're wrong ad nauseum.

No, you're wrong. :D

Seriously, you're telling me that if "that isn't the conclusion that I come to, then it can't be reasonable", and you're telling me I have no room for dissenting opionion?

And this makes it very tedious to share opinions on this board for fear of being dragged into another multipage back and forth as you disect each statement as you invalidate an opinion, or drag in other discussions or poke at old disagreements.

You can validate your opinion, if you would answer the questions that are thrown back at you...like providing the exact links to all those examples "where B3D is being minimal to nVidia".

Instead, we get "This is how I remember it", or worse, "This is how Ben remebered it, so I guess that's good enough."

Ever notice you always get the last word?

No, I don't. however, in the "back and forth" articles, I usuaally keep pressing to have the questions answered, and if my "opponenet" continues to dodge them, I will repeat them either until they are answered, or until "I get the last word in" and he just gives up. Making it obvious that he has no answer.

It generally isn't because you've convinced anybody to accept your superior viewpoint.

No, it's to reject an oppsing viewpoint that is based on lack of evidence.

Gah. I've gone completely OT now. Must be the temperature outside.

It's the rain over here...
 
You know,

I find the way this thread went (disussion on B3D "trustworthiness", as opposed to nVidia "trustworthiness"), pretty worrisome.

Same tactics that nVidia used. "Don't question the trustworthiness of our drivers....question that of 3DMark instead!"

It seems I was under the false(?) assumption that most of the regulars here including Ben and Russ "trusted" B3D more than any other single source for getting the scoop on hardware.

But since I've linked nVidia trust to B3D (nvidia not giving B3D tier 1 status), the argument has shifted. And in the 3DMark case, nVidia seems to have succeeded for the most part on casting doubt in 3D Mark, and shifting attention away from themselves.

I'd rather not have this happen here.

So perhaps Ben and Russ can answer the original question?

How can nVidia gain your trust?

If nVidia already has it (they don't have to do anything), then just say so.

I summed up my answer with the following: it would help greatly if they would make B3D a tier 1 media / review outlet.
 
How can they gain my trust?

By putting out quality hardware and not get caught with their hand in the cookie jar. Presumably by not eating cookies. They'd get more trustworthy over time--by continuing to put out quality hardware and by continuing to not put their hand in the cookie jar.

I also never took the tack of "dont' question NVIDA, question 3dmark". I don't know who did. I do, however, have misgivings that 3dmark is not catching all the cheating.

How would 3d mark gain my trust?
By continuing to ferret out cheaters, and move to a benchmark scheme that is (in my opinion) less easily targetted. Even then, with a benchmark so widely regarded, I'll always be suspicious that they're not catching all of the cheating.

And I certainly was not trying to divert attention from the question by impuning this websites integrity. I do, however, stand by my previous statements. A trip down memory lane (T-Buffer article, T&L article, as two examples) shows at least some reason for NVIDIA to not be complete pals with beyond3d. If the sites conclusions have tended to be against their technology decisions, and tended toward their competitors, it only stands to reason they'll look elsewhere for a primary outlet.
 
Dave-

First off, I am very impressed and very pleases to hear that you are going to include the differences in rendering options between the FX and MX boards, that is extremely good news to here IMO :)

For the rest of your issues, it mainly seems that you remove the forums and the posts by the people that write for the site from the main site. Take the AA v AF issue, which do you consider more important? I understand that the hits the forum gets are not comparable to the main site, which is always the case. I'm saying that the people @nV have certainly read the forums and the comments by the people who do write for the site. As far as 48 and 64bit not being enough, look to the comments by those that do write for the site talking about the shader demos that have been written to try and show as much. This took months for people to figure out a good way to display, yet now it is used and discussed extensively.

Actually, who is predicting that Doom3 is the direction that everything is heading? Personally I would say that the industry is heading down a much more shader oriented route, which Doom3 basically isn’t.

Your betting/thinking on shaders being the direction, which is what I was saying previously. This particular thought on your part is more friendly towards ATi's direction with their part then nV's.

Joe-

Pick a side and stay with it. Are we talking experience out of the box, or not?

Would you buy a new motherboard in this case, or get a different video card?

My side isn't changing, I'm not going to blame a mobo for a vid card problem. In that case(mobo/vid card don't get along) I already have purchased a new mobo in the past, multiple times actually. I'll build a rig around a video card without hesitation. Hell, right now I have a motherboard that evem matches the color of the PCB used for the R9500Pro which means I'm stuck buying Gainward when I buy a DX9 board now(I likely would anyway, there 2D is significantly better then the rest).

Don't all the new dets fubar AA on DX7 and earlier titles?

The only DX7 title I've been playing much of is Sacrifice lately, and it does work there but I will check other older titles and get back to you.

I just don't see where this "B3D preference" is dreamed up. Does or does B3D NOT give you all the information about AA and Aniso so that, no matter what your preference is, you can make a choice?

No. The R300 core board introduces a considerable amount of texture aliasing when utilizing AF.

Could you list the games that use Static Hardware T&L, where software T&L doesn't provide nearly the same results?

Sacrifice- geometric LOD flips out with soft T&L
Giants- Performance tanks when enabling shadows using soft T&L
Mafia- Sizeable performance hit
MDK2- Inferior lighting and sizeable performance hit or major performance hit

Anyone know how to force disable hard T&L? I'll gladly check more games out. Out of those that have the option that I own every one of them shows noticeable differences.

Where does the Kyro fall behind GeForce, for example, in terms of rendering performance / quality?

One good example was Giants enabling shadows. For the GF series of boards the performance hit was quite minor, a percentage point or so. For the Kyro2 there was a rather massive hit in framerate, in line with enabling 2x AA IIRC. I was running a Kyro2 for a decent amount of time and ran a couple thousand benches with it.

See Dave's response. Where do you see B3D advocting "support" for features, rather than supplying technical explanations for them? Specific links please.

What features they chose to focus on is what is at issue. Check out the TBuffer article, the 22bit post filter article, the various AA articles etc.

Again I will state, I trust B3D more then any other review site. What I don't trust, and has been proven numerous times, is that what they find to be an excellent product will be suitable for myself. I don't trust ATi or nVidia for that matter either. Out of all the IHVs, I have had the best luck in terms of product matching my expectations with nVidia over the years, so I do tend to trust them quite a bit more then ATi when it comes to making a purchase. That said, if BitBoys were to release a gfx board tomorrow that was on par with the others and showed the slightest edge in what matters to me I wouldn't have too much of an issue picking one up(driver issues would be my only hesitation only as they are not proven yet).
 
Ben said:
On the driver front, this site focuses on 3D tech and stays away from gaming related issues. As of right now, the only reason I look at 3D vid card reviews is to observe potential benefits to my gaming.
That may appear to be the direction of B3D to you at the moment but I'd have to say that I haven't been doing a lot of reviews since Dave took over and Dave's interest and priority is slightly different than mine. That should now change, so I hope that my reviews will provide a nice balance to the site's entire review content.

As for the Beyond3D is anti-NVIDIA -- B3D is not anti-NVIDIA ... that assumes that we are against NVIDIA as a company as a whole, which is miost definitely not true... it's just that B3D is anti-NVIDIA-PR . We are very leery of PR statements from any IHV (and that should be understandable)... and it so happens that during those days (and even now) we have received far more NVIDIA PR announcements/statements than from other IHVs. I don't believe any of our NVIDIA product reviews thus far have had any undeserved criticisms.
 
As for the Beyond3D is anti-NVIDIA -- B3D is not anti-NVIDIA ... that assumes that we are against NVIDIA as a company as a whole, which is miost definitely not true... it's just that B3D is anti-NVIDIA-PR . We are very leery of PR statements from any IHV (and that should be understandable)... and it so happens that during those days (and even now) we have received far more NVIDIA PR announcements/statements than from other IHVs. I don't believe any of our NVIDIA product reviews thus far have had any undeserved criticisms.

EXACTLY!
My opinion as well.
 
The amount of convers logical fallacy is getting a bit much, don't you think?

BenSkywalker said:
DaveBaumann said:
Actually, who is predicting that Doom3 is the direction that everything is heading? Personally I would say that the industry is heading down a much more shader oriented route, which Doom3 basically isn’t.
Your betting/thinking on shaders being the direction, which is what I was saying previously. This particular thought on your part is more friendly towards ATi's direction with their part then nV's.

Ben,
Thinking that shaders are the way the industry is heading isn't equivalent to being more friendly to one IHV's hardware, it is an observation based on many other factors demonstrated by the industry itself, and has a justification completely aside from "friendliness" to ATI. :oops:

What you are doing is taking the objectively determinable observation that "one IHV's approach is more successful with shaders" and constructing a relationship of "IHV friendliness" for Beyond3D, ignoring completely the relevance of any of the indications in the rest of the 3D industry's focus on shaders.

What does nVidia's CineFX push? What new features did the NV3x add to the NV25?
What are a large proportion of 3D technology papers and talks discussing and analyzing?
What is Direct 3D and OpenGL evolution focused upon?
What did nVidia leverage as a selling point, even before used to express unique functionality in games?

Do you recognize that shaders are the emphasis for all of these? If so, why is Wavey's comment wrong at all, or even "IHV friendly"? You're doing the equivalent of stating that the "industry is more friendly towards ATi's direction with their part than nV's" by the way you're criticizing Wavey and Beyond3D.

That's 100% pure blame re-assignation, and you are consistently doing it like you're hardwired in your inability to simply to assign blame to nVidia for being "less friendly towards shaders and the direction of the 3d industry", as all of your expectations you've expressed for B3D are based on the fallacy of nVidia's viewpoint being the frame of reference, when it should be what the 3D industry is actually indicating. They are not synonymous, nor is B3D a nVidia website: it is a 3D industry website.

Objectivity is impossible when trying to satisfy your dictate for not being "more friendly towards ATI", simply because ATI can objectively be established as more successful with shaders. :-? Your statements insist that Beyond3D can't recognize this without being "friendly" with them. Since you don't trust any company, could you explain the objective criteria for ATI friendliness you are using, and how they aren't based on ATI simply being successful with shaders?

AFAICS, shaders being important is not an ATI "friendly" stance. The "friendliness" in the exisiting situation is from ATI: to the 3D industry and, hopefully, their bottom line, by excelling at that at a time when widespread adoptation and utilization of it is emerging.

It seems likely that the NV40 will, in the near future, excel at the possibilities of VS/PS 3.0 over VS/PS 2.0, whatever those are, and will possibly be more "friendly" to the industry than the competition until ATI supports it as well. At some time in the past, the GF 3 was the first chip to have an exposed pixel shader model, and they were definitely more "friendly" towards this until the 8500 was released.

But that is past and future, and there are factors independent of IHV friendliness and desires pertinent to the evaluation of those situations that your labelling wouldn't consider. There are also independent factors right now, between those two times, that are pertinent to the evaluation of graphics cards, and the only remotely objective approach available is to focus on fairly representing them, regardless of how it makes IHVs look...something you seem to be complaining about. There is no objective solution that can satisfy your expectations until nVidia more successfully competes with their products, there is only ignoring factors that might be inconvenient.

Speaking again of my distrust for nVidia: in demonstrated practice, they consider misrepresentation of facts as being equally valid as achieving something in actuality. I consider the approach to 3dmark 03 330 as the clearest illustration of this in recent memory.
I don't trust that this will suddenly change given their behavior in the past, and I won't trust them any more when and if objective observations are more in line with their desire to be perceived as "number one" in some way. What I'll be trusting, when and if that occurs, is those objective evaluations that fairly and thoroughly determine that the objectivity and nVidia's statements are actually convergent. What I view as B3D's focus is achieving that, and what you seem to be proposing is removing that significance for the sake of not being labelled by you as "friendly" to an IHV you dislike more than another.
 
No. The R300 core board introduces a considerable amount of texture aliasing when utilizing AF.

It does?
Maybe it's just me but I found the R300 to have a lack of TA with AF than my NV20 when both cards at 8xAF + tri. 16x AF just tends to make the image cleaner.
 
BenSkywalker said:
As far as 48 and 64bit not being enough, look to the comments by those that do write for the site talking about the shader demos that have been written to try and show as much. This took months for people to figure out a good way to display, yet now it is used and discussed extensively.

The ongoing hoopla on the forums over Nvidia's drivers and shader precision has nothing to do with whether FX12 or FP16 "are enough". There aren't any games yet which utilize the extra precision of PS 2.0 (or OpenGL equivalent), so it would be premature to argue this question in depth. Luckily, no one is doing that in much volume, at least not on the B3D forums.

Rather, the discussion is about compliance. Whatever amount of precision "is enough" (and of course it's a meaningless question, since the answer is completely different depending on the shader in question), the fact is that PS 2.0 and ARB_fragment_program both specify FP24 as the minimum default precision. Several of Nvidia's recent drivers have been established to default to FP16 when running PS 2.0 programs (without the partial precision hint). This is in violation of the spec. Period. The issue isn't whether those shaders, or any other shaders, look "good enough" in FP16, FX12, or whatever; the issue is compliance. The issue isn't whether it takes a specially designed shader to even tell which datatype the drivers are using (and of course, the reason to use a synthetic shader to determine the datatype isn't because the datatype has no bearing on image quality in a more typical shader, but because with a shader designed to test datatype precision the cause of any anomalies is unambiguous.) The issue is compliance.

Some Nvidia drivers (including the WHQL'd 44.03s) are not DX9 compliant as a default. That's an issue worth discussing on a tech oriented forum like B3D's technology and hardware forum. No, it doesn't impact you much if your chief interest is how suitable various cards are for playing DX7/DX8 games. But neither does it indicate an anti-Nvidia bias to be discussing it. (Which isn't to say that many posters don't arguably display an anti-Nvidia bias while discussing it.)
 
That may appear to be the direction of B3D to you at the moment but I'd have to say that I haven't been doing a lot of reviews since Dave took over and Dave's interest and priority is slightly different than mine.

I would agree with that based on your history in terms of reviews. You have always focused more on gaming related aspects then the rest of the B3D crew(at any point) and because of that I tend to read your reviews with more interest then the rest. For the record, I recall your liking of the V5's FSAA however you took issue with their far too conservative LOD bias which the rest of the B3D crew at first attempted to deny(your article leading to the LOD bias adjustment tool in 3dfx's drivers). On the other side, you went fairly in depth in looking at MDK2 and how it benefited from hard T&L in terms of performance and visuals and displayed this to the end user. Unfortunately on the basis of this particular topic, that stuff happened at The Pulpit and not here ;)

I remember when Barron and K had been bashing static hard T&L for quite some time and you posted the Giants screenshots and talking about framerates showing the edge it offered running that title(both in IQ and FPS) on the forums when you started participating here also. I've seen you play up the strong points on both ends of the spectrum Rev(in terms of the different strengths of each competitive IHV), but that hasn't been the norm here.

That should now change, so I hope that my reviews will provide a nice balance to the site's entire review content.

If you still use the same focus you have in the past then it will. Perhaps Dave does a good job of covering it up, but he does not come across as a gamer at all(neither does Marco). Keeping with the focus of the technology end of 3D it isn't that big of a deal, but it makes it hard to truly equal out how something comes across in one of their reviews with how something will end up playing games.

As for the Beyond3D is anti-NVIDIA -- B3D is not anti-NVIDIA ... that assumes that we are against NVIDIA as a company as a whole, which is miost definitely not true... it's just that B3D is anti-NVIDIA-PR .

Instead of simply being anti PR in general? :)

I don't believe any of our NVIDIA product reviews thus far have had any undeserved criticisms.

Are you saying that the nVidia reviews have only had valid criticism leveled at them....?
 
Back
Top