NVIDIA -- How to regain your trust

Joe-

And I'm still trying to find out from what "perspective" you can see nVidia being more "truthful / forthcoming" than ATI.

I already mentioned perhaps a large part of my problem is with my perspective. When I purchase a computer hardware item I expect it to have fully functioning drivers without any serious issues period. If I see a product on the shelf, then the drivers that match above criteria should be a given. There are some exceptions to this, I don't expect anyone's drivers except nV's to work with any real intensive 3D utilization(Viz, CAD etc), but outside of that I expect it to work as it should and as it is implied that it does when I purchase it. When I ran 3dfx this is how it worked, I had a Kyro2 and while it had some minor issues it still worked for the most part and it did so without any major problems, when I run nVidia this is how it works. In the last five years I have had two boards that failed to live up to fairly basic driver standards- an ATi All In Wonder Pro 8MB (RagePro) and a Radeon 9500Pro.

The basis of your issues with nVidia is in relation to trusting them and what it would take for you to do so. Every nV board I have purchased has performed within or exceeded my expectations, same with 3dfx and same with PowerVR.

What is "adaptive shader code?" Creating output that differs from the one the developer intended, by changing the shader algotighm behind his back, is not "adaptive". Its deceptive, and not forthcoming.

Like AA, Aniso filtering has different implementations. To say or imply that one's aniso implementation is "wrong" because it is of lower quality, then you'd have to say that nVidia's AA is "wrong."

The NV2X's AF is 'right', the NV3X isn't, although it's better then the ATi alternatives. "Adaptive shader code" is altering what the shader should be doing to output inferior results, much like ATi's AF. The big difference I see between them is I use AF in every single game I play and I have, let me count.... 0 games that support PS 2.0 ;)

Or I could say that nVidia's aniso was wrong, because it was so "slow".

You can't have your cake and eat it too.

Speed is irrelevant to doing something properly or hacking it. I've already stated in a previous post in this thread "they also degraded the AF quality moving from NV2X to NV3X".

Why does that matter?

I don't live in a theoretical vacum. What impacts me is what I care about. nVidia's and ATi's 16bit color looks like ass compared to 3dfx's, you think anyone should care in the least? If you are talking about trusting PR, I don't trust any as I have already stated anyone that does is moronic.

Where was such an implication made? Talk about reading into things. The only implication I see being made is from you: your implication that ATI has more bugs than nVidia. (Because ATI has a bug list, and "all the bugs that you know of in nVidia's drivers can be worked around.) Talk about a stretch...

List off nV's bugs that would impact me that can't be worked around easily. I've asked this of numerous people who try and state that ATi's drivers are competitive with nV's and haven't seen a good response yet. I tried to find workarounds to those issues that I had with ATi's drivers, I spent a month trying to work around them before I gave up.

FUDie-

As OpenGL guy mentioned, the GeForce 2 wasn't one of the boards listed. I said, "MSAA", which the GeForce 2 doesn't support, but the GeForce 3/4/FX allegedly do.

They support it and it works as advertised and as it has. They do not only filter the edges that they deem worthy of being filtered which is what would need to be taking place for it to be comparable to ATi's or nV's NV3X AF.
 
BenSkywalker said:
When I purchase a computer hardware item I expect it to have fully functioning drivers without any serious issues period.

For me it's not "period". It's also living up to performance expectations bsaed on specs and reviews. And if the p/reviews and specs are so ambiguous so that I don't even know what I SHOULD expect, then there's no way in hell I'm going to buy it.

The basis of your issues with nVidia is in relation to trusting them and what it would take for you to do so.

Um, that's the basis of this discussion.

Every nV board I have purchased has performed within or exceeded my expectations, same with 3dfx and same with PowerVR.

I had a somewhat opposite experience. The only NV card that I owned that worked "as expected" was a Riva 128. And that's only becuase I also had a 3dfx Voodoo Graphics to run the GL Quake while nVidia got there GL driver implemented.

My TNT had to be returned because it had issues with my LX motherboard. Just wouldn't work at all.

My 3dfx and Radeon 8500 boards worked with no probs. (Though I almost returned the 8500 because Smoothvision wasn't in the initial drivers. It was implemented within a month and I was satisfied with it.)

The NV2X's AF is 'right', the NV3X isn't,

I disagree. We've had this "right or wrong Aniso" discussion on this board before though. NV2x's aniso is higher quality than R2/3xx. Not more "right".

"Adaptive shader code" is altering what the shader should be doing to output inferior results, much like ATi's AF.

No. The application developer doesn't "code" the algorithms for anisotropic filtering. He flips a bit and says "hardware, apply aniso". This is unlike what he does for shaders, where he programs the algorithms himself.

Speed is irrelevant to doing something properly or hacking it. I've already stated in a previous post in this thread "they also degraded the AF quality moving from NV2X to NV3X".

Speed is not irrelevant when choosing how to implement a feature which does not have a specific definition for implementation, like AA or Aniso.

I don't live in a theoretical vacum. What impacts me is what I care about. nVidia's and ATi's 16bit color looks like ass compared to 3dfx's, you think anyone should care in the least?

Of course they should. Especially if performance in 32 bit mode on nVidia's and ATI's cards is not up to par with 3dfx's 16 bit.

If you are talking about trusting PR, I don't trust any as I have already stated anyone that does is moronic.

There's a distinction between blindly trusting PR, and PR being so elusive / dishonest, that you can't being able to verify their claims.[/quote]

See all the current nVidia discussions surrounding pipelines, shader precision, driver cheats, etc.

List off nV's bugs that would impact me that can't be worked around easily. I've asked this of numerous people who try and state that ATi's drivers are competitive with nV's and haven't seen a good response yet.

Geezus, Ben, head on over to NVNews and browse the forums for all sorts of problems. Ditto for Rage 3D.

You'll find 3 types of people on each forum:
1) I can't get this thing to work AT ALL!
2) I have issues X-Y and Z, but everything else is OK
3) I have no problems at all.

heck, Try to get Humus's mandelbot demo to run with 32 bit precision on an NV30/3/34. Good luck trying to "work around them."
 
I had far more problems with my GF3 than I've had with my 9500P/9700P/9800P.... Thats not to say that I haven't had problems.... But they were less than with the GF3, and just a few more than the GF4 at the time I originally got into the R300..... Right now, I would say that they GF4& R300/350 are about the same. - or at least very close, enough to efectively take drivers out of the loop - with regards to the GF4. GFFX is another can of worms ATM. I will admit that replacing drivers is a bit more involved with ATI.... but once you know how it's very painless.....
 
I haven't had problems with any Trident/S3/NV/3dfx or Ati cards (I have owned a lot). I just want IQ. Since that is the case I will stick with my R300 for a long time to come.

Game/rendering related problems on the other hand are different from driver related ones. IE: MSAA in Final Fantasy 7 creates background borders.
Even SSAA doesn't fix it. Disabling AA does. Though who disables AA? ;)

I'm sure that a driver team sometimes can walk around problems, though it can't be FIXED by the driver team.
 
Joe-

For me it's not "period".

So you do not care if you have drivers that work with all your games, that clears up that misunderstanding. I guess I'm in the minority with unwavering demands on that end.

It's also living up to performance expectations bsaed on specs and reviews. And if the p/reviews and specs are so ambiguous so that I don't even know what I SHOULD expect, then there's no way in hell I'm going to buy it.

You have seen and criticized nV's non Cg PS 2.0 shader performance quite a bit, what exactly would the surprise be?

My TNT had to be returned because it had issues with my LX motherboard. Just wouldn't work at all.

I have always blamed issues with vid cards and mobos on the motherboards. This included Intel's decission to drop support for the AGP 1.0 spec completely leaving V5 owners out in the cold.

I disagree. We've had this "right or wrong Aniso" discussion on this board before though. NV2x's aniso is higher quality than R2/3xx. Not more "right".

When you select 8x AF do you always get 8x AF on every texture that needs it with the R200/R300/NV30? No. Do you with the NV2X? Yes.

No. The application developer doesn't "code" the algorithms for anisotropic filtering. He flips a bit and says "hardware, apply aniso". This is unlike what he does for shaders, where he programs the algorithms himself.

And when he flips that bit that states to apply anisotropic he should assume that the board will apply a random level of filtering based on if it thinks it is an important angle or not? I understand the difference you are pointing out, the big difference for me is one matters to me right now in every game I play and the other is a big issue for synthetic demos.

Speed is not irrelevant when choosing how to implement a feature which does not have a specific definition for implementation, like AA or Aniso.

Then Quincunx is OK? I sure as hell never use it. ;)

Of course they should. Especially if performance in 32 bit mode on nVidia's and ATI's cards is not up to par with 3dfx's 16 bit.

I'm talking about the R3x0 and NV3x level parts and comparing them to 3dfx's 16bit. Their 16bit quality still blows, do you care? Should anyone?

There's a distinction between blindly trusting PR, and PR being so elusive / dishonest, that you can't being able to verify their claims.

PR = Ignore. It's really quite simple :)

Geezus, Ben, head on over to NVNews and browse the forums for all sorts of problems. Ditto for Rage 3D.

You'll find 3 types of people on each forum:
1) I can't get this thing to work AT ALL!
2) I have issues X-Y and Z, but everything else is OK
3) I have no problems at all.

Checked through the first five pages at nVNews nV forum and found one person who has an issue with a TNT2 under Linux not working properly, and another thread with someone who has a MSI K7N2 mobo and a Ti4200 who has issues under OpenGL, along with another poster with a like mobo who has the same problem with a R9700Pro. Wait, missed one. A guy running a couple of NFS games and SplinterCell has some flickering textures, he says the board isn't OCd but it sounds like OCing artifacts. I'd go to Rage3D for info on problems with nV hardware with the same level of trust I would going to Intel for problems with AMD hardware. I do read AT daily however, and I am still asking the same question. What problems are there supposed to be? I have a couple hundred games for my PC here, and I have been known to purchase games just to check on supposed issues before. Just tell me what the game is that is supposed to have problems so I can check it. Since their problems are supposed to be comparable to ATi's, just list the game/s.

heck, Try to get Humus's mandelbot demo to run with 32 bit precision on an NV30/3/34. Good luck trying to "work around them."

A demo? I guess if I was interested in the demo scene that would be a viable concern. Maybe that's my issue, focusing on games and not synthetics/demos?

Martrox-

I had far more problems with my GF3 than I've had with my 9500P/9700P/9800P

What problems? Which games with which drivers? I'm honestly asking here.

KILER-

Game/rendering related problems on the other hand are different from driver related ones. IE: MSAA in Final Fantasy 7 creates background borders.

I don't expect IHVs to fix game bugs, just have their drivers working. FF7 was a steaming POS in terms of the quality of its code if you weren't running a 3dfx board(despite being a D3D game upon release).
 
BenSkywalker said:
As OpenGL guy mentioned, the GeForce 2 wasn't one of the boards listed. I said, "MSAA", which the GeForce 2 doesn't support, but the GeForce 3/4/FX allegedly do.
They support it and it works as advertised and as it has. They do not only filter the edges that they deem worthy of being filtered which is what would need to be taking place for it to be comparable to ATi's or nV's NV3X AF.
Except that MSAA on the GeForce 3/4/FX is far lower quality than ATI's implementation. First, the NVIDIA boards lack gamma correction. Second, the nvidia boards use an ordered grid for 4x AA. The edge quality produced by NVIDIA's MSAA technique is far inferior to ATI's. Thus, by your own logic, that casts doubt on whether the NVIDIA boards really support MSAA.

BTW, the "alternative" AF modes on the NV3X chips is of far worse quality than the R300-based chips.

-FUDie
 
BenSkywalker said:
So you do not care if you have drivers that work with all your games, that clears up that misunderstanding. I guess I'm in the minority with unwavering demands on that end.

Um, no. I said that having drivers and hardware that work is not the ONLY thing I care about.

So, you do not care if the card meets your performance expectations? Do you even have any performance expectations?

You have seen and criticized nV's non Cg PS 2.0 shader performance quite a bit, what exactly would the surprise be?

I have NO IDEA at this point what the true performance is with 32 bit shaders on NV30/31/34 vs. NV35. Each driver revision seems to do things differently, and newer driver revisions show performance "boosts" on older PS tests, with quality degredations common, while newer *cough* unoptimized *cough* tests don't seem to show such boosts.

Everybody on this message board is running tests and re-tests with a variety of hardware and drivers and shading apps to try and figure out wtf is going on.

No, I have no idea wat to expect.

My best guess: If a shading game or test becomes "popular" enough, nvidia can create "optimized" drivers for NV3x hardwares that bring DX9 pixel shading performance not quite up to R300 performance, and likely with image quality degredation.

If the app or test is not popular, I'll likely be stuck with abysmal PS 2.0 performance.

I have always blamed issues with vid cards and mobos on the motherboards.

Doesn't matter. I owned that LX MOBO for a year or so, plugged in the TNT, and it didn't work. I expect new products to work with already existing ones.

When you select 8x AF do you always get 8x AF on every texture that needs it with the R200/R300/NV30? No. Do you with the NV2X? Yes.

Using that logic, I guess Parhelias Fragment AA is not real AA then? Of course it is, even though it doesn't apply AA to all "edges that it should."

Heck, is MSAA real AA? It doesn't AA textures, which any "full scene AA" should.

And when he flips that bit that states to apply anisotropic he should assume that the board will apply a random level of filtering based on if it thinks it is an important angle or not?

He should assume that "anisotropic filtering" is applied, by whatever mean the IHV has implemented it. Filtering is just a way to reduce artifacts. It's not something used to "create" the scene / art.

I understand the difference you are pointing out, the big difference for me is one matters to me right now in every game I play and the other is a big issue for synthetic demos.

I again don't understand the tangent you are going off on. We're talking about "truthfulness" and being forward correct?

Has ATI ever not been forward about it's Aniso implementation?

Then Quincunx is OK? I sure as hell never use it. ;)

The CHOICE to have Quincunx mode is certainly is OK. Advertising it as a big PR feature as "4X quality with 2X performance" is not OK.

I'm talking about the R3x0 and NV3x level parts and comparing them to 3dfx's 16bit. Their 16bit quality still blows, do you care? Should anyone?

What relevance does that have at all with our discussion?

PR = Ignore. It's really quite simple :)

Soory, Ben, but when the product doesn't speak for itself (like the current NV3x fiasco...we're all still trying to figure out WTF they are), your only choice is to also use PR statements, interviews, etc as a guide.

The overall point is: I couldn't trust the NV3x architecture further than I could throw it. If nVidia wants me to trust them (which is what this thread is about), nVidia has to come clean about its architecture / performance expectations.

Checked through the first five pages at nVNews nV forum and found one person who has an issue with a TNT2 under Linux not working properly, and another thread with someone who has a MSI K7N2 mobo and a Ti4200 who has issues under OpenGL, along with another poster with a like mobo who has the same problem with a R9700Pro. Wait, missed one....

??

http://www.nvnews.net/vbulletin/forumdisplay.php?s=&forumid=26

I didn't even read the threads, just read the headlines for god's sake. "Blue Screens", FSAA not working, Digital Vibrance issues....

I really don't get what you're trying to say, Ben. Are you really trying to imply that nVidia's drivers don't have as many issues as Catalyst drivers?

hA demo? I guess if I was interested in the demo scene that would be a viable concern. Maybe that's my issue, focusing on games and not synthetics/demos?

You issue is that these synthetic demos give us a clue as to what to expect with upcoming games.

It's all about expectations, Ben.

Presumably you buy a card only for how it plays the games in your current library? No thought whatsoever about how it will handle games you haven't bought yet?
 
You know, perhaps I'm approaching this the wrong way with you, Ben.

There is one single key difference between ATI and nVidia and their level of "trust", that basically speaks volumes.

Beyond3D is a "tier 1" media/review site for ATI.
Beyond3D is not a "tier 1" media/review site for nVidia.

To me, that says, relatively speaking, ATI trusts B3D more than nVidia does.

That, in and of itself, makes ATI more trustworthy to me than nVidia.

So, to answer the subject heading question: Do you know what would go a long way toward me gaining my trust back with nVidia? nVidia putting Beyond3D on the tier 1 list.
 
DaveBaumann said:
Somehow I suspect we've dropped of the tier entirely now... :?
Well, if Nvidia doesnt like B3D, then Nvidiots across the web wont like B3D either. I'ts going to be one of those cases of "B3D is biased and doesnt even matter"....I've already seen it happen in the [H] forums, not that it matters what they think anyways.
 
BenSkywalker said:
I don't expect IHVs to fix game bugs, just have their drivers working.
This is too funny. Do you know how many bugs we've fixed in our drivers that were actually game bugs? Bugs that you'd never know about if you always used the same vendor's cards during development, but you'd notice immediately if you used another's. You may find it hard to believe, but some of the features of your favorite vendor's cards are bugged (i.e. don't follow specs).

Sometimes we can't fix game bugs in our driver, but end users expect us to anyway.
 
ok guys nVidia PR != actual product performance

ATi%20sattire.jpg
 
Joe-

Um, no. I said that having drivers and hardware that work is not the ONLY thing I care about.

I didn't say that either. It is an expectation I have that is verty firm.

So, you do not care if the card meets your performance expectations? Do you even have any performance expectations?

Absolutely, in games that I play. Those are secondary to the games working without crashing or having any major image corruption.

I have NO IDEA at this point what the true performance is with 32 bit shaders on NV30/31/34 vs. NV35.

Why is everyone talking about how slow it is?

Doesn't matter. I owned that LX MOBO for a year or so, plugged in the TNT, and it didn't work. I expect new products to work with already existing ones.

So if a mobo manufacturer makes an out of spec part that doesn't provide the proper voltage and/or wattage to a slot it is the video card manufacturers fault?

Using that logic, I guess Parhelias Fragment AA is not real AA then? Of course it is, even though it doesn't apply AA to all "edges that it should."

Heck, is MSAA real AA? It doesn't AA textures, which any "full scene AA" should.

I don't consider MSAA FSAA, not Matrox's FAA. They are AA, but not FSAA.

I again don't understand the tangent you are going off on. We're talking about "truthfulness" and being forward correct?

Has ATI ever not been forward about it's Aniso implementation?

They claimed they supported AF.

The CHOICE to have Quincunx mode is certainly is OK. Advertising it as a big PR feature as "4X quality with 2X performance" is not OK.

PR = Ignore :)

What relevance does that have at all with our discussion?

It doesn't impact me, I don't see how it is relevant. Right now I have no games that support PS 2.0, so it doesn't impact me now. Of the games that I have seen that use PS 2.0 level shaders and their resultant benches, the NV3X boards perform quite wel, although I highly doubt they were shader limited anyway.

The overall point is: I couldn't trust the NV3x architecture further than I could throw it. If nVidia wants me to trust them (which is what this thread is about), nVidia has to come clean about its architecture / performance expectations.

And ATi has in your view. ATi has one level of precission to 'come clean' about, it is a lot simpler. Do you think nVidia should have simply followed along with what ATi was doing for simplicity's sake?

I didn't even read the threads, just read the headlines for god's sake. "Blue Screens", FSAA not working, Digital Vibrance issues....

One of the blue screen threads was a bug with The Matrix game and the sound card. The other was one that seemed to be fairly random. The FSAA problem- I've run all the drivers they listed that are supposed to be impacted and haven't run in to the issue.

I really don't get what you're trying to say, Ben. Are you really trying to imply that nVidia's drivers don't have as many issues as Catalyst drivers?

Without a doubt.

Presumably you buy a card only for how it plays the games in your current library? No thought whatsoever about how it will handle games you haven't bought yet?

Absolutely not. If a board can't even handle the games I have now properly however, then I have no faith in how it will handle upcoming games.

You issue is that these synthetic demos give us a clue as to what to expect with upcoming games.

Shaders that are optimized for one board, a sort order for back to front rendering and the like and you really think that is the case? Demos are OK for seeing certain features work before they will be used, but I don't think they are a good way to gauge how the features will function in upcoming games. This includes the IHVs own demos. The CubeMap blob that nV used to show off the technique for the original GeForce ran considerably worse then the games that followed using it as a general example.

There is one single key difference between ATI and nVidia and their level of "trust", that basically speaks volumes.

Beyond3D is a "tier 1" media/review site for ATI.
Beyond3D is not a "tier 1" media/review site for nVidia.

To me, that says, relatively speaking, ATI trusts B3D more than nVidia does.

If I was working at ATi I would 'trust' B3D a lot more then if I was working at nVidia. For that matter, if I was working at any other IHV I would 'trust' B3D a lot more then I would if I were working at nVidia. '16bit color is all you need' to '48bit and 64bit aren't enough' despite the former being at a time when there were games that fully supported 32bit and the latter there is none. The anti static T&L campaign where they tried to denounce the technology as it wouldn't be used by developers in a wide spread fashion because VS were so much better(here we are three years later and static acceleration is still the standard), features like Dot3 and CubeMaps being irrelevant because no games used them etc. and then a full switch to PS 2.0 performance is the standard that next gen cards should be measured by. After a long enough period of time, if you work at an IHV and a site is always building up your weakest attribute and claiming your strenghts aren't useful what would you think? If B3D was taking their old stance against features not used in games and saying that PS 2.0 performance was meaningless because no games used them right now I think you would find nVidia a lot more friendly. I'm not saying that it is intentional, and B3D doesn't do shootout reviews so their particular fondness for a given architecture doesn't impact their reviews in a significant fashion but how would you look at it working at an IHV? You want your board to get positive reviews, you send them to a site that is friendly to your style of architecture.
 
A lot of that was before my time, so I'm not going to bother commenting (its sounds incorrect though) however:

features like Dot3 and CubeMaps being irrelevant because no games used them etc. and then a full switch to PS 2.0 performance is the standard that next gen cards should be measured by. After a long enough period of time, if you work at an IHV and a site is always building up your weakest attribute and claiming your strenghts aren't useful what would you think?

This seems like total bullshit to me - we continue to test a range of shaders from PS1.1 to PS2.0, and in fact we've not actually, as yet, use any of the 'pure' PS2.0 tests that have been floating around in the forums or shadermark and the like. In fact the conclusion of our last NVIDIA preview read:

DX8 Pixel Shader Performance - The clockspeed, and possibly the number of shader execution units, means that the pure DX8 class Pixel Shader execution rate is very good in comparison to the low end NV25, which should be a boon to titles that feature Pixel Shaders. The caveat here is that games are not going to exclusively utilise Pixel Shaders in a scene and will still need other elements such as texturing - the Nature test results showed that this pulls the 5600 Ultra back in line with the Ti4200. Also, one of the main selling points of the NV3x series is the DX9 capabilities heralded by the 'CineFX' architecture, however its still a little too early to truly gauge the performance of NV31 here.

I fail to see where this full switch to PS2.0 thing comes from?
 
On the trust issue - both companies should

1) mutually agree what are optimisations vs cheats
2) come fully clean as to what games have cheats vs optimisations applied
3) identify per game how much the cheats lift performance and / or impair image quality

4) On issuing any driver designed to highlight their performance and thereby boost sales - formally declare to institutional investors, shareholders and their external auditors there are no cheats in their software.
 
A lot of that was before my time, so I'm not going to bother commenting (its sounds incorrect though) however:

Weren't you here back then? I know you weren't working for the site, but I thought you were still around. I also knew that no matter how I stated that part, it would be taken poorly.

This seems like total bullshit to me - we continue to test a range of shaders from PS1.1 to PS2.0, and in fact we've not actually, as yet, use any of the 'pure' PS2.0 tests that have been floating around in the forums or shadermark and the like. In fact the conclusion of our last NVIDIA preview read:

Look at the forums and the comments from the people here(including those who pen for the site). PS 2.0 is an obsession on these boards at the moment. With the AF and noise issue resolved all effort seems to have moved to criticizing the weak point that the NV3X has left and of course, anything that could be considered positive is swept under the rug. You may like to think that the forums aren't going to impact an IHV's view on a site, but with the amount of other sites that quote discussions here where they see comments that the members proper of B3D make it will have an impact.

Edit- That last part came out sounding a bit hostile rereading it. I am trying to state that due to the caliber of these particular forums they are considered a part of the site by the general tech community at large, moreso then the other forums. Because of this, no matter how down the line you try to keep your reviews, your comments on the boards will be read and factored in. That is what I was trying to state in my last part.
 
BenSkywalker said:
I didn't say that either. It is an expectation I have that is verty firm.

You said (parphrase) "work out of the box...PERIOD."

Absolutely, in games that I play. Those are secondary to the games working without crashing or having any major image corruption.

Good, then we agree on that.

Why is everyone talking about how slow it is?

Because in some cases, it is, with certain drivers, and with ceratian apps, and if the sky is blue on wednesday.

Why are ther polls on this site about if NV35 is "fixed" or has driver hacks?

So if a mobo manufacturer makes an out of spec part that doesn't provide the proper voltage and/or wattage to a slot it is the video card manufacturers fault?

Completely miss the point.

This is about "pulling it out of the box and it working" right?

I pulled the TNT out of the box, pluggedit in, and it didn't work. Every other AGP card I pulled out did work.

I don't consider MSAA FSAA, not Matrox's FAA. They are AA, but not FSAA.

So when a developer switches the AA bit on, they shouldn't expect polygon intersections to be AA'd?

[ATI] claimed they supported AF.

And they do.

The CHOICE to have Quincunx mode is certainly is OK. Advertising it as a big PR feature as "4X quality with 2X performance" is not OK.

PR = Ignore :)

So, why don't you ignore ATI PR when they say they support Aniso?

ATi has one level of precission to 'come clean' about, it is a lot simpler. Do you think nVidia should have simply followed along with what ATi was doing for simplicity's sake?

No, but they should be UPFRONT about the performance levels of each precision level, and they certainly shouldn't be changing precision on the fly in certain tests, not should they be writing new shaders which don't exactly reproduce intended output. Get it?

If nVidia was "forthcoming" about their performance, there wouldn't be any "shader" cheats or "clipping plane" cheats. They are designed to FOOL ME into thinking thier performance is something it's not.

One of the blue screen threads was a bug with The Matrix game and the sound card. The other was one that seemed to be fairly random. The FSAA problem- I've run all the drivers they listed that are supposed to be impacted and haven't run in to the issue.

It's just wonderful how you attribute all the bugs that others run into as "not nVidia problems", "random", etc.

I really don't get what you're trying to say, Ben. Are you really trying to imply that nVidia's drivers don't have as many issues as Catalyst drivers?

Without a doubt.

Then this is a lost cause.

Presumably you buy a card only for how it plays the games in your current library? No thought whatsoever about how it will handle games you haven't bought yet?

Absolutely not. If a board can't even handle the games I have now properly however, then I have no faith in how it will handle upcoming games.

Just stick with the "nVidia logo" games, and you'll be fine.

Demos are OK for seeing certain features work before they will be used, but I don't think they are a good way to gauge how the features will function in upcoming games.

They are an indication, Ben. And right now, the indication is, unless there are specialized paths / hacks for the NV3x, shader performance is poor compared to R3x0. And if the game gets those hacks, the quality won't be as good.

If I was working at ATi I would 'trust' B3D a lot more then if I was working at nVidia.

I would word that differently. If I was working at nVidia, I would trust they B3D would try its best to find the real truth, so I would trust that they might not come up with the "best light" kind of review.

So, Ben, which 3DGraphcis site do YOU PERSONALLY trust most for truthful analysis?

'16bit color is all you need' to '48bit and 64bit aren't enough' despite the former being at a time when there were games that fully supported 32bit and the latter there is none.

Link to the articles that claim that?

And I thought it was NVIDIA who claimed, despite supporting fp16, that fp24 "wasn't enough?"

The anti static T&L campaign where they tried to denounce the technology as it wouldn't be used by developers in a wide spread fashion because VS were so much better(here we are three years later and static acceleration is still the standard),

Static acceleration the standard? Where did that come from? Static T&L is used, but to what extent over CPU T&L? And given that nVidia was first out with both static T&L and vertex shading....how is this "against nVidia?"

features like Dot3 and CubeMaps being irrelevant because no games used them etc. and then a full switch to PS 2.0 performance is the standard that next gen cards should be measured by.

See Dave's response...bullshit. Do you read the same site that I do?

After a long enough period of time, if you work at an IHV and a site is always building up your weakest attribute and claiming your strenghts aren't useful what would you think?

That perhaps you believe in your own marketing, and not what an honest assesment of the technology might be.

If B3D was taking their old stance against features not used in games and saying that PS 2.0 performance was meaningless because no games used them...

What is this "old stance" you are talking about? Again, their stance on "static T&L"? As if that's an nVidia specific feature? Asif nVidia also wasn't the first part out that had DX8 vertex shaders?

..but how would you look at it working at an IHV?

Like I said above. I'd look at B3D as a site that isn't one to just regurgitte marketing PR, and may actually offer a level of analysis of the features than an IHV may or may not want to hear.

So if I'm an IHV that has aproduct that's more fluff that stuff, I'd be very worried indeed. Which is exactly why I'm less trustful of IHVs that do not "like" B3D.

You want your board to get positive reviews, you send them to a site that is friendly to your style of architecture.

BINGO. But if you want the TRUTH, you send them to a site that is as impartial to "architecture" and marketing spin as possible.

I come to B3D for the truth. Not "positive" reviews. And any IHV that is "afraid" to give support to B3D is doing so because they fear the truth might actually come out, rather than the "positive spin."

Again, I ask you, what 3D site do YOU trust most for an honest assesment of the hardware, vs. just spitting out marketing gcrap?
 
Ill weigh in and state that I think Ben does have at least a few valid points.

The apparent flip flop between forward looking features that are indespensible, and ones that aren't useful because their too slow for now. He brings up a few examples, though poorly worded and densly formatted making them difficult to discern.
-Beyond3d was against T&L when it was V5 vs. GF256. The article was very slanted; I also remember some of the information being just plain wrong/misleading, but very vigorously defended. A bit of 'drinking the 3dfx coolaid', as it were. (downplaying advantage of NVIDIA.)
-But, on the other hand, here we are advocating PS/VS2.0 when it is ATI vs. NV. Definately not drinking the NVIDIA coolaid. (emphasising disadvantage of NVIDIA.)
-I don't remember this exactly, though he's suggesting that DOT3, etc were poo-poo'd or downplayed because the V5 didn't have them, not many games used it, and the rampage super duper texture computer was about to come out. Again, drinking the 3dfx coolaid (downplaying advantage of NVIDIA.)
-But PS1.4 (which enabled dependant texture lookups in 1 pass) was very important. (I'm not sure this was a typical Beyond3d stance, and I'm not sure I've got the dependant texture thing right)
-32 bits color was not important because it was too slow, etc. (Downplaying advantage)
-48 bit or 64 bit color isn't enough. We need, coincidentally, 96 bit color, but we don't need 128 bit color. (Emphasising disadvantage)


Not that I agree with everything he says, but there is a case for Beyond3d seems to adopt stances that are "against NVIDIA", either lambasting them for not having a feature or for being slower, or downplaying the feature advantage they have as unimportant for now. I can see how the history of such would be unhelpful to garnering the trust of a company.

There is an undeniable pattern. You can come up with reasonable arguments for each of the stances (except the T&L one, in my opinion), but you can't deny there is a pattern--and it seems to be on the surface inconsistant.

Of course, they also probably don't want to talk to you because you're in like flynn with Futuremark, who is their major pain in the behind right now.

I don't suggest you kiss ass to become a tier 1, but perhaps a bit of navel contemplation over the past might be useful and enlightening.
 
Back
Top