NVIDIA -- How to regain your trust

Joe-

On the other hand, I have MAJOR trust issues when it comes to Nvidia FX series shader performance.

What trust issue exactly? What they are outputting compared to what is called for?

So, what happens if due to all the "lies, damn lies", you have no idea what you should expect to get with your money? That's what's at issue here.

And how does this differ from the norm? ATi's claims that the R200 core boards supported AF I find a far greater use of misinformation then what nV is doing with their shaders. There the IQ gap was staggering, no need for close ups. I understand perfectly what you are saying, I simply find things like drivers that work with all games to be of much greater importance. Who is misleading consumers more? I guess calling driver issues with games misleading implies that you work under the assumption that the drivers for your board will work without issue, which is something that I do so perhaps I'm in error on that point but I find it far more important then PS 2.0 levels of precission.

How I define "trust" in a company is similar to your definition. However, it's not only getting what you expect that's important....it's having a high degree of confidence in what you expect to get before you get it.

I agree with that. Which company is supposedly truly forthcoming?
 
In an age where companies seek greater means to protect intellectual property and being litigious seems to be the norm, I seriously doubt you will find anyone 'truly forthcoming'. However I do not see a campaign of misinformation and conspiracy theories as a step in the right direction. Maybe a swift kick in the wallet will straighten Nvidia out and maybe if their drivers rendered the scenes properly we could see the 2nd shooter on the grassy knoll.
 
BRiT said:
I hope that clarifies my view. If not let me know what points I'm not presenting well enough.

It does, and in a lot of ways that attitude makes sense. It's not the one I choose to take, although maybe it would be if I were interested more as a potential consumer than as someone with an interest in the technology; that is, it might be the more accurate attitude to take when one's money is at stake.

To be honest I suspected I was somewhat caricaturing your argument in the first place, and now that you have clarified it, I can say that was definitely the case. But even if your views don't embody it, I was trying to make the point that the graphics world is full of fair weather fanboys, to the extent where Nvidia might actually be better off having navigated their recent off period with lies, cheats, bluster and vapor, than if they had taken a more ethical and dignified response to their probably-temporary less-competitive position.

Sorry if I implied that you were an immoral evil person along the way. ;)
 
BenSkywalker said:
I agree with that. Which company is supposedly truly forthcoming?

Before I address the rest of your post...please tell me you are being sarcastic?

Do you think there is no difference in which company is more forthcoming (ATI and nVidia):

1) Depiction of Hardware Capability (Why is everyone comfortble with R3x0 performance, and there are new threads every day with respect to pipelines and precision and different driver sets on nVidia hardware?)

2) Response to FutureMark's documentation of driver detection optimizations?

To be clear, I did not claim that ANY company is "truly, 100% forthcoming", if that's what you think I said. There is, however, a world of difference between ATI and the R300 core, and nVidia and the NV3x core.
 
1. nVidia were to concentrate on making GREAT HARDWARE instead of GREAT DRIVER HACKS.

2. nVidia were to quit trying to make their competitors look bad by releasing things like the Kyro II PDF or the Quack.exe tool kit. They are definately making themselves look bad in the GFX business.

You see when i switched over from S3 cards to ATi cards, I did not care about the speed, I cared about what I saw on the screen and definately cared about how the 2D looked at higher resolutions. That ruled out any chance of getting an nVidia product.. Kinda sad that a savage 3D/4/2K has better IQ than any GForce 256 or GF2 card on the market....

Who cares if you can render quake III at 300 FPS if the IQ SUCKS?
 
Benskywalker,

It would simply take to much time to really address all the Spin and Misinformation you just posted in your last few posts. Suffice it to say I am hard pressed to find one clear honest statement from anything you put in this thread.

I wont even Try to address the issues of Nvidias Trstworthyness. Becuase Frankly they have gotten away with murder for a long time now. It is very clear to me that they have a history of Hacking IQ for speed, just as guilty if not more so of Dropping frames, using underhanded tactics against competitors (Kyro doc, Quak) and outrigtht lying and paying people off to coverup their dirty work.

For someone to try to say that the R300 did not do AF in the face of Quincunx, Hacked mipmap levels and other issues simply demonstrates the nature of the problem we have today. These are not the kind of issues that are even being discussed here. Every single manufacturer has little differences in their approach to things and what they offer. None of them are an issue of Cheating, or what the topic of this thread is about. Yet there is a Ready army of people at the ready to muddy the water so much than anything resebleing the real issue is lost.

It is my personal Opinion that Nvidia is one of the most Preditory companies around. Preditory to a fault. This is at the core of some of the very Questionable things they have done throughout their history, including what they are currently trying to do to Futuremark. The only real Cure to this is a complete Gutting of their Senior leadership from the CEO down. They need new blood. Just like ATi needed a Good Shakeup recently and new infusion of leadership and technology.
 
Before I address the rest of your post...please tell me you are being sarcastic?

Why do you think that?

Do you think there is no difference in which company is more forthcoming (ATI and nVidia):

I think there are differences but more based on what level they have for which issues.

1) Depiction of Hardware Capability (Why is everyone comfortble with R3x0 performance, and there are new threads every day with respect to pipelines and precision and different driver sets on nVidia hardware?)

And this is an area that ATi is more honest about.

2) Response to FutureMark's documentation of driver detection optimizations?

Another area where ATi has an edge.

There is, however, a world of difference between ATI and the R300 core, and nVidia and the NV3x core.

What about ATi's issues? Do you think it is honest to say that the R200 core boards support AF?

Why can't ATi admit to the issues with improper filtering on their R300 based boards? Why is it that thousands of users suffer from rolling lines unless they use the DVI>VGA adaptor? This is for built by ATi boards btw.

Why is ATi optimizing to increase their scores in benches at all when they still have numerous issues with in game bugs?

Which issues effect end users more in terms of companies being truly forthcoming? Based on the information that people are looking for on this board for the most part that would be ATi, but what about to the typical end user?

Hellbinder-

It is very clear to me that they have a history of Hacking IQ for speed

Compared to who?
 
BenSkywalker said:
What about ATi's issues? Do you think it is honest to say that the R200 core boards support AF?
Doesn't enabling AF on the R200 improve image quality? Sure, it's different from nvidia's implementation, but it still works well most of the time.
Why can't ATi admit to the issues with improper filtering on their R300 based boards?
Improper filtering where? I don't see any such problems.
Why is it that thousands of users suffer from rolling lines unless they use the DVI>VGA adaptor? This is for built by ATi boards btw.
Thousands? Where'd you get that number?
Why is ATi optimizing to increase their scores in benches at all when they still have numerous issues with in game bugs?
What game bugs are you referring to? nvidia (and every other IHV) has bugs too, yet they continue to optimize their drivers. Why do you think that is?
 
BenSkywalker said:
What about ATi's issues? Do you think it is honest to say that the R200 core boards support AF?
Do you think it's honest to say that the GeForce 3/4/FX support MSAA? 2x lacks gamma correction, 4x lacks gamma correction and is on an ordered grid. Both of these modes look far worse than comparable modes on the R300-based boards. NVIDIA has made no attempt to improve these MSAA modes in 3 or 4 generations of chips, yet ATI has improved the AF quality of their chips.

-FUDie
 
Doesn't enabling AF on the R200 improve image quality?

No. Degrades rather seriously I would say. Far too much aliasing introduced with the R200 core boards.

Improper filtering where? I don't see any such problems.

Didn't clarify on that engouh, although the context you are likely thinking of is another valid one(although the FX isn't up to par either, at least its fairly constant no matter what angle). Power/signal line noise filtering is what I was talking about.

Thousands? Where'd you get that number?

Unless every single one of the people who have the problem in the entire world post at AT or R3D then I would say it is easily within the thousands range.

What game bugs are you referring to?

With the optimizations being present in the Cat3.2s and the list of bug fixes in the Cat 3.4s, there appear to be a rather sizeable amount.

nvidia (and every other IHV) has bugs too, yet they continue to optimize their drivers. Why do you think that is?

I've seen a couple of those, all fixed either through the game cfg files or using RivaTuner. If they were something I saw I would certainly mention it(as I did when I ran in to the SS 'no clip' style artifacts).

Do you think it's honest to say that the GeForce 3/4/FX support MSAA? 2x lacks gamma correction, 4x lacks gamma correction and is on an ordered grid.

Fire up a flight sim with AF cranked on any R200 core board and do a roll. This is a completely different order of magnitude. For the particular examples you are bringing up, take some screenshots in CounterStrike Italy map and compare the R300 running 6x AA to a GF2 running 4x.
 
Ben, I'm not sure where you are trying to deflect this argument to.

Different vendors have different implementations of AA and AF. Any debate about how effective any individual implementation is mostly irrlelevant.

It's how forthcoming and honest the companies are about their "implementations" that's relevant. I'd say both companies have been pretty similar with respect to being "forthcoming" about their implementations. Neither of them disclose the details of how they do things.

Quincunx, IMO, is a much more "deceptive" PR campaign (4X quality at 2X performance!), than ATI's aniso.

And ATI actually documenting their fixed and known remaining bugs with driver releases is a bad thing? This is another example of exactly how they are more forthcoming than nVidia.
 
Ben, like many "supporters of nVidia" <trying to be nice ;) > interests are not in being objective...... he would rather complain about how "bad" ATI is than admit what nVidia is/has done.......
 
1) Sack some of the 'middle' level manament guys that the company got in a hury during the rapid growth period. A company needs them during those times, but afterwards they tend to change things around in a way that mainly supports themselves staying in a now reduntant job.

2) Sack most if not all of their PR team - especially the responsably chaps (sorry Rev.!)

3) Make the new PR team issue a decent (it doesn't even have to be totally true, just professional) statement on the 3Dmark03 fiasco.

4) Get the driver team back on course.

5) Make a kick ass NV40/NV41

Dunno, maybe the management should just lay off the pot and arrogant stance and remember where they came from. Sometimes it's really simple, nVidia. ;)
 
BenSkywalker said:
Doesn't enabling AF on the R200 improve image quality?

No. Degrades rather seriously I would say. Far too much aliasing introduced with the R200 core boards.
It never bothered me.
Improper filtering where? I don't see any such problems.

Didn't clarify on that engouh, although the context you are likely thinking of is another valid one(although the FX isn't up to par either, at least its fairly constant no matter what angle). Power/signal line noise filtering is what I was talking about.
I haven't experienced this so can't comment.
Thousands? Where'd you get that number?

Unless every single one of the people who have the problem in the entire world post at AT or R3D then I would say it is easily within the thousands range.
Or maybe there is a good percentage of the people posting on Rage3D; you don't know either way.
What game bugs are you referring to?
With the optimizations being present in the Cat3.2s and the list of bug fixes in the Cat 3.4s, there appear to be a rather sizeable amount.
So we shouldn't fix bugs? We shouldn't tell you what bugs we've fixed? What?
nvidia (and every other IHV) has bugs too, yet they continue to optimize their drivers. Why do you think that is?
I've seen a couple of those, all fixed either through the game cfg files or using RivaTuner. If they were something I saw I would certainly mention it(as I did when I ran in to the SS 'no clip' style artifacts).
I see. You found a workaround for the bug so that makes it less of a bug. Many of the bugs people have found with ATI products have workarounds too, so you better take those off your list as well.
Do you think it's honest to say that the GeForce 3/4/FX support MSAA? 2x lacks gamma correction, 4x lacks gamma correction and is on an ordered grid.
Fire up a flight sim with AF cranked on any R200 core board and do a roll. This is a completely different order of magnitude. For the particular examples you are bringing up, take some screenshots in CounterStrike Italy map and compare the R300 running 6x AA to a GF2 running 4x.
Did I even mention the GeForce 2? No (I even added bold to my quote above so you can see). Why? Because it lacks MSAA. The GeForce 2 lacks AF (except for the mostly worthless "2x aniso") too. Why bring it up? Is there some reason why you didn't answer my question?
 
man of man,


anyone want to crank up their GF1 to GF3 cards to oh lets say 1600*1200?


can you even read the text without getting a headache.

ANSIO on R300 looks just as GOOD as nVidia's approach without the I am going to kill performance part

FSAA ATi's 6X rotated multisample LOOKS WAY BETTER than nVidia's 8X supersampled AA and performs alomost 300% faster.
 
Joe-

Ben, I'm not sure where you are trying to deflect this argument to.

I am not trying to deflect or argue, I'm pointing out that how you judge a particular IHV's 'honesty' depends greatly on perspective.

Different vendors have different implementations of AA and AF. Any debate about how effective any individual implementation is mostly irrlelevant.

Using adaptive AF is OK, but using adaptive shader code isn't is what you are saying?

Quincunx, IMO, is a much more "deceptive" PR campaign (4X quality at 2X performance!), than ATI's aniso.

If nV didn't have anything better then Quincunx I'd agree.

And ATI actually documenting their fixed and known remaining bugs with driver releases is a bad thing?

There was an attempt to imply that ATi didn't have many game bugs in their drivers that were optimized for 3DMark2K3, while they had a lengthy list of bugs that were present. I said nothing in any way to indicate that them listing bugs was a bad thing, that is something you would have to stretch hard to read.

Martrox-

Ben, like many "supporters of nVidia" <trying to be nice > interests are not in being objective...... he would rather complain about how "bad" ATI is than admit what nVidia is/has done.......

nVidia cheated in 3DMark2K3, they also degraded the AF quality moving from NV2X to NV3X. Their NV30 generates way too much heat and their early drivers for it had serious issues in regards to AF quality and performance. By placing too much faith in TSMC they screwed up the launch of all the NV3X boards that have shipped to date and they also underestimated the competition in the mid range market forcing them to revise one of their mainstream volume parts(the 5600). Then there was the whole TNT debacle, as fast as a V2 SLI setup while they weren't able to get their clock speed close to as fast as they aimed for and ended up falling well short of their boasts until the TNT2 shipped.

OGL-

Or maybe there is a good percentage of the people posting on Rage3D; you don't know either way.

Between AT and R3D there are hundreds of users who have the issue.

I see. You found a workaround for the bug so that makes it less of a bug.

In terms of how it impacts me, which is what I've been sayig all along, absolutely. I spent a month looking for workarounds to some of the bugs in ATi's drivers and couldn't for all of them. Those that I was able to find workarounds for I don't continue to list.

Did I even mention the GeForce 2?

That was a response to FUDie, not you.

YEMM-

anyone want to crank up their GF1 to GF3 cards to oh lets say 1600*1200?

can you even read the text without getting a headache.

My Gainward GF2Pro450 had easily superior 2D quality compared to my built by ATi Radeon 9500Pro, my Herc GF1 DDR edged it out even(though it was far from great, wasn't as bad as the R300 core). Without the DVI>VGA adapter, which was needed to get rid of the rolling lines, the R9500Pro was almost as crisp and clear as the Gainward in 2D, but not quite. Unfortunately due to the rolling lines the adapter was needed which reduced 2D quality down to the poorest I've had since my Diamond Viper V550(that includes a V3 and Herc Kyro2 also). Don't get me wrong, there are plenty of nVidia based boards that have horrendous 2D quality. The BFG GF4 Ti board I have now is easily inferior to the Gainward it replaces on that front, and also worse then the ATi board without adaptor(but not with it).

FSAA ATi's 6X rotated multisample LOOKS WAY BETTER than nVidia's 8X supersampled AA and performs alomost 300% faster.

The problem is no SuperSampling option for older games, which nVidia has. For current titles I'd agree with you, the difference is comparable to how much better the GF4 Ti's AF is then the R300s(although obviously the speed of the R300 is staggeringly faster).
 
BenSkywalker said:
I am not trying to deflect or argue, I'm pointing out that how you judge a particular IHV's 'honesty' depends greatly on perspective.

And I'm still trying to find out from what "perspective" you can see nVidia being more "truthful / forthcoming" than ATI. I've listed several cases in which ATI was more forthcoming (which you agreed with), and other cases seem about equal.

Using adaptive AF is OK, but using adaptive shader code isn't is what you are saying?

What is "adaptive shader code?" Creating output that differs from the one the developer intended, by changing the shader algotighm behind his back, is not "adaptive". Its deceptive, and not forthcoming.

Like AA, Aniso filtering has different implementations. To say or imply that one's aniso implementation is "wrong" because it is of lower quality, then you'd have to say that nVidia's AA is "wrong."

Or I could say that nVidia's aniso was wrong, because it was so "slow".

You can't have your cake and eat it too.

Quincunx, IMO, is a much more "deceptive" PR campaign (4X quality at 2X performance!), than ATI's aniso.

If nV didn't have anything better then Quincunx I'd agree.

Why does that matter?

There was an attempt to imply that ATi didn't have many game bugs in their drivers that were optimized for 3DMark2K3, while they had a lengthy list of bugs that were present. I said nothing in any way to indicate that them listing bugs was a bad thing, that is something you would have to stretch hard to read.

:?:

Where was such an implication made? Talk about reading into things. The only implication I see being made is from you: your implication that ATI has more bugs than nVidia. (Because ATI has a bug list, and "all the bugs that you know of in nVidia's drivers can be worked around.) Talk about a stretch...
 
BenSkywalker said:
Do you think it's honest to say that the GeForce 3/4/FX support MSAA? 2x lacks gamma correction, 4x lacks gamma correction and is on an ordered grid.
Fire up a flight sim with AF cranked on any R200 core board and do a roll. This is a completely different order of magnitude. For the particular examples you are bringing up, take some screenshots in CounterStrike Italy map and compare the R300 running 6x AA to a GF2 running 4x.
As OpenGL guy mentioned, the GeForce 2 wasn't one of the boards listed. I said, "MSAA", which the GeForce 2 doesn't support, but the GeForce 3/4/FX allegedly do.

-FUDie
 
Back
Top