nVidia's cheating: Dumbing it down.

digitalwanderer said:
Ok, here's a good question: How do you explain the whole nVidia situation to a non-technically minded person, or in this case to a very technically minded 6 year old?

Surely you've been a Dad long enough to know that whenever you get asked a difficult to explain question, the answer is always 'Ask your Mom'? ;)
 
Nick said:
andypski said:
Either way ATI is not 'getting away with' anything here, so I have no idea what point you are trying to make.
They advertise 128-bit color precision everywhere...

Ah, now that's a more appropriate case for you to make in terms of possible confusion than some blanket statement about programmers asking for something that is beyond the specification. I can agree with you that there may have been some confusion in this case.
 
Nick said:
zeckensack said:
If it is like you say, that all this stuff doesn't matter, can you explain why we're seeing a different market distribution? Please?

It's quite simple, I'd explain it, but I'd like to hear it from you :p
SiS can't put the same numbers on the boxes...
They can put all the fluff they want on their retail boxes. What's missing from the equation are benchmarks. Benchmarks do matter to differentiate products. And benchmarks are cheated on.

Kinda contradicts this:
Nick said:
But as far as I can recall, this thread was about John Doe who generally doesn't spend more than 100 € on a graphics card. For him the difference between the cards is not that important and the cheating doesn't influence him much.

And what about the people that never see a retail box, like the cases you mentioned yourself? Whether they are contempt with going exclusively 2D or not, sooner or later they'll notice they got a "Radeon" or "Geforce" or "Xabre" or whatever. Why do you think OEMs put MX420s into boxes instead of Xabre80s? Simple, "Geforce" gives Johns that warm and fuzzy feeling that it'll be fine, they want to read that word somewhere. Image. Brand awareness *shudders again*

Those trends are subject to market shifts in the high end, as evidenced by a lot of Radeon 9000 equipped low end machines popping up late last year. SiS/XGI will never make it into OEM boxes without a high end heavy hitter.

These are only my own humble observations, but I hope they sound reasonable.
 
Nick said:
Well, let's take my twin brother as an example. He's not into graphics programming, but he does study computer science. When I had bought my Radeon 9000 I was a bit dissapointed. A month later a lot better deals were available, but my brother bought exactly the same card, even against my advice and after reading a few reviews. The only reason he chose that card was because it was better than his TNT 2 and he had seen on my computer what it was capable of. In other words, he thought it looked "okay" and didn't want to take the risk. So he spend more money than for the better alternatives. Guess what, he doesn't care and hasn't regretted it for a second. It runs the things he wants to do fine and as expected. I'm very positive a lot of people who are not interested in maximum performance think the same way.
Well, being able to try a product before buying it can certainly be a powerful factor affecting purchase decisions. Obviously if you can see a card running and are happy with the performance, you may be compelled to buy that one instead of a different one that has good reviews but that you can't try. If you ask a knowledgeable friend, chances are they either already own the card they recommend, or they own a card that's out of the naive user's budget. I wouldn't say your example, where you own a mainstream card but recommend a different one, is the most common scenario. And reviews of the Radeon 9000 were generally positive, so that would only have reinforced your brother's decision. Remember the 9000 still outperforms the 5200 in many games, and the minimal DX9 support of the 5200 shouldn't really be a purchase factor unless you're a graphics programmer.
So for this price class, which is the biggest, it's pointless trying to convince anyone that one or the other manufacturer is worse.
Correct. But there are still a lot of people in the market for this class of product that WILL read reviews and get advice from friends, and they CAN be swayed to choose one manufacturer over another.

And for someone who does know that a DirectX 9 card is highly recommended for future games, ATI looks like a rip-off! So it's all very subjective.
Huh? How does a DX8 card look more like a rip-off than a so-called DX9 card that is far too slow to run DX9 apps? At least with the ATI card, you have no illusions about what you are getting. The FX 5200 is much more likely to fall below your expectations.
Well, many people ask my brother what graphics card to buy. And my brother has also assembled a lot of flawlessly working systems for friends and family. They don't care whether he puts an Nvidia or ATI card in it, or why they pay 50 € more than what they can get in a supermarket. They ask for quality and they get it. And I'm sure they feel warm and fuzzy inside about the graphics card because their upgrade demands are never about the graphics card.
Sure. But remember, your brother did still have to choose a graphics card, and his choice directly affected the choice of card for his friends and family. So even if your brother heard from a friend who talked to a friend who read in a magazine somewhere that the GeForce is a bad card because it cheats in benchmarks, then that one magazine's opinion has influenced the decisions of a whole pile of people who necessarily know anything about graphics cards.
Now matter how much you want it, you can't educate everyone about graphics theory and the chip and card manufaturers. They won't listen. As long as the stuff works and stays competitive, advertisement and marketing strategy have a huge influence on sales for this category.
Of course they do. But marketing and advertising do not necessarily equal trickery. Marketing is about convincing people that they should buy your product, and typically involves conveying the benefits of the product. Advertising is simply about conveying marketing messages to the widest possible audience.

Just an example: a Radeon 9200 is worse than an 9100, which is worse than an 9000, which is worse than an 8500 (there are exceptions of course). These cards are nowadays in the same price category but nearly everyone will buy the 9200 because the number is higher. So, for this example, ATI is again a complete rip-off. Don't get me wrong, I don't mean to bash ATI here. I just want to show that ATI isn't holy either. And Nvidia did an equally dirty trick with Geforce 4 MX vs. GF3 and GF2.

No trickery involved you say? Nobody asked for newer product lines that cost the same but perform worse, but they sell like candy and I don't hear too many people complaining...
I agree that the naming schemes from ATI and Nvidia have become very confusing. In your example, your determinations of which products are "worse" appear to be subjective. The 9000 has fewer texture units and vertex engines than the 8500, but they are more efficient. In some cases (specifically DX8 shader-heavy apps) it's faster, in some cases it's slower. The 9100 is identical to the 8500, so it can't be worse than either chip. The 9200 is a 9000 with AGP 8x, so it can't be worse than a 9000. The 9000 & 9200 also support Fullstream technology, but lack hardware TRUFORM support. Hopefully it's obvious why there is no simple way to name these products. In any case, given that all of them have very similar price, features and performance, I don't see how any one of these products could really be considered a rip-off relative to the others. On the other hand, a GeForce 4 MX that has less performance and than a GeForce 2 and less features than a GeForce 3, is another story...
If it didn't gain them anything I don't think they would be doing it over and over again. Currently a lot of reviews are just focussing Nvidia's tricks once they went one step too far to make their FX products look better than they are. Meanwhile ATI is getting away with every trick of their own. If the FX products were significally better than the Radeon 9700 range, nobody would be bitching about the trilinear approximations. They would even question why ATI isn't using it to increase performance...
What tricks are ATI getting away with? So far it's been identified they were doing something with the shaders in 3DMark, and they promptly removed that from their driver (resulting in a lower score). They also don't apply AF to all textures if it's selected in the control panel, but they give you a way to disable this optimization if you don't like it. As far as I know, that's it. Nvidia is still cheating in 3DMark03 with clip planes and shader replacements, still cheating in UT2003 with forced bilinear filtering, still cheating in Shadermark and CodeCreatures and lord knows where else. And the result is that they appear to outperform ATI in benchmarks, but further tests have shown that they are slower in most actual games. It's going to be up to reviewers and customers to decide if they get away with it or not.
What makes you so sure you have 256 MB when it's mentioned on the box? Unless you are the memory manufacturer or the driver developer you have no sure way of knowing what you have.
You can't be sure. That's why you trust friends and reviewers to tell you if you're being lied to. There may be no tangible performance difference between a 256MB and 128MB card, but if a company tried to pass off one as the other, you'd certainly have grounds for a lawsuit.
And as long as you don't own all those cards yourself there is no sure way of knowing what benchmarks are fully correct. And not everybody is willing to read the details of a hundred reviews just to find out who is using the right testing methods. Lately I've been seeing a lot of review conclusions about ATI being the best buy, but when I look at the detailed graphs I sometimes see Nvidia scoring a lot better at higher anti-aliasing. If that's what I'm looking after, the review's conclusion wasn't very helpful. Also, personally I don't care if ATI scores 300 FPS and Nvidia 'only' 200 FPS at low resolutions or without anti-aliasing.

It's all about expectations. I've seen a lot of people whine about 5 FPS, for Nvidia as well as ATI cards.
I'm not disagreeing with any of that. But if all of your favorite reviewers start saying that something was wrong with Nvidia's benchmarks, there's a good chance you're going to believe them. The geeks like us might analyze their methods, but most other people will just trust them.
Unfortunately reviewers make mistakes too. A lot. Some of them even claim Nvidia is not doing trilinear filtering at all. But these people don't understand what trilinear filtering is and can't tell the difference between an approximation and bilinear. Or the 24-bit FP vs. 32-bit FP ATI gets away with. I know it's not a DirectX 9 specification but what if programmers ask for it? And what they don't talk about is the banding on some of ATI's mipmap transitions and worse anisotropic filtering for diagonal directions. And what they mostly show is a floor with a texture that is not running diagonally. And a static image can have a slightly different mipmap bias so the image looks sharper and the mipmap transitions further. Not to mention you need a well calibrated monitor gamma to see things correctly...
Of course reviewers make mistakes, that's how Nvidia was allowed to get away with all of their cheats in the first place! No one discovered them until after they reviewed the original product. The ATI limitations you mention have all been known for some time and discussed at length. ATI doesn't "get away with" 24-bit FP because this is what is stated in the DX9 spec. The angle-dependence of AF has been around since the 8500 and was significantly improved in the R3xx series. The mipmap/LOD bias and gamma issues apply to tests on any card. The difference with Nvidia's cheats are that they affect only common benchmark applications, and in many cases they reduce quality below the level of earlier generations of products, and they can't be disabled!
Yes it's quite pointless. Technology will drive itself. After Intel's defeat by AMD for the 1 GHz race, it's failure with the 1133 MHz chip and the dissapointing Wilamette, it regained the crown again. But it's pointless to talk about AMD's glory nowadays. Things are a bit different on the graphics card market but you could very well forget all about Nvidia's cheating in a year if their next generation of cards is succesful. All I want to say is, don't shut your eyes for Nvidia because they had one dissapointing product range.
I don't think anyone is proposing that no one should consider purchasing any future Nvidia product because of the problems with their current line-up. But if no one bothered discussing or complaining about their problems, what motivation would they have to correct them as soon as possible? Do you think it was coincidence that Intel really cranked up the speed of their product releases after they started facing serious pressure from AMD and their customers?
Ok, I'll repeat it once more to avoid misunderstandings: I don't favor Nvidia (nor ATI). I might have defended it a little in this post, but my main point is that they still produce excellent products especially for the low-end market so trying to convince average people of their 'evilness' can make you look like you're sponsored by ATI.

But after all, time will tell...
My point is that none of Nvidia's products, even their low-end products, are actually as good as people were originally led to believe. You don't have to be sponsored by ATI to feel that way. If you're a sports writer and a Yankees fan who thinks the Mets are evil, does that imply that you are being sponsored by the Yankees?
 
GraphixViolence said:
And for someone who does know that a DirectX 9 card is highly recommended for future games, ATI looks like a rip-off! So it's all very subjective.
Huh? How does a DX8 card look more like a rip-off than a so-called DX9 card that is far too slow to run DX9 apps? At least with the ATI card, you have no illusions about what you are getting. The FX 5200 is much more likely to fall below your expectations.
I think what he was trying to say that it "looks like" a rip-off regardless of the merits of the two cards.

Now matter how much you want it, you can't educate everyone about graphics theory and the chip and card manufaturers. They won't listen. As long as the stuff works and stays competitive, advertisement and marketing strategy have a huge influence on sales for this category.
Of course they do. But marketing and advertising do not necessarily equal trickery. Marketing is about convincing people that they should buy your product, and typically involves conveying the benefits of the product. Advertising is simply about conveying marketing messages to the widest possible audience.

I would like to "market" and "advertise" a bridge in Brooklyn... :LOL:

Just an example: a Radeon 9200 is worse than an 9100, which is worse than an 9000, which is worse than an 8500 (there are exceptions of course). These cards are nowadays in the same price category but nearly everyone will buy the 9200 because the number is higher. So, for this example, ATI is again a complete rip-off. Don't get me wrong, I don't mean to bash ATI here. I just want to show that ATI isn't holy either. And Nvidia did an equally dirty trick with Geforce 4 MX vs. GF3 and GF2.

No trickery involved you say? Nobody asked for newer product lines that cost the same but perform worse, but they sell like candy and I don't hear too many people complaining...
I agree that the naming schemes from ATI and Nvidia have become very confusing. [ATI-propaganda snipped]

So, when ATI does it, it's just "confusing"?

If it didn't gain them anything I don't think they would be doing it over and over again. Currently a lot of reviews are just focussing Nvidia's tricks once they went one step too far to make their FX products look better than they are. Meanwhile ATI is getting away with every trick of their own. If the FX products were significally better than the Radeon 9700 range, nobody would be bitching about the trilinear approximations. They would even question why ATI isn't using it to increase performance...
What tricks are ATI getting away with? So far it's been identified they were doing something with the shaders in 3DMark, and they promptly removed that from their driver (resulting in a lower score). They also don't apply AF to all textures if it's selected in the control panel, but they give you a way to disable this optimization if you don't like it. As far as I know, that's it. Nvidia is still cheating in 3DMark03 with clip planes and shader replacements, still cheating in UT2003 with forced bilinear filtering, still cheating in Shadermark and CodeCreatures and lord knows where else.

Why does Nvidia merit a "lord knows what else"? ATI brought us 'Quack', they only filter one texture, they cheated in 3dMark03, and "lord knows what else". The point is that both of these are businesses that will do whatever is necessary to get the upper hand (hopefully) within the bounds of the law (then again the CEO of the Sacred Most Holy Church of ATI is currently being investigated for what seems like obvious insider trading - but I guess that's not "cheating" in your book).

You, and others on this forum remind me of people who are rabid fans of a team, as if their personal self-esteem and self-worth is tied to how well their team does. Sure, it's fun to root for a team, and argue the relative merits of two competing teams, but after you've shouted yourself hoarse, and downed more than your fair share of beers, you go home and get back to your life. But then again for some of us that's all there is?

Bottom Line: Don't be naive about corporations. Their goal is to make money, tons of it, not to be morally superior.

If you're a sports writer and a Yankees fan who thinks the Mets are evil, does that imply that you are being sponsored by the Yankees?

See above.
 
above3d said:
Bottom Line: Don't be naive about corporations. Their goal is to make money, tons of it, not to be morally superior.

Which is exactly why we should watch what they're doing and try to ensure unacceptable behaviour isn't rewarded.

Entropy

PS. Being the co-founder of a small corporation, let me just add that you can actually have as a businessmodel to supply something that is useful at a cost that is mutually beneficial. Not all corporations turn into the communist charicature of private business.
 
Why does Nvidia merit a "lord knows what else"? ATI brought us 'Quack', they only filter one texture, they cheated in 3dMark03, and "lord knows what else".

Well, why not? They've been seen to be doing a hell of a lot more so far (70+ detections in Antidetect), and its fairly evident that its been happening for quite some time (the 3DMark increases when 9700 was announced is a classic example). And, hey, ATI only implemented the filtering after NVIDIA were dolling out 5800's with non-Trilear filtering enabled. As for 3DMark03, at least they managed to "cheat" without reducing any IQ and then subsequently owned up to it and removed them - much more than NVIDIA have done so far. So far we've seen NVIDIA spoon feed press on their new "optimisation guidelines" only to completely flout them when it suites them to do so.

You, and others on this forum remind me of people who are rabid fans of a team, as if their personal self-esteem and self-worth is tied to how well their team does. Sure, it's fun to root for a team, and argue the relative merits of two competing teams, but after you've shouted yourself hoarse, and downed more than your fair share of beers, you go home and get back to your life. But then again for some of us that's all there is?

And you remind me of the embittered supporter who's team is loosing and is forced to dish out these worthless platitudes. Otherwise, if they want to waste their time discussing in this fashion on forums, what care is it of yours?
 
Above3d:

You claim to have "snipped ATI propaganda" when all you really did was eliminate part of the post that proved Nick wrong!

Just an example: a Radeon 9200 is worse than an 9100, which is worse than an 9000, which is worse than an 8500
Considering a 9100 is the SAME as an 8500, this statement is already absurd without delving deeper..... nice of you to label the facts as "ATI propaganda". We know which team you root for :rolleyes:
 
They can put all the fluff they want on their retail boxes. What's missing from the equation are benchmark

Why diss XGI before they launch their first product?
You may be in for a surprise next month :)

SiS/XGI will never make it into OEM boxes without a high end heavy hitter.

Really?

http://www.xbitlabs.com/news/video/display/20030804021242.html

XGI, a subsidiary of Silicon Integrated Systems, last week signed distributional agreements with three Asian distributors of chips intended for use in personal computers.

They will from now sell a considerable part of XGI’s graphics processors to graphics cards manufacturers and system integrators across the whole Asia. This does not mean that XGI will cease to support the existing add-in-board partners, but it means that the mentioned ASIC distributors will seek for more clients for the Hsinchu, Taiwan-based graphics company.

NVIDIA and ATI Technologies also use the same structure of selling graphics processors. Substantial part of GPU products goes to large IC distributors, while another is supplied to big manufacturers of mainboards and graphics cards. Most of not large clients of ATI and NVIDIA buy graphics processors from such IC-stocking representatives.

NVIDIA’s IC distributors are Edom Technology and Atlantic Semiconductor. ATI Technologies’ major distributors include Althon, D & H, Ingram Micro and Tech Data.
 
Back
Top