Nick said:
Well, let's take my twin brother as an example. He's not into graphics programming, but he does study computer science. When I had bought my Radeon 9000 I was a bit dissapointed. A month later a lot better deals were available, but my brother bought exactly the same card, even against my advice and after reading a few reviews. The only reason he chose that card was because it was better than his TNT 2 and he had seen on my computer what it was capable of. In other words, he thought it looked "okay" and didn't want to take the risk. So he spend more money than for the better alternatives. Guess what, he doesn't care and hasn't regretted it for a second. It runs the things he wants to do fine and as expected. I'm very positive a lot of people who are not interested in maximum performance think the same way.
Well, being able to try a product before buying it can certainly be a powerful factor affecting purchase decisions. Obviously if you can see a card running and are happy with the performance, you may be compelled to buy that one instead of a different one that has good reviews but that you can't try. If you ask a knowledgeable friend, chances are they either already own the card they recommend, or they own a card that's out of the naive user's budget. I wouldn't say your example, where you own a mainstream card but recommend a different one, is the most common scenario. And reviews of the Radeon 9000 were generally positive, so that would only have reinforced your brother's decision. Remember the 9000 still outperforms the 5200 in many games, and the minimal DX9 support of the 5200 shouldn't really be a purchase factor unless you're a graphics programmer.
So for this price class, which is the biggest, it's pointless trying to convince anyone that one or the other manufacturer is worse.
Correct. But there are still a lot of people in the market for this class of product that WILL read reviews and get advice from friends, and they CAN be swayed to choose one manufacturer over another.
And for someone who does know that a DirectX 9 card is highly recommended for future games, ATI looks like a rip-off! So it's all very subjective.
Huh? How does a DX8 card look more like a rip-off than a so-called DX9 card that is far too slow to run DX9 apps? At least with the ATI card, you have no illusions about what you are getting. The FX 5200 is much more likely to fall below your expectations.
Well, many people ask my brother what graphics card to buy. And my brother has also assembled a lot of flawlessly working systems for friends and family. They don't care whether he puts an Nvidia or ATI card in it, or why they pay 50 € more than what they can get in a supermarket. They ask for quality and they get it. And I'm sure they feel warm and fuzzy inside about the graphics card because their upgrade demands are never about the graphics card.
Sure. But remember, your brother did still have to choose a graphics card, and his choice directly affected the choice of card for his friends and family. So even if your brother heard from a friend who talked to a friend who read in a magazine somewhere that the GeForce is a bad card because it cheats in benchmarks, then that one magazine's opinion has influenced the decisions of a whole pile of people who necessarily know anything about graphics cards.
Now matter how much you want it, you can't educate everyone about graphics theory and the chip and card manufaturers. They won't listen. As long as the stuff works and stays competitive, advertisement and marketing strategy have a huge influence on sales for this category.
Of course they do. But marketing and advertising do not necessarily equal trickery. Marketing is about convincing people that they should buy your product, and typically involves conveying the benefits of the product. Advertising is simply about conveying marketing messages to the widest possible audience.
Just an example: a Radeon 9200 is worse than an 9100, which is worse than an 9000, which is worse than an 8500 (there are exceptions of course). These cards are nowadays in the same price category but nearly everyone will buy the 9200 because the number is higher. So, for this example, ATI is again a complete rip-off. Don't get me wrong, I don't mean to bash ATI here. I just want to show that ATI isn't holy either. And Nvidia did an equally dirty trick with Geforce 4 MX vs. GF3 and GF2.
No trickery involved you say? Nobody asked for newer product lines that cost the same but perform worse, but they sell like candy and I don't hear too many people complaining...
I agree that the naming schemes from ATI and Nvidia have become very confusing. In your example, your determinations of which products are "worse" appear to be subjective. The 9000 has fewer texture units and vertex engines than the 8500, but they are more efficient. In some cases (specifically DX8 shader-heavy apps) it's faster, in some cases it's slower. The 9100 is identical to the 8500, so it can't be worse than either chip. The 9200 is a 9000 with AGP 8x, so it can't be worse than a 9000. The 9000 & 9200 also support Fullstream technology, but lack hardware TRUFORM support. Hopefully it's obvious why there is no simple way to name these products. In any case, given that all of them have very similar price, features and performance, I don't see how any one of these products could really be considered a rip-off relative to the others. On the other hand, a GeForce 4 MX that has less performance and than a GeForce 2 and less features than a GeForce 3, is another story...
If it didn't gain them anything I don't think they would be doing it over and over again. Currently a lot of reviews are just focussing Nvidia's tricks once they went one step too far to make their FX products look better than they are. Meanwhile ATI is getting away with every trick of their own. If the FX products were significally better than the Radeon 9700 range, nobody would be bitching about the trilinear approximations. They would even question why ATI isn't using it to increase performance...
What tricks are ATI getting away with? So far it's been identified they were doing something with the shaders in 3DMark, and they promptly removed that from their driver (resulting in a lower score). They also don't apply AF to all textures if it's selected in the control panel, but they give you a way to disable this optimization if you don't like it. As far as I know, that's it. Nvidia is still cheating in 3DMark03 with clip planes and shader replacements, still cheating in UT2003 with forced bilinear filtering, still cheating in Shadermark and CodeCreatures and lord knows where else. And the result is that they appear to outperform ATI in benchmarks, but further tests have shown that they are slower in most actual games. It's going to be up to reviewers and customers to decide if they get away with it or not.
What makes you so sure you have 256 MB when it's mentioned on the box? Unless you are the memory manufacturer or the driver developer you have no sure way of knowing what you have.
You can't be sure. That's why you trust friends and reviewers to tell you if you're being lied to. There may be no tangible performance difference between a 256MB and 128MB card, but if a company tried to pass off one as the other, you'd certainly have grounds for a lawsuit.
And as long as you don't own all those cards yourself there is no sure way of knowing what benchmarks are fully correct. And not everybody is willing to read the details of a hundred reviews just to find out who is using the right testing methods. Lately I've been seeing a lot of review conclusions about ATI being the best buy, but when I look at the detailed graphs I sometimes see Nvidia scoring a lot better at higher anti-aliasing. If that's what I'm looking after, the review's conclusion wasn't very helpful. Also, personally I don't care if ATI scores 300 FPS and Nvidia 'only' 200 FPS at low resolutions or without anti-aliasing.
It's all about expectations. I've seen a lot of people whine about 5 FPS, for Nvidia as well as ATI cards.
I'm not disagreeing with any of that. But if all of your favorite reviewers start saying that something was wrong with Nvidia's benchmarks, there's a good chance you're going to believe them. The geeks like us might analyze their methods, but most other people will just trust them.
Unfortunately reviewers make mistakes too. A lot. Some of them even claim Nvidia is not doing trilinear filtering at all. But these people don't understand what trilinear filtering is and can't tell the difference between an approximation and bilinear. Or the 24-bit FP vs. 32-bit FP ATI gets away with. I know it's not a DirectX 9 specification but what if programmers ask for it? And what they don't talk about is the banding on some of ATI's mipmap transitions and worse anisotropic filtering for diagonal directions. And what they mostly show is a floor with a texture that is not running diagonally. And a static image can have a slightly different mipmap bias so the image looks sharper and the mipmap transitions further. Not to mention you need a well calibrated monitor gamma to see things correctly...
Of course reviewers make mistakes, that's how Nvidia was allowed to get away with all of their cheats in the first place! No one discovered them until after they reviewed the original product. The ATI limitations you mention have all been known for some time and discussed at length. ATI doesn't "get away with" 24-bit FP because this is what is stated in the DX9 spec. The angle-dependence of AF has been around since the 8500 and was significantly improved in the R3xx series. The mipmap/LOD bias and gamma issues apply to tests on any card. The difference with Nvidia's cheats are that they affect only common benchmark applications, and in many cases they reduce quality below the level of earlier generations of products, and they can't be disabled!
Yes it's quite pointless. Technology will drive itself. After Intel's defeat by AMD for the 1 GHz race, it's failure with the 1133 MHz chip and the dissapointing Wilamette, it regained the crown again. But it's pointless to talk about AMD's glory nowadays. Things are a bit different on the graphics card market but you could very well forget all about Nvidia's cheating in a year if their next generation of cards is succesful. All I want to say is, don't shut your eyes for Nvidia because they had one dissapointing product range.
I don't think anyone is proposing that no one should consider purchasing any future Nvidia product because of the problems with their current line-up. But if no one bothered discussing or complaining about their problems, what motivation would they have to correct them as soon as possible? Do you think it was coincidence that Intel really cranked up the speed of their product releases after they started facing serious pressure from AMD and their customers?
Ok, I'll repeat it once more to avoid misunderstandings: I don't favor Nvidia (nor ATI). I might have defended it a little in this post, but my main point is that they still produce excellent products especially for the low-end market so trying to convince average people of their 'evilness' can make you look like you're sponsored by ATI.
But after all, time will tell...
My point is that none of Nvidia's products, even their low-end products, are actually as good as people were originally led to believe. You don't have to be sponsored by ATI to feel that way. If you're a sports writer and a Yankees fan who thinks the Mets are evil, does that imply that you are being sponsored by the Yankees?