nVidia's cheating: Dumbing it down.

Nick said:
You're having a hard time because the difference between Nvidia and ATI is marginal.

The problem is: Only geeks like you care about 5% performance.

It boggles the mind how someone could post to a site like B3d and yet be oblivious to the many articles and threads here (which abound) which show conclusively that at similar levels of IQ the performance difference is far, far greater than 5%. Perhaps the problem is that ostensibly since you are not a geek (assumed from your comment above) you simply don't understand the facts demonstrated copiously on that topic?

Only geeks like you care if an optimization that is not percievable can't be turned off in the settings.

OK, I guess the issue of whether you're a geek is settled...;) So, it's not surprsing that you can't perceive things which are so obvious to the "geeks" you are conversing with. Until you become a geek I perceive you will not be able to perceive many things...;)

Only geeks like you care if an office computer has an Nvidia or ATI card. Only geeks like you pay the double for the cheapest DirectX 9 card.

Yes, of course--but then this makes both the ATi corporation and the nVidia corporation "geeks" too, doesn't it, since they both insist on shipping their products in distinctively different packages and pointing out the differences between them to all the non-geeks they can get to listen to them? As well, only non-geeks get suckered into buying so-called "DX9" cards without even knowing what DX9 is in the first place.

The similarity with Intel vs. AMD is striking. But when you dumb it down, they are both excellent companies with excellent products. And somebody without a great deal of technical knowledge doesn't care at all. "Build me a computer for 2000 €" they say and wouldn't notice if you sold your old one and builded a new one for yourself.

You are correct: this is exactly the approach that non-geeks take. It also is precisely why non-geeks are so easily suckered.

No hard feelings, but you're wasting time. My time! ;)

Don't despair--when you at last learn enough about the basics of how things work to reach the status of "geek," then you will find your time is far more constructively apportioned when exploring such issues.

geek = derogatory slang used by non-geeks to denote the utter confusion and the frustration they often feel when reading things they are unable to understand. Calling those who understand the topics discussed "geeks" makes the non-geek feel more secure in his ignorance.

(Sorry, but your post was too tempting a morsel to ignore...;))
 
WaltC said:
geek = derogatory slang used by non-geeks to denote the utter confusion and the frustration they often feel when reading things they are unable to understand. Calling those who understand the topics discussed "geeks" makes the non-geek feel more secure in his ignorance.

Here's my favorite quote for people not in the know...

"The first step towards knowledge is to know that we are ignorant." - Richard Cecil

Bad thing is most people are too lazy to become knowledgable. Plus, they won't because they don't want to be considered a geek. ;)

Tommy McClain
 
AzBat said:
WaltC said:
geek = derogatory slang used by non-geeks to denote the utter confusion and the frustration they often feel when reading things they are unable to understand. Calling those who understand the topics discussed "geeks" makes the non-geek feel more secure in his ignorance.

Here's my favorite quote for people not in the know...

"The first step towards knowledge is to know that we are ignorant." - Richard Cecil

Bad thing is most people are too lazy to become knowledgable. Plus, they won't because they don't want to be considered a geek. ;)

Tommy McClain

Heh...:) Good quote, indeed...;)

Yea, the "geek" stigma has always seemed a bit baffling to me...I can understand the "nerd" stigma a bit better...*chuckle*...because the word "nerd" just has that nasal kind of twang to it...heh (trying to keep a straight face.) Oh, never mind...can't do it.....GUFFAW! Hah-ha-hahhah...;) Man ...;)
 
WaltC said:
AzBat said:
WaltC said:
geek = derogatory slang used by non-geeks to denote the utter confusion and the frustration they often feel when reading things they are unable to understand. Calling those who understand the topics discussed "geeks" makes the non-geek feel more secure in his ignorance.

Here's my favorite quote for people not in the know...

"The first step towards knowledge is to know that we are ignorant." - Richard Cecil

Bad thing is most people are too lazy to become knowledgable. Plus, they won't because they don't want to be considered a geek. ;)

Tommy McClain

Heh...:) Good quote, indeed...;)

Yea, the "geek" stigma has always seemed a bit baffling to me...I can understand the "nerd" stigma a bit better...*chuckle*...because the word "nerd" just has that nasal kind of twang to it...heh (trying to keep a straight face.) Oh, never mind...can't do it.....GUFFAW! Hah-ha-hahhah...;) Man ...;)

The Dig gives AzBat & WaltC a big hug!

I love you guys! :LOL:
 
digitalwanderer said:
WaltC said:
AzBat said:
WaltC said:
geek = derogatory slang used by non-geeks to denote the utter confusion and the frustration they often feel when reading things they are unable to understand. Calling those who understand the topics discussed "geeks" makes the non-geek feel more secure in his ignorance.

Here's my favorite quote for people not in the know...

"The first step towards knowledge is to know that we are ignorant." - Richard Cecil

Bad thing is most people are too lazy to become knowledgable. Plus, they won't because they don't want to be considered a geek. ;)

Tommy McClain

Heh...:) Good quote, indeed...;)

Yea, the "geek" stigma has always seemed a bit baffling to me...I can understand the "nerd" stigma a bit better...*chuckle*...because the word "nerd" just has that nasal kind of twang to it...heh (trying to keep a straight face.) Oh, never mind...can't do it.....GUFFAW! Hah-ha-hahhah...;) Man ...;)

The Dig gives AzBat & WaltC a big hug!

I love you guys! :LOL:

LOL, you guys are nuts. :)

BTW, it sounds better this way...

"I love you man!"

"Johnny you're not getting my beer." -- Bud Light Commerical

:D Oh man, beer does sound good right now. ;)

Tommy McClain
 
WaltC said:
It boggles the mind how someone could post to a site like B3d and yet be oblivious to the many articles and threads here (which abound) which show conclusively that at similar levels of IQ the performance difference is far, far greater than 5%. Perhaps the problem is that ostensibly since you are not a geek (assumed from your comment above) you simply don't understand the facts demonstrated copiously on that topic?
Duh! Obviously I wasn't speaking for myself. I was speaking from the viewpoint of the "non-technically minded person". And trust me, all of my family, friends and university pals, even those who like gaming or have some knowledge about hardware don't fucking care about 5% or even 50%. They want to be able to plug it in and it has to work, period. No matter how many optimizations Nvidia makes, they don't care as long as they think it looks ok. And it does, to them. They also rather have a driver that works automatically than a billion settings they hardly really understand. All they need to know is the difference between performance and quality mode. So for them a Geforce FX 5200 is a lot better than any low-end Radeon because it simply runs everything available today and in the near future, at modest resolution and framerate.
OK, I guess the issue of whether you're a geek is settled...;) So, it's not surprsing that you can't perceive things which are so obvious to the "geeks" you are conversing with. Until you become a geek I perceive you will not be able to perceive many things...;)
Actually I am a geek. Look at my signature. I'm probably even more a geek than the average Beyond3D member. :LOL:
Yes, of course--but then this makes both the ATi corporation and the nVidia corporation "geeks" too, doesn't it, since they both insist on shipping their products in distinctively different packages and pointing out the differences between them to all the non-geeks they can get to listen to them? As well, only non-geeks get suckered into buying so-called "DX9" cards without even knowing what DX9 is in the first place.
Indeed, the average costumer who seeks to increase his e-penis only looks at the numbers. That's why 256 MB sells very well even for low-end cards. And both Nvidia and ATI trick people this way. But in the end they don't care what the real value of it is.
You are correct: this is exactly the approach that non-geeks take. It also is precisely why non-geeks are so easily suckered.
And you know what, they don't care! Tell them all you want, make stories like comparing Nvidia to the cheating baseball player, they'll believe you whatever you say. But send them to a computer shop and they'll buy it anyway. If there computer starts up without an error message, they are the happiest peope in the world.

In the high-end market different rules apply because most people there buy consciously. But that's a marginal percentage of the whole market. John Doe doesn't lie awake pondering what card to buy because whatever he chooses he will be happy with it. So you're wasting your time trying to convince him to make another choice.
Don't despair--when you at last learn enough about the basics of how things work to reach the status of "geek," then you will find your time is far more constructively apportioned when exploring such issues.
Well, I think I already learned a lot more than the basics. Not coincidentally both Nvidia and ATI have contacted me about a job offer. But ATI didn't want me because I'm still studying and Nvidia offered me an internship and I passed their tests. But that doesn't make me biased towards Nvidia. I might even have evolved past the point of pointlessly discussing the marginal differences between all cards, if you calculate in all aspects.

And I would trade my Radeon 9000 for a Geforce FX 5200 any time just for the DirectX 9 programming support. :p Have a nice day!
bow.gif
 
Nick said:
They want to be able to plug it in and it has to work, period. No matter how many optimizations Nvidia makes, they don't care as long as they think it looks ok. And it does, to them. They also rather have a driver that works automatically than a billion settings they hardly really understand. All they need to know is the difference between performance and quality mode. So for them a Geforce FX 5200 is a lot better than any low-end Radeon because it simply runs everything available today and in the near future, at modest resolution and framerate.
How is someone who doesn't know anything about graphics cards supposed to know if something looks "okay"? Either you have $x in your pocket to spend on an upgrade and you go to the store to see what that gets you, or you've done some research by asking friends, reading magazines, and surfing the net to determine what card is the best deal for your needs.

In the former case, you'll go to the store and see a FX 5200 and a Radeon 9200 at about the same price, and it'll probably be a toss up which one you buy. Performance-wise, the two products are pretty close, both have fairly mature and solid drivers, and since you have no idea what "DirectX" means, you'll either be happy with both cards or unhappy with both cards. In the end it'll probably come down to price and what the guy in the store recommends... how each product makes games look will never even be a factor.

In the latter case, you're going to want someone to tell you which card is better and why. If a reviewer says DX9 is important, you'll believe him. If a reviewer says he can see one card has superior AA quality, you'll believe him. In other words, the reviewer determines what is important. Once again, whether or not a given product makes games look "okay" to you is never an issue... you just need someone "in the know" to tell you that you're getting the best product, so you feel all warm and fuzzy and happy. That is why issues that only a "geek" could appreciate end up being important to everyone.

Indeed, the average costumer who seeks to increase his e-penis only looks at the numbers. That's why 256 MB sells very well even for low-end cards. And both Nvidia and ATI trick people this way. But in the end they don't care what the real value of it is.
There is no trickery involved here. Both companies are giving people what they want. Everyone is looking for good value, and they know that more memory at a lower price means you're getting more of something for less, so they're happy to buy it. I have no doubts that if both companies had the means to educate more people about how things like AA and DX9 are more important than memory, they would gladly do so. But until that can happen, when people naively ask for boards with more memory, ATI & Nvidia would be stupid not to deliver it.

On the other hand, what if one company told you their product had 256MB of memory, even though it only had 128MB? Wouldn't you be pissed off if you found out? How about if you bought the product because you heard it got the highest benchmark scores, and then found out it was cheating on the tests? Or how about if you bought the product because you heard it supported DX9, then found out that it was too slow to run DX9 games and had to default back to DX8? Even if you didn't really know much about how benchmarks worked or what DX9 meant, wouldn't you still be ticked off to find these things out? Regardless of how much we know, we never enjoy feeling like a sucker.

And you know what, they don't care! Tell them all you want, make stories like comparing Nvidia to the cheating baseball player, they'll believe you whatever you say. But send them to a computer shop and they'll buy it anyway. If there computer starts up without an error message, they are the happiest peope in the world.
Wrong... see above. If someone reads a review that says "don't buy this card because it cheats... buy this other card instead", people will listen, because they assume the reviewer knows what he's talking about. And if they find out AFTER they bought the card that it doesn't actually deliver what it was supposed to, you're not going to be happy, even if it seems to be working fine in your system. You won't be happy because now you know you could have bought a better card, and were suckered into buying an inferior product. That feeling sucks.

I might even have evolved past the point of pointlessly discussing the marginal differences between all cards, if you calculate in all aspects.
These so-called "pointless" discussions are the only way that products improve over time. If the only app you ever run on your PC is Internet Explorer, then as far as you're concerned, every new graphics card released in the past 3-4 years is pointless. Fortunately, there are enough people who care about these things to keep driving technology forward.
 
Nick said:
Duh! Obviously I wasn't speaking for myself. I was speaking from the viewpoint of the "non-technically minded person".

Why do that, though? If you aren't speaking for yourself, why presume to speak for other people--especially other people you lump together in groups which you call "geek" and "other-than-geek," etc.? When you do that you run the risk of having people think that you are just spouting your own opinions and simply hiding them behind a pretense of "speaking for other people," right? It really is a pretentious claim to speak for even one other individual, much less entire populations of them...;)

And trust me, all of my family, friends and university pals, even those who like gaming or have some knowledge about hardware don't fucking care about 5% or even 50%. They want to be able to plug it in and it has to work, period. No matter how many optimizations Nvidia makes, they don't care as long as they think it looks ok. And it does, to them. They also rather have a driver that works automatically than a billion settings they hardly really understand. All they need to know is the difference between performance and quality mode.

I'm quite sure you think this is what they think--that's obvious. But you'll discover that people who spend $400 + for a 3d card care very much about these things. You'll save yourself a lot of grief if you simply let them decide what "they need to know"--but I have to admit you sound like a young man who wouldn't hesitate to tell them what it is you think they "need to know," right?...;)

BTW, the very existence of the B3d forums itself is a persuasive argument against your opinions as these forums regularly host lots of folks who think they "need to know" a lot more than the manufacturer is willing to tell them, and they span all age groups, incomes, educational and job experience levels. There are dozens if not hundreds of forums all across the Internet in which thousands of people daily participate in order to broaden the base of "what they need to know." In short, "what they need to know" is never fixed in stone. It changes as the industry changes--sometimes slowly, sometimes dynamically.

So for them a Geforce FX 5200 is a lot better than any low-end Radeon because it simply runs everything available today and in the near future, at modest resolution and framerate.

Even if that was true, by your own definition of "these people"--it's not going to matter to them--they'll just punch out the cheapest card, right?

Actually I am a geek. Look at my signature. I'm probably even more a geek than the average Beyond3D member. :LOL:

OK, so if I have this right you classify yourself as a "geek" and by the definition of "geek" you enumerated in your original post you "care about marginal differences" in 3d cards, you care about "5% performance increases" and you care about who makes the 3d card you buy and you will cheerfully "pay double" for the cheapest DX9 card. I mean, that's the way you defined "geek" in your last post...;)

Why didn't you say so? Heh...;)

Indeed, the average costumer who seeks to increase his e-penis only looks at the numbers. That's why 256 MB sells very well even for low-end cards. And both Nvidia and ATI trick people this way. But in the end they don't care what the real value of it is.

My, what a "geekish" description that is...;)

Basically, most average customers care about getting the most value for their dollar--pretty simple, actually. Thank you for revealing to the world that ATi and nVidia actually are selling the same products in different boxes in a grand conspiracy to fool their stockholders and the public alike into believing that they are two separate companies selling different products into the marketplace. Now all you've got to do is to sell your geekish message to the rest of the world. The only problem I see is that I doubt geeks and non-geeks alike will pay much attention to your "It's all the same, suckers!" pronouncements. The products are not the same--if you really were a "geek" you'd know that, regardless of which product you might prefer.

And you know what, they don't care! Tell them all you want, make stories like comparing Nvidia to the cheating baseball player, they'll believe you whatever you say. But send them to a computer shop and they'll buy it anyway. If there computer starts up without an error message, they are the happiest peope in the world.

There you go again...speaking for "them"...;) You might want to think about the fact that the reason your friends ignore your advice is simply because you've failed to persuade them. But more than that--who cares? Talking about the imagined conduct and motivations of "them" doesn't diminish the importance of such topics in the minds of geeks who care about those topics. Not by one little bit. So it's kind of silly to worry about "them" when the only people we can control and direct are ourselves. If "they" walk off a cliff are you going with them? Etc.

In the high-end market different rules apply because most people there buy consciously. But that's a marginal percentage of the whole market.

It may be "marginal" in terms of the numbers of units sold in comparsion to the low end, however it is anything but "marginal" in terms of the profits companies make per unit selling into the mid-high end. Remember, they can makes as much on a single 5900U/9800P sale as they can make on five or more 5200/9000 sales. In the last statement I saw by an ATi representative, he related that the profits were nearly split evenly between the low end desktop (much higher volume) and middle/high ends (lower volume but much higher profits per unit.) So to the respective 3d companies selling into the mid-high desktop range is anything but a "marginal" situation. Think about why nVidia even attempted to bring nv30U to the mass market--if sales of such products were indeed marginal to these companies, nVidia would never have even bothered. But bother they did, to the tune of hundreds of millions of dollars.

John Doe doesn't lie awake pondering what card to buy because whatever he chooses he will be happy with it. So you're wasting your time trying to convince him to make another choice.

Who said anything about "John Doe"...? I think it's very simple: the people who care about such things will care, and they are often listened to by their friends and associates (even if yours don't listen to you.) The people who don't care, don't care--and are just as likely to buy an ATi card as they are a nVidia card--because they don't care. The assumption that people who don't care would be more inclined to buy one product over another strikes me as a contradiction in terms. Remember that "John Doe" doesn't even know what DX9 is...So why talk about DX9 as though it's relevant to John Doe? The fact is that people who don't care are no more inclined to buy one card than another. So talking about what "they" are likely to buy is a wash.

Well, I think I already learned a lot more than the basics. Not coincidentally both Nvidia and ATI have contacted me about a job offer. But ATI didn't want me because I'm still studying and Nvidia offered me an internship and I passed their tests. But that doesn't make me biased towards Nvidia. I might even have evolved past the point of pointlessly discussing the marginal differences between all cards, if you calculate in all aspects.

That's great--glad for you! Now, when you get a job after school you'll begin the year's-long process of accumulating experience, and during that time you'll discover just how things really work. I mean, just the fact that you consistently keep saying "marginal" differences illustrates to me you have a lot to learn about technology in general. Some technologies are similar, some are radically different. It would be a serious mistake to begin making such generalizations at such an early point in your career. You might want to check into PR work with some of the tech companies, though...;) nVidia right now could probably use someone with your particular set of prejudices...(I'm not trying to be mean or uncouth--just honest with you. You are reaching general conclusions way too early for your own good.)

And I would trade my Radeon 9000 for a Geforce FX 5200 any time just for the DirectX 9 programming support. :p Have a nice day!
bow.gif

Ah! I'd hoped we'd at last get down to the "they" you were actually talking about. Of course the "they" is simply you...;) See how easy...? Next time you might want to consider saying, "I prefer the 5200and can see a difference that is other-than-marginal" and start the conversation from there...it'd be a lot simpler and a lot more productive coming at it that way.
 
AzBat said:
[Here's my favorite quote for people not in the know...

"The first step towards knowledge is to know that we are ignorant." - Richard Cecil

Bad thing is most people are too lazy to become knowledgable. Plus, they won't because they don't want to be considered a geek. ;)

Tommy McClain
I'll see your Cecil and raise you a Wheeler. :)

"As our island of knowledge grows, so does the shore of our ignorance." ~ John Archibald Wheeler
 
GraphixViolence said:
How is someone who doesn't know anything about graphics cards supposed to know if something looks "okay"? Either you have $x in your pocket to spend on an upgrade and you go to the store to see what that gets you, or you've done some research by asking friends, reading magazines, and surfing the net to determine what card is the best deal for your needs.
Well, let's take my twin brother as an example. He's not into graphics programming, but he does study computer science. When I had bought my Radeon 9000 I was a bit dissapointed. A month later a lot better deals were available, but my brother bought exactly the same card, even against my advice and after reading a few reviews. The only reason he chose that card was because it was better than his TNT 2 and he had seen on my computer what it was capable of. In other words, he thought it looked "okay" and didn't want to take the risk. So he spend more money than for the better alternatives. Guess what, he doesn't care and hasn't regretted it for a second. It runs the things he wants to do fine and as expected. I'm very positive a lot of people who are not interested in maximum performance think the same way.
In the former case, you'll go to the store and see a FX 5200 and a Radeon 9200 at about the same price, and it'll probably be a toss up which one you buy. Performance-wise, the two products are pretty close, both have fairly mature and solid drivers, and since you have no idea what "DirectX" means, you'll either be happy with both cards or unhappy with both cards. In the end it'll probably come down to price and what the guy in the store recommends... how each product makes games look will never even be a factor.
So for this price class, which is the biggest, it's pointless trying to convince anyone that one or the other manufacturer is worse. And for someone who does know that a DirectX 9 card is highly recommended for future games, ATI looks like a rip-off! So it's all very subjective.
In the latter case, you're going to want someone to tell you which card is better and why. If a reviewer says DX9 is important, you'll believe him. If a reviewer says he can see one card has superior AA quality, you'll believe him. In other words, the reviewer determines what is important. Once again, whether or not a given product makes games look "okay" to you is never an issue... you just need someone "in the know" to tell you that you're getting the best product, so you feel all warm and fuzzy and happy. That is why issues that only a "geek" could appreciate end up being important to everyone.
Well, many people ask my brother what graphics card to buy. And my brother has also assembled a lot of flawlessly working systems for friends and family. They don't care whether he puts an Nvidia or ATI card in it, or why they pay 50 € more than what they can get in a supermarket. They ask for quality and they get it. And I'm sure they feel warm and fuzzy inside about the graphics card because their upgrade demands are never about the graphics card.
There is no trickery involved here. Both companies are giving people what they want. Everyone is looking for good value, and they know that more memory at a lower price means you're getting more of something for less, so they're happy to buy it. I have no doubts that if both companies had the means to educate more people about how things like AA and DX9 are more important than memory, they would gladly do so. But until that can happen, when people naively ask for boards with more memory, ATI & Nvidia would be stupid not to deliver it.
Now matter how much you want it, you can't educate everyone about graphics theory and the chip and card manufaturers. They won't listen. As long as the stuff works and stays competitive, advertisement and marketing strategy have a huge influence on sales for this category.

Just an example: a Radeon 9200 is worse than an 9100, which is worse than an 9000, which is worse than an 8500 (there are exceptions of course). These cards are nowadays in the same price category but nearly everyone will buy the 9200 because the number is higher. So, for this example, ATI is again a complete rip-off. Don't get me wrong, I don't mean to bash ATI here. I just want to show that ATI isn't holy either. And Nvidia did an equally dirty trick with Geforce 4 MX vs. GF3 and GF2.

No trickery involved you say? Nobody asked for newer product lines that cost the same but perform worse, but they sell like candy and I don't hear too many people complaining...

If it didn't gain them anything I don't think they would be doing it over and over again. Currently a lot of reviews are just focussing Nvidia's tricks once they went one step too far to make their FX products look better than they are. Meanwhile ATI is getting away with every trick of their own. If the FX products were significally better than the Radeon 9700 range, nobody would be bitching about the trilinear approximations. They would even question why ATI isn't using it to increase performance...
On the other hand, what if one company told you their product had 256MB of memory, even though it only had 128MB? Wouldn't you be pissed off if you found out? How about if you bought the product because you heard it got the highest benchmark scores, and then found out it was cheating on the tests? Or how about if you bought the product because you heard it supported DX9, then found out that it was too slow to run DX9 games and had to default back to DX8? Even if you didn't really know much about how benchmarks worked or what DX9 meant, wouldn't you still be ticked off to find these things out? Regardless of how much we know, we never enjoy feeling like a sucker.
What makes you so sure you have 256 MB when it's mentioned on the box? Unless you are the memory manufacturer or the driver developer you have no sure way of knowing what you have. Ok, ok I won't doubt it much, but you can get screwed in many ways. And as long as you don't own all those cards yourself there is no sure way of knowing what benchmarks are fully correct. And not everybody is willing to read the details of a hundred reviews just to find out who is using the right testing methods. Lately I've been seeing a lot of review conclusions about ATI being the best buy, but when I look at the detailed graphs I sometimes see Nvidia scoring a lot better at higher anti-aliasing. If that's what I'm looking after, the review's conclusion wasn't very helpful. Also, personally I don't care if ATI scores 300 FPS and Nvidia 'only' 200 FPS at low resolutions or without anti-aliasing.

It's all about expectations. I've seen a lot of people whine about 5 FPS, for Nvidia as well as ATI cards.
Wrong... see above. If someone reads a review that says "don't buy this card because it cheats... buy this other card instead", people will listen, because they assume the reviewer knows what he's talking about. And if they find out AFTER they bought the card that it doesn't actually deliver what it was supposed to, you're not going to be happy, even if it seems to be working fine in your system. You won't be happy because now you know you could have bought a better card, and were suckered into buying an inferior product. That feeling sucks.
Unfortunately reviewers make mistakes too. A lot. Some of them even claim Nvidia is not doing trilinear filtering at all. But these people don't understand what trilinear filtering is and can't tell the difference between an approximation and bilinear. Or the 24-bit FP vs. 32-bit FP ATI gets away with. I know it's not a DirectX 9 specification but what if programmers ask for it? And what they don't talk about is the banding on some of ATI's mipmap transitions and worse anisotropic filtering for diagonal directions. And what they mostly show is a floor with a texture that is not running diagonally. And a static image can have a slightly different mipmap bias so the image looks sharper and the mipmap transitions further. Not to mention you need a well calibrated monitor gamma to see things correctly...

Again, I'm not saying Nvidia is perfect either. Not at all. My next card may very well be an ATI card again because they currently offer better mid-range cards for low prices.
These so-called "pointless" discussions are the only way that products improve over time. If the only app you ever run on your PC is Internet Explorer, then as far as you're concerned, every new graphics card released in the past 3-4 years is pointless. Fortunately, there are enough people who care about these things to keep driving technology forward.
Yes it's quite pointless. Technology will drive itself. After Intel's defeat by AMD for the 1 GHz race, it's failure with the 1133 MHz chip and the dissapointing Wilamette, it regained the crown again. But it's pointless to talk about AMD's glory nowadays. Things are a bit different on the graphics card market but you could very well forget all about Nvidia's cheating in a year if their next generation of cards is succesful. All I want to say is, don't shut your eyes for Nvidia because they had one dissapointing product range.

Ok, I'll repeat it once more to avoid misunderstandings: I don't favor Nvidia (nor ATI). I might have defended it a little in this post, but my main point is that they still produce excellent products especially for the low-end market so trying to convince average people of their 'evilness' can make you look like you're sponsored by ATI.

But after all, time will tell...
 
Pete said:
AzBat said:
[Here's my favorite quote for people not in the know...

"The first step towards knowledge is to know that we are ignorant." - Richard Cecil

Bad thing is most people are too lazy to become knowledgable. Plus, they won't because they don't want to be considered a geek. ;)

Tommy McClain
I'll see your Cecil and raise you a Wheeler. :)

"As our island of knowledge grows, so does the shore of our ignorance." ~ John Archibald Wheeler

LOL. That's a good one too, thanks! I'll add it to my list of ignorance quotes. Here's a list of the ones I have so far...

"It is impossible to defeat an ignorant man in argument." - William G. McAdoo

"There is one way to handle the ignorant and malicious critic. Ignore him." - Author Unknown

"Don't mind criticism. If it is untrue, disregard it; if unfair, keep from irritation; if it is ignorant, smile; if it is justified it is not criticism, learn from it." - Author Unknown

"Your ignorance cramps my conversation." - Author Unknown

:D

Tommy McClain
 
WaltC said:
Why do that, though? If you aren't speaking for yourself, why presume to speak for other people--especially other people you lump together in groups which you call "geek" and "other-than-geek," etc.? When you do that you run the risk of having people think that you are just spouting your own opinions and simply hiding them behind a pretense of "speaking for other people," right? It really is a pretentious claim to speak for even one other individual, much less entire populations of them...;)
The problem is that this is a forum full of people who know quite a lot about graphics hardware. So the wrong conclusions could be made. I simply attempted to show how the people from my surroundings thinks about this sort of things.

I'm quite sure you think this is what they think--that's obvious. But you'll discover that people who spend $400 + for a 3d card care very much about these things. You'll save yourself a lot of grief if you simply let them decide what "they need to know"--but I have to admit you sound like a young man who wouldn't hesitate to tell them what it is you think they "need to know," right?...;)
Of course people who invest 400+ € for a single piece of hardware will think about it twice. And from this point of view I think an ATI card could be a very excellent choice. But as far as I can recall, this thread was about John Doe who generally doesn't spend more than 100 € on a graphics card. For him the difference between the cards is not that important and the cheating doesn't influence him much.
BTW, the very existence of the B3d forums itself is a persuasive argument against your opinions as these forums regularly host lots of folks who think they "need to know" a lot more than the manufacturer is willing to tell them, and they span all age groups, incomes, educational and job experience levels. There are dozens if not hundreds of forums all across the Internet in which thousands of people daily participate in order to broaden the base of "what they need to know." In short, "what they need to know" is never fixed in stone. It changes as the industry changes--sometimes slowly, sometimes dynamically.
There are still millions of people without broadband and/or unlimited internet. And even those who do often don't participate in fora. Mostly because they don't know, don't care, or don't have the time. I'm from Belgium where we have a population of only 10 Million. You're very right about hundreds of fora with thousands of people each, but even if they were all from tiny Belgium, the chance my neighbor participates in one of them is small.
Even if that was true, by your own definition of "these people"--it's not going to matter to them--they'll just punch out the cheapest card, right?
Indeed a lot of them just care about the processor and such. Current graphics cards run just about every Quake 3 generation game fine and they are enjoyed by a lot of people. And I just had a quick look at the price lists of local stores and in all of them the cheapest three or four cards were Nvidia.
OK, so if I have this right you classify yourself as a "geek" and by the definition of "geek" you enumerated in your original post you "care about marginal differences" in 3d cards, you care about "5% performance increases" and you care about who makes the 3d card you buy and you will cheerfully "pay double" for the cheapest DX9 card. I mean, that's the way you defined "geek" in your last post...;)
Indeed, personally I do care about marginal differences. I don't care much about 5% performance but 25% starts to get interesting. And I don't care about the card manufacturer as long as it's a quality product. And I'll pay double if it's worth it for my own set of criteria. What I implied was that geeks think similarly, not that thinking this way makes you a geek, although chances are bigger.
Basically, most average customers care about getting the most value for their dollar--pretty simple, actually. Thank you for revealing to the world that ATi and nVidia actually are selling the same products in different boxes in a grand conspiracy to fool their stockholders and the public alike into believing that they are two separate companies selling different products into the marketplace. Now all you've got to do is to sell your geekish message to the rest of the world. The only problem I see is that I doubt geeks and non-geeks alike will pay much attention to your "It's all the same, suckers!" pronouncements. The products are not the same--if you really were a "geek" you'd know that, regardless of which product you might prefer.
In a perfect world, everybody would be thinking twice about every euro they spend. Unfortunately, or maybe luckily for marketing people, things don't go that way. Like my economics professor said, many choices are not based on logic, but on trust, rumour, your neighbor's e-penis size, color, smell and all other senses. The main factor here is a lack of time. Like I already said, John Doe with his 100 € card isn't going to spend days reading reviews, but will have a look at the front and back of the box and simply hope the salesman isn't trying to sell him the more expensive card for his own profit (they always are, anyway). And even the people with a lot of money don't always follow logic. They might just buy the biggest, loudest and most expensive card just to show they can.

Just to make a comparison: Would you think days about what camera to buy? Would you look for more than one review? If you're an enthousiast movie editor you might, but just to shoot your vacation adventures you might be perfectly happy with what the seller told you is the right choice. Whether it's a Kodak or a Cannon probably doesn't interest you much. But what if one of them cheats about some of the specifications? Chances are you never notice and live happily ever after...
Who said anything about "John Doe"...? I think it's very simple: the people who care about such things will care, and they are often listened to by their friends and associates (even if yours don't listen to you.)
That's why there is no point spending time in this thread! People who care will read the reviews and 'investigate' themselves. This thread is about convincing John Doe, but he doesn't care so it's pointless.
That's great--glad for you! Now, when you get a job after school you'll begin the year's-long process of accumulating experience, and during that time you'll discover just how things really work. I mean, just the fact that you consistently keep saying "marginal" differences illustrates to me you have a lot to learn about technology in general. Some technologies are similar, some are radically different. It would be a serious mistake to begin making such generalizations at such an early point in your career. You might want to check into PR work with some of the tech companies, though...;) nVidia right now could probably use someone with your particular set of prejudices...(I'm not trying to be mean or uncouth--just honest with you. You are reaching general conclusions way too early for your own good.)
I do know about radical differences in technology. Isn't software rendering radical enough? The point is, only the results matter. And they look very much the same for Nvidia and ATI unless you start about the marginal differences...

Goodnight. :)
 
Nick said:
Or the 24-bit FP vs. 32-bit FP ATI gets away with. I know it's not a DirectX 9 specification but what if programmers ask for it?

What do you mean 'What if programmers ask for it'?

If you are saying 'What if they ask for it within the bounds of the API' then that shouldn't be an issue, because they can't. There is no way to specifically ask for 32-bit FP for fragment shader calculations within the bounds of either PS2.0 (which explicitly defines 24 bit as the acceptable result) or the OpenGL spec (which as far as I recall defines a maximum error for individual calculations which lies within the bounds for 24-bit FP).

If you're saying 'What if they ask for it in future hardware' then that is hardly something that can be used as an argument against current parts. ATI are always listening and taking developer feedback on issues such as these.

Either way ATI is not 'getting away with' anything here, so I have no idea what point you are trying to make.
 
Nick said:
I'm quite sure you think this is what they think--that's obvious. But you'll discover that people who spend $400 + for a 3d card care very much about these things. You'll save yourself a lot of grief if you simply let them decide what "they need to know"--but I have to admit you sound like a young man who wouldn't hesitate to tell them what it is you think they "need to know," right?...;)
Of course people who invest 400+ € for a single piece of hardware will think about it twice. And from this point of view I think an ATI card could be a very excellent choice. But as far as I can recall, this thread was about John Doe who generally doesn't spend more than 100 € on a graphics card. For him the difference between the cards is not that important and the cheating doesn't influence him much.
If this all doesn't matter to John we could reasonably expect ATI, NVIDIA and SiS to have equal market share with a margin of error of 1~3%, caused by the high end buyers and "those in the know". These three companies have appropriate products in the "cheap crap" segment.

If it is like you say, that all this stuff doesn't matter, can you explain why we're seeing a different market distribution? Please?

It's quite simple, I'd explain it, but I'd like to hear it from you :p
 
Evildeus said:
Brand names, price and long term agreements have an impact on this segment...
Yep. Shame on you for answering a question for Nick ;)

Benchmarks work towards the "brand" *shudder*, the image of a company and its products. Geforce FX5900U wins a lot of benchmarks => better image, more sales for Geforce4MX. It's all "Geforce", who cares about the tiny letters and numbers after that :D
 
andypski said:
Either way ATI is not 'getting away with' anything here, so I have no idea what point you are trying to make.
They advertise 128-bit color precision everywhere...
 
zeckensack said:
If it is like you say, that all this stuff doesn't matter, can you explain why we're seeing a different market distribution? Please?

It's quite simple, I'd explain it, but I'd like to hear it from you :p
SiS can't put the same numbers on the boxes...
 
Back
Top