Tables Turn Again (iXBT/Digit-Life Gainward 5900 GS review)

DaveBaumann said:
If you think its really out there then you are deluding yourself - the magazines have not picked up on it to any extent (perhaps a few vague references). Even reviewers who know these things are happening haven't spoken about it. I think I;ve seen at least two reviews using UT2003 from reviewers who know they are dialling down IQ in UT but haven't mentioned it at all.

I think the point is that B3d is certainly doing its job (whether anyone else is or isn't), and I wouldn't sell the influence of B3d short. Quite honestly, I think the reviewers who haven't spoken up simply aren't comfortable with taking a postion because they don't understand the issues well enough to feel confident in expounding on them--that, or they are getting some renumeration under the table. One of the two, or both.

Most hardware reviewers only feel comfortable in quoting the manufacturer's specs as printed on the box and then running a few canned benchmarks and jotting down the resulting numbers spit out. This is what has constituted a hardware "review" for years, unfortunately. Sad but true--most magazines and web sites are purely commercial entities peddling fluff. It's been that way for a long time. Sites like B3d which go beyond the "zoowie, wowie, cool!" presentation and make a serious effort at informed criticism and evaluation have always been rare.

Here, take a look at Brent's latest review:

http://www.hardocp.com/article.html?art=NDk2LDM=

Right off the bat we see the BFG 5900 Ultra surpassing the 9800 Pro in Antalus. What can be seen here is that the 5900 Ultra excels in our quality settings almost across the board.

....

In Unreal Tournament 2003 the BFG 5900 Ultra dominated Antalus and Face3 maps. Performance in Antalus was acceptable all the way through to 1600x1200 with 4XAA and 8XAF in this flyby test

Now, take a look at the mip levels of Antilus:

Image

Can you see a shred of Trilinear Filtering on the floor textures??? Nope. The floor is covered with detail textures which NVIDIA are applying their "High Performance" mode to, regardless of what setting you select its no wonder it "dominates". The only reason we are seeing variances in UT2003 as two who has the upper hand is because the maps have different useages of detail textures - those that jave less is fairer for the 9800 (even though the normal textures are still not full Trilinear), and those with more Details texture makes it easier on the 5900.


What puzzles me most about aspects of this coverage from AnandTech and [H] is how they continuously flip-flop back and forth. At times we'll see probing, informed points of view and then you turn around and read unapologetic fluff coming from the same people on many of the same topics a couple of months later, with all of the previous issues forgotten. It'd be nice if these sites would develop some kind of consistent approach. It's too bad, but I think some of them simply don't know when they're being duped and that ignorance gets passed right along to their readership.

Some examples of that readership: in another forum someone made a point as to "Ultra Shadows (TM)" and how that was a great feature and was slated for support in one or two upcoming games. I asked him what "Ultra Shadows(TM)" was and wasn't surprised to discover he had no idea. Not even a rough idea--he was only conversant with the marketing term. But it was one of the factors which motivated him to buy a $500 nv35.

Still another person after viewing screenshots of UT2K3 denoting the fact that trilinear filtering wasn't being used with detail textures decried the fact that people were making a fuss about "screeshots viewed at odd angles" and "a couple of pixels 10 miles back in the frame." Even though he had viewed the screenshots similar to yours here he had no conception of what he was looking at even though it was explained to him very well in the context of the thread. He mischaracterized the situation because he simply didn't understand it.

There are always going to be the haves and the have nots, the informed and the ignorant, and the latter group will always heavily outnumber the former. If that was not so, we wouldn't have to put up with marketing as we do, IMO...;) Likewise with web sites--those that "get it" are far less numerous than those which don't. Keep up the good work, Dave--I think if anything you may underestimate the clarity you are bringing to this and many other such topics.
 
Bouncing Zabaglione Bros. said:
The fact we are talking about it shows that it's no longer the nasty little secret that Nvidia was keeping six months ago. The fallout from Nvidia cheating on 3DMark was all over the web. The lack of comment on some of the big websites just goes to show what a big stick Nvidia still holds over these people, and how they are operating to their own agendas.
It's getting to the point where the credibility of magazines and websites are suffering because they appear to be deliberately avoiding the issue because they are either too stupid or too corrupt to address it honestly for their readers.

No one is arguing that bits of pieces of information are out there. However, there is a lot of misinformation out there too, as I described earlier. And too say that the majority of people buying video cards or even visiting hardware sites know and understand the lengths and depths of what NVIDIA is doing is rather naive.

nV News forums aren't that big, but there are a surprising amount of people who still don't know that NVIDIA is degrading IQ for speed. Either that, or they have their heads stuck so far up their asses that they won't admit it. Probably a little of both. However, the issues have been kept in the spotlight there, so ignorance of the facts isn't much of an excuse, yet I see such ignorance every day. Whether these people have an underlying agenda or actually are ignorant, I have no idea. I am not talking about those people who say "who cares, I like the performance tradeoff with lowered IQ," but those who say "there is no IQ change/degradation." I can only imagine what forums like [H], Anand, and Tom's are like. Every so often I see threads at Rage3D started by people who don't know the nature of ATI's optimizations, asking questions that have been previously answered, and making assumptions that logically could never have been made if they knew the facts.

There is no way that I can accept that most people know about the cheating. I would say that most people who visit and participate at B3D know what is going on...but B3D is a small community, and hardly representative of the internet as a whole. B3D is more like the minority of the minority of the minority ;)

Sooner or later there will be a backlash (remember how Anand bitterly reviewed the NV30?) and then there is always the posibilty that ATI may get into the same kind of "dirty war" to counter the Nvidia PR spin.

And Anand was back on the NVIDIA team bus with his NV35 review :LOL:

Nvidia is a big player, but so are the likes of IBM, Intel, Microsoft, Creative, etc, but look at where their reputations are now? Look at what the *generally accepted* views are of these companies from large portions of the informed buying public - and Nvidia is heading exactly down this path.

Are these really good examples? People still buy and use Microsoft products, despite all the bad press. Pretty much the same thing with Creative. Their leading edge products all get the thumbs up from reviews, and many people swear by their products, although less now that we have good integrated solutions. I for one never heard about Creative's evil reputation until I started visiting hardware forums. Surely Creative's reputation has been bad for years, and again, I doubt that most people know of its faults. Even with years of problems under its belt, I have yet to read a review that mentions Creative's bad rep or marketing practices. I am afraid that this NVIDIA thing will just get swept under the rug.

Even if NVIDIA reforms and stops cheating, who's to say this won't happen again? How bad will it be then? Will it be more accepted or less accepted if it becomes more widespread?
 
StealthHawk said:
Are these really good examples? People still buy and use Microsoft products, despite all the bad press. Pretty much the same thing with Creative. Their leading edge products all get the thumbs up from reviews, and many people swear by their products, although less now that we have good integrated solutions. I for one never heard about Creative's evil reputation until I started visiting hardware forums. Surely Creative's reputation has been bad for years, and again, I doubt that most people know of its faults. Even with years of problems under its belt, I have yet to read a review that mentions Creative's bad rep or marketing practices. I am afraid that this NVIDIA thing will just get swept under the rug.

The point I am trying to make, is that *things change*. They may not change quickly, they may have to go through several stages, but things are different. Look at the few people who abandoned Microsoft on the desktop for Linux five years ago, and the growth that has happened since then. Even the councils of several European cities have recently dumped the ubiquitous Microsoft. Look at how many people were using AMD chips ten years ago or something other than Creative soundcards (very few). Now look at how far AMD has come with Athlons, and what they are about to do to the market with the advent of desktop 64-bit chips. Change happens gradually, but inexorably. Alternatives arrive and gain a foothold where before there were no alternatives at all.

Let's go back 12 months and think what was happening in the graphics market. ATI were beginning to do the pre-publicity for their R300 launch. Everyone thought that it was going to be another average card with slightly lackluster performance and the NV30 would arrive a month or two later and squash it unceremoniously.

Now look at today. NV30 never made it to shops, and it was hot, loud and poor performing. NV35 is underwhelming, still with poor performance and even worse IQ. Nvidia has gained a reputation amongst a significant portion of their target audience for lying and cheating that it *never had before*, dragging down the hardware sites and showing up their lack of technical skill and journalistic integrity.

ATI has been reborn, with a fast set of cards, excellent IQ, and solid driver releases every month. Who'd a thunk it, eh?

Sure, the people who know the facts about the above are still in the minority, but that minority of people who have jumped ship to previously unthought-of alternatives like ATI has grown an awful lot in the last year.

There will always be people who are ignorant of the facts - it's in the nature of a large number of people to think like sheep, which is why Nvidia spend so much time and money lying to them via their marketing, but things *do* progress.
If ATI gets more OEM wins like Medion, and people get used to ATI IQ, then when they see a Nvidia card, they'll complain about how it looks washed out and blurry, how it can't run effects as well. They'll complain how slowly the NVidia card plays games compared to how well it scored in the benchmarks. Their more knowledgable friends will recommend ATI over Nvidia *because* of the cheating, just as I have done many times over the last few months.

Ignorance still abounds, but compared to 12 months ago, the truth is gaining ground in places it never had before, amongst people who simply didn't know the facts before. Yes slowly, but also inexorably.
 
ATI's IQ is far frombeing a panacea - we shouldn't praise them so they don't improve things. Its clear that their basic filtering algorithm is less presice than NVIDIA's, and when NV is running in their "real" quality modes their basic texture filtering is superior. As I've said before, the concept of being able to alter the degree of mipmap blending is actually a pretty good thing in terms of speed increase and IQ trade off since you'll never to totally presented with the full mipmap bandings (although the transisitions are still there to be seen in the more agressive modes), however its just a shame that that actually gets lost under the fact that the process is being abused and users are not being given the option of full filtering in some cases.

AF is also an area that we know ATI needs to work in in comparison to NVIDIA (if/when the full options are actually being given).
 
DaveBaumann said:
ATI's IQ is far frombeing a panacea - we shouldn't praise them so they don't improve things.

True, but what I like about the R3x0 series is that it is very well balanced. IQ, speed, precision, memory bandwidth etc are all very well matched to get the most use for gamers in the real world.

Nvidia having slighty better filtering that is too slow to use is rather like a powerful car that does not have the handling that enables you to use that power when you drive. It's the same philosophy that Nvidia have had for years - introduce new features, claim an advantage, but don't tell anyone these features are too slow to use for a couple of more generations. Nvidia did this with AA, large textures, 32 bit colour, etc.

"Better" filtering algorithms that are too slow to be useful when actually playing games are not "better" in my opinion.
 
Bouncing Zabaglione Bros. said:
The point I am trying to make, is that *things change*. They may not change quickly, they may have to go through several stages, but things are different. Look at the few people who abandoned Microsoft on the desktop for Linux five years ago, and the growth that has happened since then. Even the councils of several European cities have recently dumped the ubiquitous Microsoft. Look at how many people were using AMD chips ten years ago or somthing other than Creative soundcards (very few). Now look at how far AMD has come with Athlons, and what they are about to do to the market with the advent of desktop 64-bit chips. Change happens gradually, but inexorably. Alternatives arrive and gain a foothold where before there were no alternatives at all.

I think you're confusing some of the issues here. All those "migrations" happened for reasons that have nothing to do with bad rep, but have everything to do with cost.

Linux is still in the minority for consumers, and for the forseeable future it will stay that way. Governments are switching to Linux for cost reasons and for stability reasons. Personally, winXP is rock stable for me, unlike win9x. Cost is a major mitigating factor for Linux.

Soundcards. Creative had a stranglehold on the market because their products were the defacto standard. Once DirectX came around that was no longer true. There was no need for "Soundblaster compatibility." In other words, Creative had an unusually high market share. Competitors then started coming out with superior products which also happened to be cheaper. Then, cheap integrated sound was introduced. Why would most people want a discrete sound card when they get equal sound quality with integrated? Basically a no brainer to migrate from expensive sound cards to cheap integrated codec that is included on the motherboard.

AMD has gained a lot of respect and market share because it actually can compete with its Athlon lineup. This has nothing to do with Intel, but everything to do with AMD. They provided equivalent Pentium performance at a lower cost. Cost is the main reason why people use the similarly performing AMD CPUs. They don't use them because they hate Intel....well maybe some do.

The underlying point here is that no fortress stands forever. Most consumers are not so blind that they will purchase from one company when the competition can offer something that is equivalent at a lower price. It's not so much of a "what did the market leader do wrong" as a "what did the little guy do right" question. NVIDIA has not lost market share yet. That is the bottom line. And NVIDIA's cheating does not have wide exposure, nor do I think it ever will. Would I love to be wrong? Yes. I can't stand the deceit, it's utterly ridiuclous that NVIDIA would go so far to hold on to the crown.

But I don't think most people care. Probably anywhere from 75-95% of people will admit what NVIDIA is doing is cheating. But I bet the majority of those people would gladly take the IQ-performance tradeoffs that NVIDIA is offering too, if they gave an option in the drivers.
 
I still find it amazing that people say they don't mind the image loss because it gives a performance gain. I mean, how stupid are these people? If they REALLY thought that, then why did they ever select the Quality option?
 
Quitch said:
I still find it amazing that people say they don't mind the image loss because it gives a performance gain. I mean, how stupid are these people? If they REALLY thought that, then why did they ever select the Quality option?

That's the thing with IQ - it's a difficult sell. If you've only ever seen poor IQ, you don't know what you are missing. If you've gotten used to good IQ, it's impossible to go backwards, and flaws are much more obvious to the eye.

How many of us could go back to 16 bit colour, low res textures and low polycounts? But without the experience of the advances of the last couple of years, we wouldn't know any better.
 
Bouncing Zabaglione Bros. said:
Quitch said:
I still find it amazing that people say they don't mind the image loss because it gives a performance gain. I mean, how stupid are these people? If they REALLY thought that, then why did they ever select the Quality option?

That's the thing with IQ - it's a difficult sell. If you've only ever seen poor IQ, you don't know what you are missing. If you've gotten used to good IQ, it's impossible to go backwards, and flaws are much more obvious to the eye.

How many of us could go back to 16 bit colour, low res textures and low polycounts? But without the experience of the advances of the last couple of years, we wouldn't know any better.


So true. PS1 and (to some extent) PS2 come in my mind.
 
Hi there,
Quitch said:
I still find it amazing that people say they don't mind the image loss because it gives a performance gain. I mean, how stupid are these people? If they REALLY thought that, then why did they ever select the Quality option?
I don't find it too amazing, really. After all, the people that "inform" themselves on web sites such as anandtech and tomshardware get "performance! performance! yay!" spoon-fed. Also, that's the first thing you'll notice about a game: Does it run more or less smoothly, or is it choppy?

I did a test with an in-house mailing group when the whole 3DMark03-NV-"cheat" thingy was big. Most people on that list are IT professionals, and most of those consider themselves to be big gaming and hardware geeks. Not one of them even knew about the incident, and all, without fail, replied with something on the lines of "so what?" and "who the hell cares?"

As has been mentioned before, performance is more readily sold than quality. A pity, but that's apparently how the market works.

93,
-Sascha.rb
 
DaveBaumann said:
ATI's IQ is far frombeing a panacea - we shouldn't praise them so they don't improve things. Its clear that their basic filtering algorithm is less presice than NVIDIA's, and when NV is running in their "real" quality modes their basic texture filtering is superior. As I've said before, the concept of being able to alter the degree of mipmap blending is actually a pretty good thing in terms of speed increase and IQ trade off since you'll never to totally presented with the full mipmap bandings (although the transisitions are still there to be seen in the more agressive modes), however its just a shame that that actually gets lost under the fact that the process is being abused and users are not being given the option of full filtering in some cases.

AF is also an area that we know ATI needs to work in in comparison to NVIDIA (if/when the full options are actually being given).

Heh... I'm hoping the praise will encourage ATi to steer clear of nVidia's current tactics in regards to similar matters...;)

You've nailed the problem pretty well. It's the fact that nVidia is either hiding or removing certain IQ settings from not only user control, but application control as well, which creates foundation for the opinion that nVidia is consciously sacrificing IQ to garner benchmark/timedemo frame rates.

Beyond this particular issue loom larger issues, though, of benchmark manipulation, custom vendor pathing, DX9, etc. In total, it presents a picture of the industry moving in a definite direction while nVidia is at best reluctantly brought along kicking and screaming, or at worst trying to go in other directions entirely.

It doesn't help my opinions very much to read quotes by nVidia employees fantasizing they'll "be bigger than Intel in five years", either. Can't help thinking the hallucinogens are liberally enjoyed in and around nVidia these days. By comparison IMO it seems ATi has its feet on the ground whereas nVidia has its head in the clouds. ATi is concentrating on beating nVidia today. They'll worry about Intel tomorrow (if ever *chuckle*)... Employees of the company are accessible in public forums and exude a good, old-fashioned sense of serving their customers. About the only thing we get out of nVidia is PR spin dropped at "appropriate" web sites in "interviews" which are often most notable for the way in which the questions asked are never answered. nVidia could definitely learn something there about PR. The companies are just so different in their philosophies and outlooks that it's difficult to see the current ATi making the same mistakes nVidia has made for the last year.

To sum up, I'd feel very comfortable hopping on a plane and flying up north and walking right into the ATi building and saying hello for a visit and a chat. I wouldn't want to visit nVidia headquarters without an armed security detachment, a couple of lawyers, and an acronym translator (and possibly a portable lie detector)....;) j/k
 
That damned nVidia interview story keeps coming to mind

I don't remember where I read it and don't know if it's just an urban legend or not, but I keep thinking of the story of the job interview question at nVidia.

Supposedly one of the questions they ask is "Would you rather win, or would you rather do your very best" and if you answered "your very best" you wouldn't get the job.

It's probably a myth, but it just seems to fit. :(
 
Re: That damned nVidia interview story keeps coming to mind

digitalwanderer said:
I don't remember where I read it and don't know if it's just an urban legend or not, but I keep thinking of the story of the job interview question at nVidia.

Supposedly one of the questions they ask is "Would you rather win, or would you rather do your very best" and if you answered "your very best" you wouldn't get the job.

It's probably a myth, but it just seems to fit. :(

*chuckle* Don't get me started!.... :D Too late--OK, here's what I heard:

NVIDIA Employee Interview 1-c-003, question 14b(1)a:

"You are walking along behind your friend George one day on your lunch break when you notice a flutter of green fall from your associate's pocket onto the pavement. You stop momentarily and scoop up what you see is a $100 bill that you correctly ascertain must have fallen from George's pocket. Do you:

1) Tap George on the shoulder and hand him the $100 bill

2) Cleverly pocket the $100 bill and say nothing to George

3) Cleverly place the $100 bill in your wallet and pull out a $5 bill which
you then give to George

4) Cleverly take the money to the police station that afternoon
after work, innocently explaining that you found it and want to know if
you should keep it (expecting them to assent)

5) Give the money to your supervisor as a donation to the nVidia
Bachelor & Spinster support fund"

Supposedly, answers 2 & 5 were marked "positive determinant"; answers 3 & 4 are "neutrals"; and answer 1 was graded "negative disposition."

I was puzzled about the "neutral" categorization and my informant explained that in the eyes of the company these weren't good answers because one involved giving something back to George and the other involved the police.

:devilish:
 
The writer has completely misunderstood the issue with application detection. He goes to say that because he is using his own timedemo, he is immune to game engine-specific cheats. The image quality is not the same between the pictures he links! When you zoom the pictures, you can clearly see that the mipmap transitions are different, which is exactly what is going on with UT 2003 and Detonator drivers, the drivers not using full trilinear filtering even when the application asks for it.

What the heck man? I don't know about you, but I'm not going to be playing the game zoomed in a couple hundred %... I mean damn, you sound like someone fussing about what color the Pentium 4 Heatspreader is... It's not like you're EVER going to see it... Looking at those 2 screenshots with the naked eye reveals NO difference... And that's how I'm going to be playing the game, with the naked eye. This definitely doesn't fall into the same category of Ati AA vs. nVidia AA which I will admittedly conceide that ATi has the crown for... And in my honest opinion, I think ATi needs to do something akin to that in order to boost its lead even more over nVidia. This business is competitive, and with cheats/optimizations or whatever you want to call it like these UT2003 ones, ATi needs to fight fire with fire... And in UT2003's case, I really don't think that's a bad thing.
 
surfhurleydude said:
What the heck man? I don't know about you, but I'm not going to be playing the game zoomed in a couple hundred %... I mean damn, you sound like someone fussing about what color the Pentium 4 Heatspreader is... It's not like you're EVER going to see it... Looking at those 2 screenshots with the naked eye reveals NO difference... And that's how I'm going to be playing the game, with the naked eye. This definitely doesn't fall into the same category of Ati AA vs. nVidia AA which I will admittedly conceide that ATi has the crown for... And in my honest opinion, I think ATi needs to do something akin to that in order to boost its lead even more over nVidia. This business is competitive, and with cheats/optimizations or whatever you want to call it like these UT2003 ones, ATi needs to fight fire with fire... And in UT2003's case, I really don't think that's a bad thing.

Um...and the loser is....the consumer!!!!*

*SD, thats US!
 
surfhurleydude said:
Perhaps if everyone that bought cards had eyes as good as The Terminator, but seeing as how we're only human, I don't see how the consumer loses.

I'm sorry, SD, but I'm 53 years old with not the best eyesight in the world, and I can see the differences...... I don't think it has anything to do with eyesight, just mindset!
 
surfhurleydude said:
What the heck man? I don't know about you, but I'm not going to be playing the game zoomed in a couple hundred %... I mean damn, you sound like someone fussing about what color the Pentium 4 Heatspreader is... It's not like you're EVER going to see it... Looking at those 2 screenshots with the naked eye reveals NO difference... And that's how I'm going to be playing the game, with the naked eye. This definitely doesn't fall into the same category of Ati AA vs. nVidia AA which I will admittedly conceide that ATi has the crown for... And in my honest opinion, I think ATi needs to do something akin to that in order to boost its lead even more over nVidia. This business is competitive, and with cheats/optimizations or whatever you want to call it like these UT2003 ones, ATi needs to fight fire with fire... And in UT2003's case, I really don't think that's a bad thing.

They hardly picked a good screenshot to show off mipmap boundries, I'm sure the other maps with more artificial textures and large flat surfaces would show it up quite nicely.

I did a test with an in-house mailing group when the whole 3DMark03-NV-"cheat" thingy was big. Most people on that list are IT professionals, and most of those consider themselves to be big gaming and hardware geeks. Not one of them even knew about the incident, and all, without fail, replied with something on the lines of "so what?" and "who the hell cares?"

Figures, the average IT staff couldn't find their own arse with both hands and a map. :mrgreen: (no offence to any IT staff here, im sure you're not average :devilish: )
 
surfhurleydude said:
Perhaps if everyone that bought cards had eyes as good as The Terminator, but seeing as how we're only human, I don't see how the consumer loses.

The difference between bilinear and trilinear is much more obvious in motion than in a static screenshot.

As for "eyes as good as The Terminator", let's see a mere human tell the difference (without a framecounter doing it for him) between, say, 80fps and 100fps. And yet, the way reviews are written, you'd think it's the hugest difference in the world.

That's because the only tool most reviewers know how to use is a framecounter. When all you have is a hammer...
 
Back
Top