This taken from overclockers.com. Sorry for the long post, but its well worth a read.
http://www.overclockers.com/tips061/
"Radeon 9700: Boredom Over Whoredom"
Ed Stroligo - 7/18/02
The master commands, and the trained dogs bark. This is getting old.
If you claim to be an independent reviewer, you are supposed to examine and judge a product. Not sell it. If you spend more time doing the second, you are not reviewing for your audience. You are whoring for the product producer, even if you don't get a dime for it.
That's what I call whoring. Now that we understand each other . . . .
A Tale of Two 'Hos
The "previews" and "reviews" are basically an effort to give "the boss" (in this case ATI) what they want, in this case, putting out for the company.
I can't claim any great moral superiority, though, because I'm a 'ho, too. The difference is I'm "your" 'ho. I look at these products from the perspective of where you're coming from and what you want, tempered by my own experience and knowledge and thinking.
Now to do that properly, I have to know where you're coming from and what you want, and that's the real reason for the surveys we do.
And that makes all the difference in the world for this particular product for most of you.
Who The Hell Runs Games At 1600X1200?
If the few indicators we have can be believed (more on this below), the Radeon 9700 consistently blows away the Ti4600 at 1600X1200.
Not you, based on the survey information we've seen. 1024X768 seems to be the sweet spot for gamers, with a slow migration towards 1280X1024.
Most of you have monitors that are 17-to-19 inches. 1600X1200 is too tiny for those sizes. If it isn't for you (especially with 17 inches), stop shooting people in Quake, and start sharpshooting people in the Army.
While 21-inch monitors have dropped a lot in price over the past few years, they still cost a good deal more than most people are willing to spend. They're also bulky and heavy, not the best selling points when the buying audience often finds space to be in short supply.
So I don't see any rush to 1600X1200 anytime soon. That will have to await big, cheap LCD displays, and that won't happen until 2005 or so.
The performance pattern of the Radeon 9700 seems to be as follows:
It's not much better at 1024X768 than the Ti4600.
At 1280X1024, gaps widen, to varying degrees.
It usually does pretty well at 1600X1200, but again, to varying degrees.
It is likely that if you like antialiasing and the like, this will do pretty well also, but let's see a little more proof on that first.
However, these conclusions rely on the independence and objectivity of the measurements, and, as we shall see, this is questionable.
A Few Technical Suspicions
The first thing I want to know about a video card is the speed of its memory. That's going to give me a pretty good idea how far this can be pushed. Nobody seemed to want to (or could) figure this one out, but fortunately somebody provided a picture of the chip.
The memory chip used is a Samsung K4D26323RA-GC2A. If you look at the here, you'll find out that it's about a 2.8ns chip that's supposed to have a maximum speed of 700MHz. In short, pretty much the same as Ti4600.
The second thing I want to know is how fast the GPU runs on the top-of-the-line card, pretty much for the same reason I want to know the same about CPUs. It gives me some idea how far lower-end version can go.
These rules work pretty well for nVidia cards, but they may not apply too well here. Given the theoretical doubling of memory bandwidth, faster memory may not make much difference.
Per GPU speed, this is the top of the line card, and given its .15 micron construction, one has to wonder just how much headroom the 9700 has. I suspect not much, especially since ATI has not nailed down the GPU speed of production cards. Since ATI has been know for . . . uhhh . . . downsizing this little specification, this is something you need to watch.
I suspect this GPU isn't going to overclock much, and overclocking memory isn't going to do the average person much good. Indeed, the few scores available at the moment may well be overclocked scores compared to the production model. If you consider 3DMark2001 to be one of the major American sports, you may be greatly disappointed. I could be well wrong, but the prudent should wait until somebody proves me wrong.
Breaches of Faith
Read the second paragraph here. Burying the press in info on very short notice is a common technique used when you want the press to deliver your message rather than let them discover the skeletons in the closet.
This event took stage-managing to a new level. For most of the "reviewers," they only got to test under highly controlled conditions. God only knows what settings were used and what little tweaks or "video card helpers" may have been used.
Has "Quack 3" been forgotten so quickly? After that, I wouldn't believe anything from those folks without checking it thoroughly, much less with company representatives looking over my shoulder.
You might be able to explain that away if all got that treatment, but some are more equal than others, as ATI demonstrated in the case of one place.
"ATI allowed us to the (sic) test the R300 on our own testbeds and we eagerly jumped at the opportunity."
Then we find out "eagerly kneeling" would have been much more appropriate phrasing.
We'll be charitable and mostly ignore the breathless "Like A Virgin" excited hype permeating the article.
We're not charitable enough to ignore the all-too-frequent "ignore my benchmarks" comments
Finally, it would be flat-out negligent not to point out who wears the pants in this relationship.
"Because ATI has yet to finalize drivers and clock speeds, we were only allowed to publish percent (sic) improvements over a GeForce4 Ti 4600."
This is complete nonsense. Less-than-cooked drivers and clock speeds can be a legitimate reason to not have any benchmarks at all, or to caveat the performance. It is never a legitimate reason to change how data is presented.
There's an infinitely more likely reason that ATI ordered this change of presentation. If most of these benchmarks were presented in the normal FPS format, it would be much more obvious to less than anal-retentive readers that the clobberings were occurring at very high resolutions the average person doesn't use, and thus would be less impressive.
But that's not the big problem.
"Allowed?"
My, my, what a good little boy!
I guess I was wrong. This isn't allegorical whoredom, this is allegorical pedophilia.
Since when do manufacturers order reviewers around? Since when do they tell them how they may present information? Since when do self-respecting reviewers say "Yes, sir" and swallow it? And, most importantly, since when do YOU swallow it?
This is not just a matter of one particular reviewer or website. This is a sliding slope.
Because this particular one said "Yes, sir," the same will be expected from others, always with the increasingly bare bone of "we'll let you do more than the other guys" while "more" becomes less and less.
In short, the choice will be . . . .
"Ho or No"
It's a shame. The whole idea of computer hardware sites was to provide impartial news and reviews untainted by commercial interests. It is rapidly becoming worse than the computer magazines ever were.
And so cheap! You give them less, and they do more for it! Just let them play with one for a couple hours, and they'll do all kinds of tricks for you.
You can call this a lot of things, but "independent" is not one of them. There's a master here, and it's not you, the audience.
So long as you swallow this and patronize those who shovel this . . . , it's going to get worse and worse, and what you'll get will be more and more worthless, and less and less distinguishable from the company's website PR.
I'm not even going to suggest any sort of protest because 1) I don't think enough people would do it and 2) I don't think most places or the company would listen even if they did.
People will just vote with their feet like they're doing more and more, and will find alternatives that will emerge to meet the needs the websites no longer fulfill.
I'm really beginning to think they're right. I'm tired of disgust being a regular part of my morning routine. I don't even want to look at things like this anymore.
It's a shame. It's really a shame.
Conclusions
I said a few things about the card earlier on, but frankly, I trust neither the company nor the reviews enough to draw any positive conclusions from this.
After a company cooked their Radeon 8500 benchmarks like Enron and Worldcom cooked their balance sheets, would you expect to have even more faith and trust in a company than you did before? Amazingly, ATI apparently think you're stupid enough to do just that.
If those "reviewing" this product act like 'hos, even flaunt it, then give you a hard sell after giving you little or no reason for it, do you start trusting them more, like they were the Virgin Mary or something? I sure don't, and neither should you.
I had planned to buy this card, but frankly, I'm really turned off by this review version of 1984. I really don't want to reward such a place for such behavior by giving them my money. Anybody that makes nVidia look comparatively good frightens me.
But if I grit my teeth and buy it, you know what I should do? I should be the Anti-Ho. I should just find anything and everything wrong with this card, and just report that.
If the absolute opposite is OK, why not?
WTF????
Fuz
http://www.overclockers.com/tips061/
"Radeon 9700: Boredom Over Whoredom"
Ed Stroligo - 7/18/02
The master commands, and the trained dogs bark. This is getting old.
If you claim to be an independent reviewer, you are supposed to examine and judge a product. Not sell it. If you spend more time doing the second, you are not reviewing for your audience. You are whoring for the product producer, even if you don't get a dime for it.
That's what I call whoring. Now that we understand each other . . . .
A Tale of Two 'Hos
The "previews" and "reviews" are basically an effort to give "the boss" (in this case ATI) what they want, in this case, putting out for the company.
I can't claim any great moral superiority, though, because I'm a 'ho, too. The difference is I'm "your" 'ho. I look at these products from the perspective of where you're coming from and what you want, tempered by my own experience and knowledge and thinking.
Now to do that properly, I have to know where you're coming from and what you want, and that's the real reason for the surveys we do.
And that makes all the difference in the world for this particular product for most of you.
Who The Hell Runs Games At 1600X1200?
If the few indicators we have can be believed (more on this below), the Radeon 9700 consistently blows away the Ti4600 at 1600X1200.
Not you, based on the survey information we've seen. 1024X768 seems to be the sweet spot for gamers, with a slow migration towards 1280X1024.
Most of you have monitors that are 17-to-19 inches. 1600X1200 is too tiny for those sizes. If it isn't for you (especially with 17 inches), stop shooting people in Quake, and start sharpshooting people in the Army.
While 21-inch monitors have dropped a lot in price over the past few years, they still cost a good deal more than most people are willing to spend. They're also bulky and heavy, not the best selling points when the buying audience often finds space to be in short supply.
So I don't see any rush to 1600X1200 anytime soon. That will have to await big, cheap LCD displays, and that won't happen until 2005 or so.
The performance pattern of the Radeon 9700 seems to be as follows:
It's not much better at 1024X768 than the Ti4600.
At 1280X1024, gaps widen, to varying degrees.
It usually does pretty well at 1600X1200, but again, to varying degrees.
It is likely that if you like antialiasing and the like, this will do pretty well also, but let's see a little more proof on that first.
However, these conclusions rely on the independence and objectivity of the measurements, and, as we shall see, this is questionable.
A Few Technical Suspicions
The first thing I want to know about a video card is the speed of its memory. That's going to give me a pretty good idea how far this can be pushed. Nobody seemed to want to (or could) figure this one out, but fortunately somebody provided a picture of the chip.
The memory chip used is a Samsung K4D26323RA-GC2A. If you look at the here, you'll find out that it's about a 2.8ns chip that's supposed to have a maximum speed of 700MHz. In short, pretty much the same as Ti4600.
The second thing I want to know is how fast the GPU runs on the top-of-the-line card, pretty much for the same reason I want to know the same about CPUs. It gives me some idea how far lower-end version can go.
These rules work pretty well for nVidia cards, but they may not apply too well here. Given the theoretical doubling of memory bandwidth, faster memory may not make much difference.
Per GPU speed, this is the top of the line card, and given its .15 micron construction, one has to wonder just how much headroom the 9700 has. I suspect not much, especially since ATI has not nailed down the GPU speed of production cards. Since ATI has been know for . . . uhhh . . . downsizing this little specification, this is something you need to watch.
I suspect this GPU isn't going to overclock much, and overclocking memory isn't going to do the average person much good. Indeed, the few scores available at the moment may well be overclocked scores compared to the production model. If you consider 3DMark2001 to be one of the major American sports, you may be greatly disappointed. I could be well wrong, but the prudent should wait until somebody proves me wrong.
Breaches of Faith
Read the second paragraph here. Burying the press in info on very short notice is a common technique used when you want the press to deliver your message rather than let them discover the skeletons in the closet.
This event took stage-managing to a new level. For most of the "reviewers," they only got to test under highly controlled conditions. God only knows what settings were used and what little tweaks or "video card helpers" may have been used.
Has "Quack 3" been forgotten so quickly? After that, I wouldn't believe anything from those folks without checking it thoroughly, much less with company representatives looking over my shoulder.
You might be able to explain that away if all got that treatment, but some are more equal than others, as ATI demonstrated in the case of one place.
"ATI allowed us to the (sic) test the R300 on our own testbeds and we eagerly jumped at the opportunity."
Then we find out "eagerly kneeling" would have been much more appropriate phrasing.
We'll be charitable and mostly ignore the breathless "Like A Virgin" excited hype permeating the article.
We're not charitable enough to ignore the all-too-frequent "ignore my benchmarks" comments
Finally, it would be flat-out negligent not to point out who wears the pants in this relationship.
"Because ATI has yet to finalize drivers and clock speeds, we were only allowed to publish percent (sic) improvements over a GeForce4 Ti 4600."
This is complete nonsense. Less-than-cooked drivers and clock speeds can be a legitimate reason to not have any benchmarks at all, or to caveat the performance. It is never a legitimate reason to change how data is presented.
There's an infinitely more likely reason that ATI ordered this change of presentation. If most of these benchmarks were presented in the normal FPS format, it would be much more obvious to less than anal-retentive readers that the clobberings were occurring at very high resolutions the average person doesn't use, and thus would be less impressive.
But that's not the big problem.
"Allowed?"
My, my, what a good little boy!
I guess I was wrong. This isn't allegorical whoredom, this is allegorical pedophilia.
Since when do manufacturers order reviewers around? Since when do they tell them how they may present information? Since when do self-respecting reviewers say "Yes, sir" and swallow it? And, most importantly, since when do YOU swallow it?
This is not just a matter of one particular reviewer or website. This is a sliding slope.
Because this particular one said "Yes, sir," the same will be expected from others, always with the increasingly bare bone of "we'll let you do more than the other guys" while "more" becomes less and less.
In short, the choice will be . . . .
"Ho or No"
It's a shame. The whole idea of computer hardware sites was to provide impartial news and reviews untainted by commercial interests. It is rapidly becoming worse than the computer magazines ever were.
And so cheap! You give them less, and they do more for it! Just let them play with one for a couple hours, and they'll do all kinds of tricks for you.
You can call this a lot of things, but "independent" is not one of them. There's a master here, and it's not you, the audience.
So long as you swallow this and patronize those who shovel this . . . , it's going to get worse and worse, and what you'll get will be more and more worthless, and less and less distinguishable from the company's website PR.
I'm not even going to suggest any sort of protest because 1) I don't think enough people would do it and 2) I don't think most places or the company would listen even if they did.
People will just vote with their feet like they're doing more and more, and will find alternatives that will emerge to meet the needs the websites no longer fulfill.
I'm really beginning to think they're right. I'm tired of disgust being a regular part of my morning routine. I don't even want to look at things like this anymore.
It's a shame. It's really a shame.
Conclusions
I said a few things about the card earlier on, but frankly, I trust neither the company nor the reviews enough to draw any positive conclusions from this.
After a company cooked their Radeon 8500 benchmarks like Enron and Worldcom cooked their balance sheets, would you expect to have even more faith and trust in a company than you did before? Amazingly, ATI apparently think you're stupid enough to do just that.
If those "reviewing" this product act like 'hos, even flaunt it, then give you a hard sell after giving you little or no reason for it, do you start trusting them more, like they were the Virgin Mary or something? I sure don't, and neither should you.
I had planned to buy this card, but frankly, I'm really turned off by this review version of 1984. I really don't want to reward such a place for such behavior by giving them my money. Anybody that makes nVidia look comparatively good frightens me.
But if I grit my teeth and buy it, you know what I should do? I should be the Anti-Ho. I should just find anything and everything wrong with this card, and just report that.
If the absolute opposite is OK, why not?
WTF????
Fuz