Oh my god!!!

Fuz

Regular
This taken from overclockers.com. Sorry for the long post, but its well worth a read.

http://www.overclockers.com/tips061/

"Radeon 9700: Boredom Over Whoredom"
Ed Stroligo - 7/18/02

The master commands, and the trained dogs bark. This is getting old.

If you claim to be an independent reviewer, you are supposed to examine and judge a product. Not sell it. If you spend more time doing the second, you are not reviewing for your audience. You are whoring for the product producer, even if you don't get a dime for it.

That's what I call whoring. Now that we understand each other . . . .

A Tale of Two 'Hos

The "previews" and "reviews" are basically an effort to give "the boss" (in this case ATI) what they want, in this case, putting out for the company.

I can't claim any great moral superiority, though, because I'm a 'ho, too. The difference is I'm "your" 'ho. I look at these products from the perspective of where you're coming from and what you want, tempered by my own experience and knowledge and thinking.

Now to do that properly, I have to know where you're coming from and what you want, and that's the real reason for the surveys we do.

And that makes all the difference in the world for this particular product for most of you.

Who The Hell Runs Games At 1600X1200?

If the few indicators we have can be believed (more on this below), the Radeon 9700 consistently blows away the Ti4600 at 1600X1200.

Not you, based on the survey information we've seen. 1024X768 seems to be the sweet spot for gamers, with a slow migration towards 1280X1024.

Most of you have monitors that are 17-to-19 inches. 1600X1200 is too tiny for those sizes. If it isn't for you (especially with 17 inches), stop shooting people in Quake, and start sharpshooting people in the Army.

While 21-inch monitors have dropped a lot in price over the past few years, they still cost a good deal more than most people are willing to spend. They're also bulky and heavy, not the best selling points when the buying audience often finds space to be in short supply.

So I don't see any rush to 1600X1200 anytime soon. That will have to await big, cheap LCD displays, and that won't happen until 2005 or so.

The performance pattern of the Radeon 9700 seems to be as follows:


It's not much better at 1024X768 than the Ti4600.

At 1280X1024, gaps widen, to varying degrees.

It usually does pretty well at 1600X1200, but again, to varying degrees.

It is likely that if you like antialiasing and the like, this will do pretty well also, but let's see a little more proof on that first.

However, these conclusions rely on the independence and objectivity of the measurements, and, as we shall see, this is questionable.

A Few Technical Suspicions

The first thing I want to know about a video card is the speed of its memory. That's going to give me a pretty good idea how far this can be pushed. Nobody seemed to want to (or could) figure this one out, but fortunately somebody provided a picture of the chip.

The memory chip used is a Samsung K4D26323RA-GC2A. If you look at the here, you'll find out that it's about a 2.8ns chip that's supposed to have a maximum speed of 700MHz. In short, pretty much the same as Ti4600.

The second thing I want to know is how fast the GPU runs on the top-of-the-line card, pretty much for the same reason I want to know the same about CPUs. It gives me some idea how far lower-end version can go.

These rules work pretty well for nVidia cards, but they may not apply too well here. Given the theoretical doubling of memory bandwidth, faster memory may not make much difference.

Per GPU speed, this is the top of the line card, and given its .15 micron construction, one has to wonder just how much headroom the 9700 has. I suspect not much, especially since ATI has not nailed down the GPU speed of production cards. Since ATI has been know for . . . uhhh . . . downsizing this little specification, this is something you need to watch.

I suspect this GPU isn't going to overclock much, and overclocking memory isn't going to do the average person much good. Indeed, the few scores available at the moment may well be overclocked scores compared to the production model. If you consider 3DMark2001 to be one of the major American sports, you may be greatly disappointed. I could be well wrong, but the prudent should wait until somebody proves me wrong.

Breaches of Faith

Read the second paragraph here. Burying the press in info on very short notice is a common technique used when you want the press to deliver your message rather than let them discover the skeletons in the closet.

This event took stage-managing to a new level. For most of the "reviewers," they only got to test under highly controlled conditions. God only knows what settings were used and what little tweaks or "video card helpers" may have been used.

Has "Quack 3" been forgotten so quickly? After that, I wouldn't believe anything from those folks without checking it thoroughly, much less with company representatives looking over my shoulder.

You might be able to explain that away if all got that treatment, but some are more equal than others, as ATI demonstrated in the case of one place.

"ATI allowed us to the (sic) test the R300 on our own testbeds and we eagerly jumped at the opportunity."

Then we find out "eagerly kneeling" would have been much more appropriate phrasing.

We'll be charitable and mostly ignore the breathless "Like A Virgin" excited hype permeating the article.

We're not charitable enough to ignore the all-too-frequent "ignore my benchmarks" comments

Finally, it would be flat-out negligent not to point out who wears the pants in this relationship.

"Because ATI has yet to finalize drivers and clock speeds, we were only allowed to publish percent (sic) improvements over a GeForce4 Ti 4600."

This is complete nonsense. Less-than-cooked drivers and clock speeds can be a legitimate reason to not have any benchmarks at all, or to caveat the performance. It is never a legitimate reason to change how data is presented.

There's an infinitely more likely reason that ATI ordered this change of presentation. If most of these benchmarks were presented in the normal FPS format, it would be much more obvious to less than anal-retentive readers that the clobberings were occurring at very high resolutions the average person doesn't use, and thus would be less impressive.

But that's not the big problem.

"Allowed?"

My, my, what a good little boy!

I guess I was wrong. This isn't allegorical whoredom, this is allegorical pedophilia.

Since when do manufacturers order reviewers around? Since when do they tell them how they may present information? Since when do self-respecting reviewers say "Yes, sir" and swallow it? And, most importantly, since when do YOU swallow it?

This is not just a matter of one particular reviewer or website. This is a sliding slope.

Because this particular one said "Yes, sir," the same will be expected from others, always with the increasingly bare bone of "we'll let you do more than the other guys" while "more" becomes less and less.

In short, the choice will be . . . .

"Ho or No"

It's a shame. The whole idea of computer hardware sites was to provide impartial news and reviews untainted by commercial interests. It is rapidly becoming worse than the computer magazines ever were.

And so cheap! You give them less, and they do more for it! Just let them play with one for a couple hours, and they'll do all kinds of tricks for you.

You can call this a lot of things, but "independent" is not one of them. There's a master here, and it's not you, the audience.

So long as you swallow this and patronize those who shovel this . . . , it's going to get worse and worse, and what you'll get will be more and more worthless, and less and less distinguishable from the company's website PR.

I'm not even going to suggest any sort of protest because 1) I don't think enough people would do it and 2) I don't think most places or the company would listen even if they did.

People will just vote with their feet like they're doing more and more, and will find alternatives that will emerge to meet the needs the websites no longer fulfill.

I'm really beginning to think they're right. I'm tired of disgust being a regular part of my morning routine. I don't even want to look at things like this anymore.

It's a shame. It's really a shame.

Conclusions

I said a few things about the card earlier on, but frankly, I trust neither the company nor the reviews enough to draw any positive conclusions from this.

After a company cooked their Radeon 8500 benchmarks like Enron and Worldcom cooked their balance sheets, would you expect to have even more faith and trust in a company than you did before? Amazingly, ATI apparently think you're stupid enough to do just that.

If those "reviewing" this product act like 'hos, even flaunt it, then give you a hard sell after giving you little or no reason for it, do you start trusting them more, like they were the Virgin Mary or something? I sure don't, and neither should you.

I had planned to buy this card, but frankly, I'm really turned off by this review version of 1984. I really don't want to reward such a place for such behavior by giving them my money. Anybody that makes nVidia look comparatively good frightens me.

But if I grit my teeth and buy it, you know what I should do? I should be the Anti-Ho. I should just find anything and everything wrong with this card, and just report that.

If the absolute opposite is OK, why not?

WTF????

Fuz
 
wow ....

I think of myself as a matrox fan , simply because they are a local company and i've had most of their products since i've been "into" computers

but i have to admit i was dissapointed when i saw the parhelia perfomance for the first time , it looks good , plays good , but i have this feeling i've been let down , i guess i don't "work" enough on my computer to enjoy the triple head display

this brings me back to ATI , like dave said, i can't remember a product launch where i've been impressed as much , for the time beeing , this card is a great follow-up to the Geforce 4 , Radeon 8500 "generation"

I think that Perfomances on today's game have reached their limit , i find "moronic" the reviewing mentality of most of the "computer hardware sites"

coming back to the R300 , i can't find anything bad about it , since in comparision to the current products , it is clearly superior ( of course the nv30 may be superior , and probably will be , in some ways ) but future products are months away , I find it hard to belive Chalnoth's comment regarding ATI's stalling the 3d market , i'm trying to figure out where's he's coming from and what he does for a living , reading his msgs from a distance he seems to know his stuff ( more so than me it seems ) , in theory. I'm not here to bash him but to understand his opinions

now i can't understand why anyone can bash the ATI , quite simply anyone who does his an "Nvidiot" or people who claim that the only good company was 3dfx

but take my opinion with a grain of salt , i'm the guy thinking nurbs is the funniest word since Seattle

with love
muted
 
Who The Hell Runs Games At 1600X1200?

I never understand why some people think they speak for everyone. I don't run games at 1600x1200, but it's not because 1024x768 is good enough. It's because my current video card isn't fast enough.

Ati is not totally in the right here, but neither is this guy. I figure Ati rushed things so they could make the announcement before Siggraph, which starts Tuesday for the marketing folks. So I can understand the short notice on the specs. What I don't understand is the decision to allow normalized benchmarks. The only explaination I can think of is that after the over hype of Parhelia Ati wanted to let people know the claimed performance is for real. Still why not just allow benchmark numbers to be posted with a disclaimer of beta drivers?
 
This post made me finally register. His site must be in dire need of hits. That's the only reason i can think of for making such an unfounded biased article. Maybe we can set up a charity fund to help him out. :rolleyes:
Funny too, the article made me think about Chalnoth a lot. ;)
 
wow i think we are all dumber just for readin that. I know i am. Seriously I don't see how this is diffrent from the nforce 2 previews or even geforce 4 and radeon 8500 previews . This just once again shows me how many people are in nvidia's pockets and why i will never again go to overclockers.com
 
3dcgi said:
Who The Hell Runs Games At 1600X1200?

I never understand why some people think they speak for everyone. I don't run games at 1600x1200, but it's not because 1024x768 is good enough. It's because my current video card isn't fast enough.

Ati is not totally in the right here, but neither is this guy. I figure Ati rushed things so they could make the announcement before Siggraph, which starts Tuesday for the marketing folks. So I can understand the short notice on the specs. What I don't understand is the decision to allow normalized benchmarks. The only explaination I can think of is that after the over hype of Parhelia Ati wanted to let people know the claimed performance is for real. Still why not just allow benchmark numbers to be posted with a disclaimer of beta drivers?

I think it's because once the benchmark numbers are out there, they're practically set in stone. It doesn't matter if it was "beta drivers" or anything. ATi sort of got burned by the whole beta drivers thing with their last 2 parts and probably was not keen on having it happen again. Not to mention they haven't set on a final clock speed yet even (I'm hoping they push it up to 330 or 335 MHz though, the more the better!).
 
Man, what a wank. Sounds like a case of sour grapes to me. The card obviously kicks major arse, and all this guy can do is whinge about it.

It's cool to hate what everyone else likes I guess. Pffft.

As for 1600x1200 - if my card could run that at speed, I'd be there in a second. It looks fantastic on either my 17" or 19".
 
Its a bit OT, but who plays at 1280x1024??? Or has thier windows set at that res?

640/480 = 1.3'
800/600 = 1.3'
1024/768 = 1.3'
1600/1200 = 1.3'

Now, if you run at 1280x1024, it seems the aspect ratio is out of wack, right? Cause 1280/1024 = 1.25

So you should be runnning it at 1280x960, correct? Or is there something fundamentally wrong with my logic?
 
Fuz said:
Its a bit OT, but who plays at 1280x1024??? Or has thier windows set at that res?

640/480 = 1.3'
800/600 = 1.3'
1024/768 = 1.3'
1600/1200 = 1.3'

Now, if you run at 1280x1024, it seems the aspect ratio is out of wack, right? Cause 1280/1024 = 1.25

So you should be runnning it at 1280x960, correct? Or is there something fundamentally wrong with my logic?

Yes, 1280x960 is the proper aspect ratio... however most games/windows support 1280x1024 for some reason, and the difference is not noticable.

Infact, if I want to run at 1280x960 I have to use powerstrip or another work around because its not enabled by default
 
mech said:
I find I notice quite a different with 1280x1024... 1280x960 looks much nicer :)
To tell you the truth, I've never even tried 1280x960 :oops:

I just don't notice any irregularities at 1280x1024
 
12x10 was used b/c it maximized the memory of old video cards (it fit perfectly or closely into their tiny amounts, back when every bit counted). Since it's 5:4, it'll make circles or other objects designed in a 4:3 ratio look squished. But it won't affect how 3D objects look--higher res means higher resolution, not the ability to fit more on the screen.

So 3D games shouldn't look deformed, but simply higher res; your 2D desktop will look slightly squished, though (circles will look like ellipses, squares like rectangles).

12x9 is ratio correct for CRTs, but most video card makers don't include it by default, for some odd reason. 12x10 is ratio correct for 17-19" LCD's, though, as they're physically 5:4.

The rant was almost totally off-base, but I don't think it was nec'y to quote it here in full.
 
Not seeing the wood for the trees?
The point is that what were once supposed to be independent websites now get the choke-chain from manufacturers who decide what content will make it into their articles and how it will be represented. This is not to your good. It means you will only receive relatively unbiased reviews when products are in general circulation and journalists can afford them. This could mean two months in this case.
1600x1200 is not his point, his point is ATi spotting exactly where they are relative to the competition and ordering that benchmarks be presented in a certain way for their advantage.
 
Above said:
Not seeing the wood for the trees?
The point is that what were once supposed to be independent websites now get the choke-chain from manufacturers who decide what content will make it into their articles and how it will be represented. This is not to your good. It means you will only receive relatively unbiased reviews when products are in general circulation and journalists can afford them. This could mean two months in this case.
1600x1200 is not his point, his point is ATi spotting exactly where they are relative to the competition and ordering that benchmarks be presented in a certain way for their advantage.
No. This is the point of a preview, which the author doesn't seem to understand. When companies start intefering with the way journalists do reviews, then you have a problem, however thats not the case here.

If that was his only point, he could've left out 9/10ths of that article which does nothing but needlessly bash ATi and other reviewers.

God, all those whore gaming sites that say Doom3 will be good really piss me off :rolleyes:

Is ATi stopping you from writing a preview? No, last I checked the white papers are publically available. Is ATi in anyway controlling the content of your preview (Other than limiting the amount of information available, as per any preview, in any industry)? No, I'm pretty sure you can write about whatever you want, good or bad, without worrying about ATi complaining. I don't see what the big deal about the percentages is.. they offer a few reviewers a chance to try out the technology first hand, so they could gauge for themselves the performance, and didn't want to mislead people by giving them numbers that could change, so they offered them only relative performance instead.

The tests also were not nearly as controlled as the author seems to think. I'm sure ATi really sat down and decided that it would be a good idea to show an unplayable game to HardOCP, because they decided it would only show the 9700 in a positive light.

Network Associates once had in their EULA that you had to get written permission from them in order to write a review of any of their products. THAT is something to worry about, not whether or not a company, in any industry, controls information regarding an unreleased product.
 
I've been to quite a few previews where they've said - "You can take benchmarks, but don't publish them please", which I find fair enough as usually the previews are done with fairly alpha drivers and possibly prerelease hardware. I thought it quite interesting that ATi went as far as they did with the benchmarks that could be published, but then that a measure of the confidence they have with their hardware.

I have to say its a little odd that the site chose ATi's preview to take a swipe at - its not as though the 'Paper Launch' hasn't been around for a while; we've witnessed 4 prior to this this year, and not one was ATi.
 
It seems logical to me to bench at 1600x1200. I can't run at that res, I can just make 1152x864 if I push my monitor to 80Hz but if a card is capable of 4xFSAA at 1600x1200 then its going to manage 6xfsaa at 1024x768 :)

Most of you have monitors that are 17-to-19 inches. 1600X1200 is too tiny for those sizes. If it isn't for you (especially with 17 inches), stop shooting people in Quake, and start sharpshooting people in the Army.

eh? Higher res doesn't make 3d smaller. :LOL: He sounds like one of those people who runs CS in 800x600 because he thinks that everyone appears bigger and so easier to headshot :rolleyes:

It looks as if he spent quiet a bit of time over that article, oh dear... :-?
 
Above said:
I think some have the misconception that ATi have allowed no benchmarks. As if ATi could direct content to little effect in such a case.
No, ATi have allowed benchmarks when they dictate the material. See the Anandtech review.

What on earth are you talking about?
 
Back
Top