Yet another OC type editorial....

This ones from the editor of barrysworld.

Though lets be realistic... for £329 what do you want? Well it is a directX 9 part but at the moment the lack of DX9 runtime, drivers or games means that all we're seeing is DX8.1 performance... comparing it to the likes of Ti4600, very much a DX8.1 card is like comparing apples to pears... At present what we're seeing is R9700 running as a glorified Radeon 8500, it could all change with DX9.

The R9700 has eight pipelines, the Ti4600 four, so yes at similar clock speeds in extreme conditions (ATI actually quote a maximum of 2X over Ti4600 at 1280x1024 with 4x AA and 16x AF) the ATI will shift double what Ti4600 can, but you really do need to stress things, at lower resolutions it's all still CPU limited and the Radeon sits on par with Ti4200!


by the time ATI had announced the GPU, nvidias partners had already prepared new pricing which will bring the Ti4600 at least £100 below ATI's new baby, by the time it finally hits the shelves in September... any realistic comparison between a GPU that appeared in February to one that lands in July based on different API's and a quantum leap in price apart seems pretty silly.

That review from Anand by the way is rather interesting, although there are obviously no NDA's in place ATI struck agreement with anyone using the board not to release reviews until WHQL cert drivers were made available, which isn't quite the case of yet... We're still throwing our board through UT2003 and the latest builds of drivers... athough delivering good numbers we currently have what can only be described politely as 'issue' with some of the glitching appearing with more advanced effects.

I think ATI learned a lot from the 8500, both technically and from a PR perspective, they're much more self conscious and although they've announced a product long before they should (and even longer before users will get to see them) the approach to coverage, especially on the driver front is encouraging. DX9 isn't due until late October according to Microsoft, R9700's announcement this early looks like a cheap shot to dethrone a DX8.1 card from the start of this year with a GPU which won't be appealing price-wise til maybe early 2003...

A few serious points here...
ATI always like to do things differently, and while I don't criticise diversification this time I'm slightly concerned that end users who buy this product before October are going to feel slightly cheated. Radeon 9700 contains full DX9 support (as in it supports all the minumum features and a few more to get this cert) but there are a few key features which ATI neglected...

For one Radeon 9700 doesn't support looping or procedural calls in the vertex shader, nor does it allow pixel shader instructions to exceed 1024 operands. While this means it can display DirectX 9 conent that also means a lot of the shaders running in next-gen games will have to run several times which will mean a massive performance hit.

DirectX9 is the first API to demand over 48 bits of accuracy in it's calculations. All the directX9 parts will carry 128 bits of floating point accuracy which is a major step forwards, but the R9700 doesn't support _any_ descrete intermediary modes... therefore people like John Carmack who want 64bit FPC's will see no benefit running on these cards. Once games using 64-bit arrive on the market cards which support this will see a 60% calculation advantage over the likes of R9700 because they will be able to do two 64bit fetches in the same time that the R9700 does a single 64bit fetch (because it's occuring within 128bits).

The card is excessively juicy, drawing well in excess of the 40W maximum that the AGP 2.0 spec permits ATI have had to fit a power connector on the board for a hookup to the power supply, god knows what would happen if they ever decided to attempt SLI... a UPS maybe?

Worst of all, ATI have also decided to go off on their own little path with regards to actual shader development, the basis on which the very games that it will be tested against are written.

About 12-18 months ago Microsoft started work on a shader language for DirectX9 called HLSL (imaginatively meaning 'high level shader language') - as you will probably know from reading online NVIDIA have teamed up with Microsoft and taken HLSL a step further by creating a compiler and toolset which allows HLSL/CG or whatever you want to call it (they are in essence the same language) to be compiled to OpenGL or Direct3D... or even compiled at runtime to either API... After speaking to ATI this week it seems that someone wasn't aware that the Microsoft and NVIDIA languages were the same... they percieved them to be totally different (which isn't the case) and have therefore been working on their own language called 'rendermonkey' or something - rendermonkey is totally unlike the microsoft codebase and is more akin to the cinematic language Renderman. This sounds good at first until you remember that languages like Renderman were designed with no consideration as to time, renderman does a lot of stuff in the background which isn't feasible even on a pokey GPU like the R9700.

In short ATI have stepped away from Microsoft, MSDN, Dev Rel, NVIDIA, Matrox and the likes of the major shows like Sigraph in an effort to strike an equal brand... while all of the major development houses have been working with Microsoft and the rest to get the new DX9 shader language out to the masses ATI have just done something which means not only are they not as interoperable but they're less appealing to develop for


There's plenty more for and against the GPU at the moment, hopefully WHQL will hit soon and Rendermonkey will get canned...
worst case they chuck another £300K down the pan

I have to admit this kind of thing is really ticking me off... This is chalk full of speculation and misinformation about the R300...
 
Though lets be realistic... for £329 what do you want? Well it is a directX 9 part but at the moment the lack of DX9 runtime, drivers or games means that all we're seeing is DX8.1 performance... comparing it to the likes of Ti4600, very much a DX8.1 card is like comparing apples to pears... At present what we're seeing is R9700 running as a glorified Radeon 8500, it could all change with DX9.

Glorified 8500???? He starts right off hsowing he is nothing more than a common mindless Nvidia fanatic. Looking at history, usually cards released for the up-comming release of DirectX perform WORSE in the current version of DirectX. Usually you hear somehting to the effect of.. "your not seeing tis full potential yet"..

The R9700 has eight pipelines, the Ti4600 four, so yes at similar clock speeds in extreme conditions (ATI actually quote a maximum of 2X over Ti4600 at 1280x1024 with 4x AA and 16x AF) the ATI will shift double what Ti4600 can, but you really do need to stress things, at lower resolutions it's all still CPU limited and the Radeon sits on par with Ti4200!

So, why is this even a damn point???? Even the 8500 scores the same...

by the time ATI had announced the GPU, nvidias partners had already prepared new pricing which will bring the Ti4600 at least £100 below ATI's new baby, by the time it finally hits the shelves in September... any realistic comparison between a GPU that appeared in February to one that lands in July based on different API's and a quantum leap in price apart seems pretty silly.

This entire statement is flat out crap. ATi says R300 will ship in August. Did this ass complain when everyone benched the 8500 against the Ti 4600?? why is there ALWAYS a doubble standard with Nvidia types.

That review from Anand by the way is rather interesting, although there are obviously no NDA's in place ATI struck agreement with anyone using the board not to release reviews until WHQL cert drivers were made available, which isn't quite the case of yet... We're still throwing our board through UT2003 and the latest builds of drivers... athough delivering good numbers we currently have what can only be described politely as 'issue' with some of the glitching appearing with more advanced effects.

This sounds very, very questionable in my opinion.. No oen else has seen any issues... yet he claims to have them in the SAME games as the other reviewers?? And even bringing up issues exsisting in by his own words drivers 2 months before release... please... However i dont think there is even a shread of truth to this comment.

I think ATI learned a lot from the 8500, both technically and from a PR perspective, they're much more self conscious and although they've announced a product long before they should (and even longer before users will get to see them) the approach to coverage, especially on the driver front is encouraging. DX9 isn't due until late October according to Microsoft, R9700's announcement this early looks like a cheap shot to dethrone a DX8.1 card from the start of this year with a GPU which won't be appealing price-wise til maybe early 2003...

Again.. the product is shipping in august.. this is long before They should??? Now its a "cheap shot" to dethrone the GF4... please.. Would assinine statements be made if this were an Nvidia card? Plus he is intentionally not telling the truth anyway. The GF4 Ti is still almost 400$

A few serious points here...
ATI always like to do things differently, and while I don't criticise diversification this time I'm slightly concerned that end users who buy this product before October are going to feel slightly cheated. Radeon 9700 contains full DX9 support (as in it supports all the minumum features and a few more to get this cert) but there are a few key features which ATI neglected...

For one Radeon 9700 doesn't support looping or procedural calls in the vertex shader, nor does it allow pixel shader instructions to exceed 1024 operands. While this means it can display DirectX 9 conent that also means a lot of the shaders running in next-gen games will have to run several times which will mean a massive performance hit.

If ATi is Meeting the DX9 spec.. and Nvidia has gone beyond it... Who is the one going off on their own??? Not Ati.. Its gets so, so frustrating constantly having to put up with Nvidia types constant use of inverted logic. First i think he is flat out WRONG about the vertex shaders.. Second how is the 1024 ops any different than the PS 1.3 Vs PS 1.4... This crap pisses me off to no end. It is misleading and dihonest. I cant stand Hypocracy. Besides, HONEST developers will be writting to the DX9 SPEC not Nvidias personal little PS world.

DirectX9 is the first API to demand over 48 bits of accuracy in it's calculations. All the directX9 parts will carry 128 bits of floating point accuracy which is a major step forwards, but the R9700 doesn't support _any_ descrete intermediary modes... therefore people like John Carmack who want 64bit FPC's will see no benefit running on these cards. Once games using 64-bit arrive on the market cards which support this will see a 60% calculation advantage over the likes of R9700 because they will be able to do two 64bit fetches in the same time that the R9700 does a single 64bit fetch (because it's occuring within 128bits).

Again... This is just absurd. This totally smacks of a secret Nvidia PDF.. John carmack HIMSELF said the 9700 had teh PERFECT FEATURE SET FOR DOOM!!!! The 9700 also has Twin Stencil registers PER PIPELINE..

The card is excessively juicy, drawing well in excess of the 40W maximum that the AGP 2.0 spec permits ATI have had to fit a power connector on the board for a hookup to the power supply, god knows what would happen if they ever decided to attempt SLI... a UPS maybe?

its designed for AAGP 8x..... What the hell kind of a immature comment is this in the first place.

Worst of all, ATI have also decided to go off on their own little path with regards to actual shader development, the basis on which the very games that it will be tested against are written.

this is Straight up Bullshit. Plain and simple.

About 12-18 months ago Microsoft started work on a shader language for DirectX9 called HLSL (imaginatively meaning 'high level shader language') - as you will probably know from reading online NVIDIA have teamed up with Microsoft and taken HLSL a step further by creating a compiler and toolset which allows HLSL/CG or whatever you want to call it (they are in essence the same language) to be compiled to OpenGL or Direct3D... or even compiled at runtime to either API... After speaking to ATI this week it seems that someone wasn't aware that the Microsoft and NVIDIA languages were the same... they percieved them to be totally different (which isn't the case) and have therefore been working on their own language called 'rendermonkey' or something - rendermonkey is totally unlike the microsoft codebase and is more akin to the cinematic language Renderman. This sounds good at first until you remember that languages like Renderman were designed with no consideration as to time, renderman does a lot of stuff in the background which isn't feasible even on a pokey GPU like the R9700.

This is again pure bullshit.. And what the hell is this comment at the end about pokey GPU????

In short ATI have stepped away from Microsoft, MSDN, Dev Rel, NVIDIA, Matrox and the likes of the major shows like Sigraph in an effort to strike an equal brand... while all of the major development houses have been working with Microsoft and the rest to get the new DX9 shader language out to the masses ATI have just done something which means not only are they not as interoperable but they're less appealing to develop for


There's plenty more for and against the GPU at the moment, hopefully WHQL will hit soon and Rendermonkey will get canned...
worst case they chuck another £300K down the pan

Please.. these last comments are just as absurd as above...

What this is doing is demonstrate yet again the overwhealming mindless Nvidia worship going on out there on the internet. It is really really sad..
 
Can we stop posting this crap here? I could care less about some idiot's lame website criticizing the R300. If Anandtech posted this, it might be valid, but Barry's World? Come on. Let the morons with blogs write their diatribes against the R300. They're irrelevent.
 
mmmkay...

Man, where do some people get off?

That has to be one of the most hypocritical, self-contradictory "editorials" I've ever seen.

I'd don't like to use the term because I think it propagates a negative attitude towards nVidia's fine products and those who use them without stupidity, but nVidiot is the only word for this guy.

Somebody needs to tell this guy that the folks at nVidia are doing just fine on their own and they certainly don't need the likes of him defending them.
 
DemoCoder said:
Can we stop posting this crap here? I could care less about some idiot's lame website criticizing the R300. If Anandtech posted this, it might be valid, but Barry's World? Come on. Let the morons with blogs write their diatribes against the R300. They're irrelevent.

Thats true, who the hell is Barry any way?

But, and this a big but...... what happens when this gargabe (not exactly the same thing, but you get the idea) is printed in major magazines, the magazines the average computer users buy??? That, imho, will have alot more impact than than Anand or Toms sites.

Its been happening for a while now, and it does no good for the "underdog". AMD had to fight through this crap for a long time, lets hope ATI can do the same, for competitions sake.
 
Guys, Barrysworld in teh biggest game site in Europe... Its like HomeLanfed on steroids.. You can even rent game servers monthy from them...

They are HUGE in europe.
 
He's wrong anyway:

For one Radeon 9700 doesn't support looping or procedural calls in the vertex shader, nor does it allow pixel shader instructions to exceed 1024 operands. While this means it can display DirectX 9 conent that also means a lot of the shaders running in next-gen games will have to run several times which will mean a massive performance hit.

Straight from tha ATi docs:

Version 2.0 vertex shaders add new flow control commands, including loops, jumps and subroutines. As standard features in higher-level languages like C, these commands make it easier to code vertex shaders and enhance processing efficiency.

I've met the guy that runs Barry's World a couple of times and I can say this type of 'editorial' doesn't surprise me at all...
 
How about the comment that it can only do 128-bit floating point, and nothing less? I seem to recall comments from ATi reps that directly contradict that. *shrug* It is pretty well established that most internet "journalism" is completely without even the responsibility and ethics present in print and media journalism. I don't think it is a matter of underhanded kick backs as some have implied, I think it is a matter of the natural response of some people to having their egos stroked by any vendor, and some psychological associatian of their...hmm..."manhood" with a particular brand and the "power" they associate with it. :rolleyes: I'm not saying anything new, but it is interesting how misinformation and distortion flourish when that brand/"manhood", whichever brand it is at the time, is perceived to be challenged.

Now, I believe that any human being is capable of behavior that we stereotypically associate with a particular gender, but I can't help thinking that based on the principle that a particular gender more likely to exhibit particular behavior due to the environment produced by social expectation, as well as biology/hormones, that the state of computer technology reviews would be much improved if there were a lot more women doing the editorials and reviews. Sort of makes me embarassed to be a guy, sometimes. :-?
 
How about the comment that it can only do 128-bit floating point, and nothing less? I seem to recall comments from ATi reps that directly contradict that.

I don't understand his argument there anyway. If the pipelines native precisions in 128Bits (and it doen't need pipeline combining, multiple internal passes etc to achieve that) then internally its going to be just as fast at 128bit FP as it is 32bit integer. Once it gets to the DAC is still going to be 32bit (or even 16, but I doubt it) integer.
 
obvioulsy in the UK and europe editorials like that from BW hold a lot of sway with the online gaming community, and lets face it its that hardcore community who will purchase this card (or not).

I agree with Democoder in a way though, do we have to have every wrong statement about the R300 posted here?

Maybe to learn where these guys are wrong so a constructive e-mail response to the editorial can be made eh. Over to you Hellbinder, let un know BW's response?
 
somehow, I have great difficulties imaginating HellBinder writing an email with constructive criticism considering R300...

and I agree, I don't see any reason why every biased R300 article needs to be posted here. And yes, I have to at least try to understand HellBinder's point of view as being a some kind of fanboy myself (but for a different company though.), but isn't a bit too much try to make everyone love R300?? heck, as long as there is Graphic companies, there will be fanboys and biased articles.
 
DaveBaumann said:
How about the comment that it can only do 128-bit floating point, and nothing less? I seem to recall comments from ATi reps that directly contradict that.

I don't understand his argument there anyway. If the pipelines native precisions in 128Bits (and it doen't need pipeline combining, multiple internal passes etc to achieve that) then internally its going to be just as fast at 128bit FP as it is 32bit integer. Once it gets to the DAC is still going to be 32bit (or even 16, but I doubt it) integer.

Well, the nVidia presentation seems to indicate that you can use less precision for greater performance...perhaps it would only matter in multipass situations and other times when you'd be reading 128 bit data through the memory controller (depending on the caching system I'd think), such as displacement mapping.
 
I think Sweden is in Europe, I've never heard of any Barrysworld.

Even if it was a more well known site, I'd still say that you should keep it out of here.

I think I'm finaly going to get me an sig.
 
Basic said:
I think Sweden is in Europe, I've never heard of any Barrysworld.

Even if it was a more well known site, I'd still say that you should keep it out of here.

I think I'm finaly going to get me an sig.
Same over here in France ;)
 
Hmm, thats barrysworld for you. :) They aren't quite as huge as has been said and they certainly don't hold much sway over the european/UK buyers (not that I'm aware of anyway).

I agree with the above posts that you really don't need to mention every nvidiatic article blasting the r300. (that one was only a forum post :)). It's not too suprising that these are appearing (esepecially on somewhere like barrysworld :D) and there's bound to be a few more yet. Ignore them :).

Personally I've never seen any point in brand loyalty, I'm a student on a limited budget and brand loyalty is not very economical.
 
Back
Top