Hellbinder
Banned
This ones from the editor of barrysworld.
I have to admit this kind of thing is really ticking me off... This is chalk full of speculation and misinformation about the R300...
Though lets be realistic... for £329 what do you want? Well it is a directX 9 part but at the moment the lack of DX9 runtime, drivers or games means that all we're seeing is DX8.1 performance... comparing it to the likes of Ti4600, very much a DX8.1 card is like comparing apples to pears... At present what we're seeing is R9700 running as a glorified Radeon 8500, it could all change with DX9.
The R9700 has eight pipelines, the Ti4600 four, so yes at similar clock speeds in extreme conditions (ATI actually quote a maximum of 2X over Ti4600 at 1280x1024 with 4x AA and 16x AF) the ATI will shift double what Ti4600 can, but you really do need to stress things, at lower resolutions it's all still CPU limited and the Radeon sits on par with Ti4200!
by the time ATI had announced the GPU, nvidias partners had already prepared new pricing which will bring the Ti4600 at least £100 below ATI's new baby, by the time it finally hits the shelves in September... any realistic comparison between a GPU that appeared in February to one that lands in July based on different API's and a quantum leap in price apart seems pretty silly.
That review from Anand by the way is rather interesting, although there are obviously no NDA's in place ATI struck agreement with anyone using the board not to release reviews until WHQL cert drivers were made available, which isn't quite the case of yet... We're still throwing our board through UT2003 and the latest builds of drivers... athough delivering good numbers we currently have what can only be described politely as 'issue' with some of the glitching appearing with more advanced effects.
I think ATI learned a lot from the 8500, both technically and from a PR perspective, they're much more self conscious and although they've announced a product long before they should (and even longer before users will get to see them) the approach to coverage, especially on the driver front is encouraging. DX9 isn't due until late October according to Microsoft, R9700's announcement this early looks like a cheap shot to dethrone a DX8.1 card from the start of this year with a GPU which won't be appealing price-wise til maybe early 2003...
A few serious points here...
ATI always like to do things differently, and while I don't criticise diversification this time I'm slightly concerned that end users who buy this product before October are going to feel slightly cheated. Radeon 9700 contains full DX9 support (as in it supports all the minumum features and a few more to get this cert) but there are a few key features which ATI neglected...
For one Radeon 9700 doesn't support looping or procedural calls in the vertex shader, nor does it allow pixel shader instructions to exceed 1024 operands. While this means it can display DirectX 9 conent that also means a lot of the shaders running in next-gen games will have to run several times which will mean a massive performance hit.
DirectX9 is the first API to demand over 48 bits of accuracy in it's calculations. All the directX9 parts will carry 128 bits of floating point accuracy which is a major step forwards, but the R9700 doesn't support _any_ descrete intermediary modes... therefore people like John Carmack who want 64bit FPC's will see no benefit running on these cards. Once games using 64-bit arrive on the market cards which support this will see a 60% calculation advantage over the likes of R9700 because they will be able to do two 64bit fetches in the same time that the R9700 does a single 64bit fetch (because it's occuring within 128bits).
The card is excessively juicy, drawing well in excess of the 40W maximum that the AGP 2.0 spec permits ATI have had to fit a power connector on the board for a hookup to the power supply, god knows what would happen if they ever decided to attempt SLI... a UPS maybe?
Worst of all, ATI have also decided to go off on their own little path with regards to actual shader development, the basis on which the very games that it will be tested against are written.
About 12-18 months ago Microsoft started work on a shader language for DirectX9 called HLSL (imaginatively meaning 'high level shader language') - as you will probably know from reading online NVIDIA have teamed up with Microsoft and taken HLSL a step further by creating a compiler and toolset which allows HLSL/CG or whatever you want to call it (they are in essence the same language) to be compiled to OpenGL or Direct3D... or even compiled at runtime to either API... After speaking to ATI this week it seems that someone wasn't aware that the Microsoft and NVIDIA languages were the same... they percieved them to be totally different (which isn't the case) and have therefore been working on their own language called 'rendermonkey' or something - rendermonkey is totally unlike the microsoft codebase and is more akin to the cinematic language Renderman. This sounds good at first until you remember that languages like Renderman were designed with no consideration as to time, renderman does a lot of stuff in the background which isn't feasible even on a pokey GPU like the R9700.
In short ATI have stepped away from Microsoft, MSDN, Dev Rel, NVIDIA, Matrox and the likes of the major shows like Sigraph in an effort to strike an equal brand... while all of the major development houses have been working with Microsoft and the rest to get the new DX9 shader language out to the masses ATI have just done something which means not only are they not as interoperable but they're less appealing to develop for
There's plenty more for and against the GPU at the moment, hopefully WHQL will hit soon and Rendermonkey will get canned...
worst case they chuck another £300K down the pan
I have to admit this kind of thing is really ticking me off... This is chalk full of speculation and misinformation about the R300...