R520 = Dissapointment

Well...

It has to be said by someone, so i'll say it.

What the *** was Ati thinking?

No this is not because I was expecting the R520 to walk all over the GTX No.. But i sure as heck didn’t expect it to LOSE all but maybe 3 games out there. When your new card that everyone has waited for loses in DoD:Source.. which type of rendering is supposed to be your strong suit... you have a problem. The only game it seems to *supposedly* dominate in is FEAR and that is likely simply because of the 512MB frame buffer. In fact Firing Squad shows the R520 Losing in Fear in every single one of their tests.

The R520 is the re-advent of the 8500. *supposed* superior features and a near complete belly flop in performance.

I was at least expecting the card to maintain a 10-15% lead. Something.. surely ATi would not release a card that is months late and still loses a majority of the benchmarks and most by 10 FPS or more... Apparently not.

Hell, Nvidia didn’t even need the new "DET 80's".

I don’t know what the heck has happened over there at ATi, but they have completely lost any edge they may have gained in the mindshare with the R300.

This is going to be some interesting marketing spin the next couple weeks.

(by the way, apparently you simply can no longer trust *anything* Ati may tell you or say about their product's performance or features before you actually see it for yourself. That includes private or public statements)
 
Last edited by a moderator:
??

Most of what I saw /read indicates that the XT is as fast or faster than the GTX overall. Not much, but overall faster. On top of that, it has a superior feature set.

The main question with the entire line-up is going to availability (for certain SKUs) and street pricing.

Seems to me that ATI was "thinking" just fine. They just didn't execute as well as they could /should have with that "soft ground" issue tripping them up.

I think that people in general are just waaaay to overoptimistic these days with new GPU launches (from all sides.)
 
I think you are ove reacting, I quite like it.

You need to relax and go get a Starbucks Dude. :)

[RANT] When oh when is ATI going thier Opengl Implementation. [/RANT]

O-T:
Regular participants of our forums may well have been aware of some reasonably obscure numbering schemes for many months that were used to describe parts of the performance characteristics, such as 16-1-1-1 for R520, 4-1-3-2 for RV530 and 4-1-1-1 for RV515, but until now the exact meaning of these weren't known. With the specifications for each of the chips we can now derive that the first number equates to the number of ROP's, the second the number of texture units per ROP, the third the number of "shader pipes" per ROP, and finally the Z/Stencil multiplier per ROP - with these figures in mind, we'll let you consider the ramifications for parts in the R5xx series that are still to come...

I thought R520 has come already, do you not mean RV5xx ;)
 
What about the XL, and the X1600/X1300 line? They seem pretty unimpressive performance wise.

I really, really want to see somebody compare the X1800XT and X800XT at similar clocks.
 
It's the XL that is the major dissapointment, how can they release it with such crappy clocks? It's at GTX pricing yet can barely compete with the GT. They either need to drop the price or raise the clocks.
 
Benchmarks are misleading IMO. Quality wise ATI is superior of course the number don't reflect that.

I've read 3-4 reviews then i stopped reading. It's a waste of my time. If reviewers can't get their act together why even make reviews?
 
It's just late. Need to see OC benchies.

But for what, $150 more is the marginal increase of x1800 worth it? Maybe in some newer games we'll see more of a difference as someone else suggested.


Hopefully r580 isnt pushed back. Need something better on the table. But the features introduced are nice.
 
?

I have not read all the reviews, but I am looking at Extremetech's charts and it seems like it is more tit for tat, with the X1800XT doing well with AA/AF. Just looking at SOME of the settings where the X1800XT seems to beat out the 7800GTX:

Far Cry
ATI 12x10 w/AA/AF 106fps
NV 12x10 w/AA/AF 89fps
ATI 16x12 w/AA/AF 81fps
NV 16x12 w/AA/AF 65

Splinter Cell
ATI 12x10 w/AF 61fps
NV 12x10 w/AF 47fps
ATI 16x12 w/AF 45fps
NV 16x12 w/AF 35fps

Serious Sam
ATI 16x12 59fps
NV 16x12 56fps
ATI 16x12 w/AA/AF 48fps
NV 16x12 w/AA/AF 41fps

CoD2
ATI 12x10 w/AA/AF 55fps
NV 12x10 w/AA/AF 37fps
ATI 16x12 47fps
NV 16x12 42fps

Half-Life 2
ATI 12x10 w/AA/AF 98fps
NV 12x10 w/AA/AF 88fps
ATI 16x12 w/AA/AF 83fps
NV 16x12 w/AA/AF 69fps

That is more than 3 games. The only game I found on their site where the 7800GTX really separated itself was, surprise, Doom 3. Based on the tests Extremetech did, comparing the X1800XT and 7800GTX, the deduced

@ 1280x1024 => X1800 is 3% faster than 7800GTX on average
@ 1280x1024 4xAA 8xAF => X1800 is 15% faster than 7800GTX on average
@ 1600x1200 => X1800 is 3% faster than 7800GTX on average
@ 1600x1200 4xAA 8xAF => X1800 is 16% faster than 7800GTX on average

Obviously this is not conclusive, (no single review is) but it seems that the R520 is significantly faster in some new games (CoD2, SC:CT) with certain settings.

To me the question is: What settings do you play with?

Maybe I have not read enough reviews yet (good possibility) but it does not appear to me that the X1800 "flopping" as you say.

Competitive performance with good features (Avivo, SM3.0, FP16 blending + MSAA, etc). Were people expecting a miracle?
 
The Baron said:
He most certainly does not.

16-1-3-1

:)

16*3=48, *rubs hands with glee.
*thinks about 8 month wait :(
Might be a bit big for 90nm I think.
Could some one please elaborate on the structure of the ALUs contained in the RV530 please ?
 
yea i have not ready all of them, but for the most part the x1800xt is just a bit faster with better IQ wich is where we all expected it to be on current gen...
 
Hellbinder has a long history of irrational exuberance about ATi's cards before release then feeling crushing dissapointment after they're out.

r520 is a great chip and these are solid boards. It competes on performance and is a clear and notable win on features and image quality. Yeah, it should have been out 6 months ago but nothing can change that now.

Hellbinder, what were you expecting?
 
Analyzing the R520's featureset, it has two major advantages and two minor advantages:
- Better AF (with a performance hit)
- HDR AA support.
- 6xAA MSAA.
- 3Dc.
The G70, on the other hand, has one major advantage and one minor advantage:
- Less hacky Dual-GPU support (Crossfire2 doesn't seem bad, but compared to SLI2... heh)
- Vertex Texturing (that'd be a major advantage if it had better performance and/or supported filtering)

Performance-wise, the R520 architecture has a couple of advantages:
- Made with higher clockspeeds in mind. Consider them a "way to keep the crown" all you want, but design-wise, things like the memory bus system clearly were made with clockspeed in mind, so that's simply not true.
- Considering that, the VS performance is downright amazing compared to the G70's.
- The Pixel Shader has some very nice efficiency, especially so regarding things like branching.

But the G70 still has higher raw performance (shadermark is a perfect example of my point here) when it comes to sheer ALU power, and in those cases the G70 already is very near peak efficiency (so the R520 really can't do much more per-quad-clock). And amusingly enough, the G70's Performance/Watt is higher, too. Sounds like the Performance/Watt of the G71/G72 are going to eclipse ATI's RV515/RV530 too, amusingly enough; this could be the turning point for NVIDIA to take over the mobile GPU market, which has traditionally more been ATI's playground.

At the same time, I'm interested in what's gonna happen with the Turion and the C51M. Jen Hsun said that they hoped it would gain them some marketshare in that quite particuliar segment, but I'm not sold on that statement just yet (mostly because of the Turion, really). I'm going OT, though ;)

Overall, I'd have liked reviewers to try finding Apple-to-Apple settings more. There's not much they can do regarding AA (although there is some, once Crossfire2 is available...) - but regarding AF, I have yet to see any site doing a bunch of real comparaisons to find the most-nearly-identical settings, instead of just doing it at all at 16xAF.

And where are the "X850XT vs underclocked X1800XT" comparaisons, goddamnit?! :smile:


Uttar
P.S.: I'm quite convinced NVIDIA didn't consider this enough of a flop-launch not to retaliate, cf. "Next NV High-End" thread.
 
It appears to me that some people watch the battle between the two IHV titans as a gladiatorial fight.
" Someone has to die, dammit ! I didn't pay for a draw ! "

I say the X1800 family is just where it should be. The 7800 GTX is a great product, still, IMO, the X1800XT is just as good, if not better. Of course there can be much said about availability and pricing, and how does it loose in title X ...
It is an innovative product. It was a pleasure reading about the architecture.
Having a X800 XL, there is no doubt for me that my next card will be a X1800 XL. :D (not a 7800 GT)
 
Most of the reviews I've been seeing are of the X1800 XT against a 7800 GTX with the original stock speeds, but recent boards have been pushing stock clock speeds on the 7800 GTX's to 450/1250 and even higher. I'm curious to see what a 7800 GTX at that speed versus the X1800 XT would look like, and I'd also like to see what a 512MB 7800 GTX card would do at the higher resolutions with AA/AF enabled since many reviewers commented that the 256MB of the GTX was probably holding it back in some benchmarks.
 
As far as it goes, the R520 delivers exactly what it ought to considering when it was released. That is, slightly better performance with better IQ.

If it had been released when it was supposed to, then it would've blown G70 out of the water, but now it's only expected and nothing unexpected.
 
tEd said:
Benchmarks are misleading IMO. Quality wise ATI is superior of course the number don't reflect that.

I've read 3-4 reviews then i stopped reading. It's a waste of my time. If reviewers can't get their act together why even make reviews?

Yep. There's still been a steady trickle of stories about how Nvidia still seems to run benchmarks faster than in-game plays. What a surprise, especially that Nvidia shipped another unsupported beta driver for websites to use against ATI's new release. Wonder if those will ever get an official release, or will they be another "buggy" Det set that just seem to get high scores on benchmarks?
 
Last edited by a moderator:
colinisation said:
I thought R520 has come already, do you not mean RV5xx ;)

I think Dave might be alluding to R580, just a thought. ;)

As for R520...I haven't made up my mind yet. I'm waiting for the rest of Dave's article.
 
Uttar said:
- Less hacky Dual-GPU support (Crossfire2 doesn't seem bad, but compared to SLI2... heh)

P.S.: I'm quite convinced NVIDIA didn't consider this enough of a flop-launch not to retaliate, cf. "Next NV High-End" thread.

For this generation, I bet Crossfire will not be hacky. The resolution issue will be gone.
2 x Dual DVI link is a minor advantage you missed.
 
Back
Top