R520 = Dissapointment

Mariner said:
The only reason street prices should be of interest at this point in time is if you intend to buy during the next week or so. The only price of any product that counts is the one which you actually pay, not the price it was in the past.

Well when do you think all those people waiting on the release of the X1800 line are going to be making their purchases? Byt the time the low-end X1800 parts reach price parity with the competing 6xxx products, the low-end 7xxx will be probably be out anyway.
 
wireframe said:
This is the first review I read with power usage numbers. If these numbers are accurate it does not bode well for a mobile R520 part. That thing eats powerlines. Usually I am not one to worry too much about power consumption, as long as the cooling solution doesn't drive you nuts, but it doesn't look good for ATi in the power management area compared to Nvidia. Comparing the values when idle is almost shocking.


Yeah, I've been wondering how well an chip that lives on clocks will work in a notebook. That said, the load figures are similar, and I would suggest that it's the load figures, not idle, that are key. I am assuming that the idle consumption is easier to address with a mobile chip (lower the clock massively, perhaps turn bits of it off etc).
 
caboosemoose said:
Yeah, I've been wondering how well an chip that lives on clocks will work in a notebook. That said, the load figures are similar, and I would suggest that it's the load figures, not idle, that are key. I am assuming that the idle consumption is easier to address with a mobile chip (lower the clock massively, perhaps turn bits of it off etc).
Notebook versions aren't planned at the moment (I'm wondering if they are looking to R580 for that now), however "Idle" implies Windows 2D, where NVIDIA clocks down but ATI doesn't on the desktop parts yet (Terry mentioned in his presentation they would be soon, IIRC).
 
wireframe said:
I think it is quite straight forward: ATi observed a problem in clocking the R520 cores. They did some respins and metal layer changes to try to fix the problem before they discovered that one of their layout tools was defective. This helped but did not fully resolve the problem. The improved, but not "perfect" cores will be used for XLs while the latest respin with the core problem resolved will be used for the XTs.

This is not good and I am not sure how ATi will be punished for this information leaking out. The XL or GT (from Nvidia) class products are often popularized by their ability to be overclocked to XT or GTX/Ultra levels. Even if not everyone succeeds in doing this with their board, the hope is there and it makes it sweeten the deal. It looks like no such chance exists with the XLs that ATI will bring to market for release. Presumably later batches will not have this problem as they are based on the "perfect" core revision. This may hurt sales as people wait instead of buy, giving Nvidia more time to plan their response.
I imagine that they kept these parts on hand just so they would have product to launch with the previews. I would expect that within a couple of months as the new respin is fully online the xl cards will be made from less than perfect chips of that batch thus allowing good overclocks.

So if you are an avid overclocker that wants to remain within a budget I would wait about 2 months and then get a newer rev card :)
 
trinibwoy said:
Well when do you think all those people waiting on the release of the X1800 line are going to be making their purchases? Byt the time the low-end X1800 parts reach price parity with the competing 6xxx products, the low-end 7xxx will be probably be out anyway.

They will make their purchases when they decide they want to!

If they want to buy a card now, they can choose the product whose current price/performance suits them best. At present in the graphics card market, this means that some NV chips are at an advantage in comparison to ATI chips - this is the benefit of NV's successful execution as opposed to ATI's delays.

I was railing against the pointless criticism of an MSRP of a new product because of a comparison to the street price of one which has been out for ages. The only price that matters is the one you pay, not the MSRP. Choose with your wallet.
 
Mariner said:
I was railing against the pointless criticism of an MSRP of a new product because of a comparison to the street price of one which has been out for ages. The only price that matters is the one you pay, not the MSRP. Choose with your wallet.

I don't think people are disagreeing with you. But you seem to ignore the possibility that street price = MSRP for several weeks/months after launch on these new parts.
 
991060 said:
This is probably the most remarkable achievement of R5xx lines:

http://techreport.com/reviews/2005q4/radeon-x1000/shadermark-shadow.gif
More importantly, that's where Carmack is heading with id's next gen engine, and also what Epic is doing in Unreal Engine 3. They're talking about soft shadows via shadow maps and speeding it up with dynamic branching.

It looks like this stencil shadow (esp w/ reverse z-test) strength of NVidia's, which everyone is mistaking as ATI's general OpenGL weakness, is pretty fleeting in the grand scheme of things.

If I have any disappointment (HB, sorry for being a prick, but at least spellcheck the thread title) in R520, it's in the number of transistors they used for SM 3.0. Granted, decent branching isn't cheap, but they crammed 48 shader processors and 32 total texturing units (16 filtered) in R500. I know that core doesn't include ROP units and all the fancy z/colour compression and Hi-Z, but they can't be thatbig given the size of R420.

For me that begs the question: How fast is R520's vertex texturing? I heard NV40/G70 sucks in that respect. We know ATI got dynamic branching right, so maybe they put big FIFOs in the vertex shader to speed up VT, and that's partly why it's so big.

I think my reaction is exactly as I predicted in that poll:
"That great, but when is R600 coming out?"

EDIT: Whoops, just read B3D's review. ATI doesn't have vertex texturing. I hope R600 comes out before 2007...
 
Last edited by a moderator:
chavvdarrr said:
-Drivers. These are supposed to be already well-optimised (working silicon 1 year ago, etc). Yet NV is the one that added good boost in dual-core scenario. I don'y think thats a plus.
I find this quite funny. In what way do you think that multi-threaded drivers are only there for nVidia? May I remind you that ATi Tray Tools has a tweak in it's advanced tweaks window for multi-threading. Does that not imply that ATi is at least busy with multi-threaded drivers? May I also remind you of the fact the ATi releases 12 official drivers each year, instead of the few of nVidia, and does not leak beta drivers at all. So, we'll see the ATi multi-threading feature when it is working and tested and released in an official build, instead of some leaked beta multi-threaded driver from nVidia.

And now back to something else. The only thing I am disappointed in is actually the reviews around. None of them even compared image quality. And the one at Anandtech shouldn't even be allowed to wear the nametag review. When oh when will we finally get a review where benchmarks are done where both cards are tuned to produce the same image quality. That is the only thing I find interesting.
 
trinibwoy said:
But you seem to ignore the possibility that street price = MSRP for several weeks/months after launch on these new parts.

Thats not allways the case...just because it was last year for ATI does not mean it will be the same. Look how fast the 7800 prices feel from MSRP once they were out...
 
I am dissapointed and will continue to be so until ATI gets the cards out in quantity. Last gen they never did at the high end and the lack of competition kept the stupid prices ridiculously high.

I want cut throat competition so I can afford one :p

Seriously though unless the high end is widely available then it is crap. We need downard pressure from the top to drive everthing lower, without it the mid range cards creep up in price as well.
 
jb said:
Thats not allways the case...just because it was last year for ATI does not mean it will be the same. Look how fast the 7800 prices feel from MSRP once they were out...

Very true. And it's all riding on availability.
 
tEd said:
Benchmarks are misleading IMO. Quality wise ATI is superior of course the number don't reflect that.

I've read 3-4 reviews then i stopped reading. It's a waste of my time. If reviewers can't get their act together why even make reviews?


im in the same boat as you. I am seeing many many threads like this one because ATI doesnt have the lead in a stupid bar graph. Reviewers should of made it a strong point, beating into everyones head that they are doing better/just as good/almost as good as X Nvidia card, are far more feature heavy and now have a substantial IQ lead.

Hardocp touched on the HDR+all working AA effects for a whole 3 sentances or something. Other places, most of them, completely ignored what looks better and just went for whats faster and that was the point of the review. (I wonder why the beyond3d review has features targeted first and taking up more space then most places entire reviews/previews.)

Im also getting a tad tired at the card recieving such a beating from people on forums when the GTX has well matured drivers. Who knows what will change in the next 3 months. Changes from the R420 to the 520 are HUGE compared the NV40 to G70.

All in all im quite impressed. I'll take higher IQ/features over horsepower any day of the week. I am very confident for the Cat team to fix any short comings in games where they can lead but arent at the moment. Its obvious the R520 is very powerful. Now we just need to see some in stores.
 
Last edited by a moderator:
I really don't see what the fuss is all about. The R520 series has gone from monster performance to true flop the past few months, and all I've been always saying is "you can't be wrong with either product".

There is no need for doom and gloom. The R520 is a very competitive product, ATI assures us that there won't be an availability problem once launched (of course, it's best to be cautious with such claims, but I give them the benefit of the doubt), and I believe that the "war" is far from over yet. Reviews in the upcoming weeks will highlight the strengths of the new series of cards, so I just recommend to relax and wait for some more "complete" reviews (although I admire the reviewers who managed to release their pieces in such a short notice).

p.s In retrospect nobody "got" my signature for all these months :(
 
trinibwoy said:
I don't think people are disagreeing with you. But you seem to ignore the possibility that street price = MSRP for several weeks/months after launch on these new parts.

There are no possibility that street prices will stay at MSRP any significant amount of time, there would simply be no demand for x1300Pro cards at 6600GT prices, for x1600xt cards at 6800 prices or x1800xl cards at 7800GTX prices.

And even if there where a possibility that street prices would stay at MSRP it would still not be fair to compare street prices with MSRP - because you have no way to know that that is what is going to happen.
 
From a pure technological standpoint I'm not sure how anyone could be dissapointed who isn't crazy.
HDR+aa, better fsaa across the board, way better AF, super efficient etc.
X1800XT= new leader.
X1800xl= mixed views... atm its a bit slower but has better quality.
 
Dave Baumann said:
Notebook versions aren't planned at the moment (I'm wondering if they are looking to R580 for that now), however "Idle" implies Windows 2D, where NVIDIA clocks down but ATI doesn't on the desktop parts yet (Terry mentioned in his presentation they would be soon, IIRC).

Well, there's M52 on "the map". I am assuming that is based on 520, not 580. I also understand january is a time pencilled in for a tour of the new part.
 
I sent this to Brent at [H] weeks ago and now to Ryan at PCP, but to get AA working with ATI boards for EQ2 you simply create an EQ2.ini file in the game's directory with this line: r_aa_blit 1.
 
I'd say that it's incredibly unlikely that street prices of the X1300 Pro will stay anywhere near the MRSP for long. As all the reviews note, performance-wise it's a bit slower than the 6600GT (albeit with a few extra features) so the market will drive the prices down very quickly.

This assumes ample supply of course, and everything we've heard so far indicates that this low-end part should be readily available in quantity very soon.

The market decides the price of any product - the release of the X800GT/GTOs is entirely due to the fact that poor sales of the 'full' X800/X850 have left ATI with lots of inventory of the chips. If the sales of X1300 aren't as good as hoped, prices will drop more rapidly.
 
John Reynolds said:
I sent this to Brent at [H] weeks ago and now to Ryan at PCP, but to get AA working with ATI boards for EQ2 you simply create an EQ2.ini file in the game's directory with this line: r_aa_blit 1.
But John! Their "average gamer" would never know of such a thing or be capable of doing it! :oops:




(Sorry, just bitterness...I'll try and stop)
 
John Reynolds said:
I sent this to Brent at [H] weeks ago and now to Ryan at PCP, but to get AA working with ATI boards for EQ2 you simply create an EQ2.ini file in the game's directory with this line: r_aa_blit 1.
He did try it and said performance was shitty.
We could not get AA to work by default in this game with an ATI card. To make AA work, we found that we had to create an .INI file called EQ2.INI. Inside this INI file, we had to add the line r_aa_blit 1. With this line added, we could then enable all AA modes. However, performance just wasn’t there to allow any of them with or without Adaptive AA in EQ2 at 1600x1200 in balanced quality mode. Even the HQ AF quality setting decreased performance to unplayable levels. Only 1600x1200 with no AA and 16XAF was playable. We also tried the game’s high quality mode, but it too was too slow. Either this game is very CPU limited or the X1800 XL video card just isn’t any faster than an X850 XT in EQ2.
 
Back
Top