Anand has the details about r520,rv530,rv515

The only thing I've heard for today (see the link upstream and the last line of this article) is the Ibiza shindig is going on starting today thru Sunday. So by now, probably Wavey and Rys and maybe a few others who usually hang out here have been initiated into the inner mysteries of R5xx and have a pretty good idea of their practical effect performance-wise.

I wonder what Faud does at these things since he doesn't sign NDA's and thus can't get in? He hangs out at the bar all day and waylays attendees on their way out the door?
 
geo said:
I wonder what Faud does at these things since he doesn't sign NDA's and thus can't get in? He hangs out at the bar all day and waylays attendees on their way out the door?

Thats probably what he does..and think about it...it sounds like a heck of a lot more fun then being stuck in a cube farm for a living (like me) :)
 
http://www.hardwareanalysis.com/content/article/1817/

... ATI will launch with immediate availability a number of graphics cards based on their new architecture. The top-of-the-line Radeon X1800XT will be available as of the 5th of November shipping at a 625MHz core and 1500MHz memory frequency. Offered with 256MB and 512MB of GDDR3 memory it will be priced at $499 and $549 respectively. The Radeon X1800XL will be available on October 5th and offers a 500MHz core and 1GHz memory frequency and comes with 256MB of GDDR3 memory with a price tag of $449. All of the Radeon X1800 graphics cards utilize the full 16-pipelines, they’re just different in terms of memory size and clock speed or both in the case of the Radeon X1800 XL....

... The mid-range Radeon X1600XT ships with either 128MB or 256MB of memory, clocks in at a 590MHz core and 1.38GHz memory frequency and can be bought starting November 30th for respectively $199 and $249. All of the Radeon X1600 graphics cards will have to make do with just 12-pipelines. The low-end Radeon X1300 will be offered in three different versions, the $149 Radeon X1300 Pro with 256MB of memory and 600MHz core and 800MHz memory clocks. The $99 and $129 Radeon X1300 with 450MHz core and 500MHz memory clocks and 128MB or 256MB of memory respectively. And closing the ranks is the Radeon X1300 HyperMemory at $79 with 32MB HyperMemory and 450MHz core and 1GHz memory clocks....

... The Radeon X1300 Pro and X1300 will be available on October 5th. Unfortunately no dates have been given for the Radeon X1800 CrossFire Edition or the Radeon X1600 CrossFire Edition. What is clear, though, is that the Radeon 1300 will not feature a CrossFire Edition card, it’ll utilize the PCIe bus for communication. Both the Radeon X1800 and X1600 CrossFire Edition will support resolutions up to 2046x1536 pixels with a 70Hz refresh, negating some of the limitations of CrossFire on the Xx00 platform....

... That concludes our first look at the R520 architecture and the X1000 series cards that will be available at launch. As for performance we’ll have to wait for the actual launch. If ATI is to be believed the Radeon X1800XT consistently outperforms the GeForce 7800GTX in all games, sometimes by as much as 80%...
 
nagus said:
... That concludes our first look at the R520 architecture and the X1000 series cards that will be available at launch. As for performance we’ll have to wait for the actual launch. If ATI is to be believed the Radeon X1800XT consistently outperforms the GeForce 7800GTX in all games, sometimes by as much as 80%...

I can think of no general scenario where 80% will happen, given the specs. Perhaps in some very, very specific synthetic tests, but aside from that no way.

Of course, whatever happens, it's pretty funny for Sassen to be writing this after he spent the better part of a week trashing the R520 and ATi, with his "benchmarks" and other comments. How times change.
 
PaulS said:
I can think of no general scenario where 80% will happen, given the specs. Perhaps in some very, very specific synthetic tests, but aside from that no way.

I'm thinking some kind of dynamic branching scenario, or possibly something HDR-ish (but at lower precision --which may or may not make a visible IQ difference)
 
geo said:
:oops:

I wonder just how special "greatest left-handed shortstop from the domincan republic" those circumstances are?

Yeah I don't like such percentages. It could be 180 vs 100 fps, or 10 vs 18 fps. But if the R520 averages 30-50% faster my GTX is history.
 
No XT till November? :cry: I hope that isn't true, it's the card I wanted and the thought of waiting another month for it doesn't please me.
 
hardwareanalysis.com said:


... The mid-range Radeon X1600XT ships with either 128MB or 256MB of memory, clocks in at a 590MHz core and 1.38GHz memory frequency and can be bought starting November 30th for respectively $199 and $249. All of the Radeon X1600 graphics cards will have to make do with just 12-pipelines. The low-end Radeon X1300 will be offered in three different versions, the $149 Radeon X1300 Pro with 256MB of memory and 600MHz core and 800MHz memory clocks.

Typo ? Doesn´t make ANY sense.
 
Did anyone read the latest article of DoD on Anandtech? There was this little tidbit that may be where the 80% comes from;
One of the easiest ways to implement HDR from scratch is to use a floating point format with all art assets designed around HDR. Unfortunately, current hardware isn't able to handle full floating point data as fast as other methods, and no hardware (that is currently out) can allow MSAA to run on a floating point render target.

:D
 
PaulS said:
I can think of no general scenario where 80% will happen, given the specs. Perhaps in some very, very specific synthetic tests, but aside from that no way.

In Call of Duty 2 with AF+AA 512MB video memory gives a mayor boost, I would not be terrible suprized if a R520 with 512MB would outperform a GTX with 256MB with a very large margin in that game.

(It seems there will be a 512MB GTX very shortly, so the advantage in COD2 is most likely gone before the R520 is availeble).
 
trinibwoy said:
But if the R520 averages 30-50% faster my GTX is history.

If you really mean "averages" in the sense of over a broad range of titles, I'm guessing your GTX is safe. But then who knows what feature goodies they might tempt you with? On the other hand, if it does that range in BF2 and CoD2, some people might begin to twitch while you're still pointing at smaller gains in older titles as unimpressive. :LOL:
 
Headstone said:
Did anyone read the latest article of DoD on Anandtech? There was this little tidbit that may be where the 80% comes from;


:D

Well no that can't be it, cause the G70 would have to be capable of HDR+AA in order for the comparison to be made. Now this can't refer to DoD since Valve's HDR is compatible with AA on both Nvidia's and ATi's cards.
 
geo said:
If you really mean "averages" in the sense of over a broad range of titles, I'm guessing your GTX is safe. But then who knows what feature goodies they might tempt you with? On the other hand, if it does that range in BF2 and CoD2, some people might begin to twitch while you're still pointing at smaller gains in older titles as unimpressive. :LOL:

I mean averages in the games I currently play - WoW, BF2, F.E.A.R etc. What lead you to the assumption that I would care about the performance of the G70/R520 on older titles? I'm not one of these bionic people that can detect the difference between 90 and 120 fps :)
 
trinibwoy said:
Well no that can't be it, cause the G70 would have to be capable of HDR+AA in order for the comparison to be made. Now this can't refer to DoD since Valve's HDR is compatible with AA on both Nvidia's and ATi's cards.
But maybe for AA on HDR in general not specifically floating point.
 
Sunrise said:
Typo ? Doesn´t make ANY sense.

Why not, a 590Mhz RV530 will be way faster than a RV515 at 600Mhz. The RV515 is also a simpler design with less transistors so it makes sense that it can clock faster than the RV530.

Edit: should have read the rest of thread - now I am just repeating what others already has written.

Edit2: If the x1300Pro clocked at 600/400Mhz, it could mean that MSI x1300 scores posted are from a pro card and not a LE (450Mhz core) card as many (inclyding myself) thought - that changes the conclusions about the effiency increase completely.
 
Last edited by a moderator:
Back
Top