RXXX Series Roadmap from AnandTech

Dave said that RV515 and R520 made one pair, and RV530 and R580 were the other pair. In effect.

Jawed
 
Unknown Soldier said:
Ye . .I was thinking the same thing now.

[tounge-in-cheek] If you calculate the difference between 580 and 520 you get 60. We know now that the RV5-series are 1/4 size versions (essentially) of the big chips. 1/4 of 60 = 15. And 15 + 515 is.... [/tongue-in-cheek]

That's about the best you'll get out of me today :LOL:
 
Rys said:
[tounge-in-cheek] If you calculate the difference between 580 and 520 you get 60. We know now that the RV5-series are 1/4 size versions (essentially) of the big chips. 1/4 of 60 = 15. And 15 + 515 is.... [/tongue-in-cheek]

That's about the best you'll get out of me today :LOL:

I'd say "Louis Farrakhan, is that you?", but I've seen your pic, and you're as whitebread as it gets. :)
 
Jawed said:
It seems that R520 was just the first cut at a shader array architecture for the PC space (that should have been in the market since May...) and to keep it simple ATI designed it with just 16 pipes, making for a relatively small core - relying upon the increased utilisation brought about by the new scheduler.

Hmm. Another long-term issue in the "snowball rolls downhill problem" department. . .one assumes that six months or more (June) of real-world R520 experience by thousands of users would have resulted in a few tweaks to R580 here and there. They've mostly lost that advantage now.
 
Uttar said:
Well, if the R520XL is 500/500, I doubt that needs another respin. So they could release the R520XL earlier than the R520XT, with the XT only being produced on the respin. The idea there would probably be that the R520XL is a reply to 7800GT(X), and the R520XT to the 7800 Ultra, if there will be such a part.

Hmm, I'd been assuming they were stockpiling fast ones, but your theory (actually a separate spin) makes more sense for their somewhat odd release schedule, particularly if R520XL does not have at least an X800XTPE vs G6800U performance advantage. To me, the name says it won't already.

As for overclocking, while that would be true for the first ones, one assumes if you wait a little bit that they'll switch the XL to the new spin rather than keep producing two ASICs. . . which will create a bit of bitterness amongst early adopters.
 
DaveB - I hope that in the near future you can do another interview with ATI's Dave Orton, around the time that Xbox 360/Xenos and R520 launch. I am curious to see what Orton has to say about ATI's roadmap, compared to what he said over a year ago in the last interview :)
 
I'd like to see a year-on follow up to this line of discussion:

At the end of the day though, is there really the desire to continue with that – is the drive there to keep pushing that type of model?

There’s always the debate of who steps down first. I think what’s going to happen is we’re going to hit a power limit, so through other innovations and technologies we have to manage efficiencies. And then I’ve heard some, tongue in cheek, talk that NVIDIA isn’t counting die-per-wafer, but wafers-per-die, and whenever this is the case you’ve certainly crossed a threshold!

***

As you say, there’s always trade-off’s. There’s the trade-off of performance and die size. The answer is yes we could – the die size would have been fairly large to the point where we weren’t sure how produce-able it would be in 130nm and we didn’t think that 90nm was really going to be there for ’04 production. Now, NVIDIA has put something in 130nm that’s die size is 10-15% bigger and there’s still some understanding we have to get on their architecture.

And if he'd be willing to say anything about the wild range of rumors re R520. . . :)
 
:LOL: only saw this now

[00:06am] Rys: Link. Don't ever say I'm not good to you
[00:25am] Matthew @ ATI: dude, how did you get a picture of my girlfriends?

Nice one Rys :D

I'm hoping the R520 hits at least lower 10000 3Dmark05.

The R580 I hope hits 11500.
 
Dave Baumann said:
They need to design new boards for it, and with the way thngs are they are probably working on a fairly compressed schedule. R5xx was probably too late in the design phase to make changes for any different silicon to support two boards, so they probably still need the composite chip though. However, they are not saddled with the board issues so there could be changes there - personally I think they should dump the master concept and, like the Alienware solution, put the composite chip on a separate board that could be sold for $50 or so, and then have internal connectors on the R5xx boards that connects to the composite board. Doing it this way would mean that vendors only have to carry one extra, cheap, SKU (the composite board) rather than multiple "Master" boards and gives a greater range in flexability to the end user in maximising their particular setup.

I've suggested that to them and yet to have a satisfactory explaination for not doing it that way.

100% agreed. How can ATi not see that they are majorly dropping the ball and introducing a much inferior product in relation to Nvidia? Cant they see that they will allowing themselves to be owned by Nvidia marketting, as their SLI solution doesnt require Master/Slave product lines?

Im all for the extra board like Alienwares solution. Even better if they could actually produce the capability in the add-in board to do surroundsound as in matrox's multi-screen gaming. Hell, they could turn this negative into an actual postive and have some genuine marketting spin.

But hey what am I thinking. Even with all the geniuses in ATi, I doubt anyone will be brazen enough to do this. Of course we are going to see ATi have master/slave solutions.

They will bring out crossfire editions of their boards, that arent tried and tested.
They will bring out master editions of their cards which will carry a premium.
They will bring out early drivers which will undoubtably carry bugs for the enabling crossfire.

Well done ATi for what could have been a delayed product but with many bells and whistles, but a better product. However now, its looking more and mroe like a delayed product and a so what from the enthusiasts and more importantly the gen public.
 
Unknown Soldier said:
:LOL: only saw this now

[00:06am] Rys: Link. Don't ever say I'm not good to you
[00:25am] Matthew @ ATI: dude, how did you get a picture of my girlfriends?

Nice one Rys :D

I'm hoping the R520 hits at least lower 10000 3Dmark05.

The R580 I hope hits 11500.
who cares what 3dmark it gets.
I just hope it has a shader engine atleast as powerful as the 7800gtx and by the time it's out they've "fixed" OGL performance.
 
radeonic2 said:
... and by the time it's out they've "fixed" OGL performance.
I believe ATI's came out and said they're not going to focus on OpenGL performance as long as it's a smaller portion of the market.
 
Chalnoth said:
I believe ATI's came out and said they're not going to focus on OpenGL performance as long as it's a smaller portion of the market.
Link?
That's rather short sited imo.
So they just want to be known as having good d3d drivers and damn OGL?
 
radeonic2 said:
Link?
That's rather short sited imo.
So they just want to be known as having good d3d drivers and damn OGL?
Sorry, it's been a long time since I've seen it, so just take it as hearsay.
 
Chalnoth said:
Sorry, it's been a long time since I've seen it, so just take it as hearsay.
Alrighty then.
I just think it would be stupid with doom3 engined games coming out as well other OGL games.
 
Yeah, Prey, a D3-engined game, is a GITG title. Or at least it was on a trailer I've got somewhere on my hard disk... I think it was being shown on ATI stands at various shows this spring, too, including E3.

Jawed
 
radeonic2 said:
Alrighty then.
I just think it would be stupid with doom3 engined games coming out as well other OGL games.
Well, bear in mind that one can optimize for a single game or engine without worrying to much about games in general. We'll see.
 
So what's everyones performance expectations for R520XL? I'm guessing that in general (non-HDR) it will fall between stock GT and stock GTX at a GT-like price. That they are going for doing to NV with it what NV did to them with 6800GT last time.
 
Back
Top