Jawed said:Dave said that RV515 and R520 made one pair, and RV530 and R580 were the other pair. In effect.
Jawed
Unknown Soldier said:Ye . .I was thinking the same thing now.
Rys said:[tounge-in-cheek] If you calculate the difference between 580 and 520 you get 60. We know now that the RV5-series are 1/4 size versions (essentially) of the big chips. 1/4 of 60 = 15. And 15 + 515 is.... [/tongue-in-cheek]
That's about the best you'll get out of me today
Jawed said:It seems that R520 was just the first cut at a shader array architecture for the PC space (that should have been in the market since May...) and to keep it simple ATI designed it with just 16 pipes, making for a relatively small core - relying upon the increased utilisation brought about by the new scheduler.
Uttar said:Well, if the R520XL is 500/500, I doubt that needs another respin. So they could release the R520XL earlier than the R520XT, with the XT only being produced on the respin. The idea there would probably be that the R520XL is a reply to 7800GT(X), and the R520XT to the 7800 Ultra, if there will be such a part.
At the end of the day though, is there really the desire to continue with that – is the drive there to keep pushing that type of model?
There’s always the debate of who steps down first. I think what’s going to happen is we’re going to hit a power limit, so through other innovations and technologies we have to manage efficiencies. And then I’ve heard some, tongue in cheek, talk that NVIDIA isn’t counting die-per-wafer, but wafers-per-die, and whenever this is the case you’ve certainly crossed a threshold!
***
As you say, there’s always trade-off’s. There’s the trade-off of performance and die size. The answer is yes we could – the die size would have been fairly large to the point where we weren’t sure how produce-able it would be in 130nm and we didn’t think that 90nm was really going to be there for ’04 production. Now, NVIDIA has put something in 130nm that’s die size is 10-15% bigger and there’s still some understanding we have to get on their architecture.
Dave Baumann said:They need to design new boards for it, and with the way thngs are they are probably working on a fairly compressed schedule. R5xx was probably too late in the design phase to make changes for any different silicon to support two boards, so they probably still need the composite chip though. However, they are not saddled with the board issues so there could be changes there - personally I think they should dump the master concept and, like the Alienware solution, put the composite chip on a separate board that could be sold for $50 or so, and then have internal connectors on the R5xx boards that connects to the composite board. Doing it this way would mean that vendors only have to carry one extra, cheap, SKU (the composite board) rather than multiple "Master" boards and gives a greater range in flexability to the end user in maximising their particular setup.
I've suggested that to them and yet to have a satisfactory explaination for not doing it that way.
who cares what 3dmark it gets.Unknown Soldier said:only saw this now
[00:06am] Rys: Link. Don't ever say I'm not good to you
[00:25am] Matthew @ ATI: dude, how did you get a picture of my girlfriends?
Nice one Rys
I'm hoping the R520 hits at least lower 10000 3Dmark05.
The R580 I hope hits 11500.
I believe ATI's came out and said they're not going to focus on OpenGL performance as long as it's a smaller portion of the market.radeonic2 said:... and by the time it's out they've "fixed" OGL performance.
Link?Chalnoth said:I believe ATI's came out and said they're not going to focus on OpenGL performance as long as it's a smaller portion of the market.
Sorry, it's been a long time since I've seen it, so just take it as hearsay.radeonic2 said:Link?
That's rather short sited imo.
So they just want to be known as having good d3d drivers and damn OGL?
Alrighty then.Chalnoth said:Sorry, it's been a long time since I've seen it, so just take it as hearsay.
Well, bear in mind that one can optimize for a single game or engine without worrying to much about games in general. We'll see.radeonic2 said:Alrighty then.
I just think it would be stupid with doom3 engined games coming out as well other OGL games.
Indeed we shall.Chalnoth said:Well, bear in mind that one can optimize for a single game or engine without worrying to much about games in general. We'll see.