RV560/570 Gemini roadmap

Anything. What you might find, though, that if the UMC (I pick that, because they appear to generally pitch themselves on the lower end of th performance scale) version ends up lower cost and lower performance then it will also appear more in 128-bit configurations.
 
I'm just asking because I don't see for the time being a cost effective reason for more than 128bit busses in the mainstream segment.
 
These are likely to be on the cusp of the "mainstream" and (the more recently created) "performance-mainstream" section. I'll wager they'll end up around the $200-$300 price bracket, like X800 XL.
 
Ailuros said:
I'm just asking because I don't see for the time being a cost effective reason for more than 128bit busses in the mainstream segment.
Implying that you know the cost of a 256- vs a 128-bit interface.
Please share.
 
Can't get rid of RV530 with all those OEM design wins they've racked up, especially in the notebook space. I don't see any reason they'll need to change the name when it migrates to 80nm, unless they want to boost the clocks a bit.
 
Last edited by a moderator:
kemosabe said:
Can't get rid of RV530 with all those OEM design wins they've racked up, especially in the notebook space. I don't see any reason they'll need to change the name when it migrates to 80nm, unless they want to boost the clocks a bit.
I can't see why they'd still need that part. Performance isn't that much larger than rv515 (if you compare x1600pro vs. x1300pro), so for low-end rv515 should be fine (or a successor of that?). And it doesn't compete too well with anything unless it's sold at the same price point as that x1300pro... So a RV560 (if it's something like 8-1-2-2 or whatever, which should have only a moderately larger die size / higher manufacturing cost) should replace this very easily with much better performance.
 
I'm certainly not disputing that it's not a stellar performer - that has been well established. But the fact is that notebook OEMs usually have long design cycles so ATI will keep pumping them out. Of course with the advent of MXM/Axiom and other modular designs, there should be more incentive for OEMs to upgrade their GPU specs faster so RV560 could be a good replacement. I think RV570 will be aimed primarily at the retail market where ATI has flopped repeatedly in the mid-range segment.
 
Last edited by a moderator:
Entropy said:
Implying that you know the cost of a 256- vs a 128-bit interface.
Please share.

Dave's answer about his own pricing estimate is quite obvious what I was aiming at.
 
kemosabe said:
I'm certainly not disputing that it's not a stellar performer - that has been well established.

Context is all, isn't it? I don't think it was a bad performer for four months. . . in it's weight class (i.e. die size). That the competition made good and sure that it was fighting (price-wise) out of its weight class for that time (until last week) is at least as much to its credit as anything.
 
Last edited by a moderator:
geo said:
Context is all, isn't it? I don't think it was a bad performer for four months. . . in it's weight class (i.e. die size). That the competition made good and sure that it was fighting (price-wise) out of its weight class for that time (until last week) is at least as much to its credit as anything.

I don't know geo.....the preponderant reaction to RV530 from the very start was that of a let-down from the performance perspective for the ~$200 price point, notwithstanding the fact that it was generally staying ahead of the 6600GT. In the end, all that talk about the "architecturally interesting" 4-1-3-2 concept just didn't translate into the kind of mid-range punch that was anticipated. ATI hasn't hit a home run in this segment since the 9500 Pro, and even that was a circumstantial affair. RV570 should be a great part, but if it arrives on the eve of the G80 generation it'll be as exciting as a bunt single. :neutral:
 
kemosabe said:
RV570 should be a great part, but if it arrives on the eve of the G80 generation it'll be as exciting as a bunt single. :neutral:
NVIDIA have let it be known that they're aiming for the 7300/7600 to last as long as the 6200/6600 did, ie. 18 months, with a minimum lifespan of 12 months, so I wouldn't count on lower end G80 parts until 2007.
 
Fodder said:
NVIDIA have let it be known that they're aiming for the 7300/7600 to last as long as the 6200/6600 did, ie. 18 months, with a minimum lifespan of 12 months, so I wouldn't count on lower end G80 parts until 2007.
There will be plenty of room between 7600 GT and high end G80 for a value high performance part, successor to 7900 GT.
 
Ailuros said:
Dave's answer about his own pricing estimate is quite obvious what I was aiming at.
Or put in another way - you don't know the cost differential between a 256-bit and a 128-bit bus.
That's OK, I don't know either. That's why I asked.
The question I was trying to answer for myself was if it wouldn't make a lot of sense from a product positioning perspective to, rather than supporting both 128 and 256 bit interfaces, instead support GDDR3 and GDDR4, making it supremely easy to provide two different SKUs from one chip. GPU chip, packaging and as far as I can remember board design could all remain the same, making product supply balancing painless.

Performance would probably suffer a bit vs. a 256-bit interface.

So as I was thinking about this, the economic side of the equation required some sort of handle on the incremental cost of GDDR4 vs the incremental cost of a 256 bit bus for chip, packaging and PCB. I lacked useful data. My estimates were that the cost differential was probably small and in favour of GDDR4, and definitely so over time (as GDDR4 isn't really more expensive to make than GDDR3, supply is the only issue). For all the talking that has been going on for years here, I have never seen bus width cost quantified, but it seems to me that someone in the industry that is familiar with chip packaging and PCB cost should be able to make a good estimate. (My own could be wildly off base and is effectively useless.) I thought you might have spoken to such an insider.
 
EasyRaider said:
There will be plenty of room between 7600 GT and high end G80 for a value high performance part, successor to 7900 GT.
True, but remember that this time round that slot was filled by the last-gen 6800GT/GS until the new midrange was ready, 9 months later. I suspect that bracket will continue to be filled by the 7900GT or equivalent until at least early next year.
 
Fodder said:
True, but remember that this time round that slot was filled by the last-gen 6800GT/GS until the new midrange was ready, 9 months later. I suspect that bracket will continue to be filled by the 7900GT or equivalent until at least early next year.
So you expect NVidia to leave the midrange DX10 market untapped until next year? I think that would be a mistake. The 6800 did not have a large feature set disadvantage, the 7900 will.
 
Back
Top