AMD: R9xx Speculation

Computex really is a funny place. :D I've seen rumors change from one thing to another in the last few days. Yesterday was also funny with the rumors about the EVGA mainboard team (leaving to Sapphire etc). And yes, I can concur that there are some very persistant rumors floating around here which say that SI does not exist. At least. Not yet. So I decided to visit the neighbours (AMD booth is next to the MSI booth) and talk to a guy, but he basically gave me a "wtf are you talking about look" and then changed it to a strict one while saying "I'm not in the position to talk about future products". But there seems to be a lot of smokescreening going around here. So anything could be true. ;)
 
Computex really is a funny place. :D I've seen rumors change from one thing to another in the last few days. Yesterday was also funny with the rumors about the EVGA mainboard team (leaving to Sapphire etc). And yes, I can concur that there are some very persistant rumors floating around here which say that SI does not exist. At least. Not yet. So I decided to visit the neighbours (AMD booth is next to the MSI booth) and talk to a guy, but he basically gave me a "wtf are you talking about look" and then changed it to a strict one while saying "I'm not in the position to talk about future products". But there seems to be a lot of smokescreening going around here. So anything could be true. ;)

Reading between the lines: "I saw an SI card."
Is that correct? :D
 
So is Ati doing anything to press the advantage or just sitting around giving each other hi five's while Nvidia works to catch up?
 
So is Ati doing anything to press the advantage or just sitting around giving each other hi five's while Nvidia works to catch up?
From the sounds of some of the rumors, I would say they are working pretty hard. What it is and when we will see it is still unknown but most of the info points towards later this year for at least something.
 
Heck, if they can't get more wafer allocation for 40 nm and demand for 5xxx remains higher than suppy, this may be one of those rare situations where they are faced with either retiring a line while the entire line still has greater demand than supply or delaying the introduction of a new chip until demand starts to go down.

It's an odd situation to be in. I can't think of anytime in the past where a graphics company has had to make that choice.

From an economic standpoint it doesn't make sense to release a new product if you still cannot meet demand for the product it is meant to replace. There is nothing to gain. Especially if you assume that the new chip is larger and thus less chips per wafer than the product it is replacing. It's not like increasing demand for your product even more is going to benefit you financially. And if you have a fixed supply of wafers, then with a larger chip you'd actually lose money despite generating more demand.

Yet at the same time you have to be careful that you don't let the competition get a leg up on you while in this situation.

Regards,
SB
 
Heck, if they can't get more wafer allocation for 40 nm and demand for 5xxx remains higher than suppy, this may be one of those rare situations where they are faced with either retiring a line while the entire line still has greater demand than supply or delaying the introduction of a new chip until demand starts to go down.

It's an odd situation to be in. I can't think of anytime in the past where a graphics company has had to make that choice.

From an economic standpoint it doesn't make sense to release a new product if you still cannot meet demand for the product it is meant to replace. There is nothing to gain. Especially if you assume that the new chip is larger and thus less chips per wafer than the product it is replacing. It's not like increasing demand for your product even more is going to benefit you financially. And if you have a fixed supply of wafers, then with a larger chip you'd actually lose money despite generating more demand.

Yet at the same time you have to be careful that you don't let the competition get a leg up on you while in this situation.

Regards,
SB

If the new, larger chip is significantly faster than the one it replaces, then you can sell it for significantly more. Therefore it may make economic sense.

That said, I hope that by October/November, supply problems will be essentially solved.
 
Heck, if they can't get more wafer allocation for 40 nm and demand for 5xxx remains higher than suppy, this may be one of those rare situations where they are faced with either retiring a line while the entire line still has greater demand than supply or delaying the introduction of a new chip until demand starts to go down.

It's an odd situation to be in. I can't think of anytime in the past where a graphics company has had to make that choice.

If your new product is that much better, you replace the demand for your old product with the demand for your new one. You even get to sell a new product to all the people who just bought your last product, and ideally continue to dominate the market and your competitors while they are struggling to catch up.

I suspect it's better to not be able to supply all the people who want your product, versus supplying lots of product that no one wants.
 
If the new, larger chip is significantly faster than the one it replaces, then you can sell it for significantly more. Therefore it may make economic sense.
Only if
a, you are not supply constrained, otherwise even if you manage to keep a margin that offset cost, it will be at the expense of market share.
b, BUT this assumes your margin is sufficiently larger to compensate for the higher cost of production, and
c, you sell as many of these higher prized SKUs as of the lower cost alternative - no chance of that, so...
d, your margin has to be higher still, and
e, you hopefully have yields that are comparable to your previous product or you need to hike prices again to compensate

If I ran AMD, I would not spend resources introducing completely new products for 40nm at TSMC. That market battle is already under control so possible gain to offset cost and risk is quite limited, and the window for introducing new products at that node is closing. Being the first out of the gate with a set of well designed products for the next lithographic process has to be considered far and away more important, sunk costs notwithstanding. If pulling engineering resources from a potential 40nm update to work on the next node products instead helps with getting those to market, the choice seems simple.
 
Last edited by a moderator:
If your new product is that much better, you replace the demand for your old product with the demand for your new one. You even get to sell a new product to all the people who just bought your last product, and ideally continue to dominate the market and your competitors while they are struggling to catch up.

I suspect it's better to not be able to supply all the people who want your product, versus supplying lots of product that no one wants.

Yes, but you will end up losing money if...

1. Demand of the prior product means you'd sell out anyway because you cannot supply enough chips/cards

And

2. The chips are bigger thus you get even fewer chips from each wafer.

So you'd end up with a situation where you've not only raised demand but you're also lowered your supply of chips.

You'll end up selling less cards/chips. In both cases old card versus new card demand was high enough for you to sell every single chip you made because you could never get enough supply to meet demand.

As someone said, you "could" just raise the price in order to lower demand. And might end up making the same or close to the same as if you never introduced another product, but that's also a risky proposition. Raise the price too much and demand might drop far more sharply than expected and you end up losing more money than if you had just kept the price the same.

I have to say, if the rumors are true that Nvidia snatched up more 40 nm allocation than they needed due to AMD being overly cautious and not securing enough, then Nvidia were devilishly clever in limiting AMD's potential impact on overall marketshare. TSMC's continued problems producing 40 nm wafers obviously contributing to the situation.

It's similar to the situation Intel were in back in the old days where they might have a new architechture ready, but they would wait until demand started dying down before launching it. Worked great, until AMD snuck up and turned their world upside down with the Athlon 64. Now to limit the chances of that happening again, they generally release new architechtures as soon as they are done designing and testing them.

Regards,
SB
 
But aren't TSMC doubling wafer production later this year anyway? So even if the new product is larger than the old one its quite likely the supply situation, whatever happens, will be much better than it was last year at the same time.
 
Well we can all hope. :) Although I'm not sure if a refresh will be good enough over a 5870 to warrant buying a new card. On the other hand, it's amazing that it might get replaced with no price drops and in fact it might actually cost more when it is EOL'd than when it was launched. :oops:

Regards,
SB
 
Well we can all hope. :) Although I'm not sure if a refresh will be good enough over a 5870 to warrant buying a new card. On the other hand, it's amazing that it might get replaced with no price drops and in fact it might actually cost more when it is EOL'd than when it was launched. :oops:

Regards,
SB

a truly WTF moment in technology history? :oops:
 
Well we can all hope. :) Although I'm not sure if a refresh will be good enough over a 5870 to warrant buying a new card. On the other hand, it's amazing that it might get replaced with no price drops and in fact it might actually cost more when it is EOL'd than when it was launched. :oops:

Regards,
SB

We're not quite there yet. Between now and October/November, supply could improve a lot. That said, it is amazing that since the HD 5870 was launched almost 9 months ago, its price has actually increased.

I'm not aware of anything like that ever happening before.
 
Gah! I've been sitting back happily watching AMD cream some profit finally since I'd been planning on skipping the 5xxx generation but now my 4870 RAM is giving me trouble.
I'm faced with fairly cheap (NZ$250) 5770 to pretty much match the 4870 & hold out for the refresh/replacement generation or not far short of twice the price ($450) for a 5850.
If I were to pay out for the 5850 that would run a strong 'might as well go to a 5870' ($600) factor but I really wanted to skip a generation this time.
 
Gah! I've been sitting back happily watching AMD cream some profit finally since I'd been planning on skipping the 5xxx generation but now my 4870 RAM is giving me trouble.
I'm faced with fairly cheap (NZ$250) 5770 to pretty much match the 4870 & hold out for the refresh/replacement generation or not far short of twice the price ($450) for a 5850.
If I were to pay out for the 5850 that would run a strong 'might as well go to a 5870' ($600) factor but I really wanted to skip a generation this time.

Its too bad you didn't post this a couple of weeks ago. I would have lent you a spare HD 5770 I had. :D

So could you struggle onwards and mercilessly cream the HD 4870 in your possession for a few more months or is it a pressing and unfortunate calamaty which requires your immediate purchase of a new card for your pleasure?
 
If I was back at home I could send a spare 4870 I have to him, but I won't be back for another month.

Regards,
SB

This constant obsoleting of graphics cards reminds me of a thought I had that some giant Voltron-esque like robot made out of old Radeons would come and attack Dave for coming out with new GPUs all the time and making them obsolete before their time.

Anyway, shipping for you wouldn't be practical though as hes in New Zealand, mine and Zeds territory. :)
 
Back
Top