Ah, a perfect analogy to help me convey what I'm trying to say here. I totally agree with what you're saying, and I want to point out that I'm not saying AMD/ATI should thorw away all the work done on R800/R900/etc.
Using your analogy let's add one more element and say that AMD is hurting financially and food for the cows is expensive. I'm suggesting that for the sake of keeping the company financially alive, AMD takes 50% of their cows and milks the hell out of them before slaughtering them and putting that meat in the freezer. The remaining 50% of the cows
Only you can't do that with R&D - take your "frozen meat" out of the freezer even a year later, and it's no use - the market has passed you buy and your competitors have moved the goalposts as they've moved forwards and you've stood still. And it's not just what you put in the freezer, it's all the projects that are on the go for the next few years, and when you do decide to go back in, you've got that time lag again as you take a couple of years to get your new projects up to speed - once you've rebuilt all your infrastructure and staffing.
get exceptional treatment and attention to ensure they grow as large and juicey as possible. If AMD can start making good money again off these "excellent" quality cows, they might be able to afford more food and be able to have the "original" 100% number of cows.
Leaving the cow analogy aside, R&D doesn't work like that. It has a critical mass, which is why there are so few companies competing where that critical mass is very expensive such as in the graphics field. You can't grow a new crop of engineers from the half a department you kept. Workload, team dynamics, company growth, company culture all get exacerbated when you're trying to ramp back up for new projects.
You can't be a top leading technology company unless you're building top technology. That doesn't just mean top of the range graphics chips, it also means making the best midrange and low range chips, making the best profit from them via yields and designs. If AMD was to give up making top technology, they wouldn't still be able to make great mid-range products because that top technology edge would be gone. They did that in the 8500 era, and it didn't move them forwards until they moved into making top technology that could filter down to all their products.
Putting the above into more useful context, I'm suggesting the following could happen:
- Halt development on R800/R900/ and beyond projects and re-apply efforts towards applying those concepts to more mainstream GPU's (Note: R&D continues, though it is strictly for mainstream GPU's)
That technology may well be one in the same, so you gain nothing, but lose market and mind share. The technology you put "on hold" is lost as it's no use in 12 months time.
- Pull teams off of any "flagship" ($399+) GPU development and re-assign them to aid with mainstream and value GPU development (shorten development time, decrease issues, add more polish, etc.)
A common misconception. You can't increase high tecnology projects like this just by throwing more people at it. It's the standard "9 women making a baby in one month" analogy. That's why you need to have multiple teams working on staggered projects designed to come to fruition at different times, because the development time is far longer than the product life cycle.
It took a significant amount of time for Intel to come back down the Pentium 4 road, for ATI to get back to the mainstream and high end graphics market, for Microsoft to turn around when they decided that the Internet was more than just a fad. The reason is you can't just throw lots of money at complex projects and have them done by tomorrow. It literally takes years. Look at how long Intel is going to take to get back into graphics with it's Larrabie project, and if anyone can afford to pour massive amounts of money/resources into a project, it's Intel.
- If AMD can get back into the black (or come closer at least by 2H 2008 or 1H 2009) they can contemplate re-entering the "flagship" GPU business. Getting back in is always an option on the table...but it might not make financial sense for them if the above-approach proves successful.
As I said several times, you can't just stop the business for a couple of years. It's too expensive, you lose your critical staff and skills, you lose mindshare, you have to throw away all the work that was meant to be used during that period, your reputation goes down, your competitors get all your profits and OEM contracts, it takes you a couple of years to get back up to speed again.
Remember, when compared with flagship GPU's a mainstream GPU is:
- Cheaper to produce (fewer transistors, smaller die, PCB with less layers, etc.)
- Less complex (easier to design - though far from easy!)
- Less power (less heat - TDP targets easier to reach)
- Lower margin but typically MUCH higher volume
A midrange product is also less desirable, makes your company less profit per unit, loses you the mindshare in the market by not having a flagship product, and weakens your standing with OEMs because you can't supply their requirements at the high end. Do you think AMD wants to hear their OEMs tell them that they only supply high end Intel/Nvidia systems, because AMD can't make a high end system as they don't have a high end graphics solution?
You can't make those great mid-range products without having the technology to make great top range products too. You can't maintain a leading position in the market without leading technology in the form of leading products. If all you want to do is follow the leaders, then you're always going to be behind and falling back.