Launching at those prices cost a lot of money for Sony and to a lesser degree, MS. So they're not going to do that again. You also are not understanding the fact that lower process nodes do not drop price and power usage that greatly anymore. The current gen went from 90nm to 40/45nm to finally become profitable. 40->28 won't have as big of an effect. You've also completely ignored the costs for case, power supply, controller, packaging, assembly, optical drive etc. And no, optical drives will not be replaced by Flash by next gen.
Yes it cost them alot , yes they will do it again because it cost sony alot with the ps2 and the psone.
If looking at MS the costs for the 360 was less than the xbox .
This gen did go from 0nm to 40/45 for sony to be profitable but we don't know at what point MS became profitable and from what i can tell it would have been at the 65nm point.
Remember the xbox 360 launched at $300/$400 and 5 years later we still have sku's ranging from $200 all the way up to $400.
I also get the feeling that you don't know how chip costs work. MS will buy a production run of x amount of wafers for x amount of money. On each wafer they can have 500 chips that work as need or 10. The cost per wafer stays the same however. So if they have 5,000 for a wafer and 500 work each chip costs MS $10 . If they have a wafer and 10 work they pay $500 for each chip.
With a proper design and good yields ms can put alot of cpu /gpu power in the console at little cost. Even on the same process node optimising , layout changes and just general knowledge coming from using the process can drive yields up .
A cayman level chip on 28nm would be realtively small and price would drop greatly. Remember you like to point out optical drives and other costs when looking at consoles. When looking at gpus you can only see the price once everyone has gotten their cut of the pie. But to find out the true cost of a gpu you'd have to strip the ram , cooling and other things out of it
I didn't say why Sony/MS would stay there. I said there is no incentive for Intel/AMD to produce advanced cpu's with the most advanced manufacturing process and sell it to console makers at 5% markup, when every chip they make is in demand by the PC market at 100% or more markup.
Let me ask you a question. If you were AMD and you were hurting for cash and MS said to you , we want to take your bulldozer cpu and have the rights to produce it at global foundry and we will give you $5 per chip sold do you think that would be a bad choice for AMD ?
If we look at the xbox 360 its sold 50m units. IF amd got a $5 cut per unit they would have pocketed a cool 250m for doing nothing but liscensing the design to MS . If AMD is able to get both a cpu and gpu in the next console it would be great for them. it would be very little work for an anual pay check over the course of a console gen , 5-10 years.
It has nothing to do with amd actually making the chip. They just have to design it. Why else do you think AMD is in two consoles and nvidia inside of one console ? They all want it cause its easy money. A gpu design may last 1-2 years at retail but thats it. THe ps2 is still on sale what 8 years after it was released. IF amd or nvidia had a chip inside of that console they owuld still be getting a piece of that pie. Amd is still getting money from the xbox 360 and it looks like it will be around for at least another 3 years, AMD got money from the gamecube and wii and both will be around a while yet.