Everyone keeps talking about when integration starts making sense in different markets and when discrete components will die. I thought that as an opening post to this forum, I'd try to be slightly controversial and ask the exact opposite: when (and where) does integration stop making sense? And how do you figure out what's the best overall system architecture?
The disparity in costs between cutting-edge processes and their ancestors as well as analogue/RF-only processes is growing every day, and it is getting more and more challenging to get good analogue/RF performance from them. This and simplicity in general has prompted a few companies to favor System in Package approaches, and I ponder whether that might not start making a lot more sense than SoC at 40nm and beyond.
With pure-logic integration, the economics are obviously a bit more favorable. There still are catches though obviously:
- R&D design times synchronisation & extra delay/respin risks
- Slightly lower yields for highly heterogenous SoCs: if one part is broken, everything goes to the trash.
- Current industry cycles for different components are different for a variety of sometimes pretty good reasons (baseband certification, computer models being tied to their motherboard -> partial notebook refreshes, etc.)
- Process variants may make integration of different kind of logic components less attractive. CPUs are a classic example, but it's not just that - you could conceive that some integration strategies may hurt idle power, for example.
So given all this - is SoC integration overhyped in some areas, or is it not? And is SiP even a viable alternative in some instances? Am I overestimating some of these problems, or are these legitimate concerns?
The disparity in costs between cutting-edge processes and their ancestors as well as analogue/RF-only processes is growing every day, and it is getting more and more challenging to get good analogue/RF performance from them. This and simplicity in general has prompted a few companies to favor System in Package approaches, and I ponder whether that might not start making a lot more sense than SoC at 40nm and beyond.
With pure-logic integration, the economics are obviously a bit more favorable. There still are catches though obviously:
- R&D design times synchronisation & extra delay/respin risks
- Slightly lower yields for highly heterogenous SoCs: if one part is broken, everything goes to the trash.
- Current industry cycles for different components are different for a variety of sometimes pretty good reasons (baseband certification, computer models being tied to their motherboard -> partial notebook refreshes, etc.)
- Process variants may make integration of different kind of logic components less attractive. CPUs are a classic example, but it's not just that - you could conceive that some integration strategies may hurt idle power, for example.
So given all this - is SoC integration overhyped in some areas, or is it not? And is SiP even a viable alternative in some instances? Am I overestimating some of these problems, or are these legitimate concerns?