One of the most exciting macro trends in the hardware industry today is the shift from human-directed computing (PC) and mobile platforms (smartphones) to AI-driven autonomous platforms, otherwise known as the ‘Internet of Things’. This trend is not new, however, is currently accelerating at a dizzying pace with the advent of embedded ‘edge AI.’ Embedded 'edge AI' encompasses everything from automated industrial equipment and self-driving cars down to robotic appliances and tiny remote sensors. Since all these new platforms are powered by semiconductor chips (IC’s) - processors, memory, analog, power, sensors and actuators - and typically interface with the real world they are by definition quite diverse in scale and function.
With more demanding price, performance, and power requirements in every platform, the need for customized silicon become less of an option and more of a necessity. This presents a huge opportunity and challenge for the semiconductor industry – how do we develop all the various custom chips to address the diverse applications without blowing out R&D budgets and development schedules? Those who are familiar with the chip industry know the opportunity is huge (>$1Trillion/year) but so are the challenges. In our next post we will examine, the history of the IC industry and how it arrived where we are today.