The ubiquity of AI/ML in modern systems, from sensors to data centers, is evident. Sensors, the AI/ML pipeline's foundation, gather diverse environmental data. Concepts like on-chip AI within sensors are emerging for data preprocessing. Edge devices, like mobile phones and cars, execute AI/ML models requiring swift processing, ensuring swift responses and privacy. Furthermore, edge computing, nestled between the cloud and edge devices, facilitates tasks demanding more computational heft than edge devices can afford but still requires minimal latency. Data centers are perfect for managing vast datasets and training intricate AI/ML models.
This video presentation will illustrate how in this ecosystem, networks-on-chips (NoCs) play a pivotal role in the three dimensions of:
As AI/ML continues its expansive growth, NoCs will undeniably be central to the semiconductor resurgence, heralding a promising future for integrated systems.