In the era of artificial intelligence (AI), the energy consumption of computing has become a major limitation to the widespread adoption of edge-AI applications in areas such as personalized healthcare, virtual/augmented reality, and smart homes. To address this challenge, in-memory computing (IMC) has emerged as a promising technology that enables in-situ, data-centric computation with reduced data movement. However, IMC requires high reliability, large storage capacity for computing parameters, and fast reconfiguration to support novel AI models such as transformers.
In this talk, Prof. Ielmini provided an overview of novel edge-AI technologies addressing these IMC challenges. First, he discussed 3D memory solutions to increase the number of parameters within AI accelerators, including both horizontal 3D crosspoint (3DXP) and 3D vertical resistive switching memory (3D-VRRAM). He also highlighted oxide-based electrochemical random-access memory (ECRAM) for accelerating AI training using fully parallel tensor-product operations in crossbar arrays. Finally, he outlined future challenges and opportunities for advancing AI acceleration.














