Breakthrough CRAM technology ditches von Neumann model, makes AI 1,000x more energy efficient
CRAM chips by the University of Minnesota could make AI 1,000x more energy-efficient, replacing the von Neumann model using spintronic devices.
Researchers from the University of Minnesota have created a new computational random-access memory (CRAM) prototype chip promising to reduce energy needs for AI applications by 1,000 times compared to current methods. In one simulation, this novel tech showed energy savings of up to 2,500 times, marking a significant shift in efficiency for AI computations.
Unlike the traditional von Neumann architecture, which separates processor and memory units, the CRAM technology performs computations directly within the memory using spintronic devices called magnetic tunnel junctions (MTJs). These devices leverage the spin of electrons to store data, offering a highly efficient replacement for traditional transistor-based chips.
Co-author Ulya Karpuzcu states CRAM is highly flexible, enabling computation to be performed at any memory array location, enhancing performance for diverse AI algorithms. Though challenges around scalability, manufacturing, and integration with existing silicon remain, the research team plans to work with semiconductor industry leaders to bring CRAM to commercial reality.