MRAM-based memory architecture could accelerate AI by a factor of 1000

Researchers from the University of Minnesota Twin Cities developed a new AI hardware accelerator, based on MRAM memory, that can accelerate AI computation by a factor of 1,000. 

The so-calledd Computational Random-Access Memory device, or CRAM, is a machine learning inference accelerator, that is not just 1,000 faster, it also exhibits energy saves of 1,700 to 2,500 times compared to traditional methods. 

 

This project has been under R&D for over 20 years, and is based on the University's patented research into Magnetic Tunnel Junctions (MTJs) devices. CRAM performs computations directly within memory cells, utilizing the array structure efficiently, which eliminates the need for slow and energy-intensive data transfers.

Source: 
Posted: Jul 29,2024 by Ron Mertens