Get the latest tech news
'Crazy idea' memory device could slash AI energy consumption by up to 2,500 times | Live Science
By performing computations directly inside memory cells, CRAM will dramatically reduce power demands for AI workloads. Scientists claim it's a solution to AI's huge energy consumption.
In a peer-reviewed study published July 25 in the journal npj Unconventional Computing, researchers demonstrated that CRAM could perform key AI tasks like scalar addition and matrix multiplication in 434 nanoseconds, using just 0.47 microjoules of energy. "With an evolving group of students since 2003 and a true interdisciplinary faculty team built at the University of Minnesota — from physics, materials science and engineering, computer science and engineering, to modeling and benchmarking, and hardware creation — [we] now have demonstrated that this kind of technology is feasible and is ready to be incorporated into technology," Wang said in a statement. An MTJ is a small device that uses the spin of electrons to store data instead of relying on electrical charges, like traditional memory.
Or read this on r/technology