Get the latest tech news
New memory tech unveiled that reduces AI processing energy requirements by 1,000 times or more
New CRAM technology gives RAM chips the power to process data, not just store it.
Artificial intelligence (AI) computing requires tremendous amounts of electricity, but targeted research might hold the key to greatly reducing that. A group of engineering researchers at the University of Minnesota Twin Cities have demonstrated an AI efficiency-boosting technology and published a peer-reviewed paper outlining their work and findings. They plan to work with leaders in the semiconductor industry, including those in Minnesota, to provide large-scale demonstrations and produce the hardware to help advance AI functionality while also making it more efficient.
Or read this on r/technology