Get the latest tech news
Meta just beat Google and Apple in the race to put powerful AI on phones
Meta has launched compressed AI models that run directly on smartphones, making artificial intelligence faster and more private while using less memory than cloud-based alternatives.
Meta Platforms has created smaller versions of its Llama artificial intelligence models that can run on smartphones and tablets, opening new possibilities for AI beyond data centers. Meta’s compressed AI models (SpinQuant and QLoRA) show dramatic improvements in speed and efficiency compared to standard versions when tested on Android phones. While Google and Apple take careful, controlled approaches to mobile AI — keeping it tightly integrated with their operating systems — Meta’s strategy is markedly different.
Or read this on Venture Beat