Get the latest tech news
Meta makes its MobileLLM open for researchers, posting full weights
Developers and researchers interested in testing MobileLLM can now access the models on Hugging Face, fully integrated with the Transformers library.
With parameter counts ranging from 125 million to 1 billion, these models are designed to operate within the limited memory and energy capacities typical of mobile hardware. By emphasizing architecture over sheer size, Meta’s research suggests that well-designed compact models can deliver robust AI performance directly on devices. This could accelerate innovation in the field of small language models (SLMs), making high-quality AI accessible without reliance on extensive cloud infrastructure.
Or read this on Venture Beat