Get the latest tech news

Apple releases eight small AI language models aimed at on-device use


OpenELM mirrors efforts by Microsoft to make useful small AI language models that run locally.

In the world of AI, what might be called "small language models" have been growing in popularity recently because they can be run on a local device instead of requiring data center-grade computers in the cloud. Apple says its approach with OpenELM includes a "layer-wise scaling strategy" that reportedly allocates parameters more efficiently across each layer, saving not only computational resources but also improving the model's performance while being trained on fewer tokens. According to Apple's released white paper, this strategy has enabled OpenELM to achieve a 2.36 percent improvement in accuracy over Allen AI's OLMo 1B(another small language model) while requiring half as many pre-training tokens.

Get the Android app

Or read this on Hacker News

Read more on:

Photo of Apple

Apple

Photo of use

use

Photo of device

device

Related news:

News photo

Apple moves closer to China despite supply chain shifts

News photo

Meta says it will take years to make money from generative AI - but what about Apple?

News photo

Apple's 2025 M4 Mac Pro: What to Expect