Get the latest tech news

Apple details how it trained its new AI models: 4 interesting highlights (the local model was split into two blocks, the cloud-based model has a creative architecture, increased multilingual representation by 275%, and Applebot crawler)


Apple has published report detailing how its new AI models were trained, optimized, and evaluated. Here are a few interesting tidbits.

In the deocument, Apple explains that evaluations were conducted using prompts written by native speakers (rather than translations), and the model was tested on both accuracy and how natural its responses sounded in local contexts. Apple applied multiple layers of filtering to remove low-quality, unsafe, or irrelevant content, including spammy pages, shallow or templated text, and broken formatting. While the company also doesn’t specify how much of the dataset this represented, it notes that synthetic data played a large role in key training steps like fine-tuning, reinforcement learning, and improving multilingual support.

Get the Android app

Or read this on r/apple

Read more on:

Photo of Apple

Apple

Photo of cloud

cloud

Photo of blocks

blocks

Related news:

News photo

Chinese iPhone Display Supplier Hit by US Trade Ruling [Updated]

News photo

iCloud Coverage: Antitrust Storm Brews Against Apple

News photo

UK government seeks way out of clash with US over Apple encryption