Get the latest tech news
A look at Apple's technical approach to AI including core model performance etc.
Apple Intelligence makes a lot of sense when you get out of the AI bubble. Plus, the cool technical details Apple shared about their language models "thinking different."
At the end of the day, most of these evaluations are extremely opaque, so I’m leaning on a lot of trust in the company's reputation and general context to make predictions on the actual quality of their models. In fact, recent results, including those in (Engstrom et al., 2020) and our experiments in Section 5.3, show that most of the improved performance exhibited by PPO is due to code-level optimization techniques, such as learning rate annealing, observation and reward normalization, and in particular, the use of generalized advantage estimation (GAE). As Apple expands the memory footprint of their devices to accommodate a larger AI narrative, they have easy performance to gain by just dropping the quantization levels a bit.
Or read this on Hacker News