Get the latest tech news

Ai2’s new Molmo open source AI models beat GPT-4o, Claude on some benchmarks


The models are not only high-performing but also entirely open, allowing researchers and developers to access and build upon them.

Yet, Ai2 also noted in a post on X that Molmo uses “1000x less data” than the proprietary rivals — thanks to some clever new training techniques described in greater detail below and in a technical report paper published by the Paul Allen-founded and Ali Farhadi-led company. Multimodal Pre-training: During this stage, the models are trained to generate captions using newly collected, detailed image descriptions provided by human annotators. The models also excel in visual grounding tasks, with Molmo-72B achieving top performance on RealWorldQA, making it especially promising for applications in robotics and complex multimodal reasoning.

Get the Android app

Or read this on Venture Beat

Read more on:

Photo of Claude

Claude

Photo of benchmarks

benchmarks

Photo of AI2

AI2

Related news:

News photo

Ai2’s Molmo shows open source can meet, and beat, closed multimodal models

News photo

A18 and A18 Pro benchmarks released

News photo

Metal-benchmarks: Apple GPU microarchitecture