Get the latest tech news

MIT spinoff Liquid debuts non-transformer AI models and they’re already state-of-the-art


The startup from MIT's CSAIL says its Liquid Foundation Models have smaller memory needs thanks to a post-transformer architecture.

The models are engineered to be competitive not only on raw performance benchmarks but also in terms of operational efficiency, making them ideal for a variety of use cases, from enterprise-level applications specifically in the fields of financial services, biotechnology, and consumer electronics, to deployment on edge devices. Unlike traditional deep learning models, which require thousands of neurons to perform complex tasks, LNNs demonstrated that fewer neurons—combined with innovative mathematical formulations—could achieve the same results. Labonne noted that while things are “not perfect,” the feedback received during this phase will help the team refine their offerings in preparation for a full launch event on October 23, 2024, at MIT’s Kresge Auditorium in Cambridge, MA.

Get the Android app

Or read this on Venture Beat

Read more on:

Photo of State

State

Photo of Art

Art

Photo of MIT

MIT

Related news:

News photo

Inside the Nintendo Museum: a joyful celebration of machines, magic and the art of play

News photo

MT Secretary of State: Overseas ballot system back open after voter reported issue

News photo

Lawmakers press state’s biggest landlords over whether they used RealPage software to inflate rents — Greystar, Brookfield Properties, Cushman and Wakefield, Blackstone’s Apartment Income REIT among dozen named