Get the latest tech news

Liquid AI’s new STAR model architecture outshines Transformer efficiency


The STAR framework leverages evolutionary algorithms and a numerical encoding system to balance quality and efficiency in AI models.

The STAR framework leverages evolutionary algorithms and a numerical encoding system to address the complex challenge of balancing quality and efficiency in deep learning models. According to Liquid AI’s research team, which includes Armin W. Thomas, Rom Parnichkun, Alexander Amini, Stefano Massaroli, and Michael Poli, STAR’s approach represents a shift from traditional architecture design methods. While Liquid AI has yet to disclose specific plans for commercial deployment or pricing, the research findings signal a significant advancement in the field of automated architecture design.

Get the Android app

Or read this on Venture Beat

Read more on:

Photo of Transformer

Transformer

Photo of Liquid AI

Liquid AI

Related news:

News photo

TokenFormer: Rethinking Transformer Scaling with Tokenized Model Parameters

News photo

Oasis: A Universe in a Transformer

News photo

XLSTM won't replace the Transformer. Two bitter lessons