Get the latest tech news
Liquid Foundation Models: Our First Series of Generative AI Models
Announcing the first series of Liquid Foundation Models (LFMs) – a new generation of generative AI models that achieve state-of-the-art performance at every scale, while maintaining a smaller memory footprint and more efficient inference.
The LFM design framework unifies and subsumes a wide range of existing computational units in deep learning, providing a systematic approach to exploring the space of architectures. General and expert knowledge Mathematics and logical reasoning Efficient and effective long-context tasks Their primary language is English, with secondary multilingual capabilities in Spanish, French, German, Chinese, Arabic, Japanese, and Korean Share your feedback Come join us at MIT Kresge, Cambridge, MA on October 23rd 2024, to learn more about Liquid as we unveil more products and progress on LFMs and their applications in consumer electronics, finance, healthcare, biotechnology, and more!
Or read this on Hacker News