Get the latest tech news
MiniMax unveils its own open-source LLM with industry-leading 4M token context
The LLM, MiniMax-Text-o1, is of particular note for enabling up to 4 million tokens in its context window — equivalent to a small library.
The LLM, MiniMax-Text-o1, is of particular note for enabling up to 4 million tokens in its context window — equivalent to a small library worth of books. We believe MiniMax-01 is poised to support the anticipated surge in agent-related applications in the coming year, as agents increasingly require extended context handling capabilities and sustained memory.” Unlike earlier architectures, Lightning Attention employs a mix of linear and traditional SoftMax layers, achieving near-linear complexity for long inputs.
Or read this on Venture Beat