Get the latest tech news
Mixtral 8x22B
Continuing to push the frontier of AI and making it accessible to all.
It is fluent in English, French, Italian, German, and Spanish It has strong maths and coding capabilities It is natively capable of function calling; along with the constrained output mode implemented on la Plateforme, this enables application development and tech stack modernisation at scale Its 64K tokens context window allows precise information recall from large documents It strongly outperforms LLaMA 2 70B on HellaSwag, Arc Challenge and MMLU benchmarks in French, German, Spanish and Italian. Figure 3: Comparison of Mistral open source models and LLaMA 2 70B on HellaSwag, Arc Challenge and MMLU in French, German, Spanish and Italian.
Or read this on Hacker News