Get the latest tech news

Microsoft’s GRIN-MoE AI model takes on coding and math, beating competitors in key benchmarks


Microsoft's new AI model, GRIN-MoE, delivers groundbreaking scalability and performance in coding and math tasks, outperforming competitors like GPT-3.5 and LLaMA3, making it a powerful tool for enterprise applications.

GRIN’s ability to scale without expert parallelism or token dropping—two common techniques used to manage large models—makes it a more accessible option for organizations that may not have the infrastructure to support bigger models like OpenAI’s GPT-4o or Meta’s LLaMA 3.1. The model’s ability to “scale MoE training with neither expert parallelism nor token dropping” allows for more efficient resource usage in environments with constrained data center capacity. Its ability to scale efficiently while maintaining superior performance in coding and mathematical tasks positions it as a valuable tool for businesses looking to integrate AI without overwhelming their computational resources.

Get the Android app

Or read this on Venture Beat

Read more on:

Photo of Microsoft

Microsoft

Photo of math

math

Photo of competitors

competitors

Related news:

News photo

$100 billion AI infrastructure fund launched by Microsoft, BlackRock, UAE firm

News photo

Microsoft launches a Windows app for iPhones, Macs, and Android devices

News photo

The new Microsoft Flight Simulator will be a lot smaller