Get the latest tech news

China's DeepSeek Coder Becomes First Open-Source Coding Model To Beat GPT-4 Turbo


Shubham Sharma reports via VentureBeat: Chinese AI startup DeepSeek, which previously made headlines with a ChatGPT competitor trained on 2 trillion English and Chinese tokens, has announced the release of DeepSeek Coder V2, an open-source mixture of experts (MoE) code language model. Built upon De...

Founded last year with a mission to "unravel the mystery of AGI with curiosity," DeepSeek has been a notable Chinese player in the AI race, joining the likes of Qwen, 01.AI and Baidu. The original DeepSeek Coder, with up to 33 billion parameters, did decently on benchmarks with capabilities like project-level code completion and infilling, but only supported 86 programming languages and a context window of 16K. When tested on MBPP+, HumanEval, and Aider benchmarks, designed to evaluate code generation, editing and problem-solving capabilities of LLMs, DeepSeek Coder V2 scored 76.2, 90.2, and 73.7, respectively -- sitting ahead of most closed and open-source models, including GPT-4 Turbo, Claude 3 Opus, Gemini 1.5 Pro, Codestral and Llama-3 70B.

Get the Android app

Or read this on Slashdot

Read more on:

Photo of China

China

Photo of GPT-4

GPT-4

Photo of GPT-4 Turbo

GPT-4 Turbo

Related news:

News photo

China Plans New Measures to Attract Venture CapitalĀ Investment

News photo

This Is What Would Happen if China Invaded Taiwan

News photo

Hong Kong authorities halt alleged smuggler shifting 596 'high-end' CPUs to China