DeepSeek-Coder-V2

DeepSeek

DeepSeek-Coder-V2, an open-source Mixture-of-Experts (MoE) code language model. It is further pre-trained from an intermediate checkpoint of DeepSeek-V2 with additional 6 trillion tokens. The original V1 model was trained from scratch on 2T tokens, with a composition of 87% code and 13% natural language in both English and Chinese. It was pre-trained on project-level code corpus by employing a extra fill-in-the-blank task.

ANALYSIS STATUS

No analysis yet

This model exists in our database, but we only publish political alignment results once its full analysis run is completed.

A full run currently requires 114 completed bill analyses.