DeepSeek Releases V3.1 — an Open-Source Model with a 128k Context Window, Outperforming Peers in Coding

DeepSeek Releases V3.1 — an Open-Source Model with a 128k Context Window, Outperforming Peers in Coding

The Chinese company DeepSeek AI released an update to its flagship open-source model, DeepSeek V3.1, on August 20, 2025. The new release has caused a stir in the developer community due to its impressive specifications and performance. The model features a hybrid Mixture-of-Experts (MoE) architecture with a total of 685 billion parameters and an expanded context window of up to 128,000 tokens. This allows it to process and analyze vast amounts of information, such as entire codebases or documentation sets, in a single pass. According to published benchmarks, DeepSeek V3.1 shows outstanding results in programming and logical reasoning tasks, outperforming many leading proprietary and open-source models, including Claude 3.5 Sonnet. The model's availability under a permissive license on platforms like Hugging Face is accelerating its adoption and strengthening DeepSeek's position as a key player in the open-source AI world.

« Back to News List