Chinese technology firm DeepSeek this week unveiled an updated version of its foundational artificial intelligence model, DeepSeek-V3-0324, with significant improvements in code generation and mathematical reasoning.
The new model, based on the R1 reasoning system and with 685 billion parameters, has been developed as a general-purpose tool, useful for both conversational assistants and web development tasks.
MIT License
Unlike previous versions, it is available under the MIT License, a type of open source permission that allows the use, modification, and redistribution of the software without commercial restrictions, and which is the most widely used on the GitHub developer platform.
Among the most significant advances, it achieved a score of 59.4 on the American Invitational Mathematics Examination (AIME), a prestigious mathematics test in the US, easily surpassing the 39.6 achieved by his predecessor.
In programming tests such as LiveCodeBench, it achieved a 10-point improvement, reaching 49.2.
The model was able to generate more than 800 lines of code without errors and more than 20 tokens per second, a unit of measurement that represents fragments of text – for example, words or characters – used in natural language processing.
New architecture
The system is based on an architecture known as Mixture-of-Experts, which distributes tasks among different specialized modules to increase efficiency without significantly increasing computational costs.
DeepSeek-V3-0324 is out now!
Major boost in reasoning performance
Stronger front-end development skills
Smarter tool-use capabilities
For non-complex reasoning tasks, we recommend using V3 — just turn off “DeepThink”
API usage remains unchanged
Models are… pic.twitter.com/QVuPwCODne
— DeepSeek (@deepseek_ai) March 25, 2025
Users and specialized engineers emphasized that the model has been praised for its mathematical precision, its ability to generate large-scale functional code, and its potential to compete with leading firms in the sector.
The model, available on Hugging Face, a global repository where developers and researchers share open-access AI models, and on DeepSeek’s official platforms, has already been integrated into cloud services from startups like Hyperbolic.
“The coding capabilities are much stronger, and the new version could pave the way for the launch of the R2,” Li Bangzhu, founder of the specialized website AIcpb.com, was quoted as saying by Hong Kong’s South China Morning Post.
China and the United States
This progress comes in a context of strong technological competition between China and the United States.
Washington has imposed export controls on advanced semiconductors, essential for training AI models, and has imposed bans on Chinese apps like TikTok.
Beijing, for its part, maintains the blockade of American services such as Google, Facebook, X or Instagram, and demands that AI systems respect “fundamental socialist values”, prohibiting content that endangers national security or territorial unity.