← Back to directory
DeepSeek AI · Released 2026-04
DeepSeek V4
Mixture-of-experts flagship released April 2026. Tops the open leaderboard for reasoning and coding, distilled into 236B and 67B variants for self-hosters.
DeepSeekCommercial use OKgeneralreasoningcode
Params (max)
685B
Variants
685B / 236B / 67B
Context window
256K tokens
MMLU
89.4
HumanEval
92.1
GSM8K
95.2
Min VRAM (fp16, smallest variant)
80GB
Smallest Q4 GGUF
~40GB
Languages supported
30
Pros
- ✓Frontier reasoning quality
- ✓Permissive license
- ✓Distilled small variants available
Cons
- ×685B variant needs an H100 cluster
- ×MoE inference stacks still maturing
Highlights
- ●Open weights, commercial use allowed
- ●MoE — 22B active params per token
- ●Tops HumanEval among open models
Where to download
Hugging Face: deepseek-ai/DeepSeek-V4
Or via Ollama (
ollama pull deepseek-v4) or LM Studio's in-app browser.Homepage: https://deepseek.com
Related reading
Best Open Source LLMs 2026: Honest Picks by Use Case
Which open-source LLM should you actually run in 2026? Honest picks by use case — frontier reasoning, coding, RAG, edge devices, multilingual.
DeepSeek V4 vs Llama 4: Which Open Frontier Model Should You Run?
DeepSeek V4 just topped the open leaderboard. Should you switch from Llama 4 405B? Side-by-side on benchmarks, license, hardware, and ecosystem.
Open Source LLMs vs Claude / GPT in 2026: When Does Open Win?
Open-source LLMs caught up to GPT-4 in 2024 and Claude Opus in 2026 — but should you actually switch? Cost, quality, latency, privacy compared.
Where to Download Open LLM Weights Safely in 2026
Hugging Face is the default but not the only option. Mirrors, torrents, official sources, and how to verify checksums.