← Back to directory
Allen AI · Released 2025-11
OLMo 2
Allen AI's truly open model — weights, training code, training data, and checkpoints all released. The most reproducible model on this list.
Apache-2.0Commercial use OKgeneral
Params (max)
32B
Variants
32B / 13B / 7B
Context window
8K tokens
MMLU
71.2
HumanEval
60
GSM8K
78.5
Min VRAM (fp16, smallest variant)
16GB
Smallest Q4 GGUF
~4GB
Languages supported
1
Pros
- ✓Full transparency
- ✓Apache-2.0
- ✓Backed by AI2
Cons
- ×Lower benchmarks than closed-data peers
- ×8K context
Highlights
- ●100% open: weights + code + data
- ●Apache-2.0
- ●Best for research reproducibility
Where to download
Hugging Face: allenai/OLMo-2-32B
Or via Ollama (
ollama pull olmo-2) or LM Studio's in-app browser.Homepage: https://allenai.org/olmo
Related reading
Open Source LLM Licenses Explained: Llama vs Apache vs Gemma vs MIT
Can you use Llama in a commercial product? What does the Gemma license actually restrict? A plain-English breakdown of every major open LLM license.
Fine-Tuning an Open Source LLM in 2026: LoRA vs QLoRA vs Full Fine-Tune
Should you LoRA, QLoRA, or full fine-tune your open LLM? Honest tradeoffs, GPU requirements, and a decision tree.