← Back to directory
Allen AI · Released 2025-11

OLMo 2

Allen AI's truly open model — weights, training code, training data, and checkpoints all released. The most reproducible model on this list.

Apache-2.0Commercial use OKgeneral
Params (max)
32B
Variants
32B / 13B / 7B
Context window
8K tokens
MMLU
71.2
HumanEval
60
GSM8K
78.5
Min VRAM (fp16, smallest variant)
16GB
Smallest Q4 GGUF
~4GB
Languages supported
1
Pros
  • Full transparency
  • Apache-2.0
  • Backed by AI2
Cons
  • ×Lower benchmarks than closed-data peers
  • ×8K context

Highlights

  • 100% open: weights + code + data
  • Apache-2.0
  • Best for research reproducibility

Where to download

Hugging Face: allenai/OLMo-2-32B
Or via Ollama (ollama pull olmo-2) or LM Studio's in-app browser.

Related reading