

Musk Open-Sources Grok 2.5, fighting OpenAI
Aug 25, 2025
Elon Musk has made waves by open-sourcing Grok 2.5, a language model designed to rival GPT-4. This move marks a significant shift in AI accessibility, as developers can freely modify and use Grok under the Apache 2.0 license. The discussion dives deep into the model's unique design with a colossal 314 billion parameters and its smaller variant. Musk's motivations for challenging big tech companies are explored, emphasizing his belief in openness over corporate secrecy. The implications for developers and the competitive landscape of AI are also highlighted.
AI Snips
Chapters
Transcript
Episode notes
Open-Sourcing At Scale
- XAI released Grok 2.5 under Apache 2.0, letting anyone use and modify the model commercially without payment or attribution.
- This makes Grok 2.5 one of the most permissively licensed large language models at this scale.
Two Model Sizes Explained
- Grok 2.5 comes in a 314B parameter MoE version and a smaller 25B dense variant for lighter deployment.
- XAI says both were trained on publicly available and licensed data sources.
Benchmark Parity Claim
- Grok 2.5 reportedly ranks close to GPT-4 on benchmarks like MMLU and HumanEval according to internal evaluations.
- That narrows the gap between XAI's models and leading closed-source systems.