TechCrunch Startup News

Tiny startup Arcee AI built a 400B-parameter open source LLM from scratch to best Meta’s Llama

Jan 29, 2026
A tiny startup built a 400B-parameter open-source large language model called Trinity. They detail rapid training in six months and a $20M push to rival major tech models. The conversation covers Trinity’s strengths in coding and reasoning, plans for vision and speech, and a strategy to attract developers and U.S. enterprises.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

Tiny Team, Big Model

  • Arcee AI released Trinity, a 400B-parameter open-source foundation model claiming parity with top open models.
  • The model currently supports text only but benchmarks show competitive performance on coding, math, and reasoning.
INSIGHT

Strategy: Text First, Multimodal Next

  • Trinity targets developers and academics first, focusing on being the best open-weight model to win adoption.
  • The team plans to add vision and speech later but prioritized a strong text LLM to attract U.S. customers.
INSIGHT

Fast, Frugal Training

  • Arcee AI trained multiple models in six months for about $20M using 2,048 NVIDIA Blackwell B300 GPUs.
  • That rapid, cost-efficient training surprised industry observers given the small 30-person team.
Get the Snipd Podcast app to discover more snips from this episode
Get the app