Interconnects cover image

Arcee AI goes all-in on open models built in the U.S.

Interconnects

00:00

Post-training for MoEs and integrating SFT

Lucas explains adding fine-tuning into TorchTide, context parallelism, and SFT/RL workflow for MoEs.

Play episode from 20:40
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app