Alter Everything

196: AI Model Strategy in Translation

10 snips
Oct 22, 2025
Olga Beregovaya, Vice President of AI at Smartling, shares her extensive experience in language technology, diving into the evolution from rule-based to transformer models. She highlights the advantages of purpose-built AI for translation over general models. Olga also discusses Smartling's multi-model translation stack and the operational challenges of managing various models. The conversation includes tackling biases in translation and the benefits of curated data, particularly in risk-sensitive sectors like life sciences.
Ask episode
AI Snips
Chapters
Transcript
Episode notes
INSIGHT

Transformers Were The Major Breakthrough

  • Transformer models were the biggest leap in language technology after statistical methods replaced rule-based systems.
  • Olga Beregovaya highlights that attention-based transformers unlocked capabilities previously impossible with older approaches.
ADVICE

Prefer Purpose-Built Models For Translation

  • Use purpose-built models for translation tasks instead of relying solely on generalized foundational models.
  • Fine-tune and curate task-specific corpora to dramatically improve translation quality and predictability.
ANECDOTE

Smartling's Multi-Model, 40-Prompt Pipeline

  • Smartling runs a model portfolio across Vertex AI, WatsonX, Bedrock, Azure, and OpenAI while remaining model-agnostic.
  • Their R&D keeps roughly 40 prompts and an ecosystem that can swap models as newer versions prove better for specific tasks.
Get the Snipd Podcast app to discover more snips from this episode
Get the app