The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence) cover image

The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)

Long Context Language Models and their Biological Applications with Eric Nguyen - #690

Jun 25, 2024
Eric Nguyen, a PhD student at Stanford, dives deep into his research on long context foundation models, specifically Hyena and its applications in biology. He explains the limitations of traditional transformers in processing lengthy sequences and how convolutional models provide innovative solutions. Nguyen introduces Hyena DNA, designed for long-range DNA dependencies, and discusses Evo, a hybrid model with massive parameters for DNA generation. The podcast touches on exciting applications in CRISPR gene editing and the implications of using AI in biological research.
45:41

Episode guests

Podcast summary created with Snipd AI

Quick takeaways

  • Hyena DNA aims to capture long-range dependencies in DNA sequences with a genomic foundation model pre-trained on 1 million tokens.
  • Evo model integrates attention layers with Hyena DNA's convolutional framework for improved performance in DNA sequence generation and design.

Deep dives

Research into Long Sequence Foundation Models in Biology

Long sequence foundation models like Hyena and Hyena DNA are explored for their application in biology, focusing on DNA sequences. These models aim to address the challenge of processing longer context lengths in language models and other sequence models, providing a solution for biological applications.

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner