NLP Highlights cover image

129 - Transformers and Hierarchical Structure, with Shunyu Yao

NLP Highlights

00:00

Introduction

Hosts tang chun yol and gustavis are members of the allen and l p team at the alen institute for a i. They talk about interesting work and natural language processing. Lye: I was really stited to see that ssinsa is first author on this paper called self attention networks, an process bound in hierarchical languages. The i wel senk for hamming me here. E, i've also rad somavers the oto word, and also excited to talk about, if we have to.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app