

ML 005: Transfer Learning for NLP with Daniel Svoboda
Sep 15, 2020
54:13
One of the hottest fields right now in machine learning is natural language processing. Whether it’s getting sentiment from tweets, summarizing your documents, sarcasm detection, or predicting stock trends from the news, NLP is definitely the wave of the future. Special guest Daniel Svoboda talks about transfer learning and the latest developments such as BERT that promises to revolutionize NLP even further. Sponsors Panel
Advertising Inquiries: https://redcircle.com/brands
Privacy & Opt-Out: https://redcircle.com/privacy
Become a supporter of this podcast: https://www.spreaker.com/podcast/adventures-in-machine-learning--6102041/support.
- Charles Max Wood
- Gant Laborde
- Daniel Svoboda
- towardsdatascience.com/bert-explained-state-of-the-art-language-model-for-nlp
- ai.googleblog.com/2018/11/open-sourcing-bert-state-of-art-pre.html
- ai.googleblog.com/2017/08/transformer-novel-neural-network.html
- www.nltk.org
- spacy.io
- https://www.kaggle.com
- Traffic Secrets: The Underground Playbook for Filling Your Websites and Funnels with Your Dream Customers
- Range: Why Generalists Triumph in a Specialized World
- Star Trek: Picard
Advertising Inquiries: https://redcircle.com/brands
Privacy & Opt-Out: https://redcircle.com/privacy
Become a supporter of this podcast: https://www.spreaker.com/podcast/adventures-in-machine-learning--6102041/support.