AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Read the full transcript here.
What is machine learning? What are neural networks? How can humans interpret the meaning or functionality of the various layers of a neural network? What is a transformer, and how does it build on the idea of a neural network? Does a transformer have a conceptual advantage over neural nets, or is a transformer basically the equivalent of neural nets plus a lot of compute power? Why have we started hearing so much about neural nets in just the last few years even though they've existed conceptually for many decades? What kind of ML model is GPT-3? What learning sub-tasks are encapsulated in the process of learning how to autocomplete text? What is "few-shot" learning? What is the difference between GPT-2 and GPT-3? How big of a deal is GPT-3? Right now, GPT-3's responses are not guaranteed to contain true statements; is there a way to train future GPT or similar models to say only true things (or to indicate levels of confidence in the truthfulness of its statements)? Should people whose jobs revolve around writing or summarizing text be worried about being replaced by GPT-3? What are the relevant copyright issues related to text generation models? A website's "robots.txt" file or a "noindex" HTML attribute in its pages' meta tags tells web crawlers which content they can and cannot access; could a similar solution exist for writers, programmers, and others who want to limit or prevent their text from being used as training data for models like GPT-3? What are some of the scarier features of text generation models? What does the creation of models like GPT-3 tell us (if anything) about how and when we might create artificial general intelligence?
Learn more about GPT-3 here. And learn more about Jeremy Nixon and listen to his episode here.
Further reading:
Staff
Music
Affiliates