The TWIML AI Podcast (formerly This Week in Machine Learning & Artificial Intelligence)

Sam Charrington
undefined
Jun 4, 2018 • 45min

Data Innovation & AI at Capital One with Adam Wenchel - TWiML Talk #147

In this episode I’m joined by Adam Wenchel, vice president of AI and Data Innovation at Capital One, to discuss how Machine Learning & AI are being integrated into their day-to-day practices, and how those advances benefit the customer. In our conversation, we look into a few of the many applications of AI at the bank, including fraud detection, money laundering, customer service, and automating back office processes. Adam describes some of the challenges of applying ML in financial services and how Capital One maintains consistent portfolio management practices across the organization. We also discuss how the bank has organized to scale their machine learning efforts, and the steps they’ve taken to overcome the talent shortage in the space. The notes for this show can be found at twimlai.com/talk/147.
undefined
May 31, 2018 • 46min

Deep Gradient Compression for Distributed Training with Song Han - TWiML Talk #146

On today’s show I chat with Song Han, assistant professor in MIT’s EECS department, about his research on Deep Gradient Compression. In our conversation, we explore the challenge of distributed training for deep neural networks and the idea of compressing the gradient exchange to allow it to be done more efficiently. Song details the evolution of distributed training systems based on this idea, and provides a few examples of centralized and decentralized distributed training architectures such as Uber’s Horovod, as well as the approaches native to Pytorch and Tensorflow. Song also addresses potential issues that arise when considering distributed training, such as loss of accuracy and generalizability, and much more. The notes for this show can be found at twimlai.com/talk/146.
undefined
May 28, 2018 • 35min

Masked Autoregressive Flow for Density Estimation with George Papamakarios - TWiML Talk #145

In this episode, University of Edinburgh Phd student George Papamakarios and I discuss his paper “Masked Autoregressive Flow for Density Estimation.” George walks us through the idea of Masked Autoregressive Flow, which uses neural networks to produce estimates of probability densities from a set of input examples. We discuss some of the related work that’s laid the groundwork for his research, including Inverse Autoregressive Flow, Real NVP and Masked Auto-encoders. We also look at the properties of probability density networks and discuss some of the challenges associated with this effort. The notes for this show can be found at twimlai.com/talk/145.
undefined
May 25, 2018 • 22min

Training Data for Computer Vision at Figure Eight with Qazaleh Mirsharif - TWiML Talk #144

For today’s show, the last in our TrainAI series, I'm joined by Qazaleh Mirsharif, a machine learning scientist working on computer vision at Figure Eight. Qazaleh and I caught up at the TrainAI conference to discuss a couple of the projects she’s worked on in that field, namely her research into the classification of retinal images and her work on parking sign detection from Google Street View images. The former, which attempted to diagnose diseases like diabetic retinopathy using retinal scan images, is similar to the work I spoke with Ryan Poplin about on TWiML Talk #122. In my conversation with Qazaleh we focus on how she built her datasets for each of these projects and some of the key lessons she’s learned along the way. The notes for this show can be found at twimlai.com/talk/144. For series details, visit twimlai.com/trainai2018.
undefined
May 24, 2018 • 38min

Agile Data Science with Sarah Aerni - TWiML Talk #143

Today we continue our TrainAI series with Sarah Aerni, Director of Data Science at Salesforce Einstein. Sarah and I sat down at the TrainAI conference to discuss her talk “Notes from the Field: The Platform, People, and Processes of Agile Data Science.” Sarah and I dig into the concept of agile data science, exploring what it means to her and how she’s seen it done at Salesforce and other places she’s worked. We also dig into the notion of machine learning platforms, which is also a keen area of interest for me. We discuss some of the common elements we’ve seen in ML platforms, and when it makes sense for an organization to start building one. The notes for this show can be found at twimlai.com/talk/143. For more details on the TrainAI series, visit twimlai.com/trainai2018
undefined
May 23, 2018 • 34min

Tensor Operations for Machine Learning with Anima Anandkumar - TWiML Talk #142

In this episode of our TrainAI series, I sit down with Anima Anandkumar, Bren Professor at Caltech and Principal Scientist with Amazon Web Services. Anima joined me to discuss the research coming out of her “Tensorlab” at CalTech. In our conversation, we review the application of tensor operations to machine learning and discuss how an example problem–document categorization–might be approached using 3 dimensional tensors to discover topics and relationships between topics. We touch on multidimensionality, expectation maximization, and Amazon products Sagemaker and Comprehend. Anima also goes into how to tensorize neural networks and apply our understanding of tensor algebra to do perform better architecture searches. The notes for this show can be found at twimlai.com/talk/142. For series info, visit twimlai.com/trainai2018
undefined
May 22, 2018 • 37min

Deep Learning for Live-Cell Imaging with David Van Valen - TWiML Talk #141

In today’s show, I sit down with David Van Valen, assistant professor of Bioengineering & Biology at Caltech. David joined me after his talk at the Figure Eight TrainAI conference to chat about his research using image recognition and segmentation techniques in biological settings. In particular, we discuss his use of deep learning to automate the analysis of individual cells in live-cell imaging experiments. We had a really interesting discussion around the various practicalities he’s learned about training deep neural networks for image analysis, and he shares some great insights into which of the techniques from the deep learning research have worked for him and which haven’t. If you’re a fan of our Nerd Alert shows, you’ll really like this one. Enjoy! The notes for this show can be found at twimlai.com/talk/141. For more information on this series, visit twimlai.com/trainai2018.
undefined
May 21, 2018 • 33min

Checking in with the Master w/ Garry Kasparov - TWiML Talk #140

In this episode I’m joined by legendary chess champion, author, and fellow at the Oxford Martin School, Garry Kasparov. Garry and I sat down after his keynote at the Figure Eight Train AI conference in San Francisco last week. Garry and I discuss his bouts with the chess-playing computer Deep Blue–which became the first computer system to defeat a reigning world champion in their 1997 rematch–and how that experience has helped shaped his thinking on artificially intelligent systems. We explore his perspective on the evolution of AI, the ways in which chess and Deep Blue differ from Go and Alpha Go, and the significance of DeepMind’s Alpha Go Zero. We also talk through his views on the relationship between humans and machines, and how he expects it to change over time. The notes for this show can be found at twimlai.com/talk/140. For more information on this series, visit twimlai.com/trainai2018.
undefined
May 17, 2018 • 33min

Exploring AI-Generated Music with Taryn Southern - TWiML Talk #139

In this episode I’m joined by Taryn Southern - a singer, digital storyteller and Youtuber, whose upcoming album I AM AI will be produced completely with AI based tools. Taryn and I explore all aspects of what it means to create music with modern AI-based tools, and the different processes she’s used to create her singles Break Free, Voices in My Head, and more. She also provides a rundown of the many tools she’s used in this space, including Google Magenta, Watson Beat, AMPer, Landr and more. This was a super fun interview that I think you’ll get a kick out of. The notes for this show can be found at twimlai.com/talk/139
undefined
May 14, 2018 • 44min

Practical Deep Learning with Rachel Thomas - TWiML Talk #138

In this episode, i'm joined by Rachel Thomas, founder and researcher at Fast AI. If you’re not familiar with Fast AI, the company offers a series of courses including Practical Deep Learning for Coders, Cutting Edge Deep Learning for Coders and Rachel’s Computational Linear Algebra course. The courses are designed to make deep learning more accessible to those without the extensive math backgrounds some other courses assume. Rachel and I cover a lot of ground in this conversation, starting with the philosophy and goals behind the Fast AI courses. We also cover Fast AI’s recent decision to switch to their courses from Tensorflow to Pytorch, the reasons for this, and the lessons they’ve learned in the process. We discuss the role of the Fast AI deep learning library as well, and how it was recently used to held their team achieve top results on a popular industry benchmark of training time and training cost by a factor of more than ten. The notes for this show can be found at twimlai.com/talk/138

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app