AI Today Podcast

AI & Data Today
undefined
May 5, 2023 • 11min

AI Today Podcast: AI Glossary Series – CPU, GPU, TPU, and Federated Learning

In this episode of the AI Today podcast hosts Kathleen Walch and Ron Schmelzer define the terms CPU, GPU, TPU, and Federated Learning, explain how these terms relate to AI and why it’s important to know about them.Want to dive deeper into an understanding of artificial intelligence, machine learning, or big data concepts?Continue reading AI Today Podcast: AI Glossary Series – CPU, GPU, TPU, and Federated Learning at Cognilytica.
undefined
May 3, 2023 • 11min

AI Today Podcast: AI Glossary Series – Tokenization and Vectorization

In this episode of the AI Today podcast hosts Kathleen Walch and Ron Schmelzer define the terms Tokenization and Vectorization, explain how these terms relates to AI and why it’s important to know about them.Want to dive deeper into an understanding of artificial intelligence, machine learning, or big data concepts? Want to learn how to apply AI and data using hands-on approaches and the latest technologies?Continue reading AI Today Podcast: AI Glossary Series – Tokenization and Vectorization at Cognilytica.
undefined
Apr 28, 2023 • 12min

AI Today Podcast: AI Glossary Series – Training Data, Epoch, Batch, Learning Curve

In order for machine learning systems to work they need to be trained on data. But as the old saying goes “garbage in is garbage out” so you need to make sure you have a dataset of prepared data that is cleaned so you can incrementally train a machine learning model to perform a particular task.Continue reading AI Today Podcast: AI Glossary Series – Training Data, Epoch, Batch, Learning Curve at Cognilytica.
undefined
Apr 26, 2023 • 11min

AI Today Podcast: AI Glossary Series – Backpropagation, Learning Rate, and Optimizer

Backpropagation was one of the innovations by Geoff Hinton that made deep learning networks a practical reality. But have you ever heard of that term before and know what it is at a high level? In this episode of the AI Today podcast hosts Kathleen Walch and Ron Schmelzer define the terms Backpropagation, Learning Rate, and Optimizer, explain how these terms relates to AI and why it’s important to know about them.Continue reading AI Today Podcast: AI Glossary Series – Backpropagation, Learning Rate, and Optimizer at Cognilytica.
undefined
Apr 21, 2023 • 13min

AI Today Podcast: AI Glossary Series – Loss Function, Cost Function and Gradient Descent

In this episode of the AI Today podcast hosts Kathleen Walch and Ron Schmelzer define the terms Loss Function, Cost Function and Gradient Descent, explain how these terms relates to AI and why it’s important to know about them.Show Notes:FREE Intro to CPMAI mini courseCPMAI Training and CertificationAI GlossaryGlossary Series: Artificial IntelligenceGlossary Series: Artificial General Intelligence (AGI), Strong AI, Weak AI, Narrow AIGlossary Series: Heuristic & Brute-force SearchAI Glossary Series – Machine Learning, Algorithm, ModelGlossary Series: (Artificial) Neural Networks, Node (Neuron), LayerGlossary Series: Bias, Weight, Activation Function, Convergence, ReLUGlossary Series: PerceptronGlossary Series: Hidden Layer, Deep LearningContinue reading AI Today Podcast: AI Glossary Series – Loss Function, Cost Function and Gradient Descent at Cognilytica.
undefined
Apr 19, 2023 • 12min

AI Today Podcast: AI Glossary Series- Hidden Layer and Deep Learning

Deep Learning is powering this current wave of AI interest. But do you really know what Deep Learning is? In this episode of the AI Today podcast hosts Kathleen Walch and Ron Schmelzer define the terms hidden layer and deep learning, explain how these terms relates to AI and why it’s important to know about them.Continue reading AI Today Podcast: AI Glossary Series- Hidden Layer and Deep Learning at Cognilytica.
undefined
Apr 14, 2023 • 13min

AI Today Podcast: AI Glossary Series – Perceptron

The Perceptron was the first artificial neuron. The theory of the perceptron was first published in 1943 by McCulloch & Pitts, and then developed in 1958 by Rosenblatt. So yes, this was developed in the early days of AI. In this episode of the AI Today podcast hosts Kathleen Walch and Ron Schmelzer define the term Perceptron and explain how the term relates to AI and why it’s important to know about it.Continue reading AI Today Podcast: AI Glossary Series – Perceptron at Cognilytica.
undefined
Apr 12, 2023 • 13min

AI Today Podcast: AI Glossary Series – Bias, Weight, Activation Function, Convergence, and ReLU

In this episode of the AI Today podcast hosts Kathleen Walch and Ron Schmelzer define the terms Bias, Weight, Activation Function, Convergence, and ReLU and explain how they relate to AI and why it’s important to know about them.Show Notes:FREE Intro to CPMAI mini courseCPMAI Training and CertificationAI GlossaryAI Glossary Series – Machine Learning, Algorithm, ModelGlossary Series: Machine Learning Approaches: Supervised Learning, Unsupervised Learning, Reinforcement LearningGlossary Series: Dimension, Curse of Dimensionality, Dimensionality ReductionGlossary Series: Feature, Feature EngineeringGlossary Series: (Artificial) Neural Networks, Node (Neuron), LayerContinue reading AI Today Podcast: AI Glossary Series – Bias, Weight, Activation Function, Convergence, and ReLU at Cognilytica.
undefined
Apr 7, 2023 • 14min

AI Today Podcast: AI Glossary Series – (Artificial) Neural Networks, Node (Neuron), Layer

If we can replicate neurons and how they are connected, can we replicate the behavior of our brains? In this episode of the AI Today podcast hosts Kathleen Walch and Ron Schmelzer define the terms (Artificial) Neural Networks, Node, and layer, and explain how they relate to AI and why it’s important to know about them.Continue reading AI Today Podcast: AI Glossary Series – (Artificial) Neural Networks, Node (Neuron), Layer at Cognilytica.
undefined
Apr 5, 2023 • 11min

AI Today Podcast: AI Glossary Series – Feature Reduction, Principal Component Analysis (PCA), and t-SNE

For a number of reasons, it can be important to reduce the number of variables or identified features in input training data so as to make training machine learning models faster and more accurate. But what are the techniques for doing this? In this episode of the AI Today podcast hosts Kathleen Walch and Ron Schmelzer define the terms Feature Reduction, Principal Component Analysis (PCA), and t-SNE, explain how they relate to AI and why it’s important to know about them.Continue reading AI Today Podcast: AI Glossary Series – Feature Reduction, Principal Component Analysis (PCA), and t-SNE at Cognilytica.

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app