Gradient Dissent: Conversations on AI

Providing Greater Access to LLMs with Brandon Duderstadt, Co-Founder and CEO of Nomic AI

34 snips
Jul 27, 2023
Ask episode
Chapters
Transcript
Episode notes
1
Introduction
00:00 • 2min
2
How GPT for All Integrates With NOMIT's Model Hosting Initiative
01:57 • 2min
3
The Most Popular Models on GPT for All
03:31 • 2min
4
The Proliferation of Language Models
05:12 • 3min
5
The Future of Open Source Models
08:13 • 3min
6
The Future of Quantization in Machine Learning
11:10 • 2min
7
The Advantages of Quantization in Binary Neural Networks
13:35 • 3min
8
How to Fine-Tune Cloud-Based Models
16:29 • 2min
9
GPT for All: A Serial General Suggestion
18:43 • 2min
10
GPT for All: The Open Source Language Model Ecosystem
20:22 • 2min
11
Atlas: Exploratory Data Analysis on Massive Unstructured Data Sets
21:59 • 2min
12
How Atlas Curated the First Data Sets for GPT for All
24:20 • 2min
13
How to Use Atlas to Find Short Responses
25:57 • 2min
14
How to Train a Code Model and Language Model
27:44 • 2min
15
How to Evaluate an Embedding
30:01 • 2min
16
The Challenges of Model Evaluation
32:10 • 2min
17
Comparing Foundation Models With Data Kernels
34:12 • 2min
18
The Importance of Nudgement in AI Writing
36:29 • 2min
19
The Future of AI
38:36 • 3min
20
The Future of Open Source Base Models
41:08 • 2min
21
The Role of Prompting and Chaining in Stability
42:59 • 2min
22
How to Make a No-Lm Application That Works
45:25 • 2min
23
Quantifiable Metrics for Quality Improvement
47:10 • 2min
24
The Objective Function for NOMIC
49:31 • 2min
25
Advice for a Younger Person
51:29 • 2min
26
How to Keep the State of Company in Your Brain at Once
53:57 • 2min
27
The Underrated Topic in Machine Learning
55:39 • 3min
28
The Cost of Automating Yourself Out of a Job
58:22 • 3min