Gradient Dissent: Conversations on AI

Emad Mostaque — Stable Diffusion, Stability AI, and What’s Next

9 snips
Nov 15, 2022
Ask episode
Chapters
Transcript
Episode notes
1
Introduction
00:00 • 2min
2
How Did You Get Started With AI?
01:52 • 2min
3
The Cattice l'Estracine X Price for Learning
03:34 • 2min
4
Is Stable Diffusion Linked to Stability AI?
05:44 • 2min
5
The Open Source Side of Things
07:27 • 3min
6
How Do You Fund Open by a Mouth?
10:27 • 2min
7
Is Open Source a Good Idea?
12:27 • 4min
8
Creating a Long Term Sustainable Business?
16:41 • 3min
9
What Are You Doing for the Bollywood Application Today?
19:54 • 3min
10
Open Source Models - What's Next?
22:28 • 2min
11
Scaling Models in the Cloud
24:30 • 2min
12
Image Models
26:15 • 3min
13
Is Scalability a Bottleneck for Open Source Research?
28:59 • 2min
14
BioML Data Sets
30:46 • 3min
15
Is It a Good Idea to Invest in Time Series?
33:32 • 2min
16
Is There a Convergence Around Transformers?
35:39 • 3min
17
The Killer Application to Generating Media
38:14 • 3min
18
Open AI
40:57 • 3min
19
Open Source Deep Fake Detection
44:16 • 2min
20
Is Copyright a Good Thing?
46:40 • 2min
21
Open Source APIs
48:36 • 4min
22
A Community Fork Is a Good Idea, Right?
52:18 • 3min
23
Is There a Future in Education?
55:03 • 5min
24
Translately AI
59:41 • 4min
25
Is There a Connection Between Machine Learning and Autism?
01:03:24 • 3min
26
What's the Topic in Machine Learning That You Think Is Underrated?
01:06:29 • 2min
27
Dream Studio Lite and Dream Studio Pro - What's the Hardest Part?
01:08:13 • 2min