Gradient Dissent: Conversations on AI cover image

Jeremy Howard — The Simple but Profound Insight Behind Diffusion

Gradient Dissent: Conversations on AI

CHAPTER

Is Stable Diffusion a Good Idea?

I feel like we're sitting here in November, 2022, and I think they've done an amazing job of bringing awareness to generative models. The thing is accessing these things through a web-based API is extremely limiting. When you've actually got the weights, you can really play with both the engineering and the artistic side of doing things that no one's done before. So it's certainly a very valuable step on the whole for society to have this stuff as open as possible. And to be clear, it was all trained at universities,. Most of the stuff we're using now for stable diffusion is trained in Germany, at German academic institutions using donated hardware.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner