Software Unscripted cover image

Comparing Haskell to R with Will Kurt

Software Unscripted

CHAPTER

The Importance of Gradient Derivatives in Programming

The idea of defining the flow of information through your system and what you expect the system to do is how stable diffusion is trained. That's how GPT is built, right? And that's a very different way of thinking about programming. It's cool you can do it for everything but at the same time, if you've ever had an idea that changed your view on the world, neural networks don't train that way. They assume that you can always move a little closer to the right space and you will get home and that will get your answer.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner