Software Unscripted cover image

Comparing Haskell to R with Will Kurt

Software Unscripted

00:00

The Importance of Gradient Derivatives in Programming

The idea of defining the flow of information through your system and what you expect the system to do is how stable diffusion is trained. That's how GPT is built, right? And that's a very different way of thinking about programming. It's cool you can do it for everything but at the same time, if you've ever had an idea that changed your view on the world, neural networks don't train that way. They assume that you can always move a little closer to the right space and you will get home and that will get your answer.

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app