The Inside View cover image

Collin Burns On Discovering Latent Knowledge In Language Models Without Supervision

The Inside View

CHAPTER

Is There a Difference Between Predicting and Not Prompting?

Pre-chaining data isn't literally just what humans say it is also for example you know what would be said in like the news or something. You could imagine training a model to like predict news articles conditioned on dates and then learning how to do that with real world events. In general prompting is kind of hacky but i don't think it will be enough to actually get monolous speechless in general one way or another.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner