Subliminal Jihad cover image

#124a - MAN IN THE BLACKBOX: LaMDA and the Rise of "Sentient" AI (with Pale Rider)

Subliminal Jihad

00:00

Is There a Causal Confusion?

A causal confusion is when a model tries to predict an event based on its surroundings. This can be particularly likely if not all parts of the environment are observed at once. A lot of these tools work best as tools where they're doing something with a human operator, but what people really want from them is full automation for given tasks. I almost feel like when these types of mistakes are being made, it's a little bit of a jump to make the analogy to translation. It also gets at one of the really important things that you have to decide when you're building these systems: how exactly are you going to use them?

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app