AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
The Limit of Human Errors
Zoo mispoke, five by ten oth minus seven is five in ten million, not in a hundred thousand. So what aubert is saying here is, we design our systems to be safe, but there's always problems with those systems,. either technical latent factors or human errors. Yer. We focus a lot of priety on trying to reduce people making mistakes for their rolling in the loop. That's how we've been managing them for a hundred years. Yes. And i'll just give a quick pre view of where this is heading, because amilbertie has a really clear reason why he doesn't think it's possible to squeeze too close. But his idea is