AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
The Uncertainty of the Risk
I think it's entirely likely that it's easier to get to alignment than some people are saying. The thing is we don't know exactly how long it takes, right? So for every additional year that we have to work on it, we increase the chances of getting to the good alignment. Just the whole not knowing thing is it really sucks. It's actually a dilemma where both ends of it are shitty because if you accelerate, then at least according to the Kalski in frame, you doom humanity and you get extinction if you're lucky.