LessWrong (Curated & Popular) cover image

"(My understanding of) What Everyone in Technical Alignment is Doing and Why" by Thomas Larsen & Eli Lifland

LessWrong (Curated & Popular)

00:00

Ought vs Open Ai Alignment

Conjecture is trying only for oracle l lms trained without any arol pressure, giving them goals. Ought aims toaudimate and scale open ended reasoning through illicit an a i research assistant. They believe advancing process base systems rather than outcome spaced ones will be beneficial in the long term. Eli's opinion: I'm generally excited about trying to audomate alignment research, but relatively more positive on conjectures approach since it aims to use non agentic systems.

Play episode from 01:07:09
Transcript

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app