Astral Codex Ten Podcast cover image

Biological Anchors: A Trick That Might Or Might Not Work

Astral Codex Ten Podcast

00:00

The Scaling Laws of the Genome

When you apply the scaling laws to a seven point five times ten to the power of eight perameter genom and to panalize it for a long horizon, you get about ten to thepower of 33 flops. Aj draws on hanandas and browns measuring the agarhythmic efficiency of neural networks lincoln boast. They look at how many flops it took to train various image recognition aies to an equivalent level of formance between 20 12 and 20 19. Over those seven years, it decreased by a factor of forty four times. But technology constantly advances. Maybe we'll discover ways to train ais faster, or run a ees more efficiently

Transcript
Play full episode

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app