When AI research is evolving at warp speed and takes significant capital and compute power, what is the role of academia? Dr. Percy Liang – Stanford computer science professor and director of the Stanford Center for Research on Foundation Models talks about training costs, distributed infrastructure, model evaluation, alignment, and societal impact.
Sarah Guo and Elad Gil join Percy at his office to discuss the evolution of research in NLP, why AI developers should aim for superhuman levels of performance, the goals of the Center for Research on Foundation Models, and Together, a decentralized cloud for artificial intelligence.
No Priors is now on YouTube! Subscribe to the channel on YouTube and like this episode.
Show Links:
Sign up for new podcasts every week. Email feedback to show@no-priors.com
Follow us on Twitter: @NoPriorsPod | @Saranormous | @EladGil | @PercyLiang
Show Notes:
[1:44] - How Percy got into machine learning research and started the Center for Research and Foundation Models at Stanford
[7:23] - The role of academia and academia’s competitive advantages
[13:30] - Research on natural language processing and computational semantics
[27:20] - Smaller scale architectures that are competitive with transformers
[35:08] - Helm, holistic evaluation of language models, a project with the the goal is to evaluate language models
[42:13] - Together, a decentralized cloud for artificial intelligence