Lex Fridman Podcast cover image

#367 – Sam Altman: OpenAI CEO on GPT-4, ChatGPT, and the Future of AI

Lex Fridman Podcast

00:00

Billions of parameter in ChatGPT

OpenAI focuses on finding small wins and multiplying them together, resulting in big leaps in their models. The training process involves many intricate steps, including data collection, data cleaning, training, optimization, and architecture. The size of neural networks, such as GPT-3 with 175 billion parameters and GPT-4 with 100 trillion parameters, can impact system performance. However, discussions about size can be taken out of context. Comparisons between the human brain and neural networks reveal the impressive complexity of these software objects. The complexity involved in producing a set of numbers in neural networks surpasses anything achieved so far.

Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner