Manifold cover image

ChatGPT, LLMs, and AI — #29

Manifold

CHAPTER

The Main Architecture of the LOMs Is Based on Transformers

There are very, very few academic research groups purely academic that can build state of the art LOMs. The one example I'm aware of is at Chinghua University in Beijing. They have produced a leading edge language model which was actually slightly better on the metrics performance metrics than GPT three or maybe 3.5. Now these big breakthroughs were not really possible without huge CAPX because for training these large models, you end up spending tens of millions of dollars and just raw compute.

00:00
Transcript
Play full episode

Remember Everything You Learn from Podcasts

Save insights instantly, chat with episodes, and build lasting knowledge - all powered by AI.
App store bannerPlay store banner