The number of parameters and a large language model has been gone up by a factor of 10 for the past several years. I think GPT-4 will have 170 trillion parameters, so 1,000 times more compared to Microsoft's 17 billion. The thing that's hard is you often don't find performance of these models on any quantitative scale as a function of number of parameters. So long story short, I think it's early days. I think there's a lot of headroom for improvement.
This is one of my favorite episodes of the year. It’s our third annual long, strange trip into the mind of a Silicon Valley legend.
Dave Kellogg is one of the best marketers, CEOs, tech provocateurs, and board whisperers around. He was an executive at iconic companies like SAP, MarkLogic, and Salesforce turned investor and board director who is now an executive in residence at Balderton Capital.
In this episode, we discuss, well, just about everything that matters for the tech economy… startup growth metrics, generative AI, how to get funded in 2023, and of course our favorite jam band.
Listen and learn:
- What Dave got right… and not so right… in his 2022 predictions
- How startups can survive downturns
- How to fix the problems at Salesforce, Amazon, and Facebook
- What single theme will characterize 2023 in Silicon Valley
- What will happen to startups that raised massive rounds in 2021
- Why virtual companies won’t outperform companies built around hubs in tech centers
- What’s ahead for consumption-based pricing and PLG
- Why generative AI poses an existential threat to Google
References in this episode: