

Emergency Pod: OpenAI's new Functions API, 75% Price Drop, 4x Context Length (w/ Alex Volkov, Simon Willison, Riley Goodside, Joshua Lochner, Stefania Druga, Eric Elliott, Mayo Oshin et al)
68 snips Jun 14, 2023
In this engaging discussion, AI expert Alex Volkov, prompt injection specialist Simon Willison, and software engineer Riley Goodside dissect OpenAI's transformative Functions API. They dive into the significant 75% price drop and the increase in context length, exploring the implications for developers. Eric Elliott shares prompting techniques to enhance accuracy, while the panel addresses security concerns related to prompt injection. The conversation is rich with insights on coding efficiency, the future of AI tools, and the evolution of user interactions with these technologies.
AI Snips
Chapters
Transcript
Episode notes
Embedding Price Drop
- Embedding prices have dropped significantly, by 90% in November/December and another 75% now.
- This raises questions about the initial pricing and the feasibility of client-side embedding.
16k Context Window
- The new 16k context window for GPT-3.5-turbo is a significant development.
- It's much cheaper than GPT-4 and offers a larger context window than the previous 3.5-turbo.
Long Context Limitations
- Be cautious about the effectiveness of the new 16k context window.
- Test how well the model attends to the entire context, as there may be limitations.