E125: Let's Help Engineering Teams Productionize AI
Apr 1, 2024
auto_awesome
Andrew Hoh, Co-Founder of LastMile AI, discusses simplifying AI for developers, LastMile's open periphery approach to open source, Azure as the top cloud provider for AI support, and more. They explore challenges in ML engineering, the significance of open source projects in AI innovation, and the importance of AI Config for model configurations and experimentation.
LastMile AI simplifies AI development with AI Config's model selection process.
LastMile AI balances open source projects and enterprise solutions to promote AI accessibility.
Deep dives
The Founding of Last Mile AI and its Unique Journey
Last Mile AI was co-founded by Andrew and his colleague, who previously worked at big tech companies like Facebook. Having worked on AI platforms, they noticed the scarcity of ML engineers despite the growing importance of niche products. The surge in large language models prompted them to start Last Mile AI, aiming to facilitate AI application development by offering unique tools, including a notebook-style playground for developers to experiment with LM applications.
Innovative Approach in Model Configuration with AI Config
One of Last Mile AI's projects, AI Config, revolutionizes how configurations are handled by simplifying the model selection process. It addresses challenges like model experimentation and the need for efficient evaluation. By introducing a YAML-based config approach, it streamlines the AI development process, allowing for easy comparison of different models and promoting collaborative development.
The Crucial Balance Between Open Source and Enterprise Solutions
Last Mile AI strategically balances its focus between open source projects like AI Config and enterprise solutions, emphasizing the importance of utility and avoiding feature bloat. With a vision to enable end-to-end AI development for individuals and fostering collaboration, Last Mile AI aims to make LM technology accessible while exploring novel research like hallucination detection models to enhance production capabilities.
Looking Forward: Insights and Hot Takes on LM Technology
Andrew shares insightful perspectives on the future of LM technology, including the importance of modular collaboration structures and caution against over-engineering systems like complex context chunking. Expressing beliefs in the value of small language models and the evolution of AI application development dynamics, Last Mile AI navigates the experimentation phase while aiming for broad, intuitive LM utilization.
Andrew Hoh is Co-Founder of LastMile AI, the AI developer platform for engineering teams to productionize LLM applications. They take an "open periphery" stance on open source with projects like AIConfig to help developers build AI applications.
LastMile AI has raised $10M from investors including Gradient, AME, Exceptional Capital, and Firsthand Alliance.
In this episode, we discuss LastMile's approach to simplifying AI for developers and why they decided to build an end-to-end solution, LastMile's open periphery approach to open source, where we are on the experimentation-to-commercialization curve with GenAI, why Azure is the top cloud provider when it comes to AI support & more!
Get the Snipd podcast app
Unlock the knowledge in podcasts with the podcast player of the future.
AI-powered podcast player
Listen to all your favourite podcasts with AI-powered features
Discover highlights
Listen to the best highlights from the podcasts you love and dive into the full episode
Save any moment
Hear something you like? Tap your headphones to save it with AI-generated key takeaways
Share & Export
Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more
AI-powered podcast player
Listen to all your favourite podcasts with AI-powered features
Discover highlights
Listen to the best highlights from the podcasts you love and dive into the full episode