In this podcast, the MLOps vs. LLMOps Panel discuss the high-level differences between MLOps and LLMOps, the impact of ML ops on companies, the challenges of open source tools and data safety in financial firms, the cost and rationalization of MLOps, options for large enterprises in ML model development, and the use of foundational models and vector databases.
Read more
AI Summary
AI Chapters
Episode notes
auto_awesome
Podcast summary created with Snipd AI
Quick takeaways
LOM Ops allows companies to leverage large language models as a base and fine-tune them for specific use cases, offering faster deployment and prompt engineering in comparison to traditional ML models.
Vector stores and feature stores play a crucial role in LOM Ops, enabling better context and observability for language models, while evaluation and debugging become more challenging as model complexity increases.
Deep dives
The Rise of LOM Ops
LOM Ops, which stands for Language Model Operations, is gaining momentum as a powerful tool for enterprises. It is seen as a way to leverage large language models (LLMs) like OpenAI to boost productivity and improve outcomes. LOM Ops offers the ability to start with a generalized LLM as a base and then fine-tune or specialize it for specific use cases. This approach allows companies to achieve faster deployment and iterate on prompt engineering rather than relying solely on traditional ML models. The stack for LOM Ops includes elements like vector stores and feature stores, which enable better context and observability for the models. Evaluation and debugging are crucial challenges in LOM Ops, as the complexity and interdependencies of models increase. Enterprises are grappling with the decision of whether to rely on mature models like OpenAI or build custom models to ensure data privacy. Cost rationalization is also a consideration, and startups are emerging to provide cost-effective LOM Ops solutions. While LOM Ops offers tremendous potential, it is still in its early stages, and the full impact on ML Ops remains to be seen.
Maintaining ML Ops in the LOM Era
ML Ops, short for Machine Learning Operations, is unlikely to disappear with the rise of LOM Ops. Both concepts share core principles, such as the need for evaluation, auditing, and reliable system of record. LOM Ops introduces new considerations and challenges, including the evaluation of LLMs, which are more complex and demand context and observability solutions. Vector stores play a key role in connecting LLMs with data to enhance performance, and feature stores enable customizations. While some specialized use cases may still require traditional ML models in ML Ops, LOM Ops offers a promising approach for more generalized tasks, such as sentiment analysis or classification. However, it is still early days for LOM Ops, and the maturity and adoption of the technology will dictate its long-term impact on ML Ops.
The Potential and Strategies in LOM Ops Adoption
LOM Ops has the potential to transform the way organizations approach machine learning. It enables the use of highly advanced LLMs, like OpenAI, as a basis for different projects, leading to significant time savings and improved outcomes. Enterprises have the option to leverage existing ML Ops infrastructure if they opt for specialized models that build upon the LLMs. Privacy concerns and the need for data security have led some companies to explore using vector databases to connect the LLMs with their own data. While LOM Ops presents exciting opportunities, it is still witnessing early adoption, and organizations are evaluating the cost, benefits, and risks associated with implementing LOM Ops. Open source tools, startups, and increased accessibility to LLMs are driving the LOM Ops evolution, allowing enterprises to strategize and decide the best approach based on their specific use cases.
Navigating the Transition to LOM Ops
The transition from ML Ops to LOM Ops requires careful consideration and evaluation of the benefits and challenges. Enterprises are faced with deciding whether to adopt LLMs as part of their ML workflows or stick with traditional ML models. While LLMs offer significant advantages like out-of-the-box accuracy for tasks like sentiment analysis and classification, organizations need to weigh the associated costs, such as data privacy concerns and infrastructure requirements. Vector stores and feature stores play a critical role in connecting LLMs with data and facilitating context retrieval and troubleshooting. Evaluating LLMs and maintaining observability become essential in LOM Ops to ensure reliable and robust performance. Whether to rebrand ML Ops as LOM Ops remains a topic of debate, but for now, the focus should be on understanding the nuances and potential of LLMs in the era of LOM Ops.
MLOps Coffee Sessions #176 with MLOps vs. LLMOps Panel, Willem Pienaar, Chris Van Pelt, Aparna Dhinakaran, and Alex Ratner hosted by Richa Sachdev.
// Abstract
What do MLOps and LLMOps have in common? What has changed? Are these just new buzzwords or is there validity in calling this ops something new?
// Bio
Richa Sachdev
A passionate and impact-driven leader whose expertise spans leading teams, architecting ML and data-intensive applications, and driving enterprise data strategy.
Richa has worked for a Tier A Start-up developing feature platforms and in financial companies, leading ML Engineering teams to drive data-driven business decisions.
Richa enjoys reading technical blogs focussed on system design and plays an active role in the MLOps Community.
Willem Pienaar
Willem is the creator of Feast, the open-source feature store and a builder in the generative AI space. Previously Willem was an engineering manager at Tecton where he led teams in both their open source and enterprise initiatives. Before that Willem built the core ML systems and created the ML platform team at Gojek, the Indonesian decacorn.
Chris Van Pelt
Chris Van Pelt is a co-founder of Weights & Biases, a developer MLOps platform. In 2009, Chris founded Figure Eight/CrowdFlower. Over the past 12 years, Chris has dedicated his career optimizing ML workflows and teaching ML practitioners, making machine learning more accessible to all. Chris has worked as a studio artist, computer scientist, and web engineer. He studied both art and computer science at Hope College.
Aparna Dhinakaran
Aparna Dhinakaran is the Co-Founder and Chief Product Officer at Arize AI, a pioneer and early leader in machine learning (ML) observability. A frequent speaker at top conferences and thought leader in the space, Dhinakaran was recently named to the Forbes 30 Under 30. Before Arize, Dhinakaran was an ML engineer and leader at Uber, Apple, and TubeMogul (acquired by Adobe). During her time at Uber, she built several core ML Infrastructure platforms, including Michelangelo. She has a bachelor’s from UC Berkeley's Electrical Engineering and Computer Science program, where she published research with Berkeley's AI Research group. She is on a leave of absence from the Computer Vision Ph.D. program at Cornell University.
Alex Ratner
Alex Ratner is the co-founder and CEO at Snorkel AI, and an Affiliate Assistant Professor of Computer Science at the University of Washington. Prior to Snorkel AI and UW, he completed his Ph.D. in CS advised by Christopher Ré at Stanford, where he started and led the Snorkel open source project, and where his research focused on defining and forwarding the concept of “data-centric AI”, the idea that labeling and developing data is the new center of the AI development workflow. His academic work focuses on data-centric AI and related topics in data management and statistical learning techniques, and applications to real-world problems in medicine, science, and more. Previously, he earned his A.B. in Physics from Harvard University.
// MLOps Jobs board
https://mlops.pallet.xyz/jobs
// MLOps Swag/Merch
https://mlops-community.myshopify.com/
// Related Links
--------------- ✌️Connect With Us ✌️ -------------
Join our slack community: https://go.mlops.community/slack
Follow us on Twitter: @mlopscommunity
Sign up for the next meetup: https://go.mlops.community/register
Catch all episodes, blogs, newsletters, and more: https://mlops.community/
Connect with Demetrios on LinkedIn: https://www.linkedin.com/in/dpbrinkm/
Connect with Richa on LinkedIn: https://www.linkedin.com/in/richasachdev/
Connect with Willem on LinkedIn: https://www.linkedin.com/in/willempienaar/
Connect with Chris on LinkedIn: https://www.linkedin.com/in/chrisvanpelt/
Connect with Aparna on Twitter: https://www.linkedin.com/in/aparnadhinakaran/
Connect with Alex on Twitter: https://www.linkedin.com/in/alexander-ratner-038ba239/
Get the Snipd podcast app
Unlock the knowledge in podcasts with the podcast player of the future.
AI-powered podcast player
Listen to all your favourite podcasts with AI-powered features
Discover highlights
Listen to the best highlights from the podcasts you love and dive into the full episode
Save any moment
Hear something you like? Tap your headphones to save it with AI-generated key takeaways
Share & Export
Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more
AI-powered podcast player
Listen to all your favourite podcasts with AI-powered features
Discover highlights
Listen to the best highlights from the podcasts you love and dive into the full episode