This chapter explores the complexities and expenses associated with developing AI-powered applications, emphasizing the need for effective cost management throughout the AI lifecycle. It highlights the importance of observability and data handling in AI pipelines, discussing strategies for minimizing costs while efficiently managing data sources and model training. By addressing challenges in early stages of development, the chapter underscores the significance of proactive measures to ensure sustainable AI implementation.
LLMs are becoming more mature and accessible, and many teams are now integrating them into common business practices such as technical support bots, online real-time help, and other knowledge-base-related tasks. However, the high cost of maintaining AI teams and operating AI pipelines is becoming apparent.
Maxime Armstrong and Yuhan Luo are Software Engineers at Dagster, which is an open source platform for orchestrating data and AI pipelines. They join the show with Sean Falconer to talk about running cost-effective AI pipelines.
Sean’s been an academic, startup founder, and Googler. He has published works covering a wide range of topics from information visualization to quantum computing. Currently, Sean is Head of Marketing and Developer Relations at Skyflow and host of the podcast Partially Redacted, a podcast about privacy and security engineering. You can connect with Sean on Twitter @seanfalconer.
The post AI Pipelines with Maxime Armstrong and Yuhan Luo appeared first on Software Engineering Daily.