Go was chosen as the programming language for the ML pipeline project due to its simplicity, reliability, and support for building distributed microservices.
The team implemented a wait strategy for service dependencies using the 'wait for it' Go package and contributed back to its repository.
Deep dives
Overview of the podcast episode
This podcast episode discusses a specific project developed by members of the Go community at Intel, focusing on their ML pipeline healthcare solution microservices. The project involved building an ML pipeline for image processing and automated image comparisons in healthcare use cases, utilizing microservices and containerization. The hosts interviewed Samantha Coyle and Anita Elizabeth Simon, who shared insights on the challenges they faced and the decision to use Go as the programming language of choice. They also discussed the wait strategy for their services and the open-source package they incorporated into the project. Overall, the episode highlights the importance of open-source contributions and the benefits of using Go for developing scalable, distributed systems.
The Use of Go and Microservices in the Project
Go was chosen as the programming language for the project due to its simplicity, reliability, and support for building distributed microservices. The team found Go to be well-suited for their needs, particularly in terms of concurrency, scalability, and its ability to run on multiple cores. The project embraced a microservices architecture, utilizing containerization for lightweight and scalable deployment. This approach allowed for easy development, maintenance, and modification of the application services, without impacting the overall solution architecture. The team considered the advantages of containerized microservices, such as improved scalability and ease of deployment across different environments.
The Wait Strategy and Open-Source Package
The podcast episode explores the implementation of a wait strategy for service dependencies and the use of an open-source package called 'wait for it' in the project. The wait strategy involved coordinating services running on different machines, ensuring dependencies were up and ready before proceeding. The team initially adopted a bash script-based solution, inspired by the Vishnu Bob bash script, which was typically used in Docker layers. However, they encountered the challenge of making the script compatible with their Go applications. Eventually, they found the 'wait for it' Go package, which provided similar functionality and was seamlessly integrated within their services. The team faced the limitation of the package not being consumable, leading to the need for modification and copying of code. Nonetheless, they contributed back to the package's repository, which received a positive response from the author.
Challenges and Considerations
The podcast episode discusses challenges faced during the project, such as the need to find replacements for deprecated open-source packages. The team highlighted the importance of evaluating package licenses and ensuring active development and community engagement. The episode also touched on the learning curve associated with learning Go for developers who are not familiar with the language. Overall, the team emphasized the significance of selecting the right programming language and architecture based on the project's requirements and the value of being adaptable and prepared for changes and challenges in the open-source ecosystem.
Our guests helped create a ML pipeline that enabled image processing and automated image comparisons, enabling healthcare use cases through their series of microservices that automatically detect, manage, and process images received from OEM equipment.
In this episode they will chat through the challenges and how they overcame them, focusing specifically on the wait strategy for their ML Pipeline Healthcare Solution microservices. We’ll also touch on how improvements were made to an open source Go package as part of this project.
Changelog++ members save 1 minute on this episode because they made the ads disappear. Join today!
Sponsors:
Fastly – Our bandwidth partner. Fastly powers fast, secure, and scalable digital experiences. Move beyond your content delivery network to their powerful edge cloud platform. Learn more at fastly.com
Fly.io – The home of Changelog.com — Deploy your apps and databases close to your users. In minutes you can run your Ruby, Go, Node, Deno, Python, or Elixir app (and databases!) all over the world. No ops required. Learn more at fly.io/changelog and check out the speedrun in their docs.
Typesense – Lightning fast, globally distributed Search-as-a-Service that runs in memory. You literally can’t get any faster!