Sofia Fosdick, a Senior Account Executive at Honeycomb.io with experience in observability, shares her expertise on managing observability costs. She delves into the challenges organizations face with rising costs and the often underutilized data. Sofia emphasizes aligning observability expenses with business value and advocates for transitioning from time-based to event-driven data strategies to optimize user experience. Her practical insights aim to help avoid astronomical bills while ensuring effective data management.
Dynamic sampling allows organizations to prioritize valuable data retention, optimizing observability costs while maintaining critical insights for decision-making.
Aligning observability efforts with business objectives enhances data quality and user experience, promoting a cultural shift towards customer satisfaction.
Deep dives
Understanding Dynamic Sampling
Dynamic sampling involves making informed decisions about which events or traces to retain based on specific attributes within the data. This approach allows organizations to prioritize keeping data that hold critical insights, such as errors or high-value transactions, while filtering out less useful information. Instead of an arbitrary capture rate, teams align their data collection strategy to their operational needs, ensuring they maintain the quality of their observability data. This careful curation not only optimizes resource use but also helps to manage and ultimately reduce the costs associated with observability.
Aligning Observability with Business Goals
Linking observability practices to business objectives enhances the relevance and utility of collected data. By focusing on critical user journeys that reflect customer experiences, organizations can streamline their telemetry collection to capture only meaningful signals. This alignment not only improves data quality but also facilitates quicker problem resolution and better insights into user impact. Ultimately, this strategic approach fosters a cultural shift where observability becomes a tool for delivering customer satisfaction rather than just a technical requirement.
Challenges of Excessive Data Collection
As organizations transition from monolithic applications to microservices, they often fall into the trap of collecting excessive telemetry data, resulting in rising observability costs. This accumulation frequently stems from the misconception that more data will yield better insights, while in reality, it can hinder effective problem-solving. To counter this, businesses need to assess what data is critical for monitoring performance and user experience rather than relying on conventional wisdom to collect broadly. By removing unnecessary data points, organizations can reduce complexity and costs while enhancing their ability to draw actionable insights from relevant information.
No one wants to get Coinbase’s $65 million observability bill in the future. Sure, observability comes with a necessary cost. But that cost cannot exceed the concrete and perceived value on balance sheets and the minds of leaders.
Sofia Fosdick shares practical insights on curbing high observability costs. She’s a senior account executive at Honeycomb.io and has held similar titles at Turbunomic, Dynatrace, and Grafana. Like always, this is not a sponsored episode!
We tackled the cost issue by covering ideas like aligning cost with value, event-based systems, and dynamic sampling. You will not want to miss this conversation if your observability bill is starting to look dangerous.