
Data Mesh Radio
Interviews with data mesh practitioners, deep dives/how-tos, anti-patterns, panels, chats (not debates) with skeptics, "mesh musings", and so much more. Host Scott Hirleman (founder of the Data Mesh Learning Community) shares his learnings - and those of the broader data community - from over a year of deep diving into data mesh.
Each episode contains a BLUF - bottom line, up front - so you can quickly absorb a few key takeaways and also decide if an episode will be useful to you - nothing worse than listening for 20+ minutes before figuring out if a podcast episode is going to be interesting and/or incremental ;) Hoping to provide quality transcripts in the future - if you want to help, please reach out!
Data Mesh Radio is also looking for guests to share their experience with data mesh! Even if that experience is 'I am confused, let's chat about' some specific topic. Yes, that could be you! You can check out our guest and feedback FAQ, including how to submit your name to be a guest and how to submit feedback - including anonymously if you want - here: https://docs.google.com/document/d/1dDdb1mEhmcYqx3xYAvPuM1FZMuGiCszyY9x8X250KuQ/edit?usp=sharing
Data Mesh Radio is committed to diversity and inclusion. This includes in our guests and guest hosts. If you are part of a minoritized group, please see this as an open invitation to being a guest, so please hit the link above.
If you are looking for additional useful information on data mesh, we recommend the community resources from Data Mesh Learning. All are vendor independent. https://datameshlearning.com/community/
You should also follow Zhamak Dehghani (founder of the data mesh concept); she posts a lot of great things on LinkedIn and has a wonderful data mesh book through O'Reilly. Plus, she's just a nice person: https://www.linkedin.com/in/zhamak-dehghani/detail/recent-activity/shares/
Data Mesh Radio is provided as a free community resource by DataStax. If you need a database that is easy to scale - read: serverless - but also easy to develop for - many APIs including gRPC, REST, JSON, GraphQL, etc. all of which are OSS under the Stargate project - check out DataStax's AstraDB service :) Built on Apache Cassandra, AstraDB is very performant and oh yeah, is also multi-region/multi-cloud so you can focus on scaling your company, not your database. There's a free forever tier for poking around/home projects and you can also use code DAAP500 for a $500 free credit (apply under payment options): https://www.datastax.com/products/datastax-astra?utm_source=DataMeshRadio
Latest episodes

May 19, 2023 • 11min
#223 What Does Your Data-Driven Org Look Like - Mesh Musings 48
Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts (most interviews from #32 on) hereProvided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.Important points to consider about being a data driven organization:If your execs can't envision what would change if the organization were significantly better at data, you have some work on your hands at understanding their challenges first and then also evangelizingYour domains should be leaning in and understand what being data driven means for their domain specifically. If they aren't you won't be able to really help them move forward. You can't drag a team to being data-driven.Your execs should be aligned on what being data driven means for the organization, especially at the macro level - how does it all fit together instead of highly competent data silos? How do we focus on incremental value delivery via concrete use cases?You should absolutely make sure your data strategy and your vision of the data driven organization ties to the actual business strategy and supports crucial priorities.You need to make sure you have - or can build - a test and learn culture. If not, can you really be data driven?Data Mesh Radio is hosted by Scott Hirleman. If you want to connect with Scott, reach out to him at community at datameshlearning.com or on LinkedIn: https://www.linkedin.com/in/scotthirleman/If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/All music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf

May 15, 2023 • 1h 15min
#222 "Not All Governance is Created Equal": Data Governance as a Data Mesh Value Lever and Driver - Interview w/ Lynn Noel
Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts (most interviews from #32 on) hereProvided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.Transcript for this episode (link) provided by Starburst. See their Data Mesh Summit recordings here and their great data mesh resource center here. You can download their Data Mesh for Dummies e-book (info gated) here.Lynn's LinkedIn: https://www.linkedin.com/in/lnoel/DAMA New England: https://damanewengland.org/Correlation/Causation XKCD: https://xkcd.com/552/In this episode, Scott interviewed Lynn Noel (pronounced Knoll), Data Governance Lead at AIM Consulting. To be clear, she was only representing her own views on the episode.Some key takeaways/thoughts from Lynn's point of view:There is a strong data governance value proposition in data mesh around making data broadly "safer, better, and easier to use and share." Lean into that value and communicate it widely.?Controversial?: Many organizations doing data mesh are moving from a very centralized approach and they have certain concerns and approaches they should take. While other organizations are overly decentralized and need to be using different approaches to migrating. Really consider these as two different approaches that land on a similar end approach.You should look to return governance to its original meaning: guiding and steering. In data mesh, governance shouldn't be about control.?Controversial?: "Not all governance is created equal." There are non-negotiables: infrastructure security and data classification. Don't over-index on sharing data at the expense of crucial governance.Shift your governance left. Shift left the responsibilities - where appropriate - but also shift governance left in your timelines - it needs to be part of the build processes.If you are doing data mesh, you have to understand the existing organization governance structure - whether that is called data governance specifically or not - and align with it. Legal, risk management, IT, etc. Make sure they are aware of your work and give them confidence you will keep things governed well.Every member of a project team on a consulting engagement is obligated to deliver value. That means you need to translate your work to business value. Governance isn't only risk mitigation, it should have incremental value.In data, we too often "confuse the pipes for the water" - meaning we focus too much on the plumbing instead of the resource people want: understandable, usable, trustable data.Consider telling business users you are providing them an ecosystem of capabilities instead of a platform. They can understand that they have their own "habitat" and can customize aspects to meet their own needs but that it all needs to work well together too with other habitats.Federated computational governance is an overly scary term. Break that down for users and what it means for them and why it actually makes their lives easier/better.?Controversial?: "If you automate something that users aren't expecting to have automated, they will be extremely surprised and disoriented by the platform." Make sure to communicate and ask people what they actually want automated. What you see as friction might be value-add work for them. Communicate!You can plan out a lot of things but they just won't survive "first contact" with the data model and/or the business model. Be prepared to iterate, don't plan too much and be flexible; iteration is great, you are implementing what new you've learned.If your organization is coming to data mesh from a highly decentralized approach, it's very likely there are pockets of data that never were put into the central data infrastructure. It will probably be hard to get people to trust sharing that data and giving up control of it because the previous approach couldn't handle what they were doing.?Controversial?: Part of data mesh or any transformation initiative, you need people who have ownership over, the responsibility for, organizational change management. Driving buy-in at the personal and organizational level. Scott note: I kind of like this but it's also scary. Who has the responsibility of driving buy-in specifically?Use the phrasing capability instead of feature when talking data platform. What capabilities are you offering to enable your use cases? And your capabilities should be very use-case agnostic. Lynn started with her being the lead data governance person for a data mesh implementation at a client. It's allowed her to have governance be part of the value drivers for all aspects of mesh. She said data governance in mesh is about making data "safer, better, and easier to use and share." The value proposition of doing governance well in data mesh starts to emerge when you think that to really get the most out of data, you need teams to feel like they can rely on data, not just use it.It's crucial in all aspects of data, but especially with data mesh, that governance is an enabler instead of a blocker according to Lynn. As a governance leader, you can't be trying to put up blockers but it's crucial for the team to understand why aspects of governance are helpful and add value. You should be building alongside - especially the building aspect, being part of the work - to keep things moving while also making sure you aren't setting yourself up for trouble later, whether that is quality, regulatory, or otherwise.Lynn said "not all governance is created equal" - it's important to establish where you can and cannot compromise. Infrastructure security and data classification are both non-negotiables. Both wrap in to external threats and internal access control, including to regulatory requirements. Risk management and compliance just cannot be ignored so make them part of the build process and ensure everyone understands why they matter. It's a core of product thinking around data. There are multiple ways to 'shift left' in governance in Lynn's view. The first is shifting governance left in the project timeline - it should be part of every aspect of building your data mesh from the start - platform, data products, etc. You don't layer governance on at the end. We also should be shifting some of the responsibilities left to the developers - while giving them the capabilities and knowledge to handle what they can and come to experts for what they can't. Lynn has had some success with talking to business users about not providing them a platform but an "ecosystem". It lets them envision something on a bit more of a tangible level, that they will build out their own habitat but it plays into a larger ecosystem. So they are comfortable in their own habitat which lets them better deliver value to themselves and others. Of course, your ecosystem can't infringe on other ecosystems so it is a good analogy for how domains all interact. When looking at what to automate as part of your platform, Lynn recommends strong communication with users. You can surprise and disorient them if you automate the wrong things. E.g. there are certain aspects of discovery analysis that look a lot like standard tests but one is incremental work where you want to dive deeper into the data and one is part of daily work without value add from doing manually. Automate the second but not the first :)Lynn brought up that your organizational setup before moving to data mesh will have a very large impact on what you should focus on when in your journey. If you are coming from an overly centralized approach, it will mean that you understand the benefits of integration but might have some challenges with letting go of central control. If you are coming from a very highly decentralized world, you might need to focus much more on alignment because domains may have had freedom to do as they please with little thought for users in other domains.In organizations doing data mesh coming from a very highly centralized approach, Lynn says be prepared for a lot of dark data. Because the officially blessed IT systems were overly rigid, many people were doing work outside of the systems, often in Excel or similar. Getting people to trust you that you'll take good care of their data and they can still maintain access will be hard. There are many fun governance challenges including naming conventions to tackle with data like this.For Lynn and her team, it's been very important to focus specifically on answering what's in it for me from users. Me in the personal sense and me in the organizational and/or domain sense too. And having people whose role involves owning organizational change management is very helpful as part of a product team. There is literally a person or people responsible for getting alignment and driving buy-in.Quick Tidbits:In data, we too often "confuse the pipes for the water" - losing the forest for the trees. We focus too much on the plumbing instead of the resource people want: understandable, usable, trustable data. Focus on the outcomes and business value.It's easy to scare people by using the 'federated computational governance' term. Work with teams to show them where we can automate the past pains of governance so everyone can work on what drives incremental value not manually applying policies.Regarding a data warehouse: "I never saw a plain vanilla implementation survive anybody's business data model." Your plan will not survive first contact with the business model as is. Be prepared to iterate on your plans in data mesh. But now, that doesn't break everything like it did for the enterprise data warehouse.Similarly "no taxonomy survives first contact with the data." You have to build so you can update and iterate, prevent brittleness. Don't use the phrase feature for your platform, use 'capability' instead. Make sure you are building the capabilities to support multiple use-cases and start to prioritize capabilities based on what use cases they can unlock.Data Mesh Radio is hosted by Scott Hirleman. If you want to connect with Scott, reach out to him at community at datameshlearning.com or on LinkedIn: https://www.linkedin.com/in/scotthirleman/If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/All music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf

May 14, 2023 • 13min
Weekly Episode Summaries and Programming Notes – Week of May 14, 2023
Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/All music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf

May 12, 2023 • 1h 3min
#221 Panel: Building Your Data Mesh Roadmap - Led by Eric Broda w/ Elizabeth Calloway and Phill Radley
Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.Transcript for this episode (link) provided by Starburst. See their Data Mesh Summit recordings here and their great data mesh resource center here. You can download their Data Mesh for Dummies e-book (info gated) here.Eric's LinkedIn: https://www.linkedin.com/in/ericbroda/Eric's Medium: https://medium.com/@ericbrodaPhill's LinkedIn: https://www.linkedin.com/in/redshirts/Liz's LinkedIn: https://www.linkedin.com/in/elizabeth-negrotti-calloway/In this episode, guest host Eric Broda, an Executive Consultant in the financial services space (guest of episode #38) facilitated a discussion with Liz Calloway, a Data Governance expert in Financial Services (guest of episode #92), and Phill Radley, Principal Data & AI Strategy Consultant at Thoughtworks. As per usual, all guests were only reflecting their own views.Scott note: I wanted to share my takeaways rather than trying to reflect the nuance of the panelists' views. This will be the standard for panels going forward.Scott's Top Takeaways:It’s important to understand that when you decentralize, different aspects can - and should - move at different paces. Your roadmap needs to account for that but it should also take advantage of that. Domains can go at their own pace, including ones looking to quickly drive towards significant data value.Everyone's roadmap, by their inherent nature, will be unique based to the organization's context. Trying to copy-paste from another organization will end badly. That said, there are some pretty core capabilities that all mesh roadmaps should have.A roadmap should point you towards your target endpoint. Of course with data mesh, there isn't really an exact endpoint - you don't stop evolving, much like with microservices - but the idea is the same: how do you want the organization to operate relative to data production and sharing? Set your North Star to guide you.Your roadmap really should have time built in for "socialization". If you don't, then there is no real extra collaboration or communication between teams and everyone executes on what they expect is needed - you optimize at the micro level, not the macro. It's crucial to human cognition to not only be 'doing' but thinking and relating to other people. That's how you can drive sustainable change management.Be very clear on your value proposition for your data mesh journey but don't try to get your budget and set your value proposition as this long-term delivery of value. Look to the incremental value delivery and find value propositions for incremental work. Basically, deliver value along the way and don't tie your budget and value to something that is a massive delivery 3 years down the road.For data mesh, absolutely be prepared to make compromises on when you plan to deliver capabilities. If you aren't prepared to make compromises, you unequivocally are not ready for data mesh.Focus more on business capabilities over purely underlying technical capabilities in your platform roadmap. The tech is the most tangible part but it's also not the thing that really ends up driving value or being overly important to most users.Governance might actually be the most crucial aspect to your roadmap. If you don't map out your planned capabilities around governance, you don't really know when you'll be able to tackle certain types of use cases. And it's also very easy to leave off crucial governance issues to later that will hurt you as you try to scale like policy as code, data product blueprints, and interoperability standards.Other Important Takeaways (many touch on similar points from different aspects):Try to delay your key architectural decisions until you know more. At some point, you have to make a call but don't try to lock in your architecture right away. However, this will REALLY frustrate many (most?) data engineers because they often care more about the tech than the capabilities.Overall, focus more on what are you trying to do to support and advance the business strategy in your roadmap. Data people want to focus on the data work but it's very easy to diverge from the business strategy. So make sure you are keeping your roadmap aligned to business goals. If you are going to create a roadmap, you should get clear on the definition of your roadmap. What are you actually trying to achieve with your roadmap? What is important to include and what isn't? Don't create a roadmap to create a roadmap - at the start of your process, define the goal around creating the roadmap.A roadmap at the high level is a strong communication tool to drive alignment across many parts of the organization. It is a vision to share and work towards. You will have your top roadmap but you might have roadmaps for aspects of your data mesh like your platform too.The roadmap isn't rigid. The target vision 'destination' is but focus a lot on what is our next 'pit stop', what is our next value delivery instead of trying to plan everything out years in advance.If you're spending too much time to ensure you are 'right', being 'right' might not be actually valuable - you spent too much of your "time, talent, and/or treasure" as Liz said where the return on investment isn't there. Learn to fail fast and iterate to value with smaller bets.Even when it comes to a roadmap, prepare people for one of Zhamak's mantras: think big, start small, move fast.Your roadmap shouldn't be too ambitious and you should roadmap out your organizational changes too. Far too many try to focus purely on the technical and platform aspects. Do not try to escape Conway's Law - look to reflect its impact in your roadmap.Don't focus overly much on exactly where you are going without reflecting on how far you've come. Think about your successes - showcase/celebrate and try to measure them as that will drive further buy-in and momentum. You can highlight the work of the team and inspires others. Make this part of your formal roadmap.You want to think about when you will add your self-service capabilities. You have to make some compromises at first in a data mesh journey, especially around the platform. It can't do everything you want it to on day one or you will have spent far too much time building instead of doing. Be honest about the importance of capabilities and look to automate where you find the most friction.Governance on your roadmap will really set a tone for your journey. Is this about better data processing or about better business outcomes through AI and analytics?Does your definition of minimum viable product for a data product evolve as you mature in your data mesh journey? It's an interesting question to dig deeper into.A simple, compelling value proposition could be 'the work we do to satisfy regulatory compliance should be reused for our analytics and AI. We should drive the cost and complexity down for both by focusing on reuse.'Your roadmap could possibly even include developing a better decisioning process for future work. Focusing on tying business use cases back to data work. Even something like measuring the value of your data work. It won't be great from day one but think about business processes around data work you want to mature.In your platform, potentially err towards building the business capability first and then building out the self-service aspect to take care of that. You need to understand what you actually want to build and an excellent (the best?) way to do that is to get real-world feedback on what are the actual pain-points of putting something into production.It's okay to have either (or both) an org-wide, large-scale transformation roadmap or a much smaller roadmap focused on that incremental value delivery. Both could be a valid approach depending on your organization's situation.Be real with assessing where you are in your data maturation journey, your data governance journey. No one is judging you on that. It's better to be honest to yourself than going for bravado. Again, no one externally is looking at you and especially not at your roadmap :)There are multiple successful and sensible ways to start a data mesh journey. You shouldn't be overly focused on how others are getting going. Yes, learn from them but you can - and should - tweak to your organization. Don't go for easiest, go for most likely to succeed in the short and long run. Yes, easier said than done.Getting your first mesh data product to production is a "momentous occasion" as Eric said. You had to really get a lot of things in place to achieve it.There are a few parallel value delivery tracks in your data mesh journey. There is business value delivery - delivering on the use case. And there is also mesh capability delivery - delivering on the ability to do data mesh. When you first start out, the business value delivery will outstrip the capability delivery but the capability delivery is where you have your long-term value leverage, your ability to scale value delivery.It isn't easy and can sound cheesy, but make sure to build empathy into your roadmap - how do we exchange context with each other and learn what drives value for each other so we can do better work that delivers more business value? Good empathy will lead to stronger communication and collaboration when it's valuable/necessary.If you see something, say something. Over communicate. Think about what others might want to know and let them know, don't only wait for someone to ask.If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/All music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf

May 8, 2023 • 1h 26min
#220 Building Your Early Mesh Data Platform and Data Product Capabilities - Interview w/ Manisha Jain
Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts (most interviews from #32 on) hereProvided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.Transcript for this episode (link) provided by Starburst. See their Data Mesh Summit recordings here and their great data mesh resource center here. You can download their Data Mesh for Dummies e-book (info gated) here.Manisha's LinkedIn: https://www.linkedin.com/in/evermanisha/'A streamlined developer experience in Data Mesh' articles by Manisha: Part 1 - Platform: https://www.thoughtworks.com/insights/blog/data-strategy/dev-experience-data-mesh-platformPart 2 - Product: https://www.thoughtworks.com/insights/blog/data-strategy/dev-experience-data-mesh-productArticle on Lean Value Tree: https://rolandbutler.medium.com/what-is-the-lean-value-tree-e90d06328f09Blog post on mentioned data mesh workshops: https://martinfowler.com/articles/data-mesh-accelerate-workshop.htmlIn this episode, Scott interviewed Manisha Jain, Data Engineer at Thoughtworks.Some key takeaways/thoughts from Manisha's point of view:Manisha's top 3 pieces of early mesh journey advice: A) put together a specification of what a data product is, make it clear. Your definition will evolve/improve but if people don't understand the building blocks, it's going to be hard to build value. B) start to create standardized input and output ports because that is how data products - your units of value - actually exchange value. C) make it easy to discover your important SLOs to create trust; partner with consumers to serve what they need.When working with your first domain, it's crucially important to make sure they have strong data engineering talent - whether that is existing people or someone embedded into the team. That way, someone inside the team building the data product can better interact with the platform team to communicate basic needs at the infrastructure level. This isn't as necessary for later domains - the platform already exists.Both the platform and product teams need to really understand and align on responsibilities and necessary capabilities. That will help you streamline your developer experience, which is crucial to scaling data mesh.People very often confuse data and data product. You should look to get crisp on what a data product means internally. Scott note: we still don't have a simple, on-paper explanation of a data product you can put in front of a team of non-data folks…It's crucial to understand how data mesh can align with how your organization thinks and works. That will make it far easier to drive buy-in. "…only when they're comfortable with that concept … will [it] make sense to go ahead and explore more."The platform team needs to focus on delivering capabilities to domains, not technologies. But they also need to think about mesh-level capabilities, e.g. supervision capabilities. Think about what capabilities are needed when, don't boil the ocean. Maybe certain use cases are too difficult to tackle right now. Scott note: that is the mark of a good org approach, when you can say "we aren't ready yet" and it's okayIt's important to remember that the data platform is there to make it easier to create, deploy, and manage/evolve data products. Your data products are the unit of value exchange in data mesh. Use that as a guide when deciding what the platform should offer.?Controversial?: Boundaries around data products and data product teams are more important than most realize. We really do need teams to be able to act independently and not deal with the hassles of shared infrastructure. Scott note: I know people get worried about cost but time to market of information matters too as well as time spent dealing with untying infrastructure knots by the platform team.Consider doing a series of workshops with a small group closely aligned to each domain to drive understanding and alignment. Don't try to do all the domains at once, each has a different set of needs and capabilities.The Lean Value Tree is an effective method to break down what you are trying to accomplish into actionable pieces of work with explicit assumptions. You state the bets you are making, the hypotheses you are testing, etc. so people are on the same page.When you start working with a domain in data mesh, really drive down to specifics. Don't just identify a use case and maybe the necessary data products. What skills and tooling are necessary to create and maintain those data products? What would a team look like to own those data products? How are you going to put that all together?Doing data mesh, every new domain is different so the onboarding plan for each incremental domain will have to be adjusted. That doesn't mean everything starts from scratch but you need to assess gaps in each domain's capabilities to figure out how best to enable them.To understand what capabilities domains need, the platform team needs to have strong communication and partner with the domain teams to build what they need, what helps them get the job done, not build the coolest platform :)The platform team should focus on addressing three questions in the initial build: a) how do users create value? b) how can we ensure users trust the data? And c) how do you make data products usable and discoverable?Your platform team probably won't recognize what all will be reusable components until they've brought multiple domains on to the platform. And that's okay. ?Controversial?: Most aspects of concepting and then building/deploying a data product are reusable, the data modeling and data transformations are the things that are unique to every data product.Interoperability standards won't happen magically but you can do more custom mapping between data products very early in your journey to get use cases out. Start to look for places to create simple standards used to integrate data products.?Controversial?: Offering automated modeling or sample data models is a double-edged sword. They can be helpful to get teams to something passable but it's rarely going to create a good data product without more work anyway. Basically, at best it's an outline but shouldn't be published like it's the article.Look to build to the thinnest slice that delivers good value. Don't get ahead of yourself. But don't try to get by ignoring one of the data mesh principles.As you build data products in a domain, you will learn more and more about that domain which will lead to enhancing/evolving the data products you do have and potentially creating new ones as use cases emerge. It's important to stay curious and be prepared to share new insights via data.?Controversial?: Data people have to really learn to speak in the language of the business. Otherwise, we will continue to talk past each other instead of learning and iterating together towards value.Manisha started the conversation with her thoughts on how to get going with data mesh, on-boarding any domain but especially your first domain. Work with a small team aligned with that domain to find how data mesh can align with how the organization works and thinks - this will be different for every organization. That alignment is crucial to getting people comfortable and driving buy-in. People have to be comfortable with how it will work and what are their responsibilities. As Manisha said, "…only when they're comfortable with that concept … will [it] make sense to go ahead and explore more."According to Manisha, when you are bringing teams up to speed, it's really crucial to get on the same page on what you mean and what you expect from them. They often confuse data and data product for example. The differences can be subtle but are important to understand. As Chris Haas also stated in his episode, they are using the Lean Value Tree method to break down target outcomes into explicit assumptions and more manageable aspects of work. What are the bets you want to make and what are the hypotheses you are testing? Your initial workshop(s) with a domain can also be a lesson in how to deliver value using a data mesh approach and prioritization. Manisha talked about how when working with a domain, you might identify multiple potential use cases. But you need to choose what is a priority to do now and why. This can surface what are the top one or two use cases and also show the domain how to prioritize as use cases continue to emerge in the future. The use case(s) they select to prioritize then directly lead to discovering the data products that needed to support the use case. And then you identify what skills and tooling are needed to actually execute and build and then maintain the necessary data products. Then, you can start to back into what a team working on the necessary data products (and potentially platform) look like. You can use that Lean Value Tree concept to really get specific because far too often in data work, things are left too vague. Scott note: Get specific, get explicit, chase away vagueness - but of course leave LOTS of room for experimentation and iteration as you learn and build.When asked more about workshop dynamics, Manisha shared how they try to keep them from being too heavy on the domain - get a few people, maybe 2-4, who really understand the domain and can represent the business aspects, not just the data and/or technical aspects. Each workshop has its own goal as an outcome but it's important to first align data mesh to organizational goals, the business strategy. Then you can get into data mesh specifics. They call their workshops 1) accelerate, 2) discovery, and 3) inception.Manisha shared some crucial dynamics when working with your first domain that do get easier as you bring on additional domains. In the first domain, it's crucial to really narrow in on understanding and definitions including roles and responsibilities. Data product owner is a new role, what does it actually mean? And there's the initial platform work too. But as you bring on your third, fourth, fifth, etc. domain, there is internal learning to share with the new domains. There is more clarity around what a data product is - they can even see already built data products - and roles/responsibilities. But you will need to definitely do a gap analysis to figure out how to best enable each domain as each domain is unique. So there is a balance - look to maximize reuse of platform, processes, organizational changes, etc. but don't look to force new domains to adhere to exactly how previous domains went through the journey.For Manisha, it's very important for the platform team to think in terms of capabilities. Deliver capabilities, not technology, to the domains. Work with early data product teams closely and focus on what they are trying to do instead of how you want to solve the technical aspects. Focus on specifically what are they trying to achieve? Also, the platform team needs to consider what mesh-level capabilities are necessary when. Don't try to deliver a complete platform at the start - your platform is a product and minimum viable product, make sure you understand what minimum means and don’t go overboard. The platform team can focus on a few simple things to drive to a good initial outcome/partnership with domains in Manisha's view: 1) how does the work create business value? What do the domains need to do to actually drive value? 2) How will users trust data, what does trust mean and what's needed? 3) How do we make it possible for domains to create and manage a data product that is usable and discoverable? By focusing on the task at hand and then mapping to capabilities to support that task, you can prioritize and deliver something useful and valuable without boiling the ocean. You don't need to try to include every capability at the start, that is a bad anti-pattern. Get close to the use case and find friction. You will also learn to recognize reusable components of the platform but some reusable components might not be evident at the start.Manisha then went further into finding and identifying reusable components. The things that are most unique to each data product are the data modeling and data transformation in her experience. Almost every other aspect of spec-ing out and building a data product are reusable, merely customized to the data product itself. Finding the necessary SLAs and SLOs by working with consumers, that is a reusable process. How your SLAs are actually measured, the definitions around those SLAs are reusable. The infrastructure and CI/CD is reusable. The overall data product blueprints are reusable. So look to make these reliable as your organization learns how to build data products to make for easy reuse.On data modeling and interoperability, Manisha shared that it's crucial to let domains evolve how they model their data as they learn. And interoperability, especially to support a use case, is of course important; but you will likely see a need for interoperability standards emerge when it's needed - basically, don't try to build all your standards ahead of time. That might be creating an enterprise data model with a different name :)When asked specifically about sample data models and automated data modeling tooling, Manisha pointed to them being a double-edged sword. While they can be helpful, most (all?) data products need more custom data modeling to maximize their value. Essentially, the tools can get to a decent initial data model but domains should look to improve them. If Platform teams offer automated modeling tools, they should make sure there is a big caveat to their usage .Manisha recommends you make sure your initial domain has strong enough data talent - whether existing or embedded - to communicate the basic needs to the platform team. Regular developers are often not going to be data fluent enough at the start to drive to exact data infrastructure needs like a data engineer could. But be careful not to over index towards tech too. Every domain will need people skilled in creating value through data modeling but you probably won't need people as advanced in data infrastructure later - the platform is already built by that point :DIt's important to differentiate what the platform should offer and what the data product developers should handle according to Manisha. The platform, at least the aspects around data product creation, should be focused on making it quicker, easier, and more reliable to create, deploy, maintain, and evolve data products. It sounds easy but it's actually easy to lose focus on that. Look for friction points in the creation and management lifecycle and automate what doesn't add incremental value. E.g. a data product developer shouldn't have to manually add data to the catalog so look to automate it - and yes, not everything should be built upfront :) Scott note: she added some good flavor around data product boundaries but it's very hard to summarizeWithin the platform, Manisha believes it's very important to maintain team boundaries because shared resources become a bottleneck and pretty quickly can become very hard to manage. This is why Zhamak has been so clear on the data product as an independently deployable unit of architecture. Manisha gave the example of even the namespace for data products in the data catalog should be reserved for that one team so teams have a dedicated space to put all their data products. Manisha gave some early mesh journey advice: 1) back to data product specification, you should create something that gives teams a very clear idea of what a data product is and encompasses. Scott note: still waiting for someone to open source their data product creation template…2) if, as Zhamak says, data products are our unit of value exchange in data mesh, then making it easier to exchange value is crucial. Start to create standardized input and output ports so you can easily ingest and serve data. ETL shouldn't be a concept, it's ingest or serving only.3) really focus on making it easy to discover and then implement SLOs and SLAs. Being able to understand and trust data is crucial to being willing to rely on it. That trust comes from good communication around SLAs.Manisha believes learning the language of the business is crucial for data people. You need to extract the actual business value drivers and build to those so you have to be talking the same language - unfortunately for data people, the language that aligns to business value is usually the business language :) Look to ask more business user-focused questions than trying to get technical.Quick Tidbits:"… the data product spec should [at a] minimum talk about the data set ports, domain, service level agreements, how do I share my data, what does data sharing look like…" - Make your data product specification easy to understand what someone will create and what a consumer will receive.Again, focus on a streamlined developer experience that keys in on autonomy. That's the way

May 7, 2023 • 33min
Weekly Episode Summaries and Programming Notes – Week of May 7, 2023
Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/All music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf

May 5, 2023 • 21min
#219 Zhamak's Corner 22 - Increasing Resilience of Data Processes Through Software Best Practices
Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Sponsored by NextData, Zhamak's company that is helping ease data product creation.For more great content from Zhamak, check out her book on data mesh, a book she collaborated on, her LinkedIn, and her Twitter.Key Takeaways:We need to be better about getting on the same page regarding some semantics in data mesh. Otherwise, it's hard to work together internally and across organizations to move the industry forward.There are so many things we've learned about how to break systems into smaller components on the services side, about preventing tight coupling but the data world has yet to apply those learnings. We're heading down some paths that we don't need to if we follow past learnings.As an example of above, early data contract approaches are too tightly coupled around schema. We need to be a little less rigid there but how feels to be determined.Postel's Law: "Be conservative in what you do, be liberal in what you accept from others." Learn it and think about how to apply it to data so we create more resiliency across our internal data ecosystems. Right now, there isn't much out there on the how.Resiliency at scale is possible on the operational plane, why not the data plane? We need to be "very mindful and not naïve" around how we integrate in the data world to not make the same mistakes we made on the services side.Postel's Law: https://ardalis.com/postels-law-robustness-principle/Semantic Diffusion article Zhamak mentioned: https://www.martinfowler.com/bliki/SemanticDiffusion.htmlPlease Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereData Mesh Radio episode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/All music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf

May 1, 2023 • 1h 31min
#218 Building the Right Data Strategy: Why Are We Even Doing This - Interview w/ Beth Bauer
Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts (most interviews from #32 on) hereProvided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.Transcript for this episode (link) provided by Starburst. See their Data Mesh Summit recordings here and their great data mesh resource center here. You can download their Data Mesh for Dummies e-book (info gated) here.Beth's LinkedIn: https://www.linkedin.com/in/beth-bauer-102449/Beth's Website: https://posiroi.com/Harvard Business Review article on 'The 3 Elements of Trust': https://hbr.org/2019/02/the-3-elements-of-trustIn this episode, Scott interviewed Beth Bauer, Founder and CEO, PosiROI. FYI, there are lots of nuggets in this one for people creating a data strategy or trying to tie your data work to value creation.Beth's ADEPT^2 Framework (covered briefly near the last 10min of the episode): Analytics - Acuity - Data - Decisions - Engagement - Enablement - People - Processes - Technology - TrustSome key takeaways/thoughts from Beth's point of view, much of which she helped craft:To do data right, we need shared responsibility. There is the technical piece of course but the business aspect is just as important. "…we need to realize that nobody's anything without each other" across the units and enterprise. ?Controversial?: Really good data management can cause some challenges to power structures, especially "how it's always been done" power structures. Try to work with people to give them sight to how they are important in a changed organization.?Controversial?: Don't think data or digital _transformation_. A transformation is something that completes. This is a journey, an ever evolving journey of improving your data practices.Data fluency is crucial - not just giving people the ability to work with data but the trust, especially in themselves, to leverage and even rely on data. People need guiderails to know where they can safely create insights independently, and where they need to ask for more expert guidance.Trust is made of relationships, judgment, and consistency (from HBR article linked above). Over half (half!) of trust is driven by relationships. If you want people to trust your data, you have to form and build relationships. In the same vein, if data itself is the consistency + judgment, that doesn't even get people half way to trust.It's crucial to think about delivery timelines for data strategy work. There are things that will take much longer and deliver considerable value but you need to break them down into manageable pieces that have incremental value as you deliver. As needs shift, you can react because it isn't a locked long-term strategy. Scott note: that project with a payoff only starting 3yrs down the road, it's always 2.5yrs too late. Deliver value incrementally building to the desired outcome, which can shift.?Controversial?: Sometimes it's necessary to go from the high-level data strategy vision all the way down into the weeds and vice versa. Oftentimes in data, those weeds really do matter and you should look to connect the strategy to specifics when it's of value.It’s really important to recognize that gaining alternate lenses from outside your usual project team can help, regardless of your data function. There's no shame, and it shouldn't be for everything, but expertise outside your normal team is often of great value. Data is a team sport."A continuous process of gap analysis is absolutely critical." If you aren’t continually assessing your competencies, your data practice will deteriorate. Unfortunately, we are never done, new competency needs emerge and existing ones can easily atrophy.When people ask for data, it's important to simply ask "what are you going to do with it?" If they can't answer that - what would it change for them? - then you want to figure that out before doing significant work. Scott note: as Alla Hale mentioned in episode #122 "what would having this unlock for you?"To do data well, you need to create a high-level vision and strategy that is flexible and can evolve. And then you need to break down your target milestones and start working out what needs to be worked on when. It's okay to make long-term bets but you need incremental value delivery as well.?Controversial?: Collaboration in data can be a double-edged sword. It can add significant value but coordination can cause friction and delays. Look for ways to avoid coordination slow-downs so you can deliver value faster and more easily but still collaborate to drive towards mutual benefit/value. You have to trust that others will deliver, give them the room to do so.Data without context, without the proper metadata and owner shaping it, can often be more harmful than beneficial.?Controversial?: Data people have to get comfortable with giving up control and enabling teams to work autonomously to deliver value. It's important to prevent silos, yes, but you still need to let go of some control to create more value as an organization.Look to the Agile philosophy and find ways to achieve the goals of Agile. Orgs doing Agile often get caught up in the ceremony and in the small picture work. Don't focus on ticket closure achievement, work in a nimble way to build incrementally towards much bigger pictures.Data isn't the point. Data is a driver to do business better. Data for the sake of data is far too common of a trap - avoid it and focus on delivering better business results through data.To do data right, producers and consumers need to be able to talk transparently about goals, expectations, limitations, etc. That goes back to trust - both sides have to be able to have a genuine, high context conversation with strong trust.The business people don't need to know "how the sausage is made" relative to data processing but they do need to know what type of sausage it is, what goes into it, what's the flavor, is it a patty or a link, etc. Scott note: this is sharing the information, not merely the 1s and 0s of data. We need to focus on sharing information, not pure data.You need to build out a data sourcing strategy - both external sourcing and internal sourcing. Internal sourcing then further splits into preparing existing data for usage and creating new sources where you lack the data entirely.It can be easy to lose focus on how data work, especially execution on data work, supports the business strategy. But that's where we constantly see the statistics on data work not meeting expectations - when the work gets disconnected from the business value and business strategy.All parties understanding where information actually comes from, how it was generated or sourced, will drive far more trust in the data.?Controversial?: Much like it's crazy to set a data strategy that doesn't reflect the business strategy, it's similarly crazy to set a business strategy not backed by data. You might not be overly sophisticated in the data you initially provide to a business strategy team but if they aren't leveraging data, they are missing crucial context and understanding.Minimum viable products (MVPs) are very important to innovation and collaboration. They are "… your first iteration of … did I hear you properly? And are we beginning to create what we needed to together?" But too many try to rush them or overengineer them. Focus on both what minimum and what viable mean. Use this as pilot-testing.Beth started the conversation with a bit about her background and then got to swinging a bit at current practices :) She talked about the need to have high-level data strategy vision but also be able to understand things in the weeds. Oftentimes those details in the weeds are important to being able to execute. Just don't get lost in them! Setting a flexible, evolvable strategy and vision is crucial in Beth's view. You need to have a vision of where you want to go before you can figure out what actions you want to take. And then you need to look at milestone goals and break down what needs to be done when. Look to find incremental value delivery instead of putting all your value eggs in the basket that won't pay off until 2-3 years down the road. No one is willing to wait that long and needs will probably shift along the way. It's definitely okay to make long term bets that will require work to deliver value years down the road but focus on delivering value before that too. Priorities will change and so will your understanding of what will deliver the best value. So set out what needs to be done and when and do get going and be ready to reprioritize as you learn!Beth talked a little about the cost of collaboration. If that is loosely coupled collaboration but not necessarily coordination on most to every step, great. But there is always a friction cost to closely controlled collaboration. Try to avoid work that requires too many dependencies - get aligned and work together towards a common vision. We really need to have trust that other parties can and will deliver. If you need to be working that closely, do you really trust each other?Agile can be as much of a hindrance as it can be a help in Beth's view. If it's about taking the bigger picture and constantly breaking it down into achievable pieces, great. But if the focus is on completing work instead of driving to bigger picture value - if the work doesn't have an extremely tangible connection to value - you get lost in the cogs of the machine. You need a strategy that can handle how the cogs work but the cogs aren't the point of the machine - what are they actually driving?Trust is made up of 3 elements according to an HBR article (linked above): relationships, judgment, and consistency. Relationships are over 50% of that trust. Beth believes data is essentially the judgment and the consistency but to get people to use the data we have, the data we can provide, we need to build relationships. You can't just show them the data, it doesn't even get them halfway to trust! Scott note: one comment I make is the difference between someone simply using the data and someone relying or depending on it. It seems like a small differentiation but it's not. One is using the brick as an accent to the building and one is building with the brick as a key element of structural support. Beth pointed to how while the business folks don't need to know exactly how "the sausage is made" relative to data, they do need to understand more than just 'here's some data for you.' It's about sharing the necessary context. If you want sausage analogies, what flavor, what are the ingredients, what is the shape as in patty versus link, etc. They don't care about the data processing techniques but they should care about - and can gain value from - how was the data transformed from a business perspective. Scott note: this is where I talk about sharing information versus data. Just the 1s and 0s of data have no value without context as Beth said. So embed the context, focus on sharing the context, otherwise it is just values in a more complicated spreadsheet :D Creating data sourcing strategies - at the micro and macro level - are important in Beth's view. Don’t overly rely on external data, that's costly. What data do you already have internally that you should leverage? What data could you be generating internally that you aren't? Dive into specifics and create a scalable way for lines of business to figure out good paths with sourcing data - internally and externally - going forward. Make sure to look at things from a cross domain lens too and also think about privacy, regulatory, etc.For Beth, many organizations have trouble keeping the data work aligned to the business strategy. So there needs to be a specific focus on making the work matter, driving to business value. Yes, at the micro level but also on the whole - what business objectives and business outcomes is the data work supporting? Getting "down in the weeds" can also be very helpful, the details do matter as they make it “fit-for-purpose”. On business strategy and data, Beth echoed the view many other guests have shared that creating a data strategy not aligned to the business is not a smart practice. But almost as egregious is not using data to help power your business strategy but this is extremely commonplace. Data is synthesized knowledge of what is going on in the world, often how the organization is interacting with the world. Why wouldn't you want to leverage that for shaping your strategy?!In data, Beth has seen the benefit of the MVP (minimum viable product) methodology and they are wonderful if used correctly. However, they are often not used correctly :) Innovation doesn't have a steady timeline - it's messy. MVP timelines are tough so focus on getting to something viable instead of hitting a deadline - and communicate that to stakeholders. MVPs are about making sure you are on the same page and then iterating to better from there.Beth talked about the need for continuously doing gap analysis with your data and business capabilities. The world is ever changing and new challenges and needs will constantly come up. Plus current capabilities can atrophy. You might be hindered in projects because you need some special capability - especially think legal/regulatory compliance - and you should know that _before_ doing the work :) Two key questions Beth uses when people ask for data/data work: 1) what are you going to do with this? And 2) How much value do you think this will generate? You don't need to get super specific but people need to at least have a good idea of what the work will unlock and the value of that. If it won't cause any action, why do the work? If the cost outweighs the value, that should be known so you can work to balance that equation by cutting costs and/or finding more value.For Beth, to do data right, we need shared responsibility. There is the technical piece of course but the business aspect is just as important. "…we need to realize that nobody's anything without each other." We need to drive to address current gaps and we need short, medium, and long-term strategies that drive necessary work in the short, medium, and long-term. Don't get overly focused on the near or the long-term. But it's not just about doing the data work, especially the technical data work. What value and change does this work actually drive?"And what I found is that largely, a lot of organizations, the challenge is, with really good data management comes really good transparency into how things work. And that really causes pushback on the power structures, and particularly in the 'how it's always been done' power structures. Because if it now points to a way that things can be done better, you start to get into things. Things are happening behind the scenes that have nothing to do with data, and everything to do with people's perceived value of themselves to the organization - without (them) thinking about how they can evolve to actually move from what they're doing today to doing it better." Scott note: it's crucial to help people see how they can move to doing more valuable things - their time of toil is behind them and we can unleash the value creation :)Quick Tidbits:In data, there is often a rush to get things done instead of get to automation. It's often - but not always - the right call to slow down to do things right and set yourself up to do them faster/better/more scalably as you move forward.Lineage, especially for how data was generated and transformed from external sources, is really crucial to increasing trust in that data."Data for the sake of data is useless." Scott note: PREACH!"The digital transformation, to me, is the biggest misnomer and misguidance that we've ever created." A transformation has an end. This is a journey. Data Mesh Radio is hosted by Scott Hirleman. If you want to connect with Scott, reach out to him on LinkedIn: https://www.linkedin.com/in/scotthirleman/If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereAll music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or

Apr 30, 2023 • 19min
Weekly Episode Summaries and Programming Notes – Week of April 30, 2023
Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/All music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf

Apr 28, 2023 • 20min
#217 Pt 2 of Another One on Buy-In: Flipping the Script on Working with Your First Domain - Mesh Musings 47
Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts (most interviews from #32 on) hereProvided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.Here are 8 more leverage points to consider when trying to work with your initial domainMake it EXTREMELY tied to their top priorities.Find additional funding sources.Find a true executive sponsor - think about being the one that brought a whole new approach that is driving far better data results…Having their data be leveraged by other teams can mean more funding flows for their data projects and potentially their domain as a whole.They control how things work for everyone in the data mesh implementation. They aren't just the trendsetter, the data mesh implementation leads will make it work for everyone but they are the secret favorite child and initial shaper.Similar to moving at the speed of business, MUCH smaller experiments.Better data quality.Potentially easier regulatory reporting.Data Mesh Radio is hosted by Scott Hirleman. If you want to connect with Scott, reach out to him on LinkedIn: https://www.linkedin.com/in/scotthirleman/If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereAll music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf