
Data Mesh Radio
Interviews with data mesh practitioners, deep dives/how-tos, anti-patterns, panels, chats (not debates) with skeptics, "mesh musings", and so much more. Host Scott Hirleman (founder of the Data Mesh Learning Community) shares his learnings - and those of the broader data community - from over a year of deep diving into data mesh.
Each episode contains a BLUF - bottom line, up front - so you can quickly absorb a few key takeaways and also decide if an episode will be useful to you - nothing worse than listening for 20+ minutes before figuring out if a podcast episode is going to be interesting and/or incremental ;) Hoping to provide quality transcripts in the future - if you want to help, please reach out!
Data Mesh Radio is also looking for guests to share their experience with data mesh! Even if that experience is 'I am confused, let's chat about' some specific topic. Yes, that could be you! You can check out our guest and feedback FAQ, including how to submit your name to be a guest and how to submit feedback - including anonymously if you want - here: https://docs.google.com/document/d/1dDdb1mEhmcYqx3xYAvPuM1FZMuGiCszyY9x8X250KuQ/edit?usp=sharing
Data Mesh Radio is committed to diversity and inclusion. This includes in our guests and guest hosts. If you are part of a minoritized group, please see this as an open invitation to being a guest, so please hit the link above.
If you are looking for additional useful information on data mesh, we recommend the community resources from Data Mesh Learning. All are vendor independent. https://datameshlearning.com/community/
You should also follow Zhamak Dehghani (founder of the data mesh concept); she posts a lot of great things on LinkedIn and has a wonderful data mesh book through O'Reilly. Plus, she's just a nice person: https://www.linkedin.com/in/zhamak-dehghani/detail/recent-activity/shares/
Data Mesh Radio is provided as a free community resource by DataStax. If you need a database that is easy to scale - read: serverless - but also easy to develop for - many APIs including gRPC, REST, JSON, GraphQL, etc. all of which are OSS under the Stargate project - check out DataStax's AstraDB service :) Built on Apache Cassandra, AstraDB is very performant and oh yeah, is also multi-region/multi-cloud so you can focus on scaling your company, not your database. There's a free forever tier for poking around/home projects and you can also use code DAAP500 for a $500 free credit (apply under payment options): https://www.datastax.com/products/datastax-astra?utm_source=DataMeshRadio
Latest episodes

Aug 18, 2023 • 22min
#249 Zhamak's Corner 27 - Creating Truly Scalable Interconnectivity for Data
Takeaways:A data product by the data mesh definition cannot by definition be a silo because it is meant to be consumed by the rest of the organization as a greater part of the picture of the organization. Whether that is true in practice remains to be seen :) APIs are the way we've learned in software to interconnect between services, so we need to learn how to leverage the same approaches in data.Within a single organization, we may be able to get by without having a great set of industry-wide interconnectivity standards - the company can fully control their own sharing. But, we know that to truly unlock the value of our organizations, cross organization data sharing - in a scalable and useful way - is necessary. So we need better ways to do that - standards and tooling/technology.It remains to be seen which will emerge first: the standards or the tooling/technology - a chicken and egg issue there - but they must really must be intertwined to work well. Sponsored by NextData, Zhamak's company that is helping ease data product creation.For more great content from Zhamak, check out her book on data mesh, a book she collaborated on, her LinkedIn, and her Twitter. Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereData Mesh Radio episode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/All music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf

Aug 14, 2023 • 1h 10min
#248 Doing Data Quality Right by Building Trust - Interview w/ Ale Cabrera
Please Rate and Review us on your podcast app of choice!Get involved with Data Mesh Understanding's free community roundtables and introductions: https://landing.datameshunderstanding.com/If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding. Get in touch with Scott on LinkedIn if you want to chat data mesh.Transcript for this episode (link) provided by Starburst. See their Data Mesh Summit recordings here and their great data mesh resource center here. You can download their Data Mesh for Dummies e-book (info gated) here.Ale's LinkedIn: https://www.linkedin.com/in/alejandracabre/In this episode, Scott interviewed Ale Cabrera, Senior Data Quality Product Manager at Clearbit. To be clear, she was only representing her own views on the episode.Some key takeaways/thoughts from Ale's point of view:A key part of understanding what data work will be impactful is a simple phrase: "Is my understanding correct?" Putting out there what you took in and making sure you're on the same page will save a ton of time and headaches!Her advice to her past self: In data, far too often, we try to jump to solutioning instead of really taking the time to understand the problem. Start from understanding the problem and assessing it first.It's very easy to make data say something that it's not actually reflecting. Quality isn't just about accuracy or similar metrics, sometimes there are intangible aspects around correctness that people get but usually can't measure.In data work, many people miss two crucial aspects - the voice of the customer and the why. If you build the greatest thing ever but it isn't what the customer wants, it won't be used. Similarly, if you focus on the work and not the target outcome, your results are likely to be subpar.If you want to prove data work return on investment, try to associate it to a key company metric and talk about how improving that metric will drive better business outcomes.When you want to prove out the value of data quality, attach quality issues to direct business challenges or goals. It’s easy if you are a company selling data but you have to understand why bad quality data negatively impacts the company in order to gain influence to improve your data quality.Far too often in data work, people try to exchange data - the 1s and 0s - and forget to exchange information. Get people together and make sure you are in alignment, get people actually talking to make sure you understand and have a good forward path.The most crucial aspect of data quality is trust. If you can't drive trust, then all the 'quality' in the world doesn't mean anything because people won't use it.As a data person, your job is not to do data work. It is to unblock teams/projects/people and add value. Yes, that is through data work but the work isn't the point.The way to think about trust is a combination of "credibility, reliability, and intimacy by self-orientation."Measuring trust is hard but a good way is through interviews with users.Nothing will ever be perfect so consumers need to understand that. A mature and healthy conversation is to ask them to tell you if they see any data quality issues because issues always can happen. If they won't consume from data that might not be perfect, they can’t consume any data.When you are quick to react to data quality issues, your consumers can get more value from the data because they can be more sure sooner whether the shape of the data changing is an issue or a genuine reflection of something changing.In data, typically one-size-fits-all really ends up being that it fits none. You don't want to over customize a data product but still, your consumers are likely going to want something that fits their needs rather than something generic that they have to do the heavy lifting to be able to use.How people consume the same data can have a large impact on trust too. While data people hate Excel, sometimes you have to deliver something they can use in Excel. Otherwise, they will struggle to trust the data enough to rely on it.Is what makes something great the output or the impact? When applying product thinking, something doesn't really matter unless it is used to generate value.A key aspect of product thinking is prioritization. Yes, it would be great if we could build to every ask but focusing on what will most likely deliver value and why will keep you more focused on generating value through your data work.Ale started with a bit about her career and what led her to focus on data quality, including what led her to her current role. On advice to her past self: in one of her past roles, a lead engineer gave her enduring advice: think before you code 😎 Oftentimes, in data, people jump to trying to solution instead of really taking the time to understand the problem and look at the various potential solutions to choose one that will work better in the long run. She also said to respect the data when considering quality. You can get data to tell a story that it really isn't reflecting the reality of what is happening so you have to give it the respect to not shape it to tell "statistical lies".Another important aspect of data work for Ale is focusing on the voice of the customer and the why. What is the user telling you, what are the pain points. You can build the best solution but if it isn't what the customer wants, they probably won't use it. Look at all the amazing data platforms that barely anyone uses. Focusing on the why, why are people looking at this data or trying to address this challenge, will keep you from trying to solution instead of meet the customer where they are and help them solve the business challenge, not just solve a challenge about data. As a data person, the point of your job is not to do data work. It is to unblock teams/projects/people and add value. Yes, that is through data work but the work isn't the point.When trying to prove return on investment for data work, Ale believes it's important to tie it to a key business metric. Something like 'Our data quality improvement from 98% accuracy to 99% accuracy will mean a decrease in churn by 5%, netting the company X amount of revenue.' In her case, she is attaching the data quality work to reducing churn because it is a key metric that is intrinsically linked to data quality in a company selling data.For Ale, an important aspect of establishing yourself as an internal expert on data quality is tying the data work to value - what actually matters for the company. Improving data quality for something that drives 1% of revenue so it grows 25% is not as much of an increase as improving something that is 10% of revenue so it grows 3%. But the business might favor one over the other. What matters most to the business and why?Ale shared a story about why it's so crucial to get people actually in a space exchanging context and information instead of just exchanging the 1s and 0s of data. She was working on a project involving two teams and neither side had ever met each other despite being months into the project. The data quality was terrible, it was seeming like half the data was wrong or missing. So the other side asked about what was happening when they were pulling data from both servers. Her team only knew about one of the servers. So get in the room - virtual or physical - and actually communicate!Trust is the most crucial aspect of data quality work for Ale. If you have the best quality data but people don't trust it enough to use it, then it's still not of value. And trust is often built more through relationships than anything else (see Beth Bauer, episode #218). What is the purpose of the work if not to drive an outcome. If there isn't trust, can you really contribute significantly to an outcome? The way to think about trust is a combination of "credibility, reliability, and intimacy by self-orientation." She uses interviews as a good way to measure trust as time goes along. And that also encourages users to actually say something if they see an issue, which also increases trust after an issue. Listening to people is just a fundamental building block of driving trust.When asked about if you lose credibility when telling data consumers to immediately flag any issues, Ale was clear that in data, mistakes happen. Nothing is ever perfect. So it's crucial that we can have conversations and tell people to speak up if they see quality issues. Recognizing that there will be data issues doesn't mean you are sloppy with data, only that you are realistic. Being aware and reacting quickly to issues also means consumers can be more sure that an anomaly or change in the data is actually something real and react to that more quickly. Recognizing you will make mistakes and working quickly to rectify those mistakes builds trust.For Ale, there is a tendency in data for delivering overkill and/or something the other party has difficulty trusting. Providing a dashboard for someone who wants to see the actual data is just not that helpful. Meet people where they are while you upskill them to be better, even if that where is in Excel or Google Sheets.Ale talked about switching to the product mindset in data. If you are building 'amazing things' but no one is using them, are they really all that great? Is what makes something great the output or the impact? A good way to get to impact is focusing on acceptance criteria - what would make the user happy and what are they actually expecting?In wrapping up, Ale shared a bit about how to become a good product manager and her thoughts on how crucial prioritization is to actually applying data product thinking. How specific should you get with a solution is a very tough question but you can start by asking what is the expected impact versus a more generic solution.Learn more about Data Mesh Understanding: https://datameshunderstanding.com/aboutData Mesh Radio is hosted by Scott Hirleman. If you want to connect with Scott, reach out to him on LinkedIn: https://www.linkedin.com/in/scotthirleman/If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereAll music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf

Aug 13, 2023 • 13min
Weekly Episode Summaries and Programming Notes – Week of August 13, 2023
Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/All music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf

Aug 11, 2023 • 28min
#247 Data Mesh Implementation Success Metrics - The Data Platform - Mesh Musings 52
Key takeaways:As mentioned last time, at the start, it's more important to start measuring something than it is to measure the right things. Do NOT let analysis paralysis hold you back. Start measuring early to figure out what actually matters and that will also change over time.Similarly, your success metric measurement framework will probably suck to start. Oh well, get to measuring.Use fitness functions.Data mesh really is a journey and so will be how you measure success. You will need to find small and simple ways to measure. Don't get bogged down. Your measurements will be rough and kinda depressing with the amount of challenges to tackle at the start. Just understand this is about how well you are doing, not how complete you are - there is always more to do!Good metrics to consider for the platform success metrics: satisfaction, time to deploy new products, time to update existing ones, ease of use - which kind of blends into all the others -, searchability/discoverability, ease of interconnection, mean-time-to-detect and mean-time-to-recovery from issues, governance, guardrails, and automation/helpful artifacts. Look to measure friction and how you reduce it and if possible - I have no great ideas here - what is the extra value add to the business, something that the platform was the genesis of? Maybe that's suggesting data products?Lastly, reflect back on how far you've come, we often forget to do that!Please Rate and Review us on your podcast app of choice!Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding. Get in touch with Scott on LinkedIn if you want to chat data mesh.If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/All music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf

6 snips
Aug 7, 2023 • 1h 16min
#246 Making Federated Data Governance Approachable and Effective - Interview w/ Kinda El Maarry, PhD
Please Rate and Review us on your podcast app of choice!Get involved with Data Mesh Understanding's free community roundtables and introductions: https://landing.datameshunderstanding.com/If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding. Get in touch with Scott on LinkedIn if you want to chat data mesh.Transcript for this episode (link) provided by Starburst. See their Data Mesh Summit recordings here and their great data mesh resource center here. You can download their Data Mesh for Dummies e-book (info gated) here.#252 Designing and Building a Better Data Governance Approach - Interview w/ Lauren MaffeoKinda's LinkedIn: https://www.linkedin.com/in/kindamaarry/In this episode, Scott interviewed Kinda El Maarry PhD, Director of Data Governance at Prima. To be clear, she was only representing her own views on the episode.Some key takeaways/thoughts from Kinda's point of view:Always be assessing data literacy gaps and work to address them. Gaps don't have to be a bad thing, they are a chance for your people to grow! But ignore them at your own peril :)A key aspect of your job as someone on the central governance team - or anyone managing a governance function - is to reduce friction, not add overhead. It can be hard to do that at the start of a journey but it should be a north star focus.The phrase 'data governance' still has a stigma for many people from what often happens with a central-only approach to governance - processes and rules divorced from the actual business processes and the data which creates bottlenecks and headaches.The federation of governance in data mesh helps address the bottlenecks but it introduces a new layer of complexity which can also cause some fear.?Controversial?: If you are asking people to take on ownership of data, you must make sure their line of business/part of the org has the actual capabilities to do so. Otherwise, they will greatly struggle to own and are likely to push back.Don't talk about governance in the weeds to people, the toolbox talks, the policies and standards. Focus on what are you trying to achieve with governance and then lean back on the toolbox if you need to explain the how. But many don't care about the how.There is a sliding scale of how much is federated versus centralized based on domain capability and maturity level. You don't just push all responsibilities on to each domain at the start, it's a gradual process finding the right balance.When you find a domain that has skill gaps necessary to overcome, set up a roadmap and have as positive of a conversation as possible to start along that roadmap - it's not what they lack but where they'll grow.!Controversial!: Where possible, get rid of the central data governance committee. Collaboration is more key than trying to keep every stakeholder across the organization informed and involved in most decisions.?Controversial?: Data governance people should stop thinking about federation as losing control. It's about enabling the decisions to be made by those that can best make them. That way, the governance team gets to focus on the guardrails, the standards, the blueprints - the things that make stuff easier for everyone.No one is saying hand over control of very sensitive decisions without reason. It's about making sure the day-to-day decisions can be made so when there is an important/sensitive decision to be made, that's where the central team can step in to assist.When trying to get teams to take ownership of their data, there isn't a silver bullet - every discussion and domain is different. But start with a clear understanding of what ownership means and start the discussion with what will you do to make them capable and ease the burden of ownership. Just dumping responsibilities in their lap won't go well.Data culture is an often overlooked aspect of data governance. But it provides your scale - getting people to evangelize and share what's working instead of all knowledge dissemination coming from a central team/training is how you can get scale.It's not exactly the data governance team's role to find the right reporting structure and career management aspects of a data mesh implementation but the team can help by creating good spaces for people to share and communicate to create a sense of comradery.Centralized capabilities and coordination of certain aspects are crucial to doing data mesh right. You want to decentralize the decisions and the application of those decisions but centralize the enablement. A core, central data team is pretty important to making that happen.?Controversial?: Interoperability between data products is important but interoperability often can get in the way of delivering data products. Don't focus as much on it as you probably think.Kinda started off discussing the traditional view of centralized data governance and the stigma around the phrase. Because the legacy way were a team that didn't really understand many of the key business aspects around the data itself, the policies and guardrails were not designed to make creating data assets or projects easy. The decoupling of understanding from decisions led to bottlenecks. And data mesh is specifically designed in part to eliminate those bottlenecks.However, as Kinda pointed to, the federated governance model of data mesh brings a whole new layer of complexity to data governance. So talking federated governance can create anxiety that there will be even more bottlenecks. But really, for many things, it makes governance far easier. A crucial point of data mesh is to unlock the ability to quickly react at the domain level by giving them the ability to make (most of) their own decisions - and easily lean on expertise when they need to.Kinda talked about good communication around governance: focus on the why, what are you doing to drive value via governance, instead of the toolbox angle of policies, standards, etc. If they understand the why, they are more likely to lean in to the extra work data mesh puts on their plates. So start from listening and talking pain points - theirs and other people within the organization. If you are showing they are heard and that you're addressing their pain points to drive value, governance isn't so scary.One aspect of data mesh that people seem to get a bit wrong in Kinda's view (and Scott STRONGLY agrees) is the federation versus centralization balance. You need to understand each domain's capabilities and overall maturity level. It's okay to start with most of the governance centralized if a domain isn't ready yet. It will be a bottleneck but it's a known bottleneck that is taken as a conscious choice.Sometimes, in Kinda's experience, domains will want to move on pushing out a data product before they and/or the organization is ready. That's okay, just make sure they understand what they own and that they will be on their own - at least it's okay as long as it isn't breaking regulatory compliance or anything.Kinda believes that you should look to do away with the central data governance committee where you can. You want stakeholders engaged of course but you can move faster and with better decisioning by collaborating, not committee-ing (blame Scott for that phrasing, not Kinda 😅). For example, instead of creating a global data quality standard herself and pushing it out, she created a hackathon around data quality measurement and the best solution - or really a combination of solutions - won.Governance people should not look at federation as a 'loss of control' according to Kinda but instead, it's freeing the governance people up to focus on where they can add the most value, creating those blueprints, standards, guardrails, etc. It's helping drive data literacy and data culture. And it's acting as an advisor when necessary - you don't want each domain to implement GDPR compliance for instance. At the end of the day, the people on the ground in the lines of business know the data better than someone with only a centralized view really could so make it possible for them to make the right decisions and apply them easily. A big misconception here is that it's handing over all control but that's just not the case - when there are sensitive decisions or things that are of a bigger scale - e.g. regulatory related decisions - that's when the central team should be focusing on partnering the most.There are a few parts to Kinda's approach for driving buy-in for a domain to take data ownership. The first is that you have to understand each domain and person is different. There isn't a single script that just works. These are people you're dealing with. Be prepared to customize the conversations and solutions. Second, have an understanding of what ownership actually means so you can communicate expectations clearly. Hard to understand exactly what's needed at the very start of a journey but get as close as you can. And third, when actually having the conversation, don't start from the expectations and new demands on them, start from what you're doing to make it possible so they actually can take ownership of the data. That includes tooling, training, and realigning their KPIs/OKRs so the data work is actually a key part of their role and measured results.For any data governance effort, Kinda believes you should make sure to focus on all three of technology, process, and people. The people aspect is the most overlooked but your people are what provide scale by enabling the application of whatever governance work you are doing.In general, driving data culture is crucial for data governance in Kinda's view. Data literacy and culture is part of data governance of course but in general, to find your internal key enablers and to inject data as part of the business processes, you can't just create mandates or platforms. You need to make creating spaces for people to share ideas and celebrate each other part of your culture.For Kinda, when asked Zhamak's provocative question of "Will we actually need a CDO in an organization doing data mesh?", she and Scott agree the answer is yes. Because, at least at the start, there needs to be someone running that central coordination and helping to make all the decisions around what to decentralize and when and a very important one: doing the exec to exec buy-in and communication. So strategic direction and maintaining momentum and engagement is crucial, thus a data leader is necessary.Kinda believes in both a top-down and bottom-up approach to doing data governance in data mesh. The top down is that executive buy-in and helping to change people's prioritization and then the bottom-up is focusing on the use case and making sure they have the necessary guardrails and processes as you continue to build those out.In wrapping up, Kinda shared a somewhat controversial viewpoint on interoperability. While it is important, trying to focus on it too much will likely slow you down. She believes you shouldn't let interoperability get in the way of your work and you can't try to solve for all interoperability at the start.Quick tidbits:If you are doing data governance right, it will remove friction instead of add additional overhead. At the start of a journey, the scales might not be as far in favor of removing friction but always look to build that out - that's a key success metric.For every project or initiative always, always, always start by assessing your data literacy gaps. If you don't, you are setting people up for failure.You can show success through your data governance program by measuring things like time to actionable insights. Metrics in general will play a big role in proving out success so try to measure what is changing and you will probably be bad at measuring at first. Learn more about Data Mesh Understanding: https://datameshunderstanding.com/aboutData Mesh Radio is hosted by Scott Hirleman. If you want to connect with Scott, reach out to him on LinkedIn: https://www.linkedin.com/in/scotthirleman/If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereAll music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf

Aug 6, 2023 • 20min
Weekly Episode Summaries and Programming Notes – Week of August 6, 2023
Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/All music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf

Aug 4, 2023 • 1h 18min
#245 Panel: Lessons From Doing Data Mesh Again: The Second Timers Club - Led by Samia Rahman w/ Khanh Chau and Sheetal Pratik
Please Rate and Review us on your podcast app of choice!Get involved with Data Mesh Understanding's free community roundtables and introductions: https://landing.datameshunderstanding.com/If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding. Get in touch with Scott on LinkedIn if you want to chat data mesh.Transcript for this episode (link) provided by Starburst. See their Data Mesh Summit recordings here and their great data mesh resource center here. You can download their Data Mesh for Dummies e-book (info gated) here.Samia's LinkedIn: https://www.linkedin.com/in/samia-r-b7b65216/Khanh's LinkedIn: https://www.linkedin.com/in/khanhnchau/Sheetal's Links:Sheetal's LinkedIn: https://www.linkedin.com/in/sheetalpratik/First Paper at ICEB Conference: https://easychair.org/publications/preprint/qZ3mInterview in Harvard Business Review: https://hbr.org/resources/pdfs/comm/BeyondTechnology_CreatingBusinessValuewithDataMesh.pdfMedium Blog on Saxo Bank Implementation: https://blog.datahubproject.io/enabling-data-discovery-in-a-data-mesh-the-saxo-journey-451b06969c8fIn this episode, guest host Samia Rahman, Director of Enterprise Data Strategy, Architecture, and Governance at life sciences company Seagen (guest of episode #67) facilitated a discussion with Sheetal Pratik, Director Engineering and leading India Data Integration Platform at Adidas (guest of episode #24), and Khanh Chau, Director of Cloud Data Architecture at Grainger (guest of episode #44). As per usual, all guests were only reflecting their own views.The topic for this panel was a bit different with reflections and learnings from doing a second (or more) data mesh implementation from a practitioner standpoint - it's people who have the experience to reflect back on multiple implementations to give advice to their past selves and those earlier in their journeys after seeing multiple organizations implementing data mesh up close. Khanh Chau was the leading data architect for Northern Trust's data mesh implementation before taking up the same role at Grainger for their implementation. Sheetal was the head of data integration at Saxo Bank as part of their data mesh implementation before moving to Adidas where she is leading the India Data Integration Platforms and has also played integration lead for moving an on prem centralized integration platform to federated cloud integration platform. And Samia worked deeply on two implementations at Thoughtworks - including one closely with Zhamak - before starting at Seagen. Scott note: I wanted to share my takeaways rather than trying to reflect the nuance of the panelists' views individually.Scott's Top Takeaways:Prepare to take many concepts from abstract to something concrete and explain it to many people. Repeatedly. And your definitions will change over time too. A big part of leading an implementation is about communication and keeping people on the same page and informed.Similarly, prepare for confusion. People will go to different sources for information - including a lot of vendor content - to learn about data mesh. So keeping people aligned and understanding key aspects of data mesh are crucial. "What is data mesh?" can be a dangerous and difficult question when it shouldn't be.It's important to learn from other organizations' implementations but your journey - if you are going for success and not simply copying - will be unique in what matters most when. It's easy to get overwhelmed trying to manage every aspect perfectly. So look to focus on learning and iterating along the way instead of getting it perfect upfront. Stepping back and looking at the 7 journeys across the panelists, they really don't look that similar.When choosing which domain(s) to partner with, highest value or ROI can look like the most important metric but really, finding someone who is willing to take a risk and will also champion your solution when it succeeds is the most important. You do need to find use cases that have reasonable ROI but having a true partner is the most important part.Data mesh isn't a solution to the business people - and it really shouldn't be to data people probably too. It's a way to address an ever growing challenge and issue within the organization. The business people feel the pain, talk to the pain. They don't care nearly as much about the how as you assume. Make them a tasty sausage, not give them a sausage factory tour.Don't underestimate the fear of change and loss of control. Data mesh is a new approach and many are sick of new data approaches 😅 but people's fear of getting left behind or not being in control is a natural human reaction. Make sure people understand how this makes things better not just for the organization but also for them.Defining data mesh success metrics - at the micro level of data products and at the macro level of the entire implementation - is still hard. There are a lot of measures you can use that are helpful - and do look into fitness functions - but it will take some effort to find good success metrics. And those metrics will change over time.As some past guests have also noted, a good way to drive buy-in for those actually doing the data work is to make it faster to produce data for themselves, not just for others. If you can make getting access to data and creating something useful for the data producing domain quicker, many will lean in to leveraging the platform quickly.Other Important Takeaways (many touch on similar points from different aspects):Faster time to business outcomes is likely attractive to business partners to get buy-in AND a success metric you can actually somewhat reliably measure.For success metrics for your implementation itself, look at time to deliver new products. Slow delivery might be because a domain isn't ready or leaning in but repeated slow delivery is probably a problem with the platform in some way.It's tempting to try to start from the infrastructure aspects first but you have to focus on the value of doing something like data mesh. What is the value proposition for the organization? What is it for the specific person you are talking to?The needs of your organization will look different than other organizations. You probably can't know exactly what will end up being your big focus ahead of time. A lot of the learnings and realizations can only happen when you're in the midst of your journey.Because every part of an organization matures differently, each domain generally has developed a way of working that fits to their domain but makes it very hard to then standardize the ways of working or sharing data. You need to build central frameworks for collaborating but still allowing domains to work (mostly) in their existing ways. Otherwise, you are probably fighting a losing battle.It's easy to lose sight of how important the business partners are to your data mesh success. They can provide a ton of leverage to driving buy-in, budget, etc. Or, they can provide a ton of headwinds/resistance. Find your good partners and make sure you're aligned on making your overall data mesh approach a success.Similarly, at its core, data mesh is about business transformation. Yes, transformation driven by changing how we interact with data and do data work but business transformation nonetheless.To get funding/sponsorship for something beyond the first few data products, you have to show the value proposition, especially in a soft economic environment. You aren't going to do that by talking technology. It's important to show a good return soon for sure but also a broader vision.Technological solutions don't equate to business value to most business people. Talk to them about what they care about. With every success, you can build momentum. But that momentum only builds if you're selling the successes, telling people and proving out the value. Data leaders need to be as much about showing off the results as achieving them unfortunately.It can be effective to mesh buy-in to point out how decentralized efforts will probably happen - e.g. shadow IT - but they become their own little IT fiefdoms if not at least coordinated in some way through a central mechanism. It doesn't need to be ownership but if you don't have a central framework and ruleset, you end up with silos, spending more money on many platforms, and worse quality data.As Khanh said regarding governance, "So how do we bring this whole ecosystem of technologies [together] in such a way that is safe for people to do things faster and meet all the regulatory compliance?" Two key words in there are safe and faster, don't make governance about gates, make it about getting to data production more quickly but still safely.While data mesh may not itself be cheaper than doing nothing, it often results in cost savings relatively quickly by removing duplication of work. If everyone is at least broadcasting what they are working on, then there is far less overlap that might cause confusion and two teams doing essentially the same thing.Trying to build the entire platform ahead of time is a really bad idea but you also can't come to the business with an inferior, clunky, hard-to-use, immature solution. Make sure your organization has some platform capabilities to not make it a huge hassle to build out early data products.Don't try to onboard every domain at the start of your journey. Start to seed the conversation with them, talk to the pain points, but you can't handle 30 domains coming onboard at once.When driving buy-in, don't only focus on executives. You need people throughout the organization leaning in. If you need new skills to accomplish data mesh, you need people who will acquire those skills and level up their data fluency. Talk to the people doing the actual day-to-day work.You potentially don’t need a huge budget to start a data mesh journey. You can start lean - yes, you will have leaner results but if you can prove faster time to business outcomes, you can generate more and more momentum.Build momentum around your mindset shift. You aren't going to convince people overnight.It's important for domains to take over ownership of data but that also doesn't happen overnight. And your initial ownership will probably not be as strong as you'd like. And it probably won't involve the operational plane system experts/developers as much as you'd like. But that will evolve and get better. You have to get started with okay, not perfect. Crawl, walk, run.You probably need to apply DevOps principles to your data products where the data product developers also support the data products, do the operations aspect of them. But it might be hard to get people really bought in on that when your platform is still pretty nascent.You want to build feedback mechanisms around data products to drive domains to make them better. But there are many concepts around data product feedback being touted by vendors but few people are happy with what they've built or are getting from vendors. The best feedback mechanism is probably a culture shift towards providing more/better feedback.As Samia said, "It's not really on us as governors to govern and … force it on you but to empower you so that you are doing the right things for yourself and for your organization."There's a ton of interest in how generative AI can play in data mesh but it's still far too early for it to have a big impact.Learn more about Data Mesh Understanding: https://datameshunderstanding.com/aboutData Mesh Radio is hosted by Scott Hirleman. If you want to connect with Scott, reach out to him on LinkedIn: https://www.linkedin.com/in/scotthirleman/If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereAll music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf

Aug 1, 2023 • 3min
Weekly Data Mesh Roundtables - Join Us!
You can simply use the link here https://us06web.zoom.us/j/87380091930?pwd=NVhUS3ZDOS9hSFFBbU1sQm9HQUJHZz09 or find a link on LinkedIn events: https://www.linkedin.com/events/weeklydatameshopenroundtable-da7091373106551734272/comments/Link to past recordings playlist: https://www.youtube.com/playlist?list=PL9tZROTS_Pi7TjjhLR-sCtwQVeth6Jv7e

4 snips
Jul 31, 2023 • 1h 5min
#244 Leading a Data Transformation the Empathetic (and Right) Way - Going Far Together - Interview w/ Benny Benford
Please Rate and Review us on your podcast app of choice!Get involved with Data Mesh Understanding's free community roundtables and introductions: https://landing.datameshunderstanding.com/If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding. Get in touch with Scott on LinkedIn if you want to chat data mesh.Transcript for this episode (link) provided by Starburst. See their Data Mesh Summit recordings here and their great data mesh resource center here. You can download their Data Mesh for Dummies e-book (info gated) here.Benny's LinkedIn: https://www.linkedin.com/in/bennybenford/Benny's Substack: https://www.datent.com/p/elevating-data-to-a-profession-whyIn this episode, Scott interviewed Benny Benford, the former CDO at Jaguar Land Rover (JLR) and who is currently building out a community around data transformation. To be clear, he was only representing his own views on the episode.Some key takeaways/thoughts from Benny's point of view:!Controversial!: As a data leader, if you want to be effective, you will have to focus much more on what role data work plays in the organization rather than the data work itself. It's about partnering and communication, not SQL and LLMs. That can be a hard lesson but it's crucial to being successful.?Controversial?: It's easy to try - and common to want - to separate data culture and organizational culture but it doesn't really work. "Data culture is … an aspect of your organizational culture that you're trying to build and create."A guidepost for your transformation journey: "If you want to go fast, go alone; if you want to go far, go together."The data team can be leaders in transforming the organization but they must do it in collaboration, not just going out as pioneers 'to set an example' or something. Absolutely partner with your transformation org if you have one.When you align your data culture transformation to the overall organization transformation messaging, you aren't fighting for attention to a different message. You are using the momentum of the overall org's transformation strategy. Don't try to fight the tide if you don't have to. Try to get that data transformation as part of the org's strategy and messaging.Given the emerging attention to data in many orgs, it can be tempting to try to go off and try to set culture. But that's almost certainly not what the company is asking a data leader to do.?Controversial?: "Agile and data culture are two sides of the same coin." They are very close just approaching things from different angles. Scott note: I mostly agree. The big challenge is when people are overly rigid with their approaches to Agile.If you try to force people to share data, it often stokes fears of 'can this data be used against me?' And that is a rational fear in an overly competitive org. You need assurance from senior management/the board of directors that trying to undermine other lines of business/domains will not be tolerated. If you want an open data sharing culture, you need to create clear consequences for internal political misuse.Specifically at JLR, when they opened up a lot more access in their Tableau environment, it felt risky. But because of their well-defined and broad reaching initiative, it was nothing but positives, leading to far more/better collaboration.Being frightened of change is totally normal. Don't try to solve every aspect ahead of time. Oftentimes you have to move with uncertainty and have a good framework for adjusting as you learn more. 'Bravery isn't a lack of fear, it's moving ahead despite your fear.' Scott note: this should be at the top of every data mesh strategy document…Very few data leaders have strong transformational backgrounds and it can be hard to drive transformation without that. Look to educate yourself on how to drive effective transformation.Champion's forums for your data efforts are very effective. It can inspire others but often, it can also draw out existing challenges you might not have known. Leverage those who will share their experiences to make it less scary and improve processes for those that follow in their footsteps.All organizational choices essentially have puts and takes. Don't try to hide the weaknesses or challenges of whatever route you are taking. Be honest.When decentralizing data work and ownership, many people in the centralized team may worry about their role and their future with the organization. Provide clarity and information on happy paths for them inside the organization including their career development.Try to explain how the transition is happening "with you" not "to you". Easier said than done.When thinking about data transformation, it can be helpful to think of it as splitting into projects and program. You need someone minding each aspect. One person trying to own both - the day-to-day execution and the long-term change in how your data org works - can be pretty difficult. It's a LOT.?Controversial?: Transformation happens when a few people's main focus is leading that transformation. If you have it as a bit part of many people's roles, it's unlikely to amount to a cohesive change effort.You don't need a huge training budget or permission to train everyone in the organization. As a data team, you can start a small training program and as that sees success, you can build more and more. There is a lot of great free content out there, make sure to organize it for people but you can get going now.Get people used to the idea of imperfection in data - and then give them permission to do imperfect things. Don't be cavalier and sloppy but it's okay that things provide value without being 'perfect'.EVERY organization is far more immature with data than they show publicly. Everyone feels like they are behind. Meet your peers and embrace sharing reality instead of the Instagram view of your data transformation journey. It will lead to better information sharing and probably also better sleep at night.It's crucial for data leaders to find allies across the organization. You can't transform a 5K+ person organization by yourself. Other teams can provide you leverage and support. Partner with those willing to partner.Benny started off talking about data culture and organizational culture. He doesn't believe you can separate the two but it is a common desire among data leaders to do so. Data culture is something you build and create as an aspect of your organizational culture. Otherwise, you will always be fighting against the tide of your organizational culture. It all needs to be integrated change/transformation. Otherwise the organizational changes will be in conflict with your data culture changes. And people are employees of the organization, not only the data team, at the end of the day."If you want to go fast, go alone; if you want to go far, go together." For Benny, that emphasizes the long-term transformation efforts you need to drive to a better data culture. You can go somewhere quick but if everyone is going somewhere quick, there is no cohesion, no real concerted change to the organization and thus the culture. If you have a general transformation team, align with them. Align on the overall corporate goals and leverage them to drive your data culture transformation. Yes, slightly harder than just creating your own but far more effective and long-lasting :) Yes, you want to get your data transformation initiatives as part of the org strategy but it doesn't have to be the main focus. And it might take a bit of time to make it a core pillar of the strategy but that's how you do effective change management: building to better over time.Benny made the excellent point that while it can be tempting to leverage the attention data is getting in many orgs to try to drive major organizational culture change, it's almost certainly not what the org is asking a data leader to do. Just because there is some 'juice' there, it's not where a data leader should focus. Ask if your job is really leading the organizational change or is it about delivering on data related objectives?While pressing for an open data sharing culture is great in spirit, Benny has seen it can stoke a lot of fears - what happens if someone looks deeply into my line of business/domain to try to find mismanagement or incorrect decisions? And that could happen in a very competitive organization. That is why the cultural guidance needs to come from senior management and the board of directors around what would be considered unethical internal use. Assuring people there will be consequences for trying to undermine instead of help other lines of business can reduce a lot of fears. When JLR opened up their data sharing, it felt like a risky proposition but really, because the guidelines were clear and it was about opening up much more information access, it went very smoothly and significantly improved collaboration.Benny pressed on the idea that we can solve every eventuality, every challenge in a data transformation ahead of time. Of course, we all know that's not possible. People will be afraid of change, that's normal. But having a good transformation strategy incorporates that fear and addresses it by having good ways to uncover and address emerging challenges. Assure people that while you don't have every answer, you have their back.It's also very common for data leaders to really lack experience in transformation initiatives according to Benny. While data leaders are really good at data, that lack of transformation experience can be hard. Lean into this isn't about going it alone, we're in this together :) Benny is a big proponent - a champion you might say - of champion's forums. Essentially having a place where champions of your data initiatives can exchange information with each other and your data leadership team are crucial because you can find your existing challenge points much easier. If you take in their feedback, they feel seen and heard and lean in even more. You also have great points of leverage to inspire others based on the success of your champions.When moving from a centralized to a decentralized structure for the data team, Benny saw a lot of fear from those in the centralized team. Would they even have a role in the changed organization? What was their career path? You don't want to lose your data talent - it's so expensive and hard to replace - so be clear about the path forward for them. There is always uncertainty but try to show them they are and will continue to be valued.Another experience from JLR that worked well for Benny and team was that when he was hired as a data leader, he realized he probably couldn't lead all of the transformation of focusing on both the day-to-day execution on data projects and the overall change program. So he got in a second leader to lead the projects while he focused on the program aspect. Transformation doesn't happen in a vacuum, people need to be focused on that specifically. That's the program aspect. Just executing well won't transform your team and culture.Similarly, Benny shared how they built a transformation flywheel around data - at first, it was training a few people as part of the central data team's time, training them to act on their own in their lines of business. They didn't even have budget for this training specifically in year one, two, or even three as it continued to grow to 100s of people. They started seeding the organization with data capable champions that pushed others to take the data training. And after a few years, the training was so much in demand, it was too big for the data team to own so they brought in external trainers. But you can start small and have a big impact.Benny recommends when getting started with - or really any point along the way of - a data transformation initiative, a big benefit is to get people used to and comfortable with the idea of imperfection in data. You can capture great value even around something that isn't perfect. And it will never be perfect. That's okay. Nothing in this world that is complicated ever gets to perfect!Deep partnering with other parts of the organization might seem obvious but in Benny's experience, it isn't all that commonly done by data leaders. The data team should have partnerships with HR, Finance, Sales, Marketing, etc. And if you have a transformation office, that should be your number one best friend internally. You need partners to move the organization forward. And that also means data leaders need skills that aren't just data skills.Benny wrapped on that point - what is a data leader's role in a large organization? Their role is rarely to focus on the data work itself anymore and that can be a bit of gut punch for those who love data. But it's about building the bridges to the rest of the organization and helping them do their work better. That means you're still doing the data work, just with the lens of bringing it into your business partners' context. Still, it can be frustrating and hard to give that data work up. That's not unusual or unexpected, it's just part of what it means to take on that data leadership role.Quick tidbit:Data culture and Agile culture, Agile transformation end up being pretty similar in many respects. Look to how organizations are successfully implementing Agile transformations to inform your data transformation strategies.Learn more about Data Mesh Understanding: https://datameshunderstanding.com/aboutData Mesh Radio is hosted by Scott Hirleman. If you want to connect with Scott, reach out to him on LinkedIn: https://www.linkedin.com/in/scotthirleman/If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereAll music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf

Jul 30, 2023 • 34min
Weekly Episode Summaries and Programming Notes – Week of July 30, 2023
Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/All music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf