Data Mesh Radio cover image

Data Mesh Radio

Latest episodes

undefined
Jul 3, 2023 • 1h 11min

#236 Driving Buy-in For Decomposing the Monolith; and Then Actually Doing It - Interview w/ Brenda Contreras

Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding. Get in touch with Scott on LinkedIn if you want to chat data mesh.Transcript for this episode (link) provided by Starburst. See their Data Mesh Summit recordings here and their great data mesh resource center here. You can download their Data Mesh for Dummies e-book (info gated) here.Brenda's LinkedIn: https://www.linkedin.com/in/brenda-contreras-9649a47/In this episode, Scott interviewed Brenda Contreras, VP of Engineering and Architecture at Self Financial.Some key takeaways/thoughts from Brenda's point of view:"Iterate small and sell your solutions on a practical level."It's kind of funny how often people in tech try to skip the communication. If you really align on communication and understanding, your business partners are far more likely to empower you to drive business value for them through engineering and data work.?Controversial?: As an engineering/data leader, don't dictate: set the vision, explain the vision to business partners, but try to let your technical team leverage patterns that will work for them instead of only your favorite way. Similarly, make sure your team understands which aspects of target outcomes drive value and why. They might have an approach you didn't expect but if they aren't focused on the key aspects of the outcome, even amazing feats of engineering won't create value if it's not tied to business needs.Fail fast is very important to doing microservices right. How can we learn to adopt it in data and AI? "We need we need to be … able to experiment more, we need to be more flexible" to really drive to business value quicker and easier.Before you start to decompose anything, it's crucial to understand what you already have. That can sound a bit obvious but if you start trying to do the work before understanding the 'before' picture, getting to a good 'after' picture is going to be very hard.You have to understand the landscape of your systems but it's equally important to understand how those systems power the actual business. Not just services or products, but the business. What matters and why? Talk with and carefully listen to your business partners to find those.It's crucial to align with your business counterparts on big picture visions and then what aspect of the big picture you will address and get clear on why and when. This can feel obvious but it's often overlooked. Over-communicate, share your vision often and far and wide.Your business line leaders don't care about the how of tech - most of the time at least. They care how can you support them in tackling their goals and challenges. Make your communication about changes focused on that. They don't care how the sausage is made.It's absolutely valid, even in a data mesh or microservices journey, to have a "sustain bucket". Things don't need to change for the sake of change. 'Legacy' doesn't have to mean bad/a headache.When you start to talk about breaking down the monolith internally, it's important to sell people on the reality: managing smaller systems is easier. Having one group that owns a set of systems - and that it's clearly defined - makes it much easier to find the information you need. Microservices isn't easy but it does make things much easier to manage at the micro level.Save talking deep tech or data for the people doing the tech or data work - when you talk to business partners, abstract that for them. They don't care about the particulars of your system for sending emails - they care that the emails are reliably sent and tied to the business aspect.When you first start moving towards microservices, look to start setting up your domains around your products. Then you can start to break your systems down into microservices from there. You should think about data mesh the same - don't start building the data products without knowing the exact owner and why they own it.Don't try to toss out the old in favor of the new. Look to net new on your new platform or approach. Just because something has been around for a while doesn't mean it isn't valuable as is. Look for where changes will make a big positive impact.Be very ambitious in your vision for where you could go. But take it at a reasonable pace - don't be in so much of a rush that you create more challenges than you address.You can sell the value of experimentation at the experiment level. It's hard to just get an overarching experimentation budget but constantly showing the value of fast experimentation will make it easier to get the time to do more experimentation in the future.Brenda started with a little about her history, mostly on the app development and software engineering side before moving to Self where she's added data/analytics to her architectural focus. She learned a lot working in the office of the CTO at Charles Schwab about how to decompose a monolith and not become overly rigid around the philosophy of microservices at the same time - how do you still have shared services? How do split into product domains? How do you make sure everything plays nicely together internally and for the customer?That learning about how to share data well on the operational plane is serving Brenda well in general at Self. When she first arrived, creating an inventory of what already existed and how it worked together was crucial to start decomposing appropriately. What were the systems yes but especially how they powered the actual business. What really matters and why? What are the key strategic initiatives and how can you positively impact them by improving systems? A 99% improvement in latency on something that drives no change to how people use systems is wasted if impressive work. It's okay for things to live in a "sustain bucket" as you focus on key business drivers. And you can only really find those key drivers by listening to your business partners.When Brenda started to consider where to start with decomposing the operational systems at Self, she first looked to where they might "run out of runway". What were the systems that were likely to cause trouble driven by growth? Because you don't want to have to tell your business partners they have to stop growth 😅A key aspect of decomposing a monolith is getting buy-in. Brenda had a lot of success with showing people the benefits of smaller systems that are easier to manage. Deploys are easier. Maintaining a smaller database is easier. It's easier to figure out who owns what - if someone needs information, they know who to go to. Etc. And extracting the information and then explaining your plan for change should be done in the language of the business. They don't care if you are switching your backend to Cassandra, they care about what impact changes will have on the business and what changes for them.Brenda recommends that when you start to look at decomposing a monolith into microservices - so on the application side - you should start by breaking into domains first. Without clear owners and sets of microservices, you start to split things off without a clear forward path. Once you have the domains identified, you can start to move necessary capabilities into your microservices.Differentiating baby and bathwater - what should get tossed out and what should be kept - is crucial in Brenda's mind when you are doing any kind of transformation. There are existing solutions that you don't need to migrate immediately because they're fine as is. Again, don't look to make changes for the sake of changing something. Look to net new use cases as you start to decompose and eventually those legacy use cases will get reimagined and replaced.It's very easy to focus on the wrong things in Brenda's experience - instead, focus on the conversations with business partners and work backwards towards what really matters. Then the work follows from that. E.g. in an acquisition, integrating systems and/or migrating the acquired company to your systems seems to be a major focus instead of integrating what will drive value and keeping things again in that "sustain bucket".Brenda's secret sauce to aligning with business partners comes back again to communication and understanding the big picture of the organization, how their part of the organization fits in, and what are their priorities. Essentially, it's taking the vision for where we want to go and breaking that into manageable projects to move forward on. Always be communicating priorities and timelines and working with people to realign at the small and large scale vision level. And, a biggie is to _really_ listen to your business partners and C-level execs when they talk challenges. "We need to be … able to experiment more, we need to be more flexible," Brenda said relative to figuring out how to drive more business value around data. The fast fail approach in software is crucial to getting to business value fast - how can we adopt that in data? Partly, it's again going back to communicating well with your business leaders through good storytelling. She said, "iterate small and sell … your solutions on a practical level."In wrapping up, Brenda again emphasized the importance of good communication and collaboration with your business partners. It's so easy in data and tech in general to let that drop to focus elsewhere. But when you are all at least reading from the same book, even if there is some disagreement in strategy, people can at least make informed decisions. Don't try to skip the communication with your business partners!Learn more about Data Mesh Understanding: https://datameshunderstanding.com/aboutData Mesh Radio is hosted by Scott Hirleman. If you want to connect with Scott, reach out to him on LinkedIn: https://www.linkedin.com/in/scotthirleman/If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereAll music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf
undefined
Jul 2, 2023 • 13min

Weekly Episode Summaries and Programming Notes – Week of July 2, 2023

Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/All music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf
undefined
Jun 30, 2023 • 15min

#235 Decom-mesh-ioning - Appropriately Decommissioning Existing Platforms in Your Data Mesh Journey - Mesh Musings 50

Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/So here are the summation points of this episode:Your legacy data platforms probably aren't going anywhere anytime soon. Those can have long lives but look to shut down unused capabilitiesYour new projects, where applicable, should be data mesh data products. But a large percent aren't going to be that at the start. Figure out some incentives for people to push their new data products to your mesh platform early if possibleYou should not look to lift and shift data projects/assets/products unless it is truly easy for all users - producers and consumers - lift and shift sounds great but it doesn't work well in all but the rarest of casesBe prepared for people to get concerned about having to migrate early - communicate strongly that you aren't forcing migrations of existing data workPlease Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/All music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf
undefined
Jun 26, 2023 • 1h 3min

#234 Doing Data Work That Matters: Perspective From a Line of Business Head - Interview w/ Iryna Arzner

Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.Transcript for this episode (link) provided by Starburst. See their Data Mesh Summit recordings here and their great data mesh resource center here. You can download their Data Mesh for Dummies e-book (info gated) here.Iryna's LinkedIn: https://www.linkedin.com/in/irinakukleva/Mobey Forum: https://mobeyforum.org/In this episode, Scott interviewed Iryna Arzner, Head of Group Customer Growth, Retail Banking at Raiffeisen Bank International (RBI). To be clear, she was only representing her own views on the episode.Scott note: I mostly use the phrase line of business or LOB instead of domain in this write up but they are mostly interchangeable.Some key takeaways/thoughts from Iryna's point of view:As a line of business head, data has value but only in so far as they can use it. If it's not aligned to a use case or business need, data work can be more of a distraction than a benefit.It can be very interesting for a line of business owner to know how much their data is worth to other parts of the organization - that could drive funding for additional data work inside their LOB or even more funding than that because the LOB is core to driving business value at the organizational level."You cannot be successful in your data strategy if there are no business leaders that understand the value of the data and are very much determined to uncover this value." Scott note: couldn't put it betterA good way to get your business leaders more data fluent is to very closely pair with them. Sitting side-by-side on a project will up their fluency far better than any training course ever could."How do we get these data insights that we actually need to fuel the business strategy?" It's crucial to understand the LOB business strategy and focus data work around that. Start from the business needs and work to the data, not the other way around. Scott note: PREACH!{For many senior business execs, they've operated on gut and limited data for literally decades. You need to partner with them to show why data makes their decisioning better/quicker instead of that the data will now make the calls. Data is a tool for them to be better, not their AI robot replacement :)You can't force a senior exec into leaning in on data. Start from their actual business goals and work backwards towards what could be better or what can't they do unless they have better data.It's easy to say your data work should have business value but actually get specific around driving to how your data work supports the business strategy - corporate and line of business strategy. Scott note: if it doesn't have business value in some respect, why do it?!Far too often, especially in advanced areas like AI, data people/teams want to do really cutting edge projects. But if they aren't attached to business priorities - or even if they are but just looking at the business challenge in a way that doesn't align to business leaders - it will likely be wasted effort.?Controversial?: Please, please, please stop building platforms for the sake of platforms. LOB leaders don't care about what cool tech you use, they care about capabilities that drive business value.?Controversial?: If your LOB execs aren't willing to share their strategies and use cases necessary to develop planned data work, you might have to resort to bringing things to them you believe are aligned and hope they bite. Scott note: if you can skip these execs at first, they'll come around if others are driving more value through dataRelated but very different, talk to LOB leaders about if they are interested in those potential insights that aren't directly tied to current business strategy/challenges. Some (e.g. Iryna) like to be opportunistic and have the cognitive load for potentially interesting if tangential insights. The key is to ask.!Controversial!: The "time is past" where business leaders can get away with not understanding tech or data. Business leaders need to understand the necessary data capabilities to drive to their use cases. They can't just wait for insights to be dropped in their laps.Data people need to help business leaders understand the value of a platform approach. Just because something can be addressed 'tomorrow' doesn't mean that it's the right call. What is sustainable and scalable? It's crucial to communicate that.?Controversial?: It's okay to focus your data work on the pioneers, the data advanced domains. Yes, they might get even further ahead of the pack but they will be easier to work with. Scott note: this is actually a pretty big debate in data mesh as it can lead to building a platform only for advanced teams.Really closely partner between the business and data teams on use cases. If you requirements dump and the data team does work divorced from the business context, it will rarely drive good value.Part of delivering a good product or service is giving your customer something relevant to them. In data, that seems to be missing from lots of work - what is actually relevant to someone's needs and wants?There is a tendency by many data people to defer to the data first. Instead, think about leveraging your subject matter experts more to help define what data you need or what reasonable hypotheses to test :)Iryna started out with a bit about her background and then jumped right into it. She said "it takes two to tango" relative to the business side and the data side collaborating. She understands the power and value good data work can drive for her line of business so she and her team are often pairing with the data team to figure out how to drive value.For Iryna, a data strategy cannot be successful unless there are at least some business leaders that are leaning in - that understand the value of data and are also determined to actually capture that value. Specifically in her area, Iryna gave an example that while in the past, banking was often done via relationship with a banker - so insights about customers could essentially be managed in the head of the banker - as people move to mostly digital, there needs to be a lot more data insights to be able to offer them the best services to their needs at the right time.Starting from the business strategy and the use cases and working backwards to what data is necessary to support those use cases is crucial in Iryna's view. It's proving to be the case over and over in data mesh that starting from the use case is key to getting value out of the data work because you need to know what would actually drive business decisions and actions and then drive into the fine details. But it's easy to get lost in the data work AND it's just as easy to get lost in the business strategy - you need to actually get specific about the data products necessary to support use cases.Iryna discussed that most senior execs have been operating much more on gut than on data for years to decades - you aren't going to change that overnight and you need to partner with them to make their lives easier via data if you want them to lean in. And the best way to do that is to partner on their business challenges, looking for opportunities or challenges they can't address right now without better data. And don't start with the biggest challenges first, think small and quick to prove out value as you build to bigger and better.A common if understandable failure point Iryna mentioned was data people's desire to do really challenging work - fairly often irrespective of if that really matters from a business priority standpoint. Or even if some AI work is aligned to a key business objective or challenge, if it isn't aligned to the way the business leader thinks about the business, it often won't leveraged because they have to change their mental model of their business just to "get" it. So again, start from the conversation with the business leaders and work backwards to significantly increase your chances of your data work driving large-scale business value.Iryna mentioned she's "allergic" to platforms built for the sake of building a platform - 'it's what everyone is doing, so we should too!' Or wanting to use fancy tech instead of focusing on capabilities. Those are red flags that the data team aren't aligned to business value or learning from past mistakes - past projects didn't fail because the platform just wasn't cool enough, it was because the platform didn't make it easy to do what was needed!Many execs in Iryna's view won't want to engage much on data that isn't tied to their strategic initiatives/challenges. But, there are still many like her that are happy to be opportunistic. Actually have the conversations and ask the leaders where they come down on that. Some want people laser focused on what they are trying to do but you can build interesting and highly valuable partnerships with the flexible and opportunistic ones like Iryna."This is the past," is something Iryna said about business leaders having no clue about technology or data. Leaders need to understand at least the basics and upskill to be better able to leverage the new data capabilities from new tools being developed all the time. The data team will be needed to identify - "in a nice way" - business leader skill gaps and help them close those gaps. And it's important for business leaders to understand the benefit of platform and product thinking in data. Just because a challenge can be 'solved' tomorrow doesn't mean it will stay solved! Sustainability and scalability of solutions in data are crucial.Iryna believes it's good to at least start your data initiatives working with the "front-runners and pioneers". That's because then you will save time and hassle - they are already bought in and capable. Seeing those front-runners drive lots of value with data will lead others to want to follow. Scott note: there is a lot of controversy around this topic - will you only build for teams that are already advanced? How far do you go with this strategy?As to where lots of data projects go wrong, as past guests have also noted, Iryna pointed to the requirements dump pattern - the business stakeholders and data team meet, information is exchanged and then the data team goes off and works on the use case and comes back a month or two later and it's not really what the business side wanted and it doesn't deliver the value. Instead, as Ghada Richani also pointed to, the business and data side need to pair closely throughout the process as the use case and needs evolve and as incremental value is created. The data team can actually get a big boost to happiness from seeing the actual business value get created from their work. Make this whole process mutually tied and mutually beneficial.When thinking about products and good customer interactions, Iryna says it's important to bring your customer something that is actually relevant to them at that point in time. We should look to do the same in data. And that should honestly be easier in data because your customer can - and should - literally tell you what they care about and want. And you can prototype together instead of having to deliver a final product - you need to communicate that but it's more of a partnership than just a customer relationship. Start to work together on the vision of what success looks like.Iryna gave a great recent example of when the data isn't going to necessarily give you the best information. They are looking at a specific financial product and how to sell it better based on certain life event triggers. So, instead of making guesses and random hypotheses, why not ask the people on the ground for what they look for, what are their hypotheses as to why they are selling when and what are the best triggers to use to sell the financial product? This is how data and business processes can build off each other - you don't have to treat your data as if it should tell you the information - extract context from subject matter experts!While she might not be the most typical line of business head in a data mesh implementation - she's more altruistic than what we're often hearing - Iryna and her team are simply excited to understand what value their data has for the rest of the organization. That might even lead to insights that get shared back to her team for additional use cases. But proving out the value of her domain's data is also likely to lead to more funding for their data work so it should have a positive impact all the same.In wrapping up, Iryna urged listeners to tie their data work more closely to the business strategy - yes at the corporate level but even the line of business objectives. Every bit of your work should be tied somehow to driving business value and you should have visibility to that. Yes, platform work can have that but you should understand how that platform work actually drives value. And also, you should sit side-by-side with your business leaders in your data work - that will up their data fluency far more than any formal training program ever could.Data Mesh Radio is hosted by Scott Hirleman. If you want to connect with Scott, reach out to him on LinkedIn: https://www.linkedin.com/in/scotthirleman/If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereAll music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf
undefined
Jun 25, 2023 • 19min

Weekly Episode Summaries and Programming Notes – Week of June 25, 2023

Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/All music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf
undefined
Jun 23, 2023 • 1h 6min

#233 Panel: A Head Data Architect's View of Data Mesh - Led by Khanh Chau w/ Balvinder Khurana, Yushin Son, and Carlos Saona

Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.Transcript for this episode (link) provided by Starburst. See their Data Mesh Summit recordings here and their great data mesh resource center here. You can download their Data Mesh for Dummies e-book (info gated) here.#233 Panel: A Head Data Architect's View of Data Mesh - Led by Khanh Chau w/ Balvinder Khurana, Yushin Son, and Carlos SaonaKhanh's LinkedIn: https://www.linkedin.com/in/khanhnchau/Balvinder's LinkedIn: https://www.linkedin.com/in/balvinder-khurana/Yushin's LinkedIn: https://www.linkedin.com/in/yushin-son-30362b1/Carlos' LinkedIn: https://www.linkedin.com/in/carlos-saona-vazquez/In this episode, guest host Khanh Chau, Director of Cloud Data Architecture at Grainger (guest of episode #44) facilitated a discussion with Balvinder Khurana, Technical Principal and Global Data Community Lead at Thoughtworks (guest of episode #135), Carlos Saona, Chief Architect at eDreams ODIGEO (guest of episode #150), and Yushin Son, Chief Architect of Data Platform & Data Products Engineering at JPMorgan Chase. As per usual, all guests were only reflecting their own views.The topic for this panel was an architect's view of data mesh, especially from an architecture lead standpoint. There are many challenges architects face in data mesh, managing the micro level minutiae, down to the data product output and input port decisions but balance that with crucial high-level decisions. Balancing the near-term and long-term vision and roadmap/North Star. Scott note: I wanted to share my takeaways rather than trying to reflect the nuance of the panelists' views individually.Scott's Top Takeaways:Balvinder said, "…data mesh expects a lot more out of architects, there would be a lot of people and process and operation management that you will have to understand." It's important to understand that architects aren't just building out the systems but the entire org capability to do decentralized data well. It's a lot of responsibility but also a lot of interesting new challenges.As keeps coming up in many episodes, doing decentralized data/data mesh doesn't mean everything is decentralized. That's what federated means in data mesh: building core building blocks that make the work far easier but not trying to chase into every corner on every use case - that's not scalable/repeatable. Creating ways for people to collaborate and interoperate easily. Doing the glue work so people can focus on the specific value add of the use case/data product.It's absolutely crucial to understand that data mesh is not a complete vision just yet. If you are expecting to pick it up and simply run with it like it's a playbook, you're in for a bad time. The tooling isn't exactly there yet to do this easily and even if it were, we are still in the early days of learning the patterns to do it well. It's like microservices in 2013 more than 2023.Every team will interpret data mesh differently - and many will interpret it in a way that lets them do what they want most :) Be prepared to step in to prevent people building everything themselves and also be prepared for teams that still expect a central data team to own everything. Communication and balance will be key.Multiple years into your journey 'we're figuring it out' will still be a common refrain. Don't make flippant decisions but you'll learn and improve your processes so leave yourself maneuvering room but don't worry about getting things perfect. Optimize for learning and iterating to better, especially on the culture side.Be prepared to compromise. You probably won't be able to decentralize major core systems - e.g. SAP ERP - immediately or maybe ever. While a perfect setup might be what you want, driving to value sooner is where to focus. Be practical and prepared for 'better but not good yet' kind of outcomes as you implement, especially early. It's easy, especially early in your journey, to get overly focused on one domain or use case. That can be at the architecture level, the ways of working level, all sorts of governance aspects, etc. You MUST keep a balance between the micro - e.g. down to the use case or data product level - and the macro of can this decision scale and serve the broader organization.Your organization is not a greenfield. Be prepared to not look exactly like any other orgs. You'll have lots of constraints. Frustrating yes, but every single implementation is messy behind the scenes. Everyone is trying to figure it out and cutting some corners and/or making missteps. Give yourself a break as you learn and iterate towards value.Other Important Takeaways (many touch on similar points from different aspects):We just plain don't really have data mesh best practices yet. They are starting emerge in certain aspects but if you aren't ready for some ambiguity and experimentation, you aren't ready to do data mesh.Every organization will find different balances between centralization and decentralization for certain aspects of a data mesh journey. It's okay if your balance looks quite a bit different from others'.Your balance between centralization and decentralization for many aspects of your implementation will shift over time. Constantly be measuring and assessing if things are 'good enough' and be prepared to change. But that also gives you the freedom to experiment :)Pre data mesh, it was hard to conceive of doing decentralized data similar to how microservices works on the operational plane. It seemed it would cause chaos and data silos. But if done well, you can give teams significant autonomy and still have a larger internal data ecosystem where data from across the organization plays nicely and can be more easily used.A centralized platform offering means more complexities than just one team owning a massive data lake setup - e.g. managing backwards and forwards compatibility across many domains - but also has lots of advantages. A biggie can be that with teams owning their own infra, there is no resource contention - especially important in financial services around month-end processing.The old paradigm of central data ownership and control prevent fast releases so you don't really have any autonomous work by the producing domains. And you can't keep up with things changing - they all go through a backlog that means the time between something happening in the real world and the change to the table in the data lake can be weeks to months. That just doesn't work for the speed of change in the real world.It's crucial to understand that data mesh is about building and using new muscles, ones around decentralization. Much like training, you will probably strain or pull some muscles. That's normal and to be expected. No one gets it perfect the first time around. Building out the interoperability at the tooling level - not even the standards to interoperate the data but the tooling to make it possible for teams to combine data across data products or source data from upstream data products - is going to be harder than you expect. Not even policy/security governance, just the way data flows independently throughout the mesh rather than one giant pipeline per use case.Early in your journey, you probably need to find domains that are excited to give data mesh a try. Rather than dragging a domain along, look for a true partner.Some degree of failure is inevitable in a data mesh journey. But the cost of failure - and our ability to learn and iterate from 'failure' to 'not failure'/value - is so drastically different in data mesh. Fast failure, while possibly scary, is a crucial concept to embrace. Quickly learning and iterating is key.Once you've built up the buy-in and excitement for data mesh, your job might become hype management :D People are used to data projects being built as fast as possible but with not as much focus on sustainability - data-as-a-product - or interoperability. Make sure to keep initial expectations manageable and share why doing it right is better than doing it right now.When building out initial expectations, maintaining flexibility for breaking changes is very helpful. You don't want to make breaking changes carelessly but as you learn how to do data mesh, it's key to be able to scrap something that you find won't work rather than support a non-scalable platform for years to come.Right now, you'll probably have to build a lot more tooling than you'd want, including some custom glue between services. Hopefully that gets far better but right now, there is still a lot of building to do if you want to do data mesh well.If you weren't part of the microservices revolution and evolution, really study what worked and didn't. There are many lessons learned that you don't have to relearn :)Yushin said, "Architects are not wizards. So we make a lot of choices and some of those choices will be wrong. And we need to experiment. And we need to pivot quickly once we find out that something's not going to work." That flexibility is necessary. If you don't have any flexibility to experiment and change, your org is probably not ready for data mesh.It's still very difficult to abstract away the necessary capabilities for managing data from the tooling. It's very easy to get overly tied to specific tools. That is costly and difficult to find specialists for many tools and becomes a blocker to scaling. Look to build good abstractions where possible. Tools are hopefully coming soon.Your room for experimentation is much broader, especially at the architecture/platform level, early in your journey. You probably don't only want to go with more 'legacy' tools, it's okay to try some bets on more 'cutting edge' offerings. There's balance of course but you can be a bit adventurous if you manage risk from that experimentation well.Really focus on enabling self-service for the domain teams - there are certain aspects of governance that should be enforced automatically by the platform so teams don't have to reinvent everything and then it all becomes a mess because everyone is handling things differently. And it's okay to delay tackling use cases because you aren't ready to meet all their needs just yet.It's going to take far longer than you'd like to sunset existing platforms and the data assets/data products using them. Don't press for lift and shift. Potentially look to the Strangler Fig Pattern as you replace existing use cases. And REALLY assure people you aren't going to decommission existing data platforms overly quickly.Look to prevent copies of data that don't have a very clear reason to exist. If you can always get the data, do you need to persist it? There's storage costs yes, but a bigger cost might be the headaches around governance - including bad actor access risk - of keeping track of all those copies… As Carlos recommended, look to push the cost of that governance onto teams that want to keep copies and see if they still want to make copies of data :)Potentially look to leverage tools that use open formats so you don't have to do custom integration around tools. Scott personal note: this is where I think data mesh can actually push the biggest industry changes. Imagine no more crazy proprietary formats and tools just work together!You can drive data mesh participation buy-in from consumers by showing the general friction and pain points of your existing data setup. Almost every team has a lot of pain points. Can you really trust upstream data right now or do you want a contract that provides you a lot of assurances?Cost will be a major factor in a lot of your architectural choices. Would that things were relatively equivalent but pricing across vendors is often extremely different. Be prepared to use more tools than you might want.Different teams will have different capability levels regarding data. There is often going to be some needs that fall back to a centralized team, whether that is the platform team or another shared central team. In an ideal world, it would all be decentralized ownership but sometimes the cost of creating a full data function inside a domain isn't worth it. That's okay because that team shouldn't be a bottleneck if it's only handling a few teams but be careful 😅Right now, a real value differentiation of data mesh is the ability to scale with nimbleness and agility while managing data quality. But for most organizations, at least right now, their data mesh implementation isn't super cost optimized or performant. Cost optimizations are typically second order concerns as you start out but will become bigger and bigger topics. Data Mesh Radio is hosted by Scott Hirleman. If you want to connect with Scott, reach out to him on LinkedIn: https://www.linkedin.com/in/scotthirleman/If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereAll music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf
undefined
Jun 19, 2023 • 1h 12min

#232 It's About the Value, Not the Data - Effectively Partnering With the Business - Interview w/ Aaron Wilkerson

Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.Transcript for this episode (link) provided by Starburst. See their Data Mesh Summit recordings here and their great data mesh resource center here. You can download their Data Mesh for Dummies e-book (info gated) here.Aaron's LinkedIn: https://www.linkedin.com/in/aaron-wilkerson-81bb21a/In this episode, Scott interviewed Aaron Wilkerson, Senior Manager of Data Strategy and Governance at Carhartt. To be clear, he was only representing his own views on the episode. Apologies for the lawn work sounds around the middle of the episode :)Before we jump in, this episode contains a lot of really good framing on how data leaders can actually partner with business people to drive to what matters for them. How do you extract what matters to the organization and to each specific business partner? And then how do you tie the data work to that? So while this episode is not heavy on data mesh specifics, it's really important to really considering the business partner's point of view and how to work with them to drive value for the organization.Some key takeaways/thoughts from Aaron's point of view:?Controversial?: Aaron (and Scott) laid out a challenge for data leaders: have a conversation this week with a stakeholder and never mention data. We get too wrapped up in the data instead of listening and understanding stakeholder business challenges.When thinking data strategy, you should first think business strategy. At the end of the day, it all comes down to how data can support the business in its objectives, not about doing data work for the sake of data work. What are the key business goals and target outcomes?Business people very rarely care about the how of data, the sausage making. Don't try to communicate to them about the how, focus on the what and the why. Really drive towards what are they trying to accomplish and work backwards towards what data work you can do to support them.When working with business partners, yes you can try to teach them to be more data driven. But most of the conversation should still be in the language of the business. Data work itself doesn't drive value, how it supports business partner value creation is crucial. Make sure you understand what matters and then communicate how you will help them.Dig into what business people are measured on, their KPIs, to best figure out how to support them with data work. How others are measuring their success is a very important indicator.?Controversial?: Data leaders really need to understand and take to heart that data just doesn't play that big a role in the day-to-day machinations for most business people. It can play a big role in supporting their work, it's just never going to be a top-of-mind focus for most.?Controversial?: Coming to business people with the data first instead of starting from their business challenges and goals first - you are essentially asking them to do the work of figuring out if you've brought them something of value. Start from asking them what they'd value and work towards creating that.Focus on the data work supporting your business partners where they're at - not where you wish they were - to move forward. Speak with them and _listen_ to their challenges. Then find ways to bring data in to support and improve their work-streams.When tying data work to the business strategy, while many people might have the same target outcomes, they all have different responsibilities. Really dig into what aspect the person you are working with needs to happen to best support them. Scott note: really important nuance to consider.When talking to an exec, be concise. "… here's the challenge we're facing, here's the outcome we want to get to, and here's next steps. And then are you all aligned to us?"Try to focus your data work on taking away work from your stakeholders instead of adding more things to their plate.Just because data might drive incremental benefit, it might not be worth it. Think return on investment, not just return. Data people have to work with business people to ask the question of what would actually get them to act. A 10% improvement on one aspect of their work probably won't get buy in. Find what would get them excited to move forward and focus on supporting those initiatives with data work.By starting from the problems people have, their pain points, your business partners will be bought in that any insights you find are likely valuable. Otherwise, if it is an incremental opportunity they hadn't considered, you need to read them in much more on the details of why the problem matters before even getting to the insight. That's again more work for them.You will probably have to build a track record of quick wins before business partners look to rely on you for bigger and bigger projects. And that's okay, you can get to know their general problem areas well through the quick wins.While data people often want to share everything they find, it can be a double-edged sword. Just letting people know about something can be okay but if you identify a new challenge or opportunity that isn't closely aligned with their current focus areas, you are asking them to do more work to identify if it should be a new focus. Think carefully about whether and how to bring those up.To be able to partner well with people down the road, first make sure you "stabilize, build the trust, and get everyone kind of on the same page around … what we need to do to fix [any] leaks." Lay that groundwork and start to build off a good foundation before trying to do large data projects.Learn to say no and to prioritize. Don't stretch yourself too thin or you won't deliver as well as you could and you will likely not build the trust necessary to really drive the value you want.Communicating what you're doing - really marketing your work and success stories - is crucial in data leadership. If all you are focused on is doing the work, you won't build the momentum and have people coming to you to partner on work.Focus on learning the language of the business. Yes, it would be great if everyone were data fluent but the main value drivers for most companies happen at the business level.RACI matrices are a very useful and important tool. Scott note: this comes up A LOT - get specific on responsibilities and ownership. Overcommunicate on both aspects.It's amazing how important making your business partners feel seen and heard is, not even 'fixing' their pain points. People are far more likely to work with people that take the time to understand them. It sounds simple - and it kind of is - but it's crucial to laying the groundwork for a good partnership.?Controversial?: When focusing on being a data leader, when you aren't in a conversation with a fellow data person, you need to lean in to talking in the language of the business. Too many data leaders are still entrenched in the ways of doing data instead of supporting the business via data work.Aaron started with a bit about his background and how he's been upping his game around better partnering with the business and focusing more on the strategy aspects - data and non-data strategy. He was hitting a career ceiling from focusing so much on the data work itself and communicating by showing people the data. Since he started digging deeper into data strategy, everything points to the business strategy, so the last few years have been about getting smarter on aligning the data strategy to the business strategy and doing data work to support key business objectives. What are the target business outcomes and how can data / data work support those?For Aaron, when he looks at companies' business strategies, it's pretty rare to even see the word data featured. Maybe it's to become data-driven but it's certainly not 'build a data warehouse' or anything at the tactical level. It's drive revenue, reduce costs, etc. So a key approach is doing the logic gap work - what do they care about and then think how data might support that. At the end of the day, they care that they get help on their key initiatives, not whether you built a data warehouse or leveraged some new paradigm or tool to drive that help. To best partner with business people, Aaron recommends digging into their incentivization. Not just their goals but what are the measures of their success, their KPIs? How can you support them in doing a better job at what is most important to their role? Understanding what drives them will help you best understand how data work can support them, which in turn should support the business strategy. And it's crucial to really take to heart the fact that data just isn't that big of a part of the day-to-day work of most people inside the business. We want data to make that work more impactful but trying to get everyone to be a citizen data scientist or whatever is just not going to work or drive good business value.Aaron talked about two different ways of bringing data to business people - starting from a data first approach or starting from the business pain points and working to the data. If you come with the data first approach, where you've done some analysis and ask them if this is of value, you are asking them to do more work to evaluate if it's useful to them. That's probably not going to win you many friends. That's the tail wagging the dog. The business people care about certain things, start from listening to them and then working to support them via data. You want the data to make their lives easier, not to add more work to evaluate what you've found and for them to figure out if it even matters to their strategic objectives and KPIs.In a lot of the same vein but from a different angle, it's really important to meet your business partners where they are with data. Look at their work-streams and their goals/challenges and find ways for data work to better support what they are doing instead of trying to reinvent the wheel or push them in an entirely new direction. The more friction to change, the more likely even potentially very valuable insights won't drive value because it won't drive actual action.Aaron made a really good point about while the organization may have target outcomes, each person working to drive to those outcome has different responsibilities - and measures of success - and it's important to figure out what aspect they own and how you can help them specifically with data. And be concise! He said a good way to think about it is in 3 slides to get your point across: "… here's the challenge we're facing, here's the outcome we want to get to, and here's next steps. And then are you all aligned to us?" Try to focus on making sure you are reducing workload for people in most instances rather than again making them do more work to get to value. Find ways to make it easier for them to focus on more and more valuable work and automate the important but rote stuff. Understanding business partners' goals is only one aspect of partnering well according to Aaron. You have to understand the magnitude of an improvement that would get them to move. There is friction to change so a 10% improvement is often not going to be enticing enough. Think about Aaron's example of a rival cable company offering you a $2/month discount to switch. And you also need to understand how would you communicate opportunities to your specific business partners to spur them to action. How do we market our data-driven insights and improvements? While we would probably all like to make major impacts with data, Aaron has seen it's more likely you need to put together a track record of small quick wins to gain the trust of many business partners. Once you've done that, they will be more ready to partner with you on bigger projects. And while you get to those quick wins, you can more easily understand what drives them and where there pain points lie so you are better prepared for a bigger project when they bring it to you.Aaron talked about the difference between bringing up new challenges you've found and planting seeds for something that might be important down the road. Planting seeds - letting people know a few tidbits of information along the way - is great to lay the groundwork for future work together. But most people will _not_ thank you for bringing a new challenge to their attention if it's not a major unknown issue or opportunity. They are focused on what they are most aligned to, they have to do more work to assess if the new opportunity or challenge you brought to them is worth their time. So consider carefully whether to bring something like that up and if yes, how. Again, try not to give people more work unless it's of significant value to them.To get to a place where you can work on high-impact, large data work, Aaron recommends making sure you build from a solid core. Look to stabilize what data work is already occurring, look for and address value leaks, and build trust. Then you will get to a better partner position. And look to stay away from constantly trying to generate ideas instead of listening to your business partners' ideas. They know their business better than you do and will almost certainly have better intuition about where to focus. Aaron also believes it's really important to prioritize and learn to say no. Don't stretch yourself too thin. Again, focus on stabilizing and then start to work with people on future state but don't add too much to your plate. Saying no is an incredibly powerful tool.Another powerful tool Aaron recommends is marketing your work internally. Many data people fall into the trap of trying to focus simply on doing good work and not sharing what they are doing widely enough. A lot of data people got in to data to focus on doing interesting work instead of heavy communication work :) But it's important for data leaders to constantly be sharing what's going on and just communicating in general. To understand where to focus your work, you need to understand people's needs. And you need people aligned with what you are doing and why. Talk to them about what are the top challenges, what you are doing to address them, etc. Eventually, you can increase the data fluency level of your business partners to a degree where they can help you make strategic decisions too but that only comes with constant communication.On the topic of data fluency, Aaron believes it's probably better in most organizations for the data people to learn the business language and what drives the business rather than trying to teach a lot of deep data concepts to hundreds or thousands of business people. The business processes are what drive value - yes, data can significantly improve the business processes but without understanding which ones to focus on and how to improve them, does the data work really have much value?Aaron, as many past guests have also noted, really recommends leveraging RACI matrices - responsible, accountable, consulted, and informed. Get really specific on who is responsible for what - get explicit and over-communicate to make sure there aren't misunderstandings or gaps. While it gets said a lot, making people feel seen and heard ends up having a large impact on partnering in Aaron's experience. "You're more likely to work with folks who you feel understand you versus people who were talking about something … you don't at all care about." If you learn people's pain points and talk to them, even if you can't address them right now, they are far more likely to lean into the conversation and future partnership. Data Mesh Radio is hosted by Scott Hirleman. If you want to connect with Scott, reach out to him on LinkedIn: https://www.linkedin.com/in/scotthirleman/If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereAll music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf
undefined
Jun 18, 2023 • 36min

Weekly Episode Summaries and Programming Notes – Week of June 18, 2023

Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/All music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf
undefined
Jun 16, 2023 • 24min

#231 Zhamak's Corner 24 - Can We Change Mistakes to "Happy Little Accidents" in Data?

Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Sponsored by NextData, Zhamak's company that is helping ease data product creation.For more great content from Zhamak, check out her book on data mesh, a book she collaborated on, her LinkedIn, and her Twitter.Takeaways:We can do better in data than what we did learning decentralization in services/the operational side: "We have to level up. We can't repeat the past mistakes. Let's not be silly and fool ourselves just because we have a schema, now we have an amazing system."The services world has learned good ways of communicating between producers and consumers. We should look to learn more from them and look to adapt then adopt what works well. We need to change our approach to measuring and reflecting on past decisions - we might have made a decision based on not great information, does that mean the decision was bad simply because it didn't work out? Probably not, but as Ari Gold said in Entourage "There are no asterisks in life, only scoreboards…" Can we really get to a place where we allow those asterisks? Zhamak believes we can adopt many software development practices across data - that's pretty key to data mesh - but one area people seem to be skipping over are things like decision records - what were you thinking when you made a past decision, what did you know and what were your hypotheses. It's easy to judge results but it's better to judge the judgment :)Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereData Mesh Radio episode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/All music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf
undefined
16 snips
Jun 12, 2023 • 1h 16min

#230 Getting Real About Data Product Management in Data Mesh - Interview w/ Frannie Helforoush

Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.Transcript for this episode (link) provided by Starburst. See their Data Mesh Summit recordings here and their great data mesh resource center here. You can download their Data Mesh for Dummies e-book (info gated) here.Frannie's LinkedIn: https://www.linkedin.com/in/frannie-farnaz-h-a7a11014/Post on the Product Trio concept by Teresa Torres: https://www.producttalk.org/2021/05/product-trio/In this episode, Scott interviewed Frannie Helforoush, Technical Product Manager/Data Product Manager at RBC Global Asset Management. To be clear, she was only representing her own views on the episode.Some key takeaways/thoughts from Frannie's point of view:There is a difference between the product mindset and creating/maintaining data products but both are very important to exchanging value through data. We should be looking to apply the product mindset to all aspects of our data work, not just how it applies to data products specifically.To do data product management well, you should look to software product management practices and recontextualize those to data. Many map well but some don't. It's not a copy paste, think through what should be applied to data differently.The data product manager needs to serve as the bridge between data producers and consumers, making sure consumer requirements are satisfied much like with a software product manager where they are the bridge between software engineering and software users.If product management is the intersection of business, tech, and user experience (UX), how should we think about that for data product management? Tech and business are easy but there isn't a user interface (UI) so think of UX in terms of data fluency, access, and documentation.Relatedly, documentation around data products is more important than traditional software products because there isn't really a tangible UI for data products.Data discovery platforms are very useful for data consumers because they are not only for discovering that data products exist but they are also great for discovering information about the data products - they provide a good initial understanding of data products before digging deeper.In data mesh, you shouldn't think of one single data platform to rule over everything. Users care about capabilities and their workflows/UX, not about exactly how things are stitched together on the back-end. Don't have 100 platforms but don't also create a monolithic beast you can't evolve.?Controversial?: Product discovery - the act of discovering what product(s) you should build - is crucial and often overlooked in the data product space. Many organizations are waiting for requests instead of data producers discovering what data products users might want.While we should contextualize product management functions and needs to data product management, oftentimes the terminology gets all jumbled up. Be very explicit in defining terms to other stakeholders to prevent as much confusion as possible - but there will still be confusion :)To help guide you in creating the right data product(s) to serve a use case, you really need to work closely with consumers - what are the day-to-day business problems of the use case? Then you can start to work backwards towards creating the right data products to serve the use case.Make sure to dig into who will actually use a data product and how will it drive value for them. There are far too many business processes that are wildly inefficient or even unnecessary simply because people aren't asking - or asking again - those questions. Don't let your data product be inefficient, be intentional in asking.Leverage the 'Product Trio' concept for data product management. While you won't have a design lead, find someone that can still help to optimize the data user experience (DUX) instead of the UI as you would with software. Again that UX is data fluency, access, and documentation.Use product management frameworks to deep dive into potential data users' actual challenges. Get specific so you can find points of leverage to improve the process. To actually drive value, you have to drill down.A 'fast fail' culture/approach is crucial to getting the best value from data. Producers and consumers can quickly test what works and iterate towards better instead of making data work all or nothing. Failure in data has to be something we are okay with.Potentially the hardest aspect of data product management is defining success and the metrics to measure your success. Don't feel bad if you are struggling here too :) It's okay to start with less than great metrics and iterate towards better.Enabling self-serve for consumers of data products should fall on the platform rather than on the data product owner. If the platform isn't robust enough, you shouldn't make every data product manager build out those capabilities to make up for it. Frannie started off with a bit about her background and how that has helped her adopt the data as a product mindset, applying product thinking, when looking at creating and maintaining data products. A lot of it is about mapping traditional software product management to data product management as many things translate pretty well 1:1 but lot of things _don't_. And knowing the difference is a crucial point of leverage. The data product isn't the point, it's merely a way to exchange information to support a use case. In software product management, the product manager "serves as a bridge between software engineering and software users". In data product management, the product manager should serve as the bridge between data producers and consumers to ensure consumer requirements are satisfied.Martin Eriksson's idea that product management is the intersection of tech, business, and user experience is foundational to Frannie. However, what actually is a user experience (UX) relative to a data product? Much of the user interface (UI) is actually owned by the platform itself rather than the data product so that makes UX a little more nebulous. She believes we should break down the UX into three categories: data fluency, access, and documentation. Access is the easiest because it's about making sure people can request access and can grant access easily - yes, automatic access is great too but that's not always possible. Data fluency is about how well the data product communicates its information to target audiences. Can people understand what the data product contains, why it exists, how things are organized, etc.? Lastly, unfortunately it seems documentation is harder and also more important in data product management than software product management. In software there is a tangible user interface but that's not really feasible for data products so the documentation is the way to guide people to understanding.For Frannie, data discovery platforms are really valuable for consumers to understand what data products exist. And once they find data products that might be of interest, the platform allows them to learn more about the data product. Companies should make it easy to find the most useful information about a data product all in one place, e.g. the lineage, the purpose of the data product, sample queries, sample data, etc. as well as request access. Scott note: this is a partly a trapped metadata challenge. Many tools trap their metadata so it's hard to bring in things like SLA observability directly into the data discovery platform.There is a big difference between data discovery and product discovery - or in this case data product discovery - in Frannie's terminology. In software product management, product discovery is about finding the needs users have and looking to fill that with product{s). Basically, discovering what products or features of products should exist. This is often overlooked in data product management because so much of the genesis for data products is defined use cases where the consumers come to the producers with their use case at least somewhat well scoped and understood. But data product discovery is a very useful practice to at least investigate if not lean more heavily into. Scott note: I've been saying this for literally over a year - communicate and spark the art of the possible with data consumers!Frannie talked more about the process of discovering the right data product(s) to create. You want to dig in deep with the potential users. They might have ideas about what data they want but the most important aspect is what is the day-to-day challenge or opportunity they are looking to address? What is the actual business process and how do they want it improved? That will help tell you what data product(s) you need to create for the use case and how you need to shape it to address the business process they want to make better. Essentially, consumers are experts in what they are trying to fix, don't rely on them to be experts in what data to share and how, that's for the producers to own.The 'Product Trio' concept from Teresa Torres has really been helpful to Frannie and team around data product management. While it is coming from the product management space where the trio is the product manager, the technical lead, and the product designer, it needs to be altered for data products since that designer role doesn't make as much sense when there isn't a direct UI. So you still have a product manager, a tech lead of some kind, and then someone that is more aligned to the user, maybe a data scientist or a data architect, to help ensure a positive data user experience (DUX). Again, Frannie breaks that DUX down into 3 components: data fluency, access, and documentation.Frannie recommends looking at product management frameworks for leading product discovery sessions, e.g. the value proposition framework. Frameworks can help you get to actual tangible information about business processes and their challenges. Think about concrete ways to qualify and potentially even quantify aspects of the business challenges. Get pretty specific instead of a laundry list of potential use cases, drill into one or two and figure out how to help. Scott note: this seems to be a recurrent issue in data - lack of specificity around use cases and needs. It's part of why requirements gathering fails. Do not skimp on this process :DFailing fast is a concept Frannie believes we need to adopt - and maybe adapt - in data. That's what prototyping needs to be better able to quickly get to value. Failure has historically been somewhat of a catastrophe in data because so many things have been essentially all or nothing with huge budgets. But with things like data mesh and in general with collaborative iterative data work between producers and consumers, failure doesn't mean forever fail, it leads us to places to improve. So while we might need new terminology around fast fail - data people seem to hate the concept of failing at all - it's a practice that will help us quickly iterate towards better solutions and more value. A fast fail culture with smaller blast radii can really highlight how people will use a data product instead of the pre-planning phase where you try to come to how both sides _think_ a consumer will use a data product.Frannie shared what probably all data product managers feel but few hear: it's very difficult - and probably a bit squishy - to define success for a data product and then also define the metrics to measure success. E.g. just pointing to number of users doesn't necessarily equate to value. User satisfaction is a useful measure but obviously it doesn't cover every aspect. How well are you doing at hitting your SLAs? It's okay to start with non-perfect metrics as long as they are directionally valuable. Scott note: I recommend listening to episode #95 with Dave Colls on fitness functions if you want to dig deeper into success metric tracking.Quick tidbit:It's important to build out your data consumer self-serve capabilities to give consumers a better feeling of being able to actually get necessary access. And it's even more important to make sure that is handled by the platform and not the individual data products. Data Mesh Radio is hosted by Scott Hirleman. If you want to connect with Scott, reach out to him on LinkedIn: https://www.linkedin.com/in/scotthirleman/If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereAll music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app