AI-powered
podcast player
Listen to all your favourite podcasts with AI-powered features
Expression of Gratitude, Connecting with the Community, and Introduction of Speaker's Industry Analyst Firm
The speakers express gratitude for the conversation, discuss the best way to reach out to them, emphasize the importance of sharing experiences and connecting with the community, and provide information about their services and free programs. They conclude with contact information for suggestions and a farewell message.
Please Rate and Review us on your podcast app of choice!
Get involved with Data Mesh Understanding's free community roundtables and introductions: https://landing.datameshunderstanding.com/
If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see here
Episode list and links to all available episode transcripts here.
Provided as a free resource by Data Mesh Understanding. Get in touch with Scott on LinkedIn if you want to chat data mesh.
Transcript for this episode (link) provided by Starburst. See their Data Mesh Summit recordings here and their great data mesh resource center here. You can download their Data Mesh for Dummies e-book (info gated) here.
Jimmy's LinkedIn: https://www.linkedin.com/in/jimmy-kozlow-02863513/
In this episode, Scott interviewed Jimmy Kozlow, Data Mesh Enablement Lead at Northern Trust. To be clear, he was only representing his own views on the episode.
Also, FYI, there were some technical difficulties in this episode where the recording kept shutting down and had to be restarted. So thanks to Jimmy for sticking through and hopefully it isn't too noticeable that Scott had to ask questions without hearing the full answer to the previous question.
There is a lot of philosophical discussion in this conversation but tied to very deep implementation experience. It is hard to sum up in full without writing a small novel. Basically, this is one to probably listen to over just reading the notes.
Also, Scott came up with a terrible new phrase, asking people to "get out there and be funky."
Some key takeaways/thoughts from Jimmy's point of view:
Jimmy's role is a rather unique one in that he is literally tasked with enabling data mesh to happen and make sure they are focusing on the right things. That means a myriad of different things but a lot of it is ensuring the communication and collaboration happens where it needs to while still focusing on the big picture. It might be like an American football coach that coaches the entire team but is still calling plays and making the minute decisions during the game too. It's a big set of tasks to take on.
At Northern Trust, they started their mesh implementation with the innovators according to Jimmy. That's a common pattern for all tech innovation, not just in data mesh - find the people enthusiastic to try something new so you don't have to spend half your time driving buy-in. There obviously also had to be a need where data mesh could work and would be a differentiator.
For Jimmy, one big complexity factor he is seeing is around data products. He believes that you really start to drive value of a data product higher the more relevant and interoperable data sets you can include in the data product - within reason of course. But as you add that fourth or fifth data set, it gets complex to maintain interoperability and consumability even at the data product level. But that complexity is where a big part of the value really lies in data mesh - that it pushes organizations to take on that complexity but in a scalable way.
There is a bit of a push/pull in data mesh for Jimmy: organic data product growth from new use cases versus the centralized team pushing for the value from more and more interoperable data - basically creating data products that fill the gaps between existing data products to enable additional use cases. New use cases emerge the more data products you have that are well crafted and that link data across domains; but the question becomes do you push for new data products that don't have a specific use case - inorganic growth - in order to create a tipping point where those use cases can quickly emerge? It would mean less work for consumers and faster time to market if the data is already available instead of working with the producing team. But will the data products be made well enough to serve use cases? Are people ready to go and discover pre-made data products instead of ones tailored to their needs? Should we only apply this to new data products - how would a central team spot when additional attributes are needed to take an emerging data product from only serving a use case to being more globally valuable? It's still a question in data mesh as to whether to pursue inorganic data product creation.
At the start of their journey - and even though they are two plus years in, it's still relatively early - Jimmy and team are focusing on existing, known use cases as they build out the existing available data and improve their capabilities. Capabilities not just to deliver new data products but how to deliver incremental data that fits well into their already existing set of available data on the mesh. It's about building out the entire picture instead of focusing too much at the micro level. Scott note: balancing that micro and macro level is hard - extremely hard? - but the earlier you get good at figuring out how to add value at the overall mesh level while serving use cases, the more value you will deliver with each incremental data product.
Jimmy talked about with data mesh, even though we can deliver scalable data products relatively quickly, there can be more initial friction for new use cases. E.g. teams have to go and collect the necessary information to do the governance well instead of trying to add the governance at the end. And some might be frustrated or not bought in that the upfront friction is worth it. So he's trying to show the value far exceeds that initial extra friction but people will of course resist new ways of working. Such is the nature of working with humans 😅
When asked what does success look like for them, Jimmy pointed to showing value. It's important to note that isn't merely delivering value but being able to show that value. As Jerry McGuire said, "SHOW ME THE MONEY!" Being able to show value generation helps to build momentum and adoption. If you are proving value then it's not nearly as difficult to get incremental investment. Teams want to participate and capture value too. Excitement builds. But it's also important to note that what success looks like will change - maybe not wholesale but at least in part - in different phases of your implementation.
For Jimmy, there are two important aspects of your implementation, essentially the setup and the knockdown. You have to set up your implementation for success by building out the platform and capabilities but getting teams to actually adopt is still crucial. Just because you built something amazing, you still have to work with people to understand the shift in mindset and approach to get them to buy-in and adopt. A great platform that no one is using isn't really a great platform…
Trying to keep momentum of the whole mesh implementation until you reach critical mass is very challenging. Jimmy talked about how trying to get teams with quite different capability and speed levels to work together can be hard. It's not as though any organization is built to all move together as one - it's not a car that's built to move as one unit, it's more like a group of cars - so you need to really focus on the coordination, collaboration, and especially communication. ABC - Always Be Communicating 😎
Jimmy believes that the central data team in data mesh should be a key point of leverage. They can jump in to help a domain early in their journey to get something delivered while raising that domain's capabilities. A central team can bring repeatable patterns to find easy paths for new domains. That way, the domains still learn by doing but they don't have to learn by repeatedly failing. But once a domain is capable enough, the central team tries to move out quickly to give the domains autonomy to do what's valuable. They are also building a community of practice for practitioners to share insights with each other, providing even more leverage and discovering more repeatable, high value patterns.
When asked about bringing a domain up to speed and the question of complexity, Jimmy strongly believes that you should start simple. You don't hand the 7yr old who wants to help you cook the knife and have them go wild on day 1. Get them into a groove, get them confidence and understanding how to deal with data, then you can start to think about adding complexity. But dealing with data is complex enough, keep it simple for them to deliver value initially.
When thinking about success and things like a new domain's time to first data product, Jimmy believes you can learn where there is friction but the actual times vary quite a bit. So you might see lengthening times for new domains launch a data product but it's a good thing because you are dealing with domains who really aren't sure what they are doing and have to learn a ton about data in general and their own data - that means you are penetrating the less data savvy parts of the organization. All things equal, you want to go faster but just take things with a grain of salt.
There's also the fun of trying to thread the needle of data modeled for the initial use case - so fit for purpose - yet also modeled so it fits well and is interoperable with the rest of the data in your available mesh of data products. Jimmy said this is especially true of domains just getting up to speed with dealing with their data and data modeling so prepare for them to move slower and need more help.
When asked about where there is friction in the process of bringing on new domains that we _shouldn't_ try to reduce, Jimmy pointed to learning and understanding. People need to take the time to understand how to do data work and all that but also understand the new ways of working and why the organization is going in this direction. It's that old are you trying to get them to do the steps you say or achieve the target outcome you give them. Learning takes time, don't rush it.
Jimmy's role is pretty unique as far as other organizations telling their story. He's focused on helping people focus in the right areas but also on helping them connect the dots. There is such a big picture when you think about the entire information scape of an organization and helping people to connect to each other and see where they could enhance that bigger picture is highly valuable. This drives better value while also reducing miscommunications, duplication of work, and wasted time. Scott note: this is somewhat similar to the 'Data Sherpa' concept I've mentioned repeatedly that just about everyone looks like I'm a madman when I bring up.
The question of what to decentralize versus centralize is a tough one for every organization doing data mesh. Jimmy pointed to the fact that where
Listen to all your favourite podcasts with AI-powered features
Listen to the best highlights from the podcasts you love and dive into the full episode
Hear something you like? Tap your headphones to save it with AI-generated key takeaways
Send highlights to Twitter, WhatsApp or export them to Notion, Readwise & more
Listen to all your favourite podcasts with AI-powered features
Listen to the best highlights from the podcasts you love and dive into the full episode