Data Mesh Radio cover image

Data Mesh Radio

Latest episodes

undefined
Apr 20, 2022 • 11min

#62 Can We Make Data Mesh 'The Good Place': What We Owe Each Other - Mesh Musings 11

Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/All music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf
undefined
Apr 19, 2022 • 1h 16min

#61 Driving Value Through Participating in the Data Economy - Data Innovation Summit Takeover Interview w/ Jarkko Moilanen

Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.Transcript for this episode (link) provided by Starburst. See their Data Mesh Summit recordings here and their great data mesh resource center hereJarkko's LinkedIn: https://www.linkedin.com/in/jarkkomoilanen/Open Data Product Spec: http://opendataproducts.org/#open-data-product-specificationJarkko's Data Product Business website: https://www.dataproductbusiness.com/This episode is part of the Data Innovation Summit Takeover week of Data Mesh Radio.Data Innovation Summit website: https://datainnovationsummit.com/; use code DATAMESHR20G for 20% off ticketsFree Ticket Raffle for Data Innovation Summit (submissions must be by April 25 at 11:59pm PST): Google FormScott interviewed Jarkko Moilanen, a Data Economist and Country CDO ambassador for Finland. Jarkko will be presenting on "Data Monetization and Related Data Value Chain Requires Both Data Products and Services" on May 6th in track M6.Jarkko is approaching measuring the value of data and then trying to extract that value from data from many different angles. He is thinking about data products, data as a service, data as a product, etc.Per Jarkko, treating data like a product can apply a LOT of the learnings from the API revolution - this time around we can skip a lot of the sharp edges. APIs are about an interface to value creation - how can we treat data products the same way?We discussed the difference between return and return on investment. A data initiative may have a very high return but if the investment to get that return is too high, it's a bad initiative. How do we get to figuring out what quality level we need to solve our challenges - there is no reason to go for 5 9s quality if that doesn't move the needle.Jarkko coined a new concept on the call - the half life of data value. For a large percent of data, Jarkko believes the value of the data starts to fall considerably over a relatively short period of time. How can we extract the value when it is most valuable? If the half-life is weeks, days, hours, or even less? And how do we set ourselves up to get the most "bang for the buck"?Jarkko is firmly in the camp of intentionality when it comes to data. We can't keep betting on "this data might have value" or collecting data for the sake of collecting it. The data cleansing after the fact is difficult - what was the context at the time? Can you enrich the data further? Etc. - and the cost to do so is typically quite high compared to the value. And you keep incurring costs just to keep data around. If you ascribe to his data half-life theory, the value diminishes quickly so stop keeping around so much data you aren't using!Jarkko's data economy model has three layers that he adapted from the API economy: The bottom layer is private/internal to the organization only use. In Jarkko's view this is typically for organizations that don't have the capabilities to productize their data, that have low data maturity. If they can move toward productizing, it will enable reuse - not just for themselves but potentially third parties.The middle layer is to have closed sharing agreements with other organizations, creating data sharing ecosystems. These ecosystems are typically very limited in the number of other organizations involved. These are often also about creating value for a joint purpose, e.g. two suppliers sharing data specifically to meet a customer need. To do this, you need to productize your data enough to make it generally understandable and relatively easy to use.The top layer is the completely public data marketplace layer. Organizations participating at this layer are packaging data or even algorithms for sale.Jarkko also has four key elements that define a data product - whether a data mesh type data product or not - in his mind:The technical data flow layer - how the data is processed/created/handled by the underlying infrastructureThe business plan layer - descriptions, plans for the data; really, what is the business objective of the data productThe legal layer - what are the conditions for using the dataThe ethical layer - while this is becoming more important in the AI space, we should think about ethical use in all data productsJarkko talked about trust as a measurement for value - if you can't trust data, it's value is significantly less. Lack of trust can significantly raise the total cost as consumers have to work to enrich the data and thus lowers the value of data. Jarkko has been working on a trust index for data products, which is especially applicable in a data exchange scenario.For Jarkko, there are 3 key things to managing data: First, treat every bit or set of data as if you'd share it externally. That means enrich it, make it trustable, usable, secure, etc. Data has a habit of going external in some way. Second, make your data actually usable in your scenario - what level of data literacy do you have so you know what bar you have to meet? How can you find that core 80% in a 10/80/10 split that will drive insights with data? Third, have a ready-made toolkit to mock data products at the business layer with consumers. This is more about the process than tooling, but have a set of canvases so you can share ideas about new data products and get good feedback. That good feedback from users before creating a data product is very useful.Jarkko summarized his thoughts with let the business people lead the way; if they aren't enabled to lead, we need to educate them so they can leverage the data.Data Mesh Radio is hosted by Scott Hirleman. If you want to connect with Scott, reach out to him on LinkedIn: https://www.linkedin.com/in/scotthirleman/If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereAll music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf
undefined
Apr 18, 2022 • 1h 8min

#60 Managing Organizational Structure in a Traditional Company: Can You Have Two Solid Lines - Data Innovation Summit Takeover Interview w/ Daniel Engberg

In this discussion, Daniel Engberg, Head of AI, Data, and Platforms at Scandinavian Airlines, shares insights on reshaping traditional organizational structures for modern agility. He highlights how clear communication and tailored goals can break down silos and foster cross-functional teams. Engberg also addresses the complexities of dual reporting lines and the challenges of governance in data-driven environments. The conversation offers innovative approaches to enhance collaboration and retain talent, especially in the evolving landscape of data governance.
undefined
Apr 17, 2022 • 34min

DIS Takeover - Weekly Episode Summaries and Programming Notes - Week of Apr 17, 2022 - Data Mesh Radio

Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/All music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf
undefined
Apr 15, 2022 • 1h 15min

#59 Knowledge Graphs as the Engine for Collaboration Across Data - KGC Takeover Interview w/ Philippe Höij and Guest Host Ellie Young

Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.Transcript for this episode (link) provided by Starburst. See their Data Mesh Summit recordings here and their great data mesh resource center here.Ellie's LinkedIn: https://www.linkedin.com/in/sellieyoung/Philippe's LinkedIn: https://www.linkedin.com/in/hoijnet/Philippe's Twitter: @hoijnet / https://twitter.com/hoijnetDFRNT website: https://dfrnt.com/Knowledge Graph Conference website: https://www.knowledgegraph.tech/Free Ticket Raffle for Knowledge Graph Conference (submissions must be by April 18 at 11:59pm PST): Google FormIn this episode of the Knowledge Graph Conference takeover week, special guest host Ellie Young (Link) interviewed Philippe Höij, Founder at DFRNT.At the wrap-up, Philippe mentioned that data architects should be able to communicate in ways other than PowerPoint. We need new and better ways to express ourselves, and the way things are connected. We will always need metadata around our data, we need text to express our ambiguity; but we don't have great ways to express things that are slightly ambiguous - not fully formed but also mostly known. A good tool allows to more easily query your model of the world to iterate and increment on it. That is where knowledge graphs can be the most helpful. Ellie responded, "It's not difficult, it's just complicated."Philippe shared his journey towards knowledge graphs, especially thinking about the AIDITTO project he and a team built out of the "Hack the Crisis Sweden" event in 2020 around COVID-19. He needed a way to prototype, visualize and collaborate on data and the connections between data at scale. A regular data model does not convey enough information about what the data is and how it relates.Ellie then shared some insight into the difficulties around collaborating on data across organizations and people in her climate change work at Common Action. Collaborating across organizations, all with different ways of working, you need a common "language" or way of communicating relative to data but can't easily develop a shared schema. Knowledge graphs provide incremental capabilities for collaboration.Philippe talked about when collaborating across organizations, you still have the needs master data management (MDM) tries - and often fails - to address but there is zero capability to manage the other organizations' data flow into the shared data "pool". Philippe was having issues with open-ended knowledge graphs like SparQL or OWL - they needed composable data structures to be able to be flexible in the case they can't fully decompose a concept, especially as the concept or their understanding of the concept evolves.For Philippe, TerminusDB was a big win because it allowed for composable data structures and much easier querying across the graph. Ellie discussed the origins of TerminusDB being about collaboration across many entities/organizations so it has a much different approach to accepting data that doesn't necessarily conform to a schema or data model. The "git for data" concept in TerminusDB also really was a big win for Philippe as it made experimentation much easier.Ellie shared some of the challenges in her work at Common Action around working with many different entities, many of which are small and not that data literate - or "data native" - and how they need to enable collaboration without rigidity as things are so dynamic. Philippe discussed the need for enabling people to collaborate in a "messy" environment - the world is changing and trying to spend all your time and effort categorizing it into a single schema isn't realistic. He believes you need to enable collaboration in a truly distributed environment; the value is driven by micro level action via autonomy - people making progress in their own domain - which creates global value. Too often, we've tried centralized collaboration and it doesn't work. The collaboration shouldn't be a heavy overhead to driving that value - how can we flip the script to make the collaboration the enabler?Philippe shared about how knowledge graphs can be used to manage compliance with security standards. You can map out much more easily who has responsibility for what and even identify gaps in your compliance adherence process. Being able to query that information easily makes it far easier to make sure you are identifying and mitigating risk. Ellie talked about breaking things down into paths for what is happening, what is not happening, and then what needs to happen to actually hit future goals. She mentioned it's a new way to interact with change and the unknown.For Philippe, we need to start somewhere in breaking down the complexity and visualize what's going on; what are the patterns that we can see? Let's model and share them in a low complexity enough way. You can start to see the concepts and connect them in a way that we as humans can understand. It's almost like building a hivemind concept - each brain has it's own context and then when you share that context into the greater whole, it's impossible to know what incremental information or knowledge will be generated but it's almost an inevitability that it will happen. Many more patterns will emerge - patterns of patterns. But we need to be able to share that context in some way to have those patterns emerge.Ellie shared thoughts about what is complexity - just something that is so big and/or gnarly that it is - at best - difficult for a human to understand it all. Things are so interconnected, you can't just adjust one piece or aspect. BUT it's okay to not understand the absolute complete picture, we can move forward with confidence and/or identify the most likely best challenges to address. Philippe believes it is often sufficient to understand directionality to move forward and make progress. Knowledge graphs help us deal with complexity and capture aspects of complexity in a way that is more understandable. How can we unravel the giant knotted ball of yarn one bit at a time instead of mapping out the entire unraveling process ahead of time? Ellie talked about knowledge graphs creating better information flow about what impact changes or prospective changes to data we are sharing will have on downstream consumers.Philippe mentioned that knowledge graphs aren't great for every use case - look for the places where it really makes sense. Look at the specifics. While you can use graphs to manage the interconnectivity between the data, not all relational structures benefit.Then Philippe discussed what he is working on and when to release it - he's focused on making data modeling much easier but doesn't want it to be overly technical. He wants to guide and enable the changemakers. It will be about enabling the collaborative aspects of knowledge graphs so people can have much better conversations about the data. And then people can express changes in data instead of in a PowerPoint.Ellie mentioned what Veronika discussed in her episode, that you can model your data in the same way as you are used to in a knowledge graph but it's as a form of data immediately rather than taking it and transforming it into data. We have to talk to each other to discuss our data conventions and develop a new relationship for business users to data.Data Mesh Radio is hosted by Scott Hirleman. If you want to connect with Scott, reach out to him on LinkedIn: https://www.linkedin.com/in/scotthirleman/If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereAll music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf
undefined
Apr 14, 2022 • 15min

#58 No, You Don't Sell a Data Mesh: Vendor BS in Data Mesh - Mesh Musings 10

Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Scott gets grumpy and calls out some misbehaving vendors. And adds some fun new music to the episodes.Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/All music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf
undefined
Apr 13, 2022 • 1h 10min

#57 Using a Knowledge Graph for a Data Marketplace and Data Mesh for Retail - KGC Takeover Interview w/ Olivier Wulveryck and Guest Host Ellie Young

Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.Transcript for this episode (link) provided by Starburst. See their Data Mesh Summit recordings here and their great data mesh resource center here.Ellie's LinkedIn: https://www.linkedin.com/in/sellieyoung/Olivier's LinkedIn: https://www.linkedin.com/in/olivierwulveryck/Knowledge Graph Conference website: https://www.knowledgegraph.tech/Free Ticket Raffle for Knowledge Graph Conference (submissions must be by April 18 at 11:59pm PST): Google FormIn this episode of the Knowledge Graph Conference takeover week, special guest host Ellie Young (Link), founder of Common Action, interviewed Olivier Wulveryck, Senior Consultant and Manager at OCTO Technology.In the first two thirds of the interview, Olivier and Ellie chatted about a lot of concepts specifically around data mesh and then they linked in the concepts around knowledge graph in the last third.For Olivier, a knowledge graph is the map for the data that is available - each data product or node in a data mesh is the representation of the knowledge within the organization. The knowledge graph is the abstraction of that knowledge across the data mesh - a logical representation on top of the data mesh nodes to help people make sense of the data mesh.Currently, Olivier is working with a client sharing their data in a data marketplace. They are working on implementing a knowledge graph on that but not on their internal data. If they are seeing value from applying a knowledge graph externally, they may apply to their internal usage.Olivier shared his view that it's easier to start with a data mesh than a knowledge graph - any first steps with a data mesh will bring you value. It is not the same with knowledge graphs - you have to do more work to get to value with knowledge graphs.Olivier previously worked on the operational side of software engineering. He realized they had lots of data sitting in databases but the data was just a consequence - it was state data, there was no temporal dimension. He wanted to apply machine learning to the data but he was suffering from low quality data, just like the data team. The producing team never really thought about the data to be shared with others. For Olivier, the way to fix this is to put data back at the center of the domain and flip the script - make the operational datastore the consequence of changes in the data.Olivier believes there is a need to think about the semantics of the data as it will be used by the rest of the organization - data can no longer be just an internal asset of the domain. If we believe we need the data to actually be used, we need that data to actually provide value. Make the data usable and useful.To get specific, Olivier shared a use case of a clothing retailer. They might be getting people's body type and measurements to help them choose clothes that fit better. But the company could also use the data to make better fitted clothes for a broader range of their customers. They might have more insights to change the way they tailor or design their clothing.Ellie asked about can we standardize how we capture and share data. Olivier is not sure if we can harmonize how we capture data. So instead, we need to harmonize on aggregation and integration. Olivier also talked about the challenges around finding the equilibrium between data consumer and data producer needs/wants.Per Olivier, adapt is better than adopt. There is no by-the-book way to implement data mesh because that would never be applicable to any real organization. Olivier also mentioned the need for a framework for how teams will communicate and work together, e.g. Team Topologies.Data Mesh Radio is hosted by Scott Hirleman. If you want to connect with Scott, reach out to him on LinkedIn: https://www.linkedin.com/in/scotthirleman/If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereAll music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf
undefined
Apr 12, 2022 • 1h 18min

#56 Insights from Deploying Data Mesh and Knowledge Graphs at Scale - KGC Takeover Interview w/ Veronika Haderlein-Høgberg and Guest Host Ellie Young

Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.Transcript for this episode (link) provided by Starburst. See their Data Mesh Summit recordings here and their great data mesh resource center here.Ellie's LinkedIn: https://www.linkedin.com/in/sellieyoung/Veronika's LinkedIn: https://www.linkedin.com/in/veronikahaderlein/Knowledge Graph Conference website: https://www.knowledgegraph.tech/Free Ticket Raffle for Knowledge Graph Conference (submissions must be by April 18 at 11:59pm PST): Google FormIn this episode of the Knowledge Graph Conference takeover week, special guest host Ellie Young (Link) interviewed Veronika Haderlein-Høgberg, PhD. Veronika was employed at Fraunhofer-Gesellschaft at the time of recording but was representing only her own view and experiences. She was invited for her special mix of both data mesh and knowledge graph know-how.At Fraunhofer-Gesellschaft, Veronika's employer up until recently, she and team were currently implementing a knowledge graph to help with decision support for the organization. And previously, Veronika worked on a data mesh-like implementation as part of the Norwegian public sector at the Norwegian tax authority before the data mesh concept was really congealed into a singular form by Zhamak.Veronika and Ellie wrapped the conversation with a few key insights: to share data, groups need to agree on common standards to represent it, and they also need to be able to share information with each other about that data into the future. To develop these initial data standards, and to build the relationships to coordinate around that data long term, different departments in the enterprise have to converse with each other. Building conversations across departments requires also building trust, and for this curiosity is a crucial ingredient, both on the individual level, but also at the domain and organizational levels. If people don't feel comfortable asking questions, they can’t understand each other’s perspectives well enough to contribute to that shared context. What does this look like in practice? Different departments coming to discuss the difference definitions they have of different terms, and finding out what data they need from each other, therefore, what data they must collect and protocols they must develop. And, computer scientists discussing data with business people—understanding both what business requirements are, and conveying the needs of data systems in order to provide organized, quality data. Veronika's recent organization, Fraunhofer, is using a knowledge graph as they need to make their investment decisions much more data driven. They need to do analysis across many different sources - they have some slight control over internal data sources but essentially none over external sources. They are repeatedly doing harmonization across these sources, often the same harmonizations. Veronika believes they shouldn't have to do the harmonization manually, so they needed a translation layer - the knowledge graph.To build out their knowledge graph, they need business experts to work with the ontology experts - however, it is a struggle for time and attention from the business experts and they need to learn the importance and how to do ontologies. This is when Ellie mentioned that by centralizing the integration, it might cost a lot of effort up front, but it's necessary if you only want to do the harmonization work once.For Veronika, thinking in the data as a product mindset and having data owners is crucial to getting a knowledge graph implementation right. She said a knowledge graph is a different way of expressing and sharing your knowledge, it's just in a way that computers can also understand it - not just humans. That framing helps people to understand why knowledge graphs are useful and important.Data mesh implementation background and insights:At the tax authority, Veronika's team of information architects were working to translate tax law into data models. They discovered the need of a common methodology to create the models - they chose UML - and then other authorities needed to use the data so the team began working to create data standards for efficient data sharing. The Norwegian data mesh implementation has even extended into a public/private partnership. Veronika also mentioned that Denmark now has standards for how bills are written so the legal aspects can be translated into data.Per Veronika, their data mesh journey started from pains - they really struggled to consume data from other entities as well as internally. They started by making it easy to consume data from those other entities without creating a large burden on those producers. They learned about good - and bad - practices on sharing data and which tools were best to use for data modeling. All the decisions were in small partnerships.For Veronika, a big key to success was taking on small partnerships and making an environment where asking questions was highly encouraged and even making mistakes was okay - this has been a recurring theme in many DMR episodes. Ellie made points that the community and communication are key to being able to make something like data mesh work. Veronika followed up that culture and "fuzzy factors" are more important than tooling and even methodologies.Veronika discussed the silent fear that change brings and how that is such an impediment to getting things done. There is also a fear of looking silly when asking questions. So we need to work with people to get to a place far more comfortable with change and where curiosity is rewarded instead of shamed or looked down upon. Asking questions gives people the ability to grow as they learn new things and gives a conduit for far more information sharing - documentation can't be the end-all, be-all.Ellie mentioned the need to stop focusing so much on the specific data in data processes. There needs to be a much bigger focus on people and the data creation process. Think about creating data for others to ingest with intentionality.For Veronika, she saw the organizational impact of implementing a data mesh very strongly - it led to a greater sense of doing meaningful work, which led to far lower churn across the organization. Those cross functional implementations had a strong impact and the people working on those implementations were far more engaged. Veronika believes the fear of data mesh silos in data mesh makes sense but people naturally want to prevent those silos so allow people to focus on consumer needs.Veronika believes knowledge graphs are key to preventing silos in data mesh and somewhat vice versa - it is very difficult to maintain a knowledge graph across an organization that doesn't think of data as a product.The two discussed the importance of data reuse, even in the same use case, to prevent manual work. It's like a golden source-like concept - you don't have to figure out which data you can trust. That manual repetition of harmonization is what kills productivity and people's desire to work with outside data.Veronika made an interesting point that any time you model data, it is almost like a mini knowledge graph. Each data model has a tiny ontology of the domain.At the Norwegian Tax Authority, the working groups around the initial data mesh implementation started very informally - people who knew each other beforehand. In a larger org, Veronika believes the key to success is talking, sharing what you are working on and talking about how it might work. Specific goals are very important, you need concrete deliverables which make it easier to get funding.In the Norwegian government, they worked closely with lawmakers to define business concepts and initially thought that everything had to harmonize but found that it was more important for each domain to define their business concepts the way they understand it and make it transparent - it isn't possible to make all data harmonizable, the world is full of variation. You need to think about your business concepts and then not force them on others - focus on the translation. Otherwise, you will never get to publishing anything. They didn't go quite as far as the Danish government though, which has created a role in the law making process, where there is someone who understands data enough that they can word the law to be able to translate it into data or a data model.Veronika believes you shouldn't use terms to identify concepts, use URIs. Terms are just a label, not the concept. She also believes it is often not necessary to make things computer readable and that you need to focus on creating a living organization.To finish up, Veronika reiterated: don't be afraid, be curious - curiosity is a prerequisite for success with data mesh or knowledge graphs.Data Mesh Radio is hosted by Scott Hirleman. If you want to connect with Scott, reach out to him on LinkedIn: https://www.linkedin.com/in/scotthirleman/If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereAll music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf
undefined
Apr 11, 2022 • 20min

#55 Just What is a Knowledge Graph - Short Answers to a Tough Question

Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.Knowledge Graph Conference website: https://www.knowledgegraph.tech/Free Ticket Raffle for Knowledge Graph Conference (submissions must be by April 18 at 11:59pm PST): Google FormThank you to our contributors! You can find additional introductory resources below.Contributors and their contact info:Karen Passmore, CEO and Founder of Predictive UXKaren LinkedIn: https://www.linkedin.com/in/karenpassmore/DMR Episode 30Xhensila Poda, Machine Learning Engineer at CARDO AIXhensila's LinkedIn: https://www.linkedin.com/in/xhensilapoda/Jens Scheidtmann, Lead Architect at BayerJens' LinkedIn: https://www.linkedin.com/in/jens-scheidtmann/Juan Sequeda, Principal Scientist at Data.worldJuan's LinkedIn: https://www.linkedin.com/in/juansequeda/DMR Episode 14Steve Stesney, Senior Product Lead and Data Practice Lead at Predictive UXSteve's LinkedIn: https://www.linkedin.com/in/stephenstesney/DMR Episode 30Tim Tischler, Principal Engineer at WayfairTim's LinkedIn: https://www.linkedin.com/in/timtischler/DMR Episode 43Further Introductory Resources (provided by Ellie Young and Juan Sequeda):What is a Knowledge Graph video by Martin Keen: https://www.youtube.com/watch?v=y7sXDpffzQQKnowledge graphs: Introduction, history, and perspectives (paper): https://onlinelibrary.wiley.com/doi/10.1002/aaai.12033Knowledge Graphs book (free): https://kgbook.org/Knowledge Graphs paper by Juan Sequeda and Claudio Gutierrez: https://cacm.acm.org/magazines/2021/3/250711-knowledge-graphs/fulltextJuan's book "Designing and Building Enterprise Knowledge Graphs", 25% discount using code DATAWORLD Data Mesh Radio is hosted by Scott Hirleman. If you want to connect with Scott, reach out to him on LinkedIn: https://www.linkedin.com/in/scotthirleman/If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereAll music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf
undefined
Apr 10, 2022 • 34min

KGC Takeover - Weekly Episode Summaries and Programming Notes - Week of Apr 10, 2022 - Data Mesh Radio

Sign up for Data Mesh Understanding's free roundtable and introduction programs here: https://landing.datameshunderstanding.com/Please Rate and Review us on your podcast app of choice!If you want to be a guest or give feedback (suggestions for topics, comments, etc.), please see hereEpisode list and links to all available episode transcripts here.Provided as a free resource by Data Mesh Understanding / Scott Hirleman. Get in touch with Scott on LinkedIn if you want to chat data mesh.If you want to learn more and/or join the Data Mesh Learning Community, see here: https://datameshlearning.com/community/All music used this episode was found on PixaBay and was created by (including slight edits by Scott Hirleman): Lesfm, MondayHopes, SergeQuadrado, ItsWatR, Lexin_Music, and/or nevesf

The AI-powered Podcast Player

Save insights by tapping your headphones, chat with episodes, discover the best highlights - and more!
App store bannerPlay store banner
Get the app