
In Their Own Words
Interviews with members of The Deming Institute community, including industry leaders, practitioners, educators, Deming family members and others who share their stories of transformation and success through the innovative management and quality theories of Dr. W. Edwards Deming.
Latest episodes

25 snips
Jul 1, 2024 • 35min
Quality, Back to the Start! Misunderstanding Quality (Part 1)
Bill Bellows, with 31 years of experience in Dr. Deming's philosophy, shares his quality journey. Topics include challenges in understanding quality management teachings, first encounters with quality circles, Taguchi method for gear wear problems, and Deming's philosophies on quality and systems thinking.

Jun 17, 2024 • 38min
Goal Setting is Often an Act of Desperation: Part 6
In the final episode of the goal setting in classrooms series, John Dues and Andrew Stotz discuss the last three of the 10 Key Lessons for implementing Deming in schools. They finish up with the example of Jessica's 4th-grade science class. TRANSCRIPT 0:00:02.4 Andrew Stotz: My name is Andrew Stotz, and I'll be your host as we continue our journey into the teachings of Dr. W Edwards Deming. Today I'm continuing my discussion with John Dues, who is part of the new generation of educators striving to apply Dr. Deming's principles to unleash student joy in learning. This is episode six about goal setting through a Deming lens. John, take it away. 0:00:26.4 John Dues: Hey, Andrew, it's good to be back. Yeah, for the past handful of episodes or so, we've been talking about organizational goal setting. We covered these four conditions of healthy goal setting and then got into these 10 key lessons for data analysis. And then we've been looking at those 10 key lessons applied to an improvement project. And we've been talking about a project that was completed by Jessica Cutler and she did a Continual Improvement Fellowship with us here at our schools. And if you remember, Jessica was attempting to improve the joy in learning of her students in her fourth grade science class. So last time we looked at lessons five through seven. Today we're gonna look at those final three lessons, eight, nine and ten applied to her project. 0:01:15.7 AS: It's exciting. 0:01:17.1 JD: Yeah. So we'll jump in here. We'll kind of do a description, a refresher of each lesson. And we'll kind of talk about how it was applied to her specific project, and we'll look at some of her data to kind of bring that live for those of the folks that have video. Let's jump in with lesson number eight. So we've talked about this before, but lesson number eight was: more timely data is better for improvement purposes. So we've talked about this a lot. We've talked about something like state testing data. We've said, it can be useful, but it's not super useful for improvement purposes, because we don't get it until the year ends. And students in our case, have already gone on summer vacation by the time that data comes in. And you know that the analogous data probably happens in lots of different sectors where you get data that lags, to the point that it's not really that useful for improvement purposes. 0:02:15.8 JD: So when we're trying to improve something, more frequent data is helpful because then we can sort of see if an intervention that we're trying is having an effect, the intended effect. We can learn that more quickly if we have more frequent data. And so it's, there's not a hard and fast rule, I don't think for how frequently you should be gathering data. It just sort of needs to be in sync with the improvement context. I think that's the important thing. Whether it's daily or a couple times a day or weekly, or monthly, quarterly, whatever, it's gotta be in sync with whatever you're trying to improve. 0:02:50.5 AS: You made me think about a documentary I saw about, how they do brain surgery and how the patient can't be sedated because they're asking the patient questions about, do you feel this and they're testing whether they're getting... They're trying to, let's say, get rid of a piece of a cancerous growth, and they wanna make sure that they're not getting into an area that's gonna damage their brain. And so, the feedback mechanism that they're getting through their tools and the feedback from the patient, it's horrifying to think of the whole thing. 0:03:27.7 JD: Yeah. 0:03:28.3 AS: It's a perfect example of why more timely data is useful for improvement purposes 'cause imagine if you didn't have that information, you knock the patient out, you get the cancerous growth, but who knows what you get in addition to that. 0:03:43.7 JD: Yeah, that's really interesting. I think that's certainly an extreme example, [laughter], but I think it's relevant. No matter what our context, that data allows us to understand what's going on, variation, trends, whether our system is stable, unstable, how we should go about improving. So it's not dissimilar from the doctors in that example. 0:04:06.8 AS: And it's indisputable I think, I would argue. But yet many people may not, they may be operating with data that's not timely. And so this is a reminder that we would pretty much always want that timely data. So that's lesson eight. Wow. 0:04:22.6 JD: Lesson eight. Yeah. And let's see how we can, I'll put a visualization on the screen so you can see what Jessica's data look like. All right. So now you can see. We've looked at these charts before. This is Jessica's process behavior chart for joy in science. So just to reorient, you have the joy percentage that students are feeling after a lesson on the x-axis, sorry, on the y-axis. On the x-axis, you have the school dates where they've collected this survey information from students in Jessica's class. 0:04:57.0 AS: Can you put that in Slide Show view? 0:05:00.4 JD: Yeah. I can do that. Yeah. 0:05:02.7 AS: Just it'll make it bigger, so for the... 0:05:06.5 JD: There you go. 0:05:07.8 AS: For the listeners out there, we're looking at a chart of daily, well, let's say it looks like daily data. There's probably weekends that are not in there because class is not on weekends, but it's the ups and downs of a chart that's ranging between a pretty, a relatively narrow range, and these are the scores that are coming from Jessica's surveying of the students each day, I believe. Correct? 0:05:34.2 JD: Yeah. So each day where Jessica is giving a survey to assess the joy in science that students are feeling, then she's averaging all those students together. And then the plot, the dot is the average of all the students sort of assessment of how much joy they felt in a particular science lesson. 0:05:54.7 AS: And that's the average. So for the listeners out there John's got an average line down the middle of these various data points, and then he is also got a red line above and a red line below the, above the highest point and slightly below the lowest point. Maybe you can explain that a little bit more. 0:06:15.4 JD: Yeah. So with Jessica, you remember originally she started plotting on a line chart or a run chart when we just had a few data points just to kind of get a sense of how things are moving so she could talk about it with her class. And over time what's happened is she's now got, at this point in the project, which she started in January, now this is sort of mid-March. And so she's collected two to three data points a week. So she doesn't survey the kids every day just for time sake, but she's getting two, three data points a week. And so by March, she started just a couple months ago, she's got 28 data points. So that sort of goes back to this idea of more timely data is better for improvement. 0:07:00.9 JD: And a lot of times, let's say a school district or a school does actually survey their students about how, what they think of their classes. That might happen at best once a semester or maybe once a year. And so at the end of the year you have one or two data points. So it's really hard to tell sort of what's actually going on. Compared to this, Jessica's got these 28 data points in just about two months or so of school. So she's got 28 data points to work with. And so what her and her students are doing with this data then, one, they can see how it's moving up and down. So we have, the blue dots are all the plotted points, like you said, the green line is the average running sort of through the middle of the data, and then those red lines are our process limits, the upper and lower natural process limits that sort of tell us the bounds of the system. 0:07:50.4 JD: And that's based on the difference in each successive data point. But the most important thing is that as Jessica and her students are looking at this, initially, they're really just studying it and trying to sort of see how things are going from survey to survey. So one of the things that Deming talked about frequently is not tampering with data, which would be if you sort of, you overreact to a single data point. So let's say, a couple of days in, it dips down from where it started and you say, oh my gosh, we gotta change things. And so that's what Deming is talking about. Not tampering, not overreacting to any single data point. Instead look at this whole picture that you get from these 28 data points and then talk about... 0:08:41.5 JD: In Jessica's case she's talking about with her students, what can we learn from this data? What does the variation from point to point look like? If we keep using the system, the fourth grade science system, if we leave it as is, then we'll probably just keep getting data pretty similar to this over time, unless something more substantial changes either in the negative or the positive. So right now they... 0:09:10.1 AS: And I think for the listeners, it's, you can see that there's really no strong pattern that I can see from this. It's just, there's some, sometimes that there's, seems like there's little trends and stuff like that. But I would say that the level of joy in the science classroom is pretty stable. 0:09:32.1 JD: Pretty stable. Yeah. Pretty high. It's bouncing around maybe a 76% average across those two and a half months or so. And so, they, you kind of consider this like the baseline. They've got a good solid baseline understanding of what joy looks like in this fourth grade science classroom. Did that stop sharing on your end? 0:10:00.2 AS: Yep. 0:10:00.2 JD: Okay, great. So that's lesson eight. So clearly she's gathered a lot of data in a pretty short amount of time. It's timely, it's useful, it's usable, it can be studied by her and her students. So we'll switch it to lesson nine now. So now they've got a good amount of data. They got 28 data points. That's plenty of data to work with. So lesson nine is now we wanna clearly label the start date for an intervention directly in her chart. And remember from earlier episodes, not only are we collecting this data, we're actually putting this up on a screen on a smart board in the classroom, and Jessica and her students are studying this data together. They're actually looking at this, this exact chart and she's explaining sort of kind of like we just did to the listeners. She's explaining what the chart means. 0:10:54.2 JD: And so over time, like once a week she's putting this up on the smart board and now kids are getting used to, how do you read this data? What does this mean? What are all these dots? What do these numbers mean? What do these red lines mean? That type of thing. And so now that they've got enough data, now we can start talking about interventions. That's really what lesson nine is about. And the point here is that you want to clearly, explicitly with a literally like a dotted line in the chart to mark on the day that you're gonna try something new. So you insert this dashed vertical line, we'll take a look at it in a second, on the date the intervention started. And then we're also gonna probably label it something simple so we can remember what intervention we tried at that point in time. 0:11:42.7 JD: So what this then allows the team to do is then to very easily see the data that happened before the intervention and the data that happened after the implementation of this intervention or this change idea. And then once we've started this change and we start plotting points after the change has gone into effect, then we can start seeing or start looking for those patterns in the data that we've talked about, those different rules, those three rules that we've talked about across these episodes. And just to refresh, rule one would be if we see a single data point outside of either of the limits, rule two is if we see eight consecutive points on either side of that green average line, and rule three is if we see three out of four dots in a row that are closer to one of the limits than they are to that central line. 0:12:38.3 JD: So that again, those patterns tell us that something significant, mathematically improbable has happened. It's a big enough magnitude in change that you wouldn't have expected it otherwise. And when we see that pattern, we can be reasonably assured that that intervention that we've tried has worked. 0:12:56.0 AS: And let me ask you about the intervention for just a second because I could imagine that if this project was going on, first question is, does Jessica's students are, obviously know that this experiment is going on? 0:13:08.3 JD: Yes. 0:13:09.8 AS: Because they're filling out a survey. And my first question is, do they know that there's an intervention happening? I would expect that it would be yes, because they're gonna feel or see that intervention. Correct? 0:13:25.1 JD: Sure. Yep. 0:13:25.2 AS: That's my first point that I want to think about. And the second point is, let's imagine now that everybody in the classroom has been seeing this chart and they're, everybody's excited and they got a lot of ideas about how they could improve. Jessica probably has a lot of ideas. So the temptation is to say, let's change these three things and see what happens. 0:13:46.5 JD: Yeah. 0:13:47.1 AS: Is it important that we only do one thing at a time or that one intervention at a time or not? So maybe those are two questions I have in my mind. 0:13:58.6 JD: Yeah, so to the first question, are you, you're saying there there might be some type of participant or... 0:14:02.3 AS: Bias. 0:14:03.3 JD: Observer effect like that they want this to happen. That's certainly possible. But speaking to the second question, what intervention do you go with? Do you go with one or you go with multiple? If you remember a couple of episodes ago we talked about, and we actually looked at a fishbone diagram that Jessica and her students that they created and they said, okay, what causes us to have low joy in class? And then they sort of mapped those, they categorized them, and there were different things like technology not working. If you remember, one was like distractions, like other teachers walk into the room during the lesson. And one of them was others like classmates making a lot of noise, making noises during class and distracting me. And so they mapped out different causes. I think they probably came up with like 12 or 15 different causes as possibilities. 0:14:58.7 JD: And they actually voted as a class. Which of these, if we worked on one of these, which would have the biggest impact? So not every kid voted for it, but the majority or the item that the most kids thought would have the biggest impact was if we could somehow stop all the noises basically. So they came up with that as a class, but not, it wasn't everybody's idea. But I think we've also talked about sort of the lessons from David Langford where once kids see that you're gonna actually take this serious, take their ideas serious and start acting on them, they take the project pretty seriously too. So maybe not a perfect answer, but that's sort of what we... 0:15:38.0 AS: I was thinking that, ultimately you could get short-term blips when you do an intervention and then it stabilizes possibly. That's one possibility. And the second thing I thought is, well, I mean ultimately the objective, whether that's an output from a factory, and keeping, improving that output or whether that's the output related to joy in the classroom as an example, you want it to go up and stay up and you want the students to see it and say, wow, look, it's happening. So, yeah. 0:16:11.7 JD: And there's different ways you can handle this. So this joy thing could go up to a certain point. They're like, I don't know if we can get any more joy, like, it's pretty high. And what you could do at that point is say, okay, I'm gonna assign a student to just sort of, every once in a while, we'll keep doing these surveys and we will sort of keep plotting the data, but we're not gonna talk about a lot. I'm just gonna assign this as a student's job to plot the new data points. And we'll kind of, we'll kind of measure it, but we won't keep up with the intervention 'cause we got it to a point that we're pretty happy with. And now as a class we may wanna switch, switch our attention to something else. 0:16:45.2 JD: So we started getting into the winter months and attendance has dipped. Maybe we've been charting that and say, Hey guys, we gotta, gotta kinda work on this. This is gone below sort of a level that's really good for learning. So let's think about as a group how we could come up with some ideas to raise that. So maybe you turn your attention to something else, 'cause you can't pay attention to everything at once. 0:17:07.2 AS: Yeah, and I think I could use an example in my Valuation Master Class Boot Camp where students were asking for more personal feedback and I realized I couldn't really scale this class if I had to get stuck into hundreds of grading basically. And that's when I came up with the concept of feedback Friday, where one student from each team would present and then I would give feedback, I would give a critique and they would be intense and all students would be watching, it would be recorded, and all of a sudden all the issues related to wanting this personal feedback went away. And therefore, once I instituted it on a regular basis, I went on to the next issue and I made sure that I didn't lose the progress that I had made and continue to make feedback Friday better and better. 0:17:56.2 JD: Yeah. Yeah. That's great. That's great. I'll share my screen so you can kinda see what this looked like in Jessica's class now, what the chart looks like now. So now you see that same chart, that same process behavior chart, exact same one we were just looking at except now you can see this, this dashed vertical line that marks the spot where the intervention was started that we just talked about. And what the kids are actually doing, and Jessica are running a PDSA cycle, a Plan-Do-Study-Act cycle. That's the experimental cycle in her class. And what they're running that PDSA on is, again, how can we put something in place to reduce the distracting noises. And so what the students actually said is if we get a deduction for making noises, then there will be less noises. And so in the school's sort of management system, a deduction is sort of like a demerit. 0:19:00.0 JD: If you maybe went to a Catholic school or something like that, or some public schools had demerits as well, but basically it's like a minor infraction basically that goes home or that gets communicated to parents at the end of the week. But the kids came up with this so their basic premise is, their plan, their prediction is if there are less noises, we'll be able to enjoy science class. And if we give deductions for these noises, then there'll be less noises. So some people may push back, well, I don't think you should give deductions or something like that, but which, fine, you could have that opinion. But I think the powerful point here is this is, the students created this, it was their idea. And so they're testing that idea to see if it actually has impact. 0:19:44.8 JD: And they're learning to do that test in this scientific thinking way by using the Plan-Do-Study-Act cycle, and seeing if it actually has an impact on their data. So at the point where they draw this dashed line, let's call that March 19th, we can see a couple of additional data points have been gathered. So you can see the data went up from 3/18 to 3/21. So from March 18th to March 21st, rose from about, let's call it 73% or so, up to about 76% on March 21st. And then that next day it rose another percent or two and let's call that 78%. 0:20:28.1 JD: And so the trap here is you could say, okay, we did this intervention and it made things better. But the key point is the data did go up, but we haven't gathered enough additional data to see one of those patterns that we talked about that would say, oh, this actually has had a significant change. Because before the dashed line, you can see data points that are as high or even higher than some of these ones that we see after the PDSA is started. So it's too early to say one way or another if this intervention is having an impact. So we're not gonna overreact. You could see a place where you're so excited that it did go up a couple of days from where it was on March 18th before you started this experiment, but that's a trap. Because it's still just common cause data, still just bouncing around that average, it's still within the bounds of the red process limits that define the science system. 0:21:34.2 AS: I have an experiment going on in my latest Valuation Master Class Boot Camp, but in that case, it's a 6-week period that I'm testing, and then I see the outcome at the end of the six weeks to test whether my hypothesis was right or not. Whereas here it's real time trying to understand what's happening. So yes, you can be tempted when it's real time to try to jump to conclusion, but when you said, well, okay, I can't really get the answer to this conclusion until I've run the test in a fixed time period, then it's you don't have as much of that temptation to draw a conclusion. 0:22:14.1 JD: Yeah. And if I actually was... I should have actually taken this a step farther. I marked it with this Plan-Do-Study-Act cycle. What I should have done too is write "noises" or something like that, deduction for noises, some small annotation, so it'd be clear what this PDSA cycle is. 0:22:32.1 AS: In other words, you're saying identify the intervention by the vertical line, but also label it as to what that intervention was, which you've done before on the other chart. I remember. 0:22:42.1 JD: Yeah. And then it'd be sort of just looking at this when she puts this up on the smart board for the class to see it again too. Oh yeah yeah, that's when we ran that first intervention and that was that intervention where we did deductions for noises. But the bigger point is that this never happens where you have some data, you understand a system, you plan systematic intervention, and then you gather more data right after it to see if it's having an impact. We'd never do that ever, in education, ever. Ever have I ever seen this before. Nothing like this. Just this little setup combining the process behavior chart with the Plan-Do-Study-Act cycle, I think is very, very, very powerful and very different approach than what school improvement. 0:23:33.4 AS: Exciting. 0:23:34.6 JD: Yeah. The typical approach is to school improvement. So I'll stop that share for a second there, and we can do a quick overview of lesson 10 and then jump back into the chart as more data has been gathered. So lesson 10 is: the purpose of data analysis is insight. Seems pretty straightforward. This is one of those key teachings from Dr. Donald Wheeler who we've talked about. He taught us that the best analysis is the simplest analysis, which provides the needed insight. 0:24:08.1 AS: So repeat lesson 10, again, the purpose of... 0:24:11.6 JD: The purpose of data analysis is insight. 0:24:14.7 AS: Yep. 0:24:15.6 JD: So just plotting the dots on the run chart and turning the run chart into the process behavior chart, that's the most straightforward method for understanding how our data is performing over time. We've talked about this a lot, but it's way more intuitive to understand the data and how it's moving than if you just stored it in a table or a spreadsheet. Got to use these time sequence charts. That's so very important. 0:24:42.2 AS: And I was just looking at the definition of insight, which is a clear, deep, and sometimes sudden understanding of a complicated problem or situation. 0:24:51.6 JD: Yeah. And I think that can happen, much more likely to happen when you have the data visualized in this way than the ways that we typically visualize data in just like a table or a spreadsheet. And so in Jessica's case, we left off on March 22nd and they had done two surveys after the intervention. And so then of course what they do is they continue over the next 4, or 5, 6 weeks, gathering more of that data as they're running that intervention, then we can sort of switch back and see what that data is looking like now. 0:25:28.3 AS: Exciting. 0:25:30.3 JD: So we have this same chart with that additional data. So we have data all the way out to now April 11th. So they run this PDSA for about a month, three weeks, month, three, four weeks. 0:25:47.9 AS: And that's 11 data points after the intervention. Okay. 0:25:54.0 JD: Yep. Purposeful. So what was I gonna say? Oh, yeah. So three, four weeks for a Plan-Do-Study-Act cycle, that's a pretty good amount of time. Two to four weeks, I've kind of found is a sweet spot. Shorter than that, it's hard to get enough data back to see if your intervention has made a difference. Longer than that, then it's you're getting away from the sort of adaptability, the ability to sort of build on an early intervention, make the tweaks you need to. So that two to four week time period for your PDSA seems like a sweet spot to me. So she's continued to collect this joy in learning data to see... Basically what her and her class are doing is seeing if their theory is correct. Does this idea of giving deductions for making noises have an impact? Is it effective? 0:26:44.0 JD: So if they learn, if the data comes back and there is no change, no indication of improvement, then a lot of people will say, well, my experiment has failed. And my answer to that is, no, it hasn't failed. It might not have worked like you wanted, but you learn very quickly that that noise deduction is not going to work and we're gonna try some other thing, some other intervention. We learn that very very quickly within 3 or 4 weeks that we need to try something new. Now, in the case of Jessica's class, that's not what happened. So you can actually see that dotted line, vertical dotted line is still at March 19th, we have those 11 additional data points. And you can actually see, if you count, starting with March 21st, you count 1-2-3-4-5-6-7-8-9-10-11 data points that are above that green average line from before. 0:27:45.5 JD: So originally the red lines, the limits and the central line would just be straight across. But once I see that eight or more of those are on one side of that central line, then I actually shift the limits and the average line, 'cause I have a new system. I've shifted it up and that actually is an indication that this intervention has worked, because we said... Now for those that are watching, it doesn't appear that all the blue dots are above that green line, but they were before the shift. Remember the shift indicates a new system. So I go back to the point where the first dot of the 8 or more in a row occurred, and that's where I have indicated a new system with the shift in the limits and the central line. So this, their theory was actually correct. This idea of giving a deduction for noises actually worked to improve the joy in Jessica's science class. It was a successful experiment. 0:28:52.7 AS: Can I draw on your chart there and ask some questions? 0:29:00.5 JD: Sure. Yeah. 0:29:00.6 AS: So one of my questions is, is it possible, for instance, in the preliminary period, let's say the first 20 days or so that things were kind of stabilized and then what we saw is that things potentially improved here in the period before the intervention and that the intervention caused an increase, but it may not be as significant as it appears based upon the prior, the most recent, let's say 10 days or something like that. So that's my question on it. I'll delete my drawings there. 0:29:46.3 JD: Yeah, I think that's a fair question. So, the reason I didn't shift those before, despite you do see a pattern, so before the dotted line, I considered that period a baseline period where we were just collecting 'cause they hadn't tried anything yet. So Dr. Wheeler has these series of four questions. So in addition to seeing a signal, he's got these other sort of questions that he typically asks and that they're yes/no questions. And you want the answer to all those to be yes. And one of 'em is like, do you know why an improvement or a decline happened? And if you don't, then you really shouldn't shift the limits. So that's why I didn't shift them before. I chose not to shift them until we actually did something, actually tried something. 0:30:33.2 AS: Which is basically saying that you're trying to get the voice of the students, a clear voice, and that may be that over the time of the intervention, it could be that the... Sorry, over the time of the initial data gathering, that the repetition of it may have caused students to feel more joy in the classroom because they were being asked and maybe that started to adjust a little bit up and there's the baseline, so. Yep. Okay. 0:31:01.6 JD: Yeah. And so this is sort of where the project ended for the fellowship that Jessica was doing. But, what would happen if we could sort of see what happened, further out in the school year is that, either Jessica and the class could then be sort of satisfied with where the joy in learning is at this point where the improvement occurred. Or they could run another cycle, sort of testing, sort of a tweaked version of that noise reduction PDSA, that intervention or they could add something to it. 0:31:43.0 AS: Or they could have run another fishbone point, maybe the noise wasn't actually the students thought it would be the number one contributor, but, maybe by looking at the next one they could see, oh, hey, wait a minute, this may be a higher contributor or not. 0:32:01.2 JD: Yeah. And when you dug into the actual plan, the specifics of the plan, how that noise deduction was going to work, there may be something in that plan that didn't go as planned and that's where you would have to lean on, 'cause we've talked about the three sort of parts of the improvement team that you need. You need the frontline people. That's the students. You need the person with the authority to change the system. That's Jessica. And then someone with the knowledge of the system, profound knowledge. That's me. Well, those, the Jessica and her students are the one in that every day. So they're gonna have learning about how that intervention went, that would then inform the second cycle of the PDSA, whatever that was gonna be, whatever they're gonna work on next. The learning from the first cycle is gonna inform that sort of next cycle. 0:32:51.4 JD: So the idea is that you don't just run a PDSA once but you repeatedly test interventions or change ideas until you get that system where you want it to be. 0:33:01.1 AS: So for the listeners and viewers out there, I bet you're thinking gosh, Jessica's pretty lucky to have John help her to go through this. And I think about lots of things that I want to talk to you about [laughter] about my testing in my own business, and I know in my own teaching, but also in my business. So that I think is one of the exciting things about this is the idea that we just, we do a lot of these things in our head sometimes. I think this will make a difference and, but we're not doing this level of detail usually in the way that we're actually performing the tests and trying to see what the outcomes are. 0:33:43.9 JD: Yeah I think that for school people too, I think when we've attempted to improve schools, reform schools, what happens is we go really fast and the learning actually happens very slowly and we don't really appreciate what it actually takes to change something in practice. And what happens then is to the frontline people like teachers... The reformers have good intentions but the people on the front line just get worn out basically, and a lot of times nothing actually even improves. You just wear people out. You make these big changes go fast and wide in the system and you don't really know exactly what to do on the ground because the opposite is having Jessica's classroom. They're actually learning fast but trying very small changes and getting feedback right in the place where that feedback needs to be given right in the classroom and then they can then learn from that and make changes. 0:34:49.8 JD: And again, it may seem smaller. Maybe it doesn't seem that revolutionary to people but to me, I think it's a completely revolutionary, completely different way to do school improvement that actually kind of honors the expertise of the teacher in the classroom, it takes into account how students are experiencing a change and then I'm kind of providing a method that they can use to then make that classroom better for everybody so and I think in doing so students more likely to find joy in their work, joy in their learnings, teachers more likely to find joy in their work as well. So to me it's a win-win for all those involved. 0:35:34.9 AS: Fantastic. Well, should we wrap up there? 0:35:40.6 JD: Yeah, I think that's a good place to wrap up this particular series. 0:35:45.1 AS: And maybe you could just review for the whole series of what we've done just to kind of make sure that everybody's clear and if somebody just came in on this one they know a little bit of the flow of what they're gonna get in the prior ones. 0:36:00.4 JD: Yeah. So we did six episodes and in those six episodes we started off just talking about what do you need to have in place for healthy goal setting at an organizational level, and we put four conditions in place that before you ever set a goal you should have to understand the capability of your system, you have to understand the variation within your system, you have to understand if the system that you're studying is stable, and then you have to have a logical answer to the question by what method. By what method are you gonna bring about improvement or by what method you're gonna get to this goal that you wanna set. So we talked about that, you gotta have these four conditions in place and without those we said goal setting is often an act of desperation. 0:36:49.7 JD: And then from there what we did is start talking about these 10 key lessons for data analysis so as you get the data about the goal and you start to understand the conditions for that system of process we could use those 10 data lessons to then interpret the data that we're looking at or studying and then we basically did that over the first four episodes. In the last few episodes what we've done is look at those lessons applied to Jessica's improvement project and that's what we just wrapped up looking at those 10 lessons. 0:37:23.7 AS: I don't know about the listeners and viewers but for me this type of stuff just gets me excited about how we can improve the way we improve. 0:37:33.4 JD: Yeah. For sure. 0:37:34.9 AS: And that's exciting. So John, on behalf of everyone at the Deming Institute I want to thank you again for this discussion, and for listeners, remember to go to deming.org to continue your journey. You can find John's book Win-Win W. Edwards Deming, the System of Profound Knowledge and the Science of Improving Schools on amazon.com. This is your host Andrew Stotz, and I'll leave you with one of my favorite quotes from Dr. Deming, "People are entitled to joy in work."

Jun 11, 2024 • 29min
Goal Setting is Often an Act of Desperation: Part 5
In this episode, John Dues and Andrew Stotz apply lessons five through seven of the 10 Key Lessons for implementing Deming in classrooms. They continue using Jessica's fourth-grade science class as an example to illustrate the concepts in action. TRANSCRIPT 0:00:02.2 Andrew Stotz: My name is Andrew Stotz and I'll be your host as we continue our journey into the teachings of Dr. W. Edwards Deming. Today I'm continuing my discussion with John Dues, who is part of the new generation of educators striving to apply Dr. Deming's principles to unleash student joy in learning. This is episode five about goal setting through a Deming lens. John, take it away. 0:00:23.2 John Dues: Yeah, it's good to be back, Andrew. Yeah, like you said, for the past few episodes we've been talking about organizational goal setting. We covered four healthy conditions, or four conditions of healthy goal setting and 10 key lessons for data analysis. And then what we turn to in the last episode is looking at an applied example of the 10 key lessons for data analysis and in action. And, if you remember from last time we were looking at this improvement project from Jessica Cutler, she's a fourth grade science teacher, and she did the improvement fellowship here at United Schools Network, where she learned the tools, the techniques, the philosophies, the processes behind the Deming theory, continual improvement, that type of thing. And in... And in Jessica's specific case, in her fourth grade science class, what she was settled on that she was gonna improve was, the joy in learning of her students. And we looked at lessons one through four through the eyes or through the lens of her project. And today we're gonna look at lessons five through seven. So basically the next, uh, the next three lessons of those 10 key lessons. 0:01:34.8 AS: I can't wait. Let's do it. 0:01:37.3 JD: Let's do it. So lesson number five was: show enough data in your baseline to illustrate the previous level of variation. Right. So the basic idea with this particular lesson is that, you know, let's say we're trying to improve something. We have a data point or maybe a couple data points. We wanna get to a point where we're starting to understand how this particular concept works. In this case, what we're looking at is joy in learning. And there's some different rules for how many data points you should, should have in a typical base baseline. But, you know, a pretty good rule of thumb is, you know, if you can get 12 to 15, that's... That's pretty solid. You can start working with fewer data points in real life. And even if you just have five or six values, that's gonna give you more understanding than just, you know, a single data point, which is often what we're... What we're working with. 0:02:35.6 AS: In, other words, even if you have less data, you can say that this gives some guidance. 0:02:40.9 JD: Yeah. 0:02:41.1 AS: And then you know that the reliability of that may be a little bit less, but it gives you a way... A place to start. 0:02:46.9 JD: A place to start. You're gonna learn more over time, but at least even five or six data points is more than what I typically seen in the typical, let's say, chart where it has last month and this month, right? So even five or six points is a lot more than that. You know, what's... What's typical? So I can kind of show you, I'll share my screen here and we'll take a look at, Jessica's initial run chart. You see that right? 0:03:19.3 AS: We can see it. 0:03:21.2 JD: Awesome. 0:03:22.3 AS: You wanna put it in slideshow? Can we see that? Yeah, there you go. 0:03:24.9 JD: Yeah, I'll do that. 0:03:25.4 AS: Perfect. 0:03:26.3 JD: That works better. So, you know, again, what we're trying to do is show enough data in the baseline to understand what happened prior to whenever we started this improvement effort. And I think I've shared this quote before, but I really love this one from Dr. Donald Berwick, he said "plotting measurements over time turns out in my view to be one of the most powerful things we have for systemic learning." So what... That's what this is all about really, is sort of taking that lesson to heart. So, so you can look at Jessica's run chart for "joy in science." So just to sort of orient you to the chart. We have dates along the bottom. So she started collecting this data on January 4th, and this is for about the first 10 days of data she has collected. So she's collected this data between January 4th and January 24th. So, you know, a few times a week she's giving a survey. You'll remember where she's actually asking your kids, how joyful was this science lesson? 0:04:24.4 JD: Mm-hmm. 0:04:27.2 JD: And so this is a run chart 'cause it's just the data with the median running through the middle, that green line there, the data is the blue lines connected by, or sorry, the blue dots connected by the points and the y axis there along the left is the joy in learning percentage. So out of a hundred percent, sort of what are kids saying? How are kids sort of evaluating each of these science lessons? So we've got 10 data points so far, which is a pretty good start. So it's starting to give Jessica and her science class a decent understanding about, you know, when we, you know, define joy in science and then we start to collect this data, we really don't have any idea what that's gonna look like in practice. But now that she started plotting this data over time, we have a much better sense of what the kids think of the science lessons basically. So on the very first day... 0:05:25.4 AS: And what is the... What is the median amount just for the listeners out there that don't see it? What would be the... Is that 78%? 0:05:33.8 JD: Yeah, about 78%. So that very first day was 77%. The second day was about 68%. And then you sort of see it bounce around that median over the course of that, those 10 days. So some of the points are below the median, some of the points are above the median. 0:05:50.4 AS: And the highest point above is about 83, it looks like roughly around that. 0:05:54.4 JD: Yeah. Around 82, 83%. And one technical point is at the point that it's a run chart we don't have the process limits, those red lines that we've been taking a look at and with a run chart and, you know, fewer data points, we only have 10. It's fairly typical to use the median, just so you know, you can kind of better control for any outlier data points which we really don't have any outliers in this particular case but that's just sort of a technical point. So, yeah, I mean, I think, you know what you start to see, you start to get a sense of what this data looks like, you know, and you're gonna keep collecting this data over an additional time period, right? And she hasn't at this point introduced any interventions or any changes. Right now they're just learning about this joy in learning system, really. Right. 0:06:51.8 JD: And so, you know, as she's thinking about this, this really brings us to... To lesson six, which is, you know, what's the goal of data analysis? And this is true in schools and it's true anywhere. We're not just gonna look at the past results, but we're also gonna, you know, probably more importantly, look to the future and hopefully sort of be able to predict what's gonna happen in the future. And, you know, whatever concept that we're looking at. And so as we continue to gather additional data, we can then turn that run chart from those initial 10 points into a process behavior chart. Right. You know, that's a, sort of a, you know, it's the run chart on steroids because not only can we see the variation, which you can see in the run chart, but now because we've added more data, we've added the upper and lower natural process limit, we can also start to characterize the type of variation that we see in that data. 0:08:00.1 AS: So for the listeners, listeners out there, John just switched to a new chart which is just an extension of the prior chart carrying it out for a few more weeks, it looks like, of daily data. And then he's added in a lower and upper natural process limit. 0:08:18.9 JD: Yeah. So we're still, we're still plotting the data for joy in science. So the data is still the blue dots connected by the blue lines now because we have 24 or so data points, the green line, the central line is the average of that data running through the data. And we have enough data to add the upper and lower natural process limit. And so right now we can start to determine do we only have natural variation, those everyday ups and downs, that common cause variation, or do we have some type of exceptional or special cause variation that's outside of what would be expected in this particular system. We can start making... 0:09:00.7 AS: Can you... 0:09:02.2 JD: Go ahead. 0:09:02.8 AS: I was gonna... I was gonna ask you if you can just explain how you calculated the upper and lower natural process limits just so people can understand. Is it max and min or is it standard deviation or what is that? 0:09:18.3 JD: Yeah, basically what's happening is that, so we've plotted the data and then we use that data, we calculate the average, and then we also calculate what the moving range, is what it's called. So we just look at each successive data point and the difference between those two points. And basically there's a formula that you use for the upper and lower natural process limits that takes all of those things into account. So it's not standard deviation, but it's instead using the moving, moving range between each successive data point. 0:09:52.9 AS: In other words, the data that's on this chart will always fall within the natural upper and lower. In other words it's... Or is, will data points fall outside of that? 0:10:05.7 JD: Well, it depends on what kind of system it is. 0:10:07.8 AS: Right. Okay. 0:10:09.8 JD: If it's a stable system, that means all we see is sort of natural ups and downs in the data. And we use those formulas for the process limits. The magnitude of the difference of each successive data point is such that it's not necessarily big or small, it's just based on what you're seeing empirically. It's basically predictable. Right. And if it's not predictable, then we'll see special causes. So we'll see special patterns in the data. So I think maybe last time we talked about the three patterns, or you know, in some episode we talked about the patterns that would suggest there's a special cause that goes to the study. Those three patterns that I use are, is there a single one of these joy in science data points outside of either the upper or lower natural process limit that'd be a special cause. 0:11:05.4 JD: If you see eight data points in a row, either above the central line or below the central line, that's a special cause. And if I see three out of four in a row that are either closer to the upper limit or to the lower limit than they are to that central line, that's a pattern of the data that suggests a special cause. So we don't, in this particular dataset, we don't see any special causes. So now we have... Now we have a very solid baseline set of data. We have 24 data points. And when you're using an average central line and get... Getting technical, once you get to about 17 data points, those upper and lower natural process limits start to solidify, meaning they're not gonna really change too much 'cause you have enough data unless something really significant happens. And then if you're using the median, that solidification happens when you get to about 24 data points. 0:12:07.5 JD: So when you're, you know, when you're getting to 17 to 24 data points in your baseline, you're really getting pretty solid upper and lower national process limits. So, as of this March 1st date, which is the last date in this particular chart, there are 24 data points. So you have a pretty solid baseline set of data. Right now, the upper natural process limit is 95%. That lower limit is sitting at 66%, and then the average running through the middle, that green line is 81%. So this basically tells us that if nothing changes within Jessica's fourth grade science system, her classroom, we can expect the data to bounce around this 81% average and stay within the bounds of the limit. So we would call this a common cause system because we don't see any of those rules that I just talked about for special causes. And that's important. 0:13:07.4 JD: So do we have an unstable system or a stable system? We have a stable system. A stable system means that the data is predictable and unless something happens, you know, and this could be something that happens in the control of the teacher in the class, or it could be out of the control of the teacher in the class, but unless something happens that's significant, this data is just kind of keep humming along like this over the course of March, April, May of this particular school year. Right. So once we get to this point, so we have baseline data we've collected in a run chart, we start to understand how that data is moving up and down. We got some more data and we added the upper and lower natural process limits. Now we can assess not only the variation, but also the stability and the capability of the system, all of those things, those questions can start to be answered now that we have this process behavior chart. 0:14:09.3 JD: And this brings us to the final lesson for today, which is lesson 7, which is the improvement approach depends on the stability of the system under study. So that's why one of the reasons why the process behavior chart is so powerful is because now I have an understanding of what I need to do, like what type of approach I need to take to improve this particular system. Right? So in this particular case, I have a predictable system. And so the only way to bring about meaningful improvement is to fundamentally change this science system, right? 0:14:52.6 JD: The flip side would be if I did see a special cause let's say, it was an unpredictable system. We saw special cause on the low side. I'd wanna study that, what happened on that particular day. Because if I see a special cause, let's say on February 2nd I saw a special cause, let's say I saw a single data point below the lower natural process limit that's so different and unexpected, I'd actually wanna go to her classroom and talk to her in her class and say, okay, what happened on that day? I'm gonna try to remove that special cause. Study of that specific data point is warranted. If you don't see those special causes, then those, even though there are ups and downs, there are increases and decreases. They're within that, you know, the expected bounds of this particular system. Right. 0:15:46.9 AS: And I was gonna say, I can't remember if I got this from Dr. Deming or where it came from, but I know as an analyst in the stock market analyzing tons and tons of data in my career, I always say if something looks like a special cause or looks strange it's probably an error. [laughter] 0:16:03.2 AS: And it could just be for instance, that a student came in and they didn't understand how to fill it out or they refused to fill it out or they filled out the form with a really bizarre thing, or maybe they thought that number 10 was good and number one was bad, but in fact on the survey it was number one that was good and number 10 that was bad. And you find out that, you know, that special cause came from some sort of error. 0:16:26.6 JD: That's certainly possible. That's certainly possible. 0:16:29.5 AS: As opposed to another special cause could be, let's just say that the school had a blackout and all of a sudden the air conditioning went off for half of the class and everybody was just like really frustrated. They were burning hot. It was really a hot day and that special cause could have been a legitimate cause as opposed to let's say an error cause but you know, it causes an extreme, you know response on the survey. 0:16:56.9 JD: Yeah. And the thing is, is yeah, it could be a number of different things. Maybe she tried, maybe she had gotten some feedback about her lessons and maybe even she tried a different lesson design and it was new to her and it just didn't work very well. Maybe she tried to use some new technology or a new activity and it just didn't go well. But you know, if I'm seeing that data show up as a special cause and let's say I'm seeing that the next day or a couple days later, it's still fresh in my mind and I can even go into my chart and label what happened that day. Okay. And I... Now, okay, I'm gonna remove that thing or I'm, you know, if it's a lesson I'm trying, maybe I don't wanna give up on it, but I know I need to improve it 'cause it led to some issues in my classroom, but it's close enough to the time it actually happened that I actually remember what happened on that particular day and I can sort of pinpoint that issue. 0:17:52.9 AS: Yeah. 0:17:54.5 JD: And the data told me it was worth going into studying that particular data point because it was so different than what I had seen previously in this particular 4th grade science system. 0:18:06.5 AS: Makes sense. 0:18:09.9 JD: But in this case, we don't see that, that was a hypothetical. So all we see is sort of the data moving up and down around that green average line. So we have a stable system. So again, that tells me I need to improve the science system itself. There's no special causes to remove. So, the next question I think I would ask, and if you remember one of the data lessons is that we sort of combine the frontline workers, which is the students in this case. We have the manager or the leader, that's the teacher, and then someone with profound knowledge from the Deming lens, that's me, we're bringing these people together and we're saying, okay, you know, we're seeing this hum along this joy in science thing, hum along at sort of like an 81% average. So I think it's a reasonable question to ask, is that good enough? And should we turn our attention to something else. Now, there could be some situations where it's not good enough or some situations where that is good enough. They chose to keep moving to improve that joy in learning. But I think it'd be perfectly reasonable in some context to say, well, you know, sure, maybe we could get a little better here, but maybe it's not worth the effort in that particular area. Maybe we're gonna turn our attention to something else. You know. 0:19:23.7 AS: So you learn something from the chart and that could be... 0:19:26.4 JD: Learn something from the chart. Yeah, yeah. 0:19:27.9 AS: Because when I look at this chart, I just think hard work is ahead. 0:19:31.2 JD: Yeah. Yeah. 0:19:34.7 AS: 'Cause in order to, if you have a stable system with not a lot of extreme... Firefighting is kind of a fun thing, right? When you got special causes, you feel really important. You go out there, you try to figure out what those individual things are, you're the hero. You fix it, you understand it, you see it, whatever. But then when you get a stable system, it's like, oh man, now we got to think about how do we make some substantial changes in the system. It doesn't have to be substantial, but how do we make changes in the system, you know? And then measure whether that has an impact. 0:20:06.4 JD: Yeah. And to your point about fire... Fighting fires, like I didn't know, we had never measured joy in learning like this before, so I didn't know what we were gonna get with Jessica. And so you know what I think you also see here is a pretty well-run classroom. These are kids that are finding a pretty high amount of joy in their lessons. I think that you can kind of objectively say that, but they did choose to move on with the project and keep focusing on this particular system. And I thought it was really interesting. They actually... I'll flip slides here. 0:20:45.6 JD: They actually made this sort of rudimentary fishbone diagram, so you can, if you're viewing the video here you can see that Jessica just took a pen and a piece of paper and put this on the overhead in the classroom, and basically just drew a fishbone. And on the fishbone diagram is also called a cause and effect diagram. So out on the right it says effect. And she wrote low enjoyment, so she's meaning low enjoyment of science class. And they started brainstorming, those are the bones of the fish, what's leading to what's causing the effect of low enjoyment in science class. And so they... She did this brainstorming activity with the kids. So some of the things they came up with were why is there low enjoyment with science class? Well, the computers are sometimes lagging when the kids are trying to use them. They're mad at Ms. Cutler for one reason or another. There's a lot of writing in a particular lesson. There's a lot of reading in a particular lesson. 0:21:58.2 AS: Other teachers coming into the room. 0:22:00.7 JD: Other teachers coming into the room and disrupting the class. 0:22:02.7 AS: Stop bothering me. 0:22:04.1 JD: Yeah. I mean, you know, these are the things you don't often think about. And then they talked about classmates making noises throughout classes, another distraction. And they basically categorized these into different categories. So there were sort of things that made the lesson boring. That was one category. Accidents happening, those are like the computers not working correctly. Scholar... We call our student scholars. So students getting in trouble was one, and then distractions was another category. And so then they did another activity basically after they had this fishbone. And they basically did like a voting activity where they would figure out which of these is the most dominant cause of low enjoyment. And actually what they came up with is their classmates call, like making noises, like students making a lot of noise, making noises, random noises throughout the lesson, they identified that particular thing as the thing that they're gonna then do something like design a plan, do study around, like how are we gonna reduce the amount of noise in the class? 0:23:12.7 JD: And this is all the students coming up with these ideas. Of course, Jessica's guiding these conversations as the adult in the room, but the kids are coming up with this. Like I never would have, well, maybe I shouldn't say I would never have, but it probably wouldn't likely have been on my radar that teacher, other teachers coming into the room was a main source of distraction. You know, who knows what they're doing, dropping off papers that have to be passed out, that dismissal or coming to find a kid for this thing or that thing. Who knows why they're stopping by. But schools are certainly rife with all kinds of disruptions, announcements, people coming into the room, those types of things. 0:23:51.0 AS: It's interesting too to see mad at Miss Cutler because... I was just reading a book about or some research about how to get rid of anger and that type of thing. And they talk about meditation and I do breathing exercise before every class, when every class starts. And it's a way of just kind of calming down and separating the class time from the chaos of outside, but it also could be something that could help with feeling mad. 0:24:27.9 JD: Yeah. And I think if in certain classrooms that certainly could have risen to the top. And then what you do is then design the PSA around that. So how do you do meditation? How do you know if you're going to do... How do you know if you're doing it right? How long do you do it? You know? Does it have the intended impact? You could study all kinds of different things with meditation, but... 0:24:52.4 AS: And are you really mad or is there... Are you really mad at Ms. Cutler or are you... Are you frustrated about something else? Or that... 0:24:58.1 JD: Exactly. Yeah. Is it warranted? Is there actually something that she should stop doing or start doing? There's all kinds of possibilities there. But the main point, and I think this kind of would bring us to the wrap up is taking this approach is very different. Even just the step, Jessica's step of saying, I'm gonna work on joy in science, joy in learning in science class. That's a very different approach. And then step beyond that, I'm gonna involve my students in this improvement approach. And we have these various methods and tools for systematically collecting the classes input, and that those are improvement science or continual improvement tools that we're using. And then we're applying some of the knowledge about variation, Deming sort of data methods to understand that data, that we've systematically collected from students. 0:25:58.7 JD: And now students are involved. So they're actively coming up with both the reasons, the problems that are happening. And then they're like what we'll get into in the last few lessons is their input into the solutions, the change ideas that are gonna make things better. But all of this represents a very different approach than what's typical when it comes to school improvement. These things are not being handed down from on high from someone that has no connection to this classroom whatsoever. Instead, it's actually the people in the classroom that are developing the solutions. 0:26:36.9 AS: I just was thinking about the idea of imagining that this group of students is working really hard on that, and they come up with so much knowledge and learning about how to create a more joyful classroom. And then imagine that they've now codified that together with Ms. Cutler to create kind of the standard operating procedures. Like we put up a sign on the door outside that says, do not disturb until class is over, or... 0:27:06.1 JD: Something simple. Yeah. 0:27:07.0 AS: And that they come up with, and a breathing exercise or whatever that is. And then you imagine the next group of students coming in for the next year, let's say, or whatever, that next group who can then take the learning that the first group had and then try to take it to another level, and then upgrade how the operations of the room is done. And you do that a couple of iterations, and you've now accumulated knowledge that you are building on until in a business, you're... You're creating a competitive advantage. 0:27:40.4 JD: Yeah, absolutely. And another thing that these guys did was they didn't say we're gonna improve X, Y, or Z and then set an arbitrary goal, which is one of the things we've talked about that often happens at the outset of any type of improvement. They didn't... They sort of avoided this act of desperation. We talk about goal setting as an active... Are often goal setting is often an act of desperation. They avoided that completely. Instead, what they did was we gathered some baseline data to understand what is the capability of our system when it comes to joy in learning. That's what they did first. They didn't set the goal first. A lot of wisdom, a lot of wisdom in 10 year olds for sure. 0:28:22.1 AS: That's interesting. Well, John, on behalf of everyone at the Deming Institute, I want to thank you again for the discussion and for listeners, remember to go to deming.org to continue your journey. You can find John's book Win-Win: W. Edwards Deming, the System of Profound Knowledge and the Science of Improving Schools on amazon.com. This is your host, Andrew Stotz, and I'll leave you with one of my favorite quotes from Dr. Deming, and its absolutely applicable to today's discussion. People are entitled to joy in work.

8 snips
May 22, 2024 • 37min
How to Test for Understanding: Awaken Your Inner Deming (part 22)
Bill Bellows, who has spent 30 years applying Dr. Deming's ideas, discusses testing for understanding transformation with Andrew. They talk about little tests to measure learning impact and share amusing anecdotes. Topics include effective questioning, incentives in motivation, and fostering comprehension during transformation processes.

May 7, 2024 • 33min
Transparency Among Friends: Awaken Your Inner Deming (Part 21)
Bill Bellows shares his experience of implementing Dr. Deming's ideas with a small group at a large company. They discuss transparency, decision-making, and the importance of open communication to drive organizational change.

Apr 30, 2024 • 31min
Goal Setting is Often an Act of Desperation: Part 4
Can a 4th grade class decide on an operational definition of "joy in learning"? In part 4 of this series, educator John Dues and host Andrew Stotz discuss a real-world example of applying Deming in a classroom. This episode covers the first part of the story, with more to come in future episodes! 0:00:02.3 Andrew Stotz: My name is Andrew Stotz, and I'll be your host as we continue our journey into the teachings of Dr. W. Edwards Deming. Today, I'm continuing my discussion with John Dues, who is part of the new generation of educators striving to apply Dr. Deming's principles to unleash student joy in learning. This is episode four about goal setting through a Deming lens. John, take it away. 0:00:22.6 John Dues: Good to be back, Andrew. Yeah, we've been talking about organizational goal setting last few episodes. A couple episodes ago, we talked about those four conditions that organizations should understand prior to setting a goal. Then we sort of introduced this idea of trying to stay away from arbitrary and capricious education goals. And then we got into these 10 lessons for data analysis. And so what I thought we could do now is we've got that foundation in place is that we could take a look at an applied example in real classrooms of those 10 key lessons in action to kind of bring those alive. And I ran this project a few years ago with a teacher named Jessica Cutler. She's a fourth grade science teacher in our network. And she was going through something we call a Continual Improvement Fellowship. So we do this sort of internal fellowship where people can learn that sort of way of thinking, the tools, techniques, the theories related to the science of improvement. And then they actually take that right away and apply it to a problem in their classroom or their department or their school, depending on who it is. 0:01:55.0 JD: And so what Jessica was doing, what her project ended up being was she was trying to improve the joy in learning in her fourth grade science class. So it's interesting to see how that sort of project evolved. So I thought we could revisit each of the 10 lessons and how that lesson was applied in Jessica's improvement project. And we'll maybe get through three or four of the lessons today. And then over the course of the next few episodes, kind of get to all 10 lessons and think through how they were... How that went in her improvement project. 0:02:08.1 AS: Sounds like a good plan, practical application. 0:02:12.0 JD: Yeah. I mean, it was interesting too, because she didn't initially sort of consider joy as a possibility. She was thinking like, I'm gonna work on improving test scores or something like that was sort of her initial brainstorm. And then sort of pivoted to this when we kind of talked through what was possible from the Deming philosophy type of standpoint. So it's interesting to see how things evolve. But just to kind of revisit, so we talked through these 10 lessons. Lesson one was "data has no meaning apart from their context." So we talked about these questions that are important, like who collected the data? How was it collected? When was it collected? Where was the data collected? What are the values themselves represent? What's that operational definition for the concept under measurement? Have there been any changes to that operational definition as the project unfolds? And so even with a project with a teacher and her students, all of those questions are relevant. They're still important just because you're dealing with students that doesn't mean anything changes on that front. So it was important for her to sort of think through all of those things as she thought through the start of her project. 0:03:28.9 JD: And what her and her students came up with after they sort of decided that they were gonna focus on joy, they focused on this problem statement. And they were like, well, what do we want science class to look like? 'Cause that was sort of their starting point. And what her and her students...Oh sorry go ahead. 0:03:45.9 AS: One thing you started off talking about her, now you're talking about her students. So she got her students involved in this process. Is that what you're saying? 0:03:56.2 JD: Yeah. So they were working together from the very outset even... 0:04:02.0 AS: As opposed to a teacher talking through this with a principal or something in a faculty room and then thinking of how do I... Okay. 0:04:09.2 JD: Yep. That's right. Yeah. And so what they came up with is the sort of desired future state of science classes. "We are able to stay focused through science, enjoy science class and remain engaged." And so to give some context, what was happening is that she taught science and social studies and it was sort of like a back-to-back class period. And they would do science second. And so by the time they were doing science, sometimes the students were getting off task, disengaged. They weren't as engaged as either the students wanted to be or the teacher wanted them to be in that second lesson. So they, they came up with that as the thing they were gonna focus on. And then because they were gonna focus on joy in learning, they had to define what that meant. So what did joy in learning mean to that fourth grade science class? And what they came up with as a definition, which I really like, is "we wanna have fun learning, finding things we like to learn and have fun completing classwork and activities." So they came up with this operational definition. And keep in mind, these are fourth graders and Jessica's having these conversations like, what's the operational definition? That's not probably typical language you're gonna use with fourth graders. But if you walk them through these things, they actually pick up on it pretty quickly. 0:05:26.1 JD: It's actually pretty cool to see. 0:05:28.1 AS: And to them, a more simpler word sounds like was fun. 0:05:35.0 JD: Yeah, right. They wanted that to be a part of the science learning process. So basically, once they had the operational definition, they had to think through, well, how are we going to measure that concept that we've defined? And what they did was they just developed a simple survey. Jessica did it in Google Forms. just had, really just had two questions. The first question was, on a scale of 1 to 10, how much did you enjoy science class today? And then there was a second open-ended question that said, what made you enjoy or not enjoy class today? So it was fresh in the kids' minds. So basically, at the end, each kid has a Chromebook in Jessica's science class. She would just sort of share the link to the survey, and the kids would complete that as the closing activity for the lesson. So she would get two things out of it. So 1 to 10, just a real quick sort of numerical quantified value, how much the kids enjoyed science class that day. And then, because it had just happened, the students could say what they did and didn't like about the lesson. Oh, we haven't used computers in a few days. Or it'd be nice if I had a video to help bring this concept alive. Or there's a few words that you use that I don't know the definitions to. Could you add those definitions to the glossary? So just things like that, simple things like that. 0:06:55.9 JD: Right away. And then what Jessica could then do is take that information and actually adjust her lessons as she planned maybe for the next week, she could make those adjustments based on this feedback she was getting from the students. So that's sort of the application of lesson number one. So what are we measuring? How are we gonna measure it? When are we collecting this data? That type of thing. Lesson two, if you remember back from when we covered the lessons was "we don't manage or control the data. The data is the voice of the process," right? 0:07:28.9 JD: So we talked about this ideas that while we don't control the data, we do manage the system and the processes from which the data come, right? So, and this is really key conception of the system's view. You, you say you're going to improve this particular classroom. So that's the system. So you're not necessarily controlling the data. You're not controlling how the kids are evaluating, the numbers that they're putting one through 10 to assess joy in learning, but what the teacher and then the students, because of this project do have control over are the learning processes that are happening throughout science class, right? And so back to your point about you switch from talking about Jessica, the teacher to the students. And then you said "we" that's also a key conception of taking this approach, right? 0:08:24.4 JD: So what I think Deming would say is that when you're going to improve an organization, you have to sort of combine sort of three critical pieces. One thing is you need someone from the outside, from outside the system that has Profound Knowledge. And then that person or persons has to be collaborating with the people working in the system. So those are the students, they're working in the science class system. And, then you that third group or that third person is the manager or managers have that have the authority to work on the system. So in this case, Jessica has the authority to change what's happening in her science class. 0:09:10.2 JD: The students are the workers working in that science system. And then that third part is that person that has the sort of understanding of the System of Profound Knowledge and it's sort of bringing all of these parts together that really is how you begin to transition sort of conventional classrooms to those guided by the Deming quality learning principles, right? 0:09:33.1 JD: So in in the case of Jessica's project, that person that was, that had a System of Profound Knowledge lens was me. So I was sort of acting as an, the outsider, 'cause I'm outside of the science system. But I have this understanding of the System of Profound Knowledge. And I'm working with Jessica as she's working with her students, to sort of bring that lens to the projects. 0:10:00.4 AS: And what's the point of doing all that if she doesn't have the ability to make the changes necessary to test, if you're gonna if we change this, it's gonna result in something why go and do all this if you're just stuck in a system that you simply cannot change because of government regulation or whatever, maybe. 0:10:17.5 JD: Right. Yeah. So it's bringing all those pieces together. But what I found thinking about the three parts of a team that's working toward organizational improvement, what I've found in the past is, in my experience, whether it's a school improvement team or a district based improvement team, most of them are devoid of at least one parts of one of those components, usually two of those components, 'cause usually students aren't involved. 0:10:45.9 JD: And then in most school systems, there's no one with this outside knowledge, the System of Profound Knowledge lens, right. And I think it's what we're really doing is the students can identify the waste, the inefficiency, the things that aren't going well from their perspective, but we don't often ask them. Or if we do, we do it in a way where it's an end of year survey or an end of semester survey, but this is collecting that feedback in real time and then acting on it. We're not planning to do something next year with this feedback, we're actually planning to do something the next day, or maybe the next week, to adjust the science lessons. 0:11:24.8 AS: And it's one of those two things that come into my mind, what, how do you handle the idea that what's causing the impact on joy in learning could be that the student had a bad night, the night before. And I guess by doing many samples that starts to kind of wash out. And then the other question is since the students know that the teacher could likely make an adjustment, is there any possibility that they could be gaming or playing the system. 0:11:57.1 JD: Well, that's interesting, because I think, well on the first point. I think pretty quickly, my experience with this and David Langford I know you've talked to has echoed this sentiment is you know, he was working with high school students, this is an elementary project but either way. You may get some students that don't take this seriously. At first. And you may get some kind of crazy answers crazy brainstorms or crazy survey submissions, although I don't think Jessica got much of that. 0:12:30.2 JD: But in other projects I've gotten some stuff at the outset that was a little bit off the wall. But like David said to me when I first started this and then it's been my experience since is that kids, once they realize that you're actually gonna act on the feedback, as long as the feedback is in good faith. They actually start to take it seriously, pretty, pretty quickly. And so I think pretty quickly, those sort of types of worries go by the wayside. Now, I will say I did say that the... 0:13:01.1 JD: One of the components that has to be on this improvement team is the person that has the authority to change the system. So at the end of the day, even though we're gathering this input, Jessica's really the person as the teacher of that classroom that has the authority to make the changes to the system based on her judgment or, her professional judgment as a science teacher of what should happen. And so the students certainly offer feedback and inform that process, but ultimately it's Jessica that's gonna determine the changes to the system. 0:13:34.0 AS: I hear David in my ears saying, you know what, Andrew? You don't trust the students? They probably have a more honest, view of what's going on than most adults do. So yes, I hear the voice of David Langford. 0:13:49.2 JD: Yeah. Well, and interestingly, and we'll get into this towards the end, not today, but when we get to some of the other lessons, interestingly, not to give away the story, but, one of the things that was getting kids off track was a lot of noise during class, kids making noises. And they actually came up with this system where they were kind of penalizing each other. This was their own idea. And so, kids know exactly what's going on in class. And so it was interesting to see how they came up with some ideas to rectify that. But yeah, so it was really just bringing together, these three groups or, the group of students and then Jessica and then myself. It's that combination that's really where the power for improvement lies. And again, I, that type of partnership is just not typical in school improvement situations. 0:14:45.7 JD: So that's lesson two, applied. Lesson number three is "plot the dots for any data that incurs in time order." Right? So we've talked about this a lot. The idea behind the primary point of "plot the dots" is that plotting data over time helps us understand variation, and that in turn leads us to take more appropriate action. I think that what we decided to do with Jessica's project is, start plotting the points on a run chart and connect those points with a line, and then it becomes pretty intuitive as we're looking at that data, what joy in learning looks like in this science class. And then once we have enough data, we can turn that run part chart into a process behavior chart and actually add the limits. 0:15:40.2 JD: So, like I said, Jessica, once her and the class determined that what they were going to improve was join in learning, and they defined that concept operationally and created the survey, right away they started gathering this survey data as a part of the project, and usually they would gather the data maybe, two or three times a week across the course of this particular improvement project. So maybe I'll share my screen just so you can see what that initial run chart looked like. So, you have this run chart, and I left this in the spreadsheet so you could see the actual data. So as she began administering these surveys, she would send me the data and then I would create it the run chart for her, start plotting that data so that both of us could sort of see the variation in that survey data over time. And then she could actually take this, she would put this run chart on a slide, and every week or so she would actually show the students what the data looked like. 0:16:49.7 AS: And just to be clear, we've got a chart for those that are listening, we've got a chart that has a blue line and it's going up and down kind of around the level of about 79. So they've got points that are based, that are days. Some days are below that 79 some days are above. But also I'm assuming that those points are the output of all the surveys. So the average answer on that day from the survey as different from the average or median of all the day's output, correct? 0:17:31.1 JD: Yeah, that's right. So this is, the run chart from Jessica's class that's displaying the survey results. And what they're measuring is joy in science class as assessed by the students. 0:17:44.3 AS: On the first day, the students basically said, 75% of the respondents said that they had joy in science. 0:17:51.9 JD: That's right. So in this particular school year, which was two years ago, so we had done some of the project planning before kids went on winter break, and then when they came back from winter break, they were ready to start administering the survey. And we started plotting the dots, charting the data over time. So the X axis for those who are listening are the dates. 0:18:16.2 JD: The, Y axis is the joy in learning, percent of kids that the rating of the kids from one to 10. And then I just turn it into a percent. And so you have the green line, the central line running through is the median. We're using the median 'cause that's fairly typical for a run chart because typically run charts don't have as much data as a process behavior chart. And so, outliers can have a greater impact. So we're using the median to sort of control for that. Although this data's fairly tight. So on day one, like you said on January 4th of this school year 75, the kids sort of rated the joy in learning of that particular lesson as a 75% of 100. And then you sort of see it bounce around. 0:19:04.7 JD: That median of 79. And so what I'm showing is the data from the first 10 surveys that Jessica administered at the end of class. So over the course of 20 days from January 4th through January 24th, she administered that this survey 10 different times. So about two to three times a week. And so we see a high of about 83% joy in learning and a low of 67% joy in learning. And you have about half the points above the median, about half the points below the median. So even though it's only 10 data points, Jessica and her class, and then myself, we were starting to learn about what did joy in learning, joy in science class actually look like? Now that we have this definition and we're measuring it with these surveys and then plotting these data points. So again, she's actually putting this up on, on the, up on the screen so kids can actually see this. And what she said was after the 5th or 6th survey, and she's plotted this and put this up on a screen a few times, the kids are actually getting excited. And they're wanting to see their data. They're wanting to see what the results look like for each survey as she started plotting this. 0:20:30.6 AS: It's funny because I, when I was a loading supervisor at Pepsi, I started putting up the percent correct for each of the loaders in the warehouse. And I didn't make any comment or anything, I just put it up there. And yeah, people are interested when they start seeing numbers, they start thinking, they start asking questions. 0:20:54.3 JD: Yeah, and you can see too at a school, in a fourth grade science classroom, you can see all types of lessons, you can sort of build up this reading graphs, calculating percentages, using when do you use line graphs for some other type of graph? 0:21:10.2 AS: And why use median versus mean? Because a small amount of data could be distorted if you have a huge outlier. 0:21:18.8 JD: Yep, all kinds of practical lessons. So this brings us to sort of the last lesson for this particular episode. I think lesson four is two or three data points are not a trend, right? So, we've said that you should start plotting the dots as soon as you've decided to collect some type of data that occurs over time. And really when people ask me what type of data can you put on a run chart or a process behavior chart, there's almost any data that you're interested in improving in schools unfolds over time, almost all of it. Whether that's a daily cadence, a weekly cadence, monthly, quarterly, yearly, whatever it is, right? But the problem is the vast majority of data that we look at as educators and really probably most people, it's typically two or maybe three data points. But that doesn't tell you anything about how the data is varying naturally. So when we start thinking about this particular data, we start learning quite a bit. For one, as a teacher, I would have no idea how my kids would evaluate their joy in my classroom. 0:22:29.9 JD: And so I think if I was Jessica, I'd be pretty happy off of that, that the sort of average or the median rating is close to 80%, basically the rating each lesson has an eight out of 10. Right. I think a second thing is let's say we were a school district and we did systematically give our kids some type of survey that assess their satisfaction with the school. Right. Maybe they do it twice a year or annually. Right. And so after at the end of the year, you have two data points, but you don't really have any idea for what to do with that data. You have no idea if you collected three or four or five data points, what that would look like. And here she is in just 20 calendar days and a couple of school weeks. She's got 10 data points to work with already. So she's building that baseline of data. So I think what this is to me is just a very different approach to school improvement. 0:23:39.4 JD: And the tools are relatively simple. The ideas are relatively simple. But I think overall this really, the takeaway I want for folks is that this project really illustrates a very different approach to school improvement, guided by these sound sort of Deming principles for how to use data, to how to understand variation, to how to include the people working in the system, right?. We've talked about these arbitrary targets throughout this series, and you could see that when Jessica and her class would go to maybe set a goal for joy after collecting some of this data, that goal would be tied to something real. It's tied to actual data from the classroom. And you can sort of avoid goal setting as an act of desperation when you take this type of approach. 0:24:38.4 AS: Joy in the joy of bringing joy in science. 0:24:44.2 JD: Yeah, it's really all about this process, right? It's the kids getting into this process, that's the psychological part. They're involved in their educational process. And so that is completely different than what's happening in the typical classroom, I think, in the United States. 0:25:00.5 AS: You can imagine somebody not wanting to do this because they're afraid of what they're going to see. 0:25:07.0 JD: Certainly [laughter] 0:25:08.3 AS: Yeah. 0:25:08.7 JD: Certainly. Yeah. Hopefully they would be open to sort of collecting the data and being reflective as a professional. But I could see, maybe that's not, tha's not always the case. And another question, I kind of shared this project with some folks, in different settings, and one of the questions I typically get is, well, what about the science test scores? Like, this is great if kids have joy, I guess is kind of the reaction. But what... How does that impact the academics? 0:25:44.0 JD: And my response is, well if kids don't find joy in their learning and they're not engaged, what kind of results are you gonna get? [laughter] To me this is sort of like a part of the process that leads to academic outcomes, when you enjoy the things that you're doing, when you feel like you have some control over a process, maybe not the full control, but when you have some control, when you have input into something that you're doing all day long, you're gonna have more investment because, you know, because you're seeing that your input has meaning. That's really that psychological component. 0:26:18.2 AS: It's obvious, but maybe not proven. 0:26:21.9 JD: Yeah, I think so. I think so. 0:26:28.4 AS: Okay. [SILENCE] 0:26:30.9 JD: Yeah. I think that's a pretty good spot to wrap up this opener with the... We covered those first four lessons and started to look at how this project unfolded in, Jessica's classroom. And I think, next we can kind of see as she gathered more data, what this looked like over time. And then as she sort of had that baseline in place, then the next thing we'll look at is: what did she do as a change idea or an intervention to try to make these rates go higher in her classroom? 0:27:08.4 AS: That's interesting. I mean, in my wrap up of this, I think how lucky, is Jessica to have someone from the outside? I think a lot of teachers and a lot of people in business, they don't really have anybody to go to. And the company's not providing that type of stuff or the school is not providing that. And so you just kind of make it up as you go along. And I think that's, that's one of the things, 'cause I'm, I think like probably other listeners and viewers who are, listening to this, they're thinking, I wonder if John could help me do that in my area? The idea of, we all know there's places that we could improve that we may not be. And if a school system can provide that, wow, that's a big... That's exciting. 0:27:56.6 JD: Yeah. I'd be happy to. And it was like a, it was definitely a mutual effort. Jessica put a lot of work into sort of, 'cause she has gone through that fellowship, she had to sort of learn all of these tools and then actually, turn around and put them into practice in her classroom. And she found ways to do this in a way where, she could still do the things she was required to do, like delivering the lessons that she was required to deliver and those types of things. But then she found ways to sort of incorporate what she learned in the fellowship to make her classroom better. Seeing that, seeing her openness to feedback that really made this like a, I think a, mutually beneficial experience. And I think the kids enjoyed it too. 0:28:39.2 AS: And the purpose of this series too is, the idea of how can you do this at home and how can you start doing it in your own school, in your own classroom, in your own life? And so I think I'm looking forward to the next session where we're gonna go deeper into... I've already got, a series of questions and things that I'm wondering, and then I saw some tabs in your, in your worksheet that I thought, okay, there's gonna be some more interesting stuff. So I think we're all gonna see you in that next section. 0:29:09.9 AS: And on behalf of everyone at the Deming Institute, I want to thank you again for this discussion and, taking the time to go through these steps with us. And for listeners, remember to go to deming.org to continue your journey. You can find John's book Win-Win, W Edwards Deming, the System of Profound Knowledge and the Science of Improving Schools on amazon.com. This is your host, Andrew Stotz. And I'll leave you with one of my favorite quotes from Dr. Deming, and it's particularly apropos, people are entitled to joy in work.

Apr 16, 2024 • 46min
System of Profound Wisdom: Awaken Your Inner Deming (Part 20)
Dr. Deming developed his philosophy over time and in conversations with others, not in isolation. As learners, we tend to forget that context, but it's important to remember because no one implements Deming in isolation, either. In this conversation, Bill Bellows and host Andrew Stotz discuss how there's no such thing as a purely Deming organization and why that's good. TRANSCRIPT 0:00:02.2 Andrew Stotz: My name is Andrew Stotz, and I'll be your host as we continue our journey into the teachings of Dr. W. Edwards Deming. Today, I'm continuing my discussions with Bill Bellows, who has spent 30 years helping people apply Dr. Deming's ideas to become aware of how their thinking is holding them back from their biggest opportunities. Today is episode 20, entitled, System of Profound Wisdom. Bill, take it Away. 0:00:31.6 Bill Bellows: But not just for 30 years. I forgot to say I started when I was 12. 0:00:36.6 AS: Yes. [laughter] Yes. And you've got the hair to prove it. [laughter] 0:00:43.7 BB: All right. Now, actually, I was thinking the proposal and the title, I thought... I mean, System of Profound Wisdom is cool, System of Profound Questions. Either one of those is good. Let's see which title comes out. 0:00:57.6 AS: Yeah. And I think we'll have to also understand that may some listeners that may not even know what System of Profound Knowledge means, they've been listening. They do. But if today's their first episode, we also gotta break that down, just briefly. 0:01:10.9 BB: Yeah. Okay, let's do that. All right. Well, let me give an opening a quote from Dr. Deming from chapter three, and then we can explain this SoPK, System of Profound Knowledge, thing. But in chapter three of Dr. Deming's last book, The New Economics, the last edition, edition three, came out in 2018. And chapter three, Dr. Deming says, "We saw in the last chapter, we are living under the tyranny of the prevailing style of management. Most people imagine that this style has always existed. It is a fixture. Actually, it is a modern invention a trap that has led us into decline. Transformation is required. Education and government, along with industry, are also in need of transformation. The System of Profound Knowledge to be introduced in the next chapter is a theory for transformation." So you wanna... 0:02:15.4 AS: That's good. 0:02:16.7 BB: So let's say something. Let's just say something about SoPK. How would you explain that? 0:02:23.1 AS: Yeah. Well, actually, I wanna talk very briefly about what you just said, because it's just... 0:02:27.1 BB: Oh, sure. 0:02:29.6 AS: At one point, I thought, "It's a system of knowledge." But he just said it was a system of transformation. 0:02:38.7 BB: It's a theory for transformation. 0:02:40.1 AS: A theory for transformation. Okay, got it. I see. And one of the things that I... I look at Toyota so much just 'cause it's so fascinating and how they've survived all these years, the continuity in the business, the continuity and the profitability of the business, the continued march to become the number one auto producer in the world, and having faced all the ups and downs and survived. And I just think that what they have is a learning organization. No matter what the challenge is, they're trying to apply learning tools, like System of Profound Knowledge, like PDSA, to try to figure out how to solve this problem. And I think that many companies, including at times my companies, [chuckle] we sometimes will scramble and we'll lose knowledge and we won't gain knowledge. And so the System of Profound Knowledge, to me, is all about the idea of how do we build a base of knowledge in our business and then build upon that base of knowledge rather than destroy it when the new management comes in or when a new management idea comes in. 0:04:00.7 AS: And that's something I've just been thinking about a lot. Because I do know a company that I've been doing some work with, and they basically threw away a huge amount of work that they did on System of Profound Knowledge and stuff to go with the prevailing system of management, is like going back. And now, they just produced a loss in the first quarter, and I just think, "Interesting. Interesting." 0:04:27.6 BB: Well, a couple things come to mind based on what you said. One is I would propose that Toyota, I'm in agreement of "Toyota's a learning organization." And that'll come up later. I've got some other thoughts on learning organizations. And we know that they were influenced by Dr. Deming. To what degree, I'm not sure of. Shoichiro Toyoda, who is one of the sons of the founder of the Toyota Motor Car Company, was honored with a Deming prize in 1990. And I believe it came from JUSE, as opposed to the American Society for Quality. One or the other. He was honored with a Deming Prize. 0:05:32.0 AS: Yep. 0:05:33.5 BB: Again, I don't know if it's Deming Prize or Deming Medal. But I know he was honored. What's most important, the point I wanna make is, upon receiving it he said, "There is not a day that goes by that I don't think about the impact of Dr. Deming on Toyota." But, if I was to look at the Toyota Production System website, Toyota's Toyota Production System website, which I've done numerous times, I'd be hard-pressed to find anything on that page that I could say, "You see this word, Andrew? You see this sentence, Andrew? You see this sentiment? That's Deming." Not at all. Not at all. It's Taiichi Ohno. It's Shigeo Shingo. I'm not saying it's not good, but all those ideas predate Deming going to Japan in 1950. Taiichi Ohno joined Toyota right out of college as an industrial engineer in 1933, I believe. The Japanese Army, I mentioned in a previous episode, in 1942, wanted him to move from Toyota's loom works for making cloth to their automobile works for making Jeeps. This comes from a book that I would highly recommend. Last time we were talking about books. I wanted to read a book, I don't know, maybe 10 years ago. I wanted to read a book about Toyota, but not one written by someone at MIT or university. I didn't wanna read a book written by an academic. I've done that. 0:07:15.1 BB: I wanted to read a book by somebody inside Toyota, get that perspective, that viewpoint. And the book, Against All Odds, the... Wait I'll get the complete title. Against All Odds: The Story of the Toyota Motor Corporation and the Family That Changed it. The first author, Yukiyasu Togo, T-O-G-O, and William Wartman. I have a friend who worked there. Worked... Let me back up. [chuckle] Togo, Mr. Togo, born and raised in Japan, worked for Toyota in Japan, came to the States in the '60s and opened the doors to Toyota Motors, USA. So, he was the first person running that operation in Los Angeles. And it was here for years. I think it's now in Texas. My late friend, Bill Cummings, worked there in marketing. And my friend, Bill, was part of the team that was working on a proposal for a Lexus. And he has amazing stories of Togo. He said, "Any executive... " And I don't know how high that... What range, from factory manager, VPs. But he said the executives there had their use, free use, they had a company car. And he said Togo drove a Celica. Not a Celica. He drove a... What's their base model? Not a... 0:08:56.2 AS: A Corolla? 0:08:57.7 BB: Corolla. Yes, yes, yes. Thank you. He drove a Corolla. He didn't drive... And I said, "Why did he drive a Corolla?" Because it was their biggest selling car, and he wanted to know what most people were experiencing. He could have been driving the highest level cars they had at the time. Again, this is before a Lexus. And so in this book, it talks about the history of Toyota, Taiichi Ohno coming in, Shigeo Shingo's contributions, and the influence of Dr. Deming. And there's a really fascinating account how in 1950, a young manager, Shoichiro Toyoda, was confronted with a challenge that they couldn't repair the cars as fast as they could sell them. This is post-war Japan. They found a car with phenomenal market success. Prior to that, they were trying to sell taxicabs, 'cause people could not... I mean, buying a car as a family was not an option. But by 1950, it was beginning to be the case. And the challenge that Shoichiro Toyoda faced was improving the quality, 'cause they couldn't fix them as fast as they could sell them. And yet, so I have no doubt that that young manager, who would go on to become the chairman, whatever the titles are, no doubt he was influenced by Dr. Deming. But I don't know what that means. 0:10:23.4 BB: That does not... The Toyota Production System is not Deming. And that's as evidenced by this talk about eliminating waste. And those are not Deming concepts. But I believe, back to your point, that his work helped create a foundation for learning. But I would also propose, Andrew, that everything I've read and studied quite a bit about the Toyota Production System, Lean, The Machine That Changed The World, nothing in there explains reliability. To me, reliability is how parts come together, work together. 'Cause as we've talked, a bunch of parts that meet print and meet print all over the place could have different levels of reliability, because meeting requirements, as we've talked in earlier episodes, ain't all it's cracked up to be. So I firmly believe... And I also mentioned to you, I sat for 14 hours flying home from Japan with a young engineer who worked for Toyota, and they do manage variation as Dr. Taguchi proposed. That is not revealed. But there's definitely something going on. But I would also say that I think the trouble they ran into was trying to be the number one car maker, and now they're back to the model of, "If we are good at what we do, then that will follow." 0:11:56.8 BB: And I'm gonna talk later about Tom Johnson's book, just to reinforce that, 'cause Tom, a former professor of management at Portland State University, has visited Toyota plants numerous times back before people found out how popular it was. But what I want to get into is... What we've been talking about the last couple episodes is Dr. Deming uses this term, transformation. And as I shared an article last time by John Kotter, the classic leadership professor, former, he's retired, at the University... Oh, sorry, Harvard Business School. And what he's talking about for transformation is, I don't think, [chuckle] maybe a little bit of crossover with what Dr. Deming is talking about. What we talked about last time is, Deming's transformation is a personal thing that we hear the world differently, see the world differently. We ask different questions. And that's not what Kotter is talking about. And it's not to dismiss all that what Kotter is talking about, but just because we're talking about transformation doesn't mean we mean the same thing. 0:13:10.6 BB: And likewise, we can talk about a Deming organization and a non-Deming organization. What teamwork means in both is different. In a Deming organization, we understand performance is caused by the system, not the workers taken individually. And as a result of that, we're not going to see performance appraisals, which are measures of individuals. Whereas in a non-Deming organization, we're going to see performance appraisals, KPIs flow down to individuals. [chuckle] The other thing I had in my notes is, are there really two types of organizations? No, that's just a model. [chuckle] So, really, it's a continuum of organizations. And going back to George Box, all models are wrong, some are useful. But we talked earlier, you mentioned the learning organization. Well, I'm sure, Andrew, that we have both worked in non-Deming organizations, and we have seen, and we have seen people as learners in a non-Deming organization, but what are they learning? [chuckle] It could be learning to tell the boss what they want to hear. They could be learning to hide information that could cause pain. [chuckle] Those organizations are filled with learners, but it's about learning that makes things worse. It's like digging the pit deeper. What Deming is talking about is learning that improves how the organization operates, and as a result, improves profit. In a non-Deming organization, that learning is actually destroying profit. 0:14:51.8 BB: All right. And early, spoke... Russ, Russ and Dr. Deming spoke for about three hours in 1992. It got condensed down to a volume 21 of The Deming Library, for which our viewers, if you're a subscriber to DemingNEXT, you can watch it in its entirety. All the Deming videos produced by Clare Crawford-Mason are in that. You can see excerpts of volume 21, which is... Believe is theory of a system of education, and it's Russ Ackoff and Dr. Deming for a half hour. So you can find excerpts of that on The Deming Institute's YouTube channel. 0:15:37.0 BB: And what I wanted to bring up is in there, Russ explains to Dr. Deming the DIKUW model that we've spoken about in previous episodes, where D is data. That's raw numbers, Russ would say. I is information. When we turn those raw numbers into distances and times and weights, Russ would say that information is what the newspaper writer writes, who did what to whom. Knowledge, the K, could be someone's explanation as to how these things happened. U, understanding. Understanding is when you step back and look at the container. Russ would say that knowledge, knowledge is what you're using in developing to take apart a car or to take apart a washing machine and see how all these things work together. But understanding is needed to explain why the driver sits on the left versus the right, why the car is designed for a family of four, why the washing machine is designed for a factor of four. That's not inside it. That's the understanding looking outward piece that Russ would also refer to as synthesis. And then the W, that's the wisdom piece. What do I do with all this stuff? And what Russ is talking about is part of wisdom is doing the right things right. So, I wanted to touch upon in this episode is why did Dr. Deming refer to his system as the System of Profound Knowledge? Why not the System of Profound Understanding? Why not the System of Profound Wisdom? And I think, had he lived longer, maybe he would have expanded. Maybe he would have had... 0:17:28.4 BB: And I think that's the case. I think it's... 'Cause I just think... And this is what's so interesting, is, if you look at Dr. Deming's work in isolation and not go off and look at other's work, such as Tom Johnson or Russ, you can start asking questions like this. 0:17:45.7 AS: One thing I was going to interject is that I took my first Deming seminar in 1989, I believe, or 1990. And then I took my second one with Dr. Deming in 1992. And then soon after that, I moved to Thailand and kind of went into a different life, teaching finance and then working in the stock market. And then we set up our factory here for coffee business. But it wasn't until another 10 years, maybe 15 years, that I reignited my flame for what Dr. Deming was doing. And that's when I wrote my book about Transform Your Business with Dr. Deming's 14 Points. And what I, so, I was revisiting the material that had impacted me so much. And I found this new topic called System of Profound Knowledge. I never heard of that. And I realized that, it really fully fledged came out in 1993, The New Economics, which I didn't get. I only had Out of the Crisis. 0:18:49.9 BB: '93. 0:18:49.9 AS: Yeah. And so that just was fascinating to go back to what was already, the oldest teacher I ever had in my life at '92, leave it, come back 10, 15 years later and find out, wait a minute, he added on even more in his final book. 0:19:10.4 BB: Well, Joyce Orsini, who was recruited by Fordham University at the encouragement of Dr. Deming, or the suggestion of Dr. Deming to lead their Deming Scholars MBA program in 1990. Professor Marta Mooney, professor of accounting, who I had the great fortune of meeting several times, was very inspired by Dr. Deming's work. And was able to get his permission to have an MBA program in his name called the Deming Scholars MBA program. And when she asked him for a recommendation, "Who should lead this program?" It was Joyce Orsini, who at the time I think was a vice president at a bank in New York. I'm not sure, possibly in human resources, but I know she was in New York as a vice president. 0:20:10.0 BB: And I believe she had finished her PhD under Dr. Deming at NYU by that time. And the reason I bring up Joyce's name, I met her after Dr. Deming had died. Nancy Mann, who is running a company called Quality Enhancement Seminars with, a, at the beginning one product, Dr. Deming's 4-Day seminar, when Dr. Deming died, and I had mentioned, I was at his last seminar in December '93, she continued offering 4-day seminars. And I met her later that year when she was paired with Ron Moen and they were together presenting it, and others were paired presenting it. And at one point, as I got to know Joyce, she said, "His last five years were borrowed time." I said, "What do you mean?" She said, "He started working on the book in 19'" evidently the '87, '88 timeframe, he started to articulate these words, Profound Knowledge. 0:21:11.0 BB: And I know he had, on a regular basis, he had dinner engagements with friends including Claire Crawford-Mason and her husband. And Claire has some amazing stories of Deming coming by with these ideas. And she said, once she said, "What is this?" And he is, she took out a napkin, a discretely, wrote down the, "an understanding of the difference between intrinsic motivation and extrinsic motivation. Difference between understanding special causes versus common causes." And she just wrote all this stuff down, typed it up. When he showed up the next week, she greeted him at the door and said, and she said, he said, This is Claire. And Claire said, he said, "What's that?" He says, "Well, I took notes last week." 0:21:54.2 BB: And he says, "I can do better." [chuckle] And so week by week by week. And as he interacted with the people around him, he whittled it down. And I'm guessing it put it into some, there's a technique for grouping things, you, where on post-it notes and you come up with four categories and these things all go over here. There's one of the elements of that, one of the 16 had to, or 18 or so, had to do with Dr. Taguchi's loss function. So that could have gone into the, maybe the variation piece, maybe the systems piece. But Joyce said, basically he was frustrated that the 14 Points were essentially kind of a cookbook where you saw things like, "cease dependence on inspection" interpreted as "get rid of the inspectors." And so he knew and I’d say, guided by his own production of a system mindset, he knew that what he was articulating and the feedback were inconsistent. 0:23:01.9 BB: And I've gotta keep trying. And she said, "His last five years on borrowed time as he was dying of cancer, was just trying to get this message out." So I first got exposed to it 19, spring of '90 when I saw him speaking in Connecticut. And I was all about Taguchi expecting him to, I didn't know what to expect, but I knew what I was seeing and hearing from Dr. Taguchi when I heard Dr. Deming talk about Red Beads. I don't know anything about that, common cause and special cause, I didn't know anything about that. And so for me, it was just a bunch of stuff, and I just tucked it away. But when the book came out in '93, then it really made sense. But I just had to see a lot of the prevailing style of management in the role I had as an improvement specialist, become, [chuckle] a firefighter or a fireman helping people out. 0:24:01.5 AS: I noticed as I've gotten older that, I do start to connect the pieces together of various disciplines and various bits of knowledge to realize, so for instance, in my case, I'm teaching a corporate strategy course right now at the university. Tonight's, in fact, the last night of this particular intake. And my area of expertise is in finance, but now I see the connection between strategy and finance, and how a good strategy is going to be reflected in superior financial performance relative to peers. And of course, I know how to measure that very well. So I can synthesize more and more different areas of things that I know things about, that I just couldn't do when I was younger. So I can see, and he was always learning, obviously. So I can see how he, and also I can also see the idea of, I need bigger principles. I need bigger as you said, theory for transformation. I need, I need to be able to put this into a framework that brings all that together. And I'm still feeling frustrated about some of that, where I'm at with some of that, because I'm kind of halfway in my progress on that. But I definitely can see the idea of that coming later in life as I approach the big 6-0. 0:25:37.3 BB: The big 6-0, [chuckle] Well, but a big part, I mean, based on what you're talking about, it ended up... Previously we spoke about Richard Rumelt's work, Good Strategy/Bad Strategy, and I mentioned that I use a lecture by Richard Rumelt, I think it was 2011 or so. It was right after his book, Good Strategy/Bad Strategy came out. He spoke at the London School of Economics, and our listeners can find it if you just did a Google search for Richard Rumelt, that's R-U-M... One M. E-L-T. Good Strategy/Bad Strategy. LSE, London School of Economics. Brilliant, brilliant lecture. And I've seen it numerous times for one of my university courses. And he is like Deming, he doesn't suffer fools. And, it finally dawned on me, Deming organizations, if we can use this simple Deming versus non-Deming or Red Pen versus Blue Pen, and as, George Box would say, all models are wrong, some are useful. If we can use that model, I think it's easy to see that what frustrates Rumelt is you've got all these non-Deming companies coming up with strategies without a method. 0:27:00.0 BB: What Rumelt also talks about is not only do you need a method, but you have to be honest on what's in the way of us achieving this? Again, Dr. Deming would say, if you didn't need a method, why don't you're already achieving the results? And so it just dawned on me thinking the reason he's so frustrated, and I think that's one word you can use to describe him, but if he is talking to senior staff lacking this, an understanding of Deming's work, then he is getting a lot of bad strategies. And organizations that would understand what Dr. Deming's talking about, would greatly benefit from Rumelt's work. And they would be one, they'd have the benefit of having an organization that is beginning or is understanding what a transformation guided by Dr. Deming's work is about. And then you could look up and you're naturally inclined to have good or better strategy than worser strategies. 0:28:02.2 BB: And then you have the benefit of, profit's not the reason, profit is the result of all that. And, but next thing I wanna point out is, and I think we talked about it last time, but I just wanted to make sure it was up here, is I've come across recently and I'm not sure talking with who, but there's this what's in vogue today? Data-driven decisions. And again, whenever I hear the word data, I think backed in Ackoff's DIKUW model, I think data-driven. Well, first Dr. Deming would say, the most important numbers are unknown and unknowable. So if you're doing things on a data-driven way, then you're missing the rest of Dr. Deming's theory of management. But why not knowledge-driven decisions, why not understanding-driven decisions And beyond that, why not, right? How long... [laughter] I guess we can... Part of the reason we're doing these Andrew is that we'd like to believe we're helping people move in the direction from data-driven decisions to wisdom-driven decisions, right? 0:29:13.1 AS: Yeah. In fact, you even had the gall to name this episode the System of Profound Wisdom. 0:29:24.0 BB: And that's the title. 0:29:24.9 AS: There it is. 0:29:28.9 BB: But in terms of, I'll give you a fun story from Rocketdyne years ago, and I was talking with a manager in the quality organization and he says, "you know what the problem is, you know what the problem is?" I said, "what?" He says, "the problem is the executives are not getting the data fast enough." And I said, "what data?" He says "the scrap and rework data, they're just not getting it fast enough." So I said, "no matter how fast they get it, it's already happened." [laughter] 0:30:00.0 BB: But it was just, and I just couldn't get through to him that, that if we're being reactive and talking about scrap and rework, it's already happened. By the time the... If the executives hear it a second later, it's already happened. It's still old news. 0:30:14.7 AS: And if that executive would've been thinking he would've said, but Bill, I want to be on the cutting edge of history. 0:30:23.1 BB: Yeah, it's like... 0:30:24.6 AS: I don't want information, I don't want old information, really old. I just want it as new as it can be, but still old. 0:30:32.9 BB: Well, it reminds me of an Ackoff quote is, instead of... It's "Change or be changed." Ackoff talked about organizations that instead of them being ready for what happens, they create what's gonna happen, which would be more of a Deming organizational approach. Anyway, we talked about books last time and I thought it'd be neat to share a couple books as one as I've shared the Against All Odds Book about Toyota. 0:31:08.8 AS: Which I'll say is on Amazon, but it's only looks like it's a used book and it's priced at about 70 bucks. So I've just... 0:31:16.2 BB: How much? 0:31:16.8 AS: Got that one down? 70 bucks? Because I think it's, you're buying it from someone who has it as a their own edition or something. I don't know. 0:31:23.8 BB: It's not uncommon. This is a, insider used book thing. It's not uncommon that you'll see books on Amazon for 70, but if you go to ThriftBooks or Abe Books, you can, I have found multi-$100 books elsewhere. I don't know how that happens, but it does. Anyway, another book I wanted to reference in today's episode is Profit Beyond Measure subtitle, Extraordinary Results through Attention to Work and People, published in 2000. You can... I don't know if you can get that new, you definitely get it old or used, written by, H. Thomas Johnson. H is for Howard, he goes by Tom, Tom Johnson. Brilliant, brilliant mind. He visited Rocketdyne a few times. 0:32:17.1 BB: On the inside cover page, Tom wrote, "This book is dedicated to the memory of Dr. W. Edwards Deming, 1900-1993. May the seventh generation after us know a world shaped by his thinking." And in the book, you'll find this quote, and I've used it in a previous episode, but for those who may be hearing it first here and Tom's a deep thinker. He's, and as well as his wife Elaine, they're two very deep thinkers. They've both spoke at Rocketdyne numerous times. But one of my favorite quotes from Tom is, "How the world we perceive works depends on how we think. The world we perceive is the world we bring forth through our thinking." And again, it goes back to, we don't see the world as it is. We see the world as we are. We hear the world as we are. I wrote a blog for The Deming Institute. If our listeners would like to find it, if you just do a search for Deming blog, Bellows and Johnson, you'll find the blog. And the blog is about the book Profit Beyond Measure. And in there, I said, “In keeping with Myron Tribus' observation that what you see depends upon what you thought before you looked, Johnson's background as a cost accountant, guided by seminars and conversations with Dr. Deming, prepared him to see Toyota as a living system,” right? You talk about Toyota. 0:33:53.9 BB: He saw it as a living system, not a value stream of independent parts. And that was, that's me talking. I mean, Tom talked about Toyota's living system. And then I put in there with the Toyota Production System, people talk about value streams. Well, in those value streams, they have a defect, good part, bad part model that the parts are handed off, handed off, handed off. That is ostensibly a value stream of independent parts 'cause the quality model of the Toyota Production System, if you study it anywhere, is not Genichi Taguchi. It's the classic good parts and bad parts. And if we're handing off good parts, they are not interdependent. They are independent. And then I close with, "instead of seeing a focus on the elimination of waste and non-value added efforts, Johnson saw self-organization, interdependence, and diversity, the three, as the three primary principles of his approach, which he called Management By Means." And so what's neat, Andrew, is he, Tom was as a student of Deming's work, attending Dr. Deming seminars, hearing about SoPK, System of Profound Knowledge, and he in parallel developed his own model that he calls Management By Means. But what's neat is if you compare the two, there's three principles. So he says self-organization. 0:35:31.0 BB: Well, that's kind of like psychology and people. So we can self-organize interdependence, the other self-organized, but we're connected with one another. So that's, that's kind of a systems perspective there as well. And the third one, diversity. So when I think of diversity, I think of variation. I can also think in terms of people. So that what I don't see in there explicitly is Theory of Knowledge. But Tom's developing this model in parallel with Dr. Deming's work, probably beginning in the early '80s. And part of what Tom had in mind, I believe, by calling it Management By Means, is juxtaposing it with that other management by, right? You know the other one, Andrew, management by? 0:36:33.8 AS: You mean the bad one or the good one, Management By Objective? 0:36:37.8 BB: Or Management By Results. Or Dr. Deming once said, MBIR, Management by Imposition of Results. But what's neat is, and this is what I cover and with my online courses, Tom is really, it's just such insight. Tom believes that treating the means as the ends in the making. So he's saying that the ends are what happen when we focus on the means, which is like, if you focus on the process, you get the result. But no, MBIR, as we focus on the result, we throw the process out the window. And so when I've asked students in one of my classes is, why does Tom Johnson believe that treating the means as an ends in the making is a much surer route to stable and satisfactory financial performance than to continue as most companies do? You ready, Andrew? To chase targets as if the means do not matter. Does that resonate with you, Andrew? 0:37:44.1 AS: Yes. They're tampering. 0:37:46.8 BB: Yeah. I also want to quote, I met Tom in 1997. I'm not sure if this... Actually, this article is online and I'll try to remember to post a link to it. If I forget, our listeners can contact me on LinkedIn and I'll send you a link to find the paper. This is when I first got exposed to Tom. It just blew me away. I still remember there at a Deming conference in 1997, hearing Tom talk. I thought, wow, this is different. So, Tom's paper that I'm referencing is A Different Perspective on Quality, the subtitle, Bringing Management to Life. Can you imagine? “Bringing Management to Life.” And it was in Washington, DC, the 1997 conference. And then Tom says, this is the opening. And so when Tom and his wife would speak at Rocketdyne or other conferences I organized. 0:38:44.0 BB: Tom read from a lectern. So he needed a box to get up there and he read, whereas Elaine, his wife, is all extemporaneous. Both deeply profound, two different styles. So what Tom wrote here is he says, "despite the impression given by my title, Professor of Quality Management, I do not speak to you as a trained or a certified authority on the subject of quality management. I adopted that title more or less casually after giving a presentation to an audience of Oregon business executives just over six years ago. That presentation described how my thinking had changed in the last five years since I co-authored the 1987 book, Relevance Lost, the Rise and Fall of Management Accounting, and the talk which presaged my 1992 book, Relevance Regained." And this is when he... After he wrote, Relevance Lost, he went on the lecture circuit, he met the likes of Peter Scholtes and Brian Joiner, got pulled into the Deming community. 0:39:45.4 BB: And then he wrote this scathing book called Relevance Regained and the subtitle is... I think our audience will love it, From Top-Down Control to Bottom-Up Empowerment. Then he goes on to say, "in that I told how I had come to believe that management accounting, a subject that I had pursued and practiced for over 30 years." Over 30 years, sounds familiar. Then he says, "could no longer provide useful tools for management. I said in essence that instead of managing by results, instead of driving people with quantitative financial targets, it's time for people in business..." And this is 30 years ago, Andrew. "It's time for people in business to shift their attention to how they organize work and how they relate to each other as human beings. I suggested that if companies organize work and build relationships properly, then the results that accountants keep track of will what? Take care of themselves." 0:40:50.8 AS: It's so true, it's so true. 0:40:54.1 BB: Yeah, it sounds so literally Tom was writing that in 1999, 2000. Well, actually no, that was 1997, that was 1997, but the same sentiment. 0:41:03.4 AS: It just makes me think of the diagram that we see and that Deming had about the flow through a business, it's the same thing as of the flow from activity to result. 0:41:20.6 BB: Yes. 0:41:21.9 AS: And when we focus on the result and work backwards, it's a mess from a long-term perspective, but you can get to the result. It's not to say you can't get to the result, but you're not building a system that can replicate that. But when you start with the beginning of that process of how do we set this up right to get to that result, then you have a repeatable process that can deliver value. In other words, you've invested a large amount in the origination of that process that then can produce for a much longer time. Um, I have to mention that the worst part of this whole time that we talk is when I have to tell you that we're almost out of time 'cause there's so much to talk about. So we do need to wrap it up, but, yeah. 0:42:09.3 BB: All right. I got a couple of closing thoughts from Tom and then we'll pick this up in episode 21. 0:42:21.3 AS: Yep. 0:42:22.9 BB: Let me also say, for those who are really... If you really wanna know... I'd say, before you read The New Economics... I'm sorry, before you read Profit Beyond Measure, one is the article I just referenced, “Bringing Quality to Life” is a good start. I'd also encourage our readers to do a search. I do this routinely. It shouldn't be that hard to find, but look for an article written by Art Kleiner, Art as in Arthur, Kleiner, K-L-E-I-N-E-R. And the article is entitled, Measures... The Measures That Matter. I think it might be What Are The Measures That Matter? And that article brilliantly written by Kleiner who I don't think knows all that much about Deming, but he knows a whole lot about Tom Johnson and Robert Kaplan, who together co-authored "Relevance Lost" and then moved apart. And Tom became more and more Deming and Kaplan became more and more non and finally wrote this article. 0:43:35.6 AS: Is this article coming out in 2002, "What Are The Measures That Matter? A 10-year Debate Between Two Feuding Gurus Shed Some Light on a Vexing Business Question?" 0:43:46.4 BB: That's it. 0:43:47.2 AS: There it is and it's on the... 0:43:47.4 BB: And it is riveting. 0:43:50.8 AS: Okay. 0:43:50.8 BB: Absolutely riveting. Is it put out by... 0:43:54.0 AS: PwC, it looks like and it's under strategy... 0:43:58.5 BB: Pricewaterhouse... 0:43:58.8 AS: Yeah, strategy and business. 0:44:00.2 BB: PricewaterhouseCooper? Yeah. 0:44:01.3 AS: Yeah. 0:44:03.1 BB: And 'cause what's in there is Kleiner explaining that what Tom's talking about might take some time. You can go out tomorrow, Andrew, and slash and burn and cut and show instant results. Now what you're not looking at is what are the consequences? And so... But... And then... But Kleiner I think does a brilliant job of juxtaposing and trying to talk about what makes Kaplan's work, the Balanced Scorecard, so popular. Why is Tom so anti that? 0:44:37.9 BB: And to a degree, it could be for some a leap of faith to go over there, but we'll talk about that later. Let me just close with this and this comes from my blog on The Deming Institute about Profit Beyond Measure and I said, "for those who are willing and able to discern the dramatic differences between the prevailing focus of systems that aim to produce better parts with less waste and reductions to non-value-added efforts," that's my poke at Lean and Six Sigma, "and those systems that capitalize on a systemic connection between parts. Tom's book, Profit Beyond Measure, offers abundant food for thought. The difference also represents a shifting from profit as the sole reason for a business to profit as the result of extraordinary attention to working people, a most fitting subtitle to this book." 0:45:35.9 AS: Well, Bill, on behalf of everyone at The Deming Institute, I want to thank you again for the discussion and for listeners, remember to go to deming.org to continue your journey. If you wanna keep in touch with Bill, just find him on LinkedIn. This is your host, Andrew Stotz, and I'll leave you with one of my favorite quotes from Dr. Deming, "People are entitled to Joy in work" and I hope you are enjoying your work.

35 snips
Apr 9, 2024 • 36min
Transforming How We Think: Awaken Your Inner Deming (Part 19)
Explore the transformation of thought processes through Dr. Deming's principles, with insights on vision improvement and system thinking. Discover the role of laughter in the workplace and pitfalls of designing for averages. Delve into Deming's principles for transformative thinking and systemic problem-solving, with references to influential experts and resources for further exploration.

Apr 2, 2024 • 30min
Goal Setting Is Often An Act of Desperation: Part 3
In part 3 of this series, John Dues and host Andrew Stotz talk about the final 5 lessons for data analysis in education. Dive into this discussion to learn more about why data analysis is essential and how to do it right. TRANSCRIPT 0:00:02.4 Andrew Stotz: My name is Andrew Stotz and I'll be your host as we continue our journey into the teachings of Dr. W. Edwards Deming. Today I'm continuing my discussion with John Dues who is part of the new generation of educators striving to apply Dr. Deming's principles to unleash student joy in learning. This is episode 23 and we're talking about goal setting through a Deming lens. John, take it away. 0:00:30.8 John Dues: It's good to be back, Andrew. Yeah, in this first episode of this four-part series, we talked about why goal setting is often an act of desperation. And if you remember early on, I sort of proposed those four conditions that organizations should understand about their systems prior to ever setting a goal. Those four were capability, variation, stability, and then by what method are you going to improve your system? And then in the last episode, I introduced the first five lessons of the 10 key lessons for data analysis. And remember, these lessons were set up to avoid what I call these arbitrary and capricious education goals, which are basically unreasonable goals without consideration of those four things, the system capability, variation, and stability, and then not having a method. So, it might be helpful just to recap those first five lessons. I'll just list them out and folks that want to hear the details can listen to the last episode. 0:01:31.8 JD: But lesson one was data have no meaning apart from their context. So, we've got to contextualize the data. Lesson two was we don't manage or control the data. The data is the voice of the process. So, it's sort of, you know, the data over time shows us what's happening and we don't really have control over that data. We do have control under that underlying process. Lesson three was plot the dots for any data that occurs in time order. So, take it out of a two-point comparison or take it out of a spreadsheet and put it on a line chart that shows the data over time. Lesson four was two or three data points are not a trend. So again, get beyond the typical two-point limited comparison this month and last month, this year and last year, this same month, last year, those types of things, this week, last week. 0:02:25.6 JD: And then lesson five was, show enough data in your baseline to illustrate the previous level of variation. So, we want to get a sense of how the data is changing over time and we need a baseline amount of data, whether that's 12 points, 15 points, 20 points, there's sort of different takes on that. But somewhere in that 12-to-20-point range is really the amount of data we want to have in our baseline. So, we understand how it's moving up and down over time sort of naturally. Sort of at the outset of those two episodes, we also talked about centering the process behavior charts, like the ones we viewed in many of our episodes. And we put those in the center because it's a great tool for looking at data over time, just like we've been talking about. 0:03:11.4 JD: And I think when we use this methodology, and when you start to fully grasp the methodology, you start to be able to understand messages that are actually contained in the data. You can differentiate between those actual special events, those special causes, and just those everyday up and downs, what we've called common causes. And in so doing, we can understand the difference between reacting to noise and understanding actual signals of significance in that data. And so, I think that's a sort of a good primer to then get into lessons six through 10. 0:03:51.2 AS: Can't wait. 0:03:53.3 JD: Cool. We'll jump in then. 0:03:56.1 AS: Yeah. I'm just thinking about my goal setting and how much this helps me think about how to improve my goal setting. And I think one of the biggest ones that's missing that we talked about before is by what method. And many people think that they're setting strategy, when in fact, they're just setting stretch targets with nothing under it. And they achieve it by luck or are baffled why they don't achieve it. And then they lash out at their employees. 0:04:31.4 JD: Yeah, there was really... I mean, that goes back to one of those four conditions of setting goal capability. You have to understand how capable your system is before you can set, it's fine to set a stretch goal, but it has to be within the bounds of the system. Otherwise, it's just maybe not an uncertainty, but a mathematical improbability. That's not good. Like you're saying, it's not a good way to operate if you're a worker in that system. So, lesson six then, to continue the lessons. 0:05:06.8 JD: So, lesson six is "the goal of data analysis in schools is not just to look at past results, but also, and perhaps more importantly, to look forward and predict what is likely to occur in the future," right? So that's why centering the process behavior charts is so important, because they allow you to interpret data that takes variation into account, allows you to classify the data into the routine or common cause variation or the exceptional, that's the special cause variation, and allows us to turn our focus to that underlying or the behavior of the underlying system that produced the results. And it's this focus on the system and its processes that's then the basis for working towards continual improvement. 0:06:00.6 AS: And I was just thinking about number six, the goal is to predict what is likely to occur in the future. And I was just thinking, and what's likely to occur in the future is exactly what's happening now, or the trend that's happening, unless we change something in the system, I guess. 0:06:16.4 JD: Yeah. And that's why just setting the stretch goal is often disconnected from any type of reality, because we have this idea that somehow something magical is going to happen in the future that didn't happen in the past. And nothing magical is going to happen unless we are intentional about doing something differently to bring about that change. 0:06:39.5 AS: And that's a great lesson for the listeners and the viewers. It's like, have you been just setting stretch targets and pushing people to achieve these stretch targets? And not really understanding that your role is to understand that you're going to get the same result unless you start to look at how do we improve the method, the system, that type of thing. 0:07:05.0 JD: Yeah. And usually when you have those stretch goals, you've looked at what happened last year, and then you base the stretch goal on last year. But perhaps, you're seeing, for the last three or four years, the data has been steadily decreasing, right? And you can't realize that if you haven't charted that over the last three or four years, hopefully beyond that. So, you have no idea or it could have been trending positively, and you may under shoot your stretch goal because you missed a trend that was already in motion because of something that happened in the past. 0:07:44.8 AS: You made a chart for me, a run chart on my intake for my Valuation Masterclass Bootcamp. And we've been working on our marketing, and I presented it to the team and we talked about that's the capability of our system based upon for me to say, I want 500 students when we've been only getting 50 is just ridiculous. And that helped us all to see that if we are going to go to the next level of where we want to be, we've got to change what we're doing, the method that we're getting there, the system that we're running and what we're operating to get there or else we're going to continue to get this output. And so if the goal is to predict what is likely to occur in the future, if we don't make any changes, it's probably going to continue to be like it is in that control chart. 0:08:42.8 JD: Yeah. And that example is, in a nutshell, the System of Profound Knowledge in action in an organization where you're understanding variation in something that's important to you, enrollment in your course. You're doing that analysis with the team. So, there's the psychological component and you're saying, well, what's our theory of knowledge? So, what's our theory for how we're going to bring about some type of improvement? And so, now you're going to run probably something like a PDSA. And so now you have all those lenses of the System of Profound Knowledge that you're bringing together to work on that problem. And that's all it is really in a nutshell. 0:09:22.2 AS: Yeah. And the solution's not necessarily right there. Sometimes it is, but sometimes it's not. And we've got to iterate. Okay. Should we be doing marketing in-house or should we be doing it out using an outsourced service? What if we improve and increase the volume of our marketing? What effect would that have? What if we decrease the... What if we change to this method or that method? Those are all things that we are in the process of testing. I think the hardest thing in business, in my opinion, with this is to test one thing at a time. 0:09:58.5 JD: Yeah. 0:09:58.7 AS: I just, we I want to test everything. 0:10:00.4 JD: Yeah. Yeah. I read in the Toyota Kata that I think we've talked about before here, which talks about Toyota's improvement process. I read this in the book, I don't know if this is totally always true, but basically they focus on single factor experiments for that reason, even in a place as complex and as full of engineers as Toyota, they largely focus on single factor experiments. They can actually tell what it is that brought about the change. I mean, I'm sure they do other more complicated things. They would have to write a design of experiments and those types of things, but by and large, their improvement process, the Toyota Kata, is focused on single factor experiments for that reason. 0:10:48.1 AS: And what's that movie, the sniper movie where they say, slow is smooth and smooth is fast or something like that, like slow down to speed up. I want to go fast and do all of these tests, but the fact is I'm not learning as much from that. And by slowing down and doing single factor experiment to try to think, how do we influence the future is fascinating. 0:11:20.9 JD: Yeah, absolutely. 0:11:22.4 AS: All right. What about seven? 0:11:23.2 JD: Lesson seven. So "the improvement approach depends on the stability of the system under study," and there's really two parts to this. But what approach am I going to take if the system is producing predictable results and it's performing pretty consistently, it's stable, there's only common cause variation. And then what happens if you have an unpredictable system? So two different approaches, depending on what type of system you're looking at in terms of stability. So you know the one thing to recognize in thinking about something like single factor experiments, it's a waste of time to explain noise or explain common cause variation in this stable system, because there's no simple single root cause for that type of variation. There's thousands or tens of thousands of variables that are impacting almost any metric. And you can't really isolate that down to a single cause. 0:12:17.5 JD: So instead we don't, we don't try to do that in a common cause system that needs improvement. Instead, if the results are unsatisfactory, what we do is work on improvements and changes to the system, right? We don't try to identify a single factor that's the problem. So what we do then is we work to improve a common cause processor system by working on the design of that actual system including inputs, throughputs that are a part of that. And to your point, you sort of have to, based on your content knowledge of that area, or maybe you have to bring in a subject matter expert and you sort of start to think about what's going to make the biggest difference. And then you start testing those things one at a time, basically. That's sort of the approach. And then if you're working in an unpredictable system and that unpredictable system is unpredictable because it has special causes in your data, then it's really a waste of time to try to improve that particular system until it's stable again. And so the way you do that is at that point, there is something so different about the special cause data that you try to identify that single cause or two of those data points. And then when you've identified, you study it, and then you try to remove that specific special cause. And if you've identified the right thing, what happens then is it becomes a stable system at that point, right? 0:13:51.9 AS: I was thinking that it's no sense in trying to race your boat if you've got a hole in it. You got to fix the special cause, the hole, and then focus on, okay, how do we improve the speed of this boat? 0:14:06.5 JD: And the key is recognizing the difference between these two roadmaps towards improvement. And I think in education for sure, there's a lot of confusion, a lot of wasted effort, because there's really no knowledge of this approach to data analysis. And so people do their own things. There's a mismatch between the type of variation that's present and the type of improvement effort that's trying to be undertaken. I think the most typical thing is there's a common cause system, and people think they can identify a single thing to improve. And then they spend a lot of time and money on that thing. And then it doesn't get better over time because it was the wrong approach in the first place. 0:14:55.9 AS: Number eight. 0:14:57.6 JD: Number eight. So, number eight is, "more timely data is better for improvement purposes." So we've talked about state testing data a lot. It's only available once per year. Results often come after students have gone on summer vacation. So, it's not super helpful. So, we really want more frequent data so that we can understand if some type of intervention that we're putting in place has an effect. I think what the most important thing is, the frequency of the data collection needs to be in sync with the improvement context. So, it's not always that you need daily data or weekly data or monthly data, or quarterly data, whatever it is. It's just it has to be in sync with the type of improvement context you're trying to bring about. And no matter what that frequency of collection, the other big thing to keep in mind is don't overreact to any single data point, which is, again, I see that over and over again in my work. I think ultimately the data allows us to understand the variation and the trends within our system, whether that system is stable or unstable, and then what type of improvement effort would be most effective. And, again, in my experience, just those simple things are almost never happening in schools. Probably in most sectors. 0:16:25.9 AS: Can you explain a little bit more about in sync with the improvement process? Like, maybe you have an example of that so people can understand. 0:16:34.2 JD: Well, yeah. So, you mean the frequency of data collection? 0:16:39.0 AS: Yeah. And you're saying, yeah, this idea of like, what would be out of sync? 0:16:44.7 JD: Well, one, you need to... A lot of times what happens is there might be a system in place for collecting some type of data. Let's say, like, attendance. They report attendance, student attendance on the annual school report card. So, you get that attendance rate, but that's like the state test scores. Like, it's not that helpful to get that on the report card after the year has concluded. But the data is actually available to us in our student information system. And so, we could actually pull that in a different frequency and chart it ourselves and not wait on the state testing date or the state attendance report card has attendance... 0:17:27.5 AS: Because attendance is happening on a daily basis. 0:17:31.0 JD: Happening on a daily basis. So, if we wanted to, daily would be pretty frequent, but if we did collect the data daily, we certainly can do that. We could see, that could help us see patterns in data on certain days of the week. That could be something that goes into our theory for why our attendance is lower than we'd want it to. You could do it weekly if the daily collection is too onerous on whoever's being tasked with doing that. I think weekly data pretty quickly, would take you 12 weeks. But in 12 weeks, you have a pretty good baseline of what attendance is looking like across this particular school year. So I think when you're talking about improvement efforts, I think something daily, something weekly, I think that's the target so that you can actually try some interventions along the way. And... 0:18:29.3 AS: And get feedback. 0:18:31.1 JD: And get feedback. Yeah, yeah. And you could also peg it to something that's further out. And you could see over time if those interventions that are impacting more short-term data collection are actually impacting stuff on the longer term as well. 0:18:49.1 AS: And I guess it depends also on what is the priority of this. Let's say that attendance is not a big issue at your particular school. Therefore, we look at it on a monthly basis and we look to see if something's significance happening. But otherwise, we've got to focus over on another idea. And if, if, if attendance becomes an issue, we may go back to daily and say, is it a particular day of the week? Or is it something, what can we learn from that data? 0:19:20.0 JD: Yep, that's exactly right. And then the next step would be in lesson nine, you then, and this is why the charts are so important, then you can clearly label the start date for an intervention directly on the chart. So, what you want to do is, once you've chosen an intervention or a change idea, you clearly mark that in your process behavior chart. I just use a dashed vertical line on the date the intervention is started and also put a simple label that captures the essence of that intervention. So, that's right on the chart. So, I can remember what I tried or started on that particular day. And then that allows the team to easily see, because you're going to continue adding your data points, the stuff that comes after the dotted line, it becomes pretty apparent based on the trends you're seeing in the data, if that intervention is then working, right? 0:20:21.2 JD: If it's attendance, I may try, I do a weekly call to parents to tell them what their individual child's attendance rate is. And then we can see once we started making those weekly calls over the next few weeks, does that seem to be having an impact on attendance rates? And then I can actually see too, we've talked about the patterns in the data, there's certain patterns I'm looking for to see if there's a significant enough change in that pattern to say, yeah, this is a signal that this thing is actually working. So, it's not just because it increased, that attendance rate could go up, but that in and of itself isn't enough. I want to see a signal. And by signal, I mean a specific pattern in the data, a point outside the limits. 0:21:17.3 JD: I want to see eight points in a row in the case of attendance above the central line or I want to see three out of four that are closer to a limit, the upper limit, than they are to that central line. And again, we've talked about this before, those patterns are so mathematically improbable that I can be pretty reasonably assured if you see them that an actual change has occurred in my data. And because I've drawn this dotted line, I can tie the time period of the change back within that dataset to determine if something positive happened after I tried that intervention. 0:21:56.7 AS: It's just, you just think about how many times, how many cycles of improvement and interventions that you can do in a system and how far you will be a year later. 0:22:12.3 JD: Yes, yeah. And "cycles" is exactly the right word because really what you're doing, I didn't mention it here, but really what you were doing at the point you draw that vertical line when you're going to run an intervention, you're going to do that through the PDSA cycle, the Plan-Do-Study-Act cycle. So that's your experiment where you're testing one thing to see what impact it has on the data. So if I was going to boil continual improvement per Dr. Deming down to two things is, put your data on a process behavior chart, combine it with a PDSA to see how to improve that data. And that's continual improvement in a nutshell, basically, those two tools. 0:22:51.7 AS: Gold, that's gold. All right. Number 10. 0:22:55.3 JD: Last one, lesson 10, "the purpose of data analysis is insight." So this comes from Dr. Donald Wheeler, but he basically just teaches us that the best analysis is the simplest analysis, which provides the needed insight. But what he would say is plot the dots first on a run chart. Once you have enough data, turn it into a process behavior chart. And that's the most straightforward method for understanding how our data is performing over time. And so this approach, I think it's much more intuitive than if we store the data in tables and then the patterns become much more apparent because we're using these time sequence charts. And again, I know I've said this before, but I keep repeating it because I think it's the essence of continual improvement to do those two things. Yeah. 0:23:47.1 AS: And what's the promise of this? If we can implement these 10 points that you've highlighted in relation to goal setting, what do you think is going to change for me? I mean, sometimes I look at what you've outlined and I feel a little bit overwhelmed, like, God, that's a lot of work. I mean, can I just set the freaking goal and people just do it? 0:24:13.2 JD: Yeah. Well, I think, this is, in essence, a better way. I mean, this is really the wrap up here is that, well, one, when you understand the variation in your chart, you actually understand the story, the true story that's being told by your data. And so many people don't understand the true story. They sort of make up, that's too strong, but they don't have the tools to see what's actually happening in their system. So if you really want to see what's happening in your system, this is the way to do it. That's one thing. I think it also... I tried many, many things before I discovered this approach, but I didn't have any way to determine if something I was trying was working or not. 0:25:07.1 JD: I didn't have any way to tie the intervention back to my data. So what most people then do is tell the story that this thing is working if you like it. And if you don't want to do it anymore, you tell the story that it's not working, but none of its actually tied to like scientific thinking where I tie the specific point I try something to my data. So that's another thing. I can actually tell if interventions are working or not or can have a... I always try to use, not use definitive language. Scientifically, I have a much better likelihood of knowing that an intervention is working or not. 0:25:47.7 JD: So I think especially the process behavior chart, I think, and the way of thinking that goes with the chart is probably the single most powerful tool that we can utilize to improve schools. And we can teach this to teachers. We can teach this to administrators. We can teach this to students, can learn how to do this. 0:26:07.1 AS: Yeah. And I think one of the things I was thinking about is start where you have data. 0:26:12.3 JD: Yeah. Start where you have data. 0:26:14.2 AS: Don't feel like you've got to go out there and go through a whole process of collecting all this data and all that. Start where you have data. And even if attendance is not your major issue, let's say, but you had good attendance data, it's a good way to start to learn. And I suspect that you're going to learn a lot as you start to dig deeper into that. And then that feeds into, I wonder if we could get data on this and that to understand better what's happening. 0:26:41.4 JD: There are so many applications, so many applications. I mean, even just today, we were talking about, we get a hundred percent of our students qualify for free and reduced lunch because we have a school-wide lunch or breakfast and lunch program. And so we get reimbursed for the number of meals that are distributed. And sometimes there's a mismatch between the number that are distributed and the number we order just because of attendance and transportation issues and things like that. But the federal government only reimburses us for the meals we actually distribute to kids. And so if we over order, we have to pay out of our general fund for those meals that we don't get reimbursed for. And so, I'm just bringing this up because we were looking at some of that data just today, that mismatch, and even an area as simple as that is ripe for an improvement project. 0:27:40.7 JD: Why is there a mismatch? What is happening? And prior, I would just say, prior to having this mindset, this philosophy, I would say, well, they just need to figure out how to get the numbers closer together. But you actually have to go there, watch what's happening, come up with a theory for why we're ordering more breakfasts and lunches than we're passing out. It could be super, super simple. No one ever told the person distributing the lunches that we get reimbursed this way. And so they didn't know it was a big deal. I don't know that that's the case or not right, that's purely speculation. Or it could be, oh, we want to make sure every kid eats so we significantly over order each day. Well, that's a good mindset, but maybe we could back that off to make sure we never... We're always going to have enough food for kids to eat, but we're also not going to spend lots of extra money paying for lunches that don't get eaten. So there's all different things, even something like that operationally is ripe for improvement project. And the great thing is, is if you can study that problem and figure out how to save that money, which could by the end of the year, you know, be thousands of dollars, you could reallocate that to field trips or class supplies or to books for the library or art supplies, whatever, you know? So that's why I think this methodology is so powerful. 0:29:02.1 AS: Fantastic. That's a great breakdown of these 10 points. So John, on behalf of everyone at the Deming Institute, I want to thank you again for this discussion and for listeners, remember to go to deming.org to continue your journey. And you can find John's book, Win-Win, W. Edwards Deming, The System of Profound Knowledge and the Science of Improving Schools on Amazon.com. This is your host, Andrew Stotz. And I'll leave you with one of my favorite quotes from Dr. Deming, "People are entitled to joy in work."

Mar 26, 2024 • 35min
Organizations are Holograms: Awaken Your Inner Deming (Part 18)
Bill Bellows, an expert in the teachings of Dr. W. Edwards Deming, discusses seeing organizations as holograms and using the System of Profound Knowledge to identify transformation opportunities. Topics include organizational culture shifts, strategic KPI approaches for innovation, measuring progress through small indicators, and promoting teamwork through a holistic view of organizations.