Stop the self-sabotage and help executives understand simple variation
I attended a talk in 2006 given by a world leader in quality that contained a bar graph summary ranking 21 U.S. counties from best to worst (see figure 1). The counties were ranked from 1 to 21 for 10 different indicators, and these ranks were summed to get a total score for each county (e.g., minimum 21, maximum 210, average 110. Smaller score = better). Data presentations such as this usually result in discussions where terms like “above average,” “below average,” and “who is in what quartile” are bandied about. As W. Edwards Deming would say, “Simple… obvious… and wrong!” Any set of numbers needs a context of variation within which to be interpreted.
Rank Sum County
42 1
76 2
84 3
87 4
92 5
99 6
101 7
102 8
105 9
105 10
107 11
108 12
112 13
113 14
114 15
121 16
128 17
131 18
145 19
157 20
181 21
Figure 1: Summary of 21 U.S. counties
I asked for the original data (i.e., the individual sets of rankings for each of the 10 characteristics), and the presenter was kind enough to supply it. My analysis showed there was one “above average” county (No. 21) and one “below average” county (No. 1). Counties 2–20 were indistinguishable. (If you’re interested in the statistics involved, you can view them here.)
I then shared my analysis with him. Our e-mail correspondence follows:
World quality leader: A subtle issue you did not tackle is the political-managerial issue of communicating such insights to [the two special-cause counties] and the counties that thought they were “different” but, statistically, aren’t. I wonder what framework one could use to approach that psychological challenge?
Balestracci: As I say to my audiences, “Hey, I’m just the statistician, man!” I think the issue is how people and leaders like you are going to facilitate these difficult conversations. This is the leadership that quality gurus keep alluding to and seems to be in very short supply.
My job is to keep you all out of the “data swamp,” but I would be a willing participant. I would love to pilot some of these analyses with you or other leaders. We need to figure out what this process should be. This is potentially very exciting and could quantum-leap the quality improvement movement.
My point is that this “language” needs to be a fundamental piece of any improvement process and led by leaders who understand it and are promoted into leadership positions only if they understand it. If this could become culturally inculcated, then the rampant shoot-from-the-hip analyses and resulting defensiveness would stop, period. The discussion would then focus, as it should, on process. We need new conversations, and this could be a key catalyst.
World quality leader: Nope. I don’t buy it. Yes, I am a leader and need to carry the message. But I know you too well to let you off the hook. I’d love to see you try to lead these conversations and experiment with approaches. You’re a leader, too.
Balestracci: Give me an opportunity, and I will do my best to lead that conversation. Have you fathomed the potential of this?
Real root causes?
That last e-mail of mine has never been answered. I’m still waiting for the promised opportunity. I try to remind him every once in awhile but have given up. During the past four years, further e-mails from me have not been responded to. At his insistence, I even sent the analysis with explanation to the original executive group that collected and summarized the data. No reply.
Many of this example’s statistical principles are what Deming demonstrated during his seminars. After more than 20 years of trying to teach similar concepts, I am still amazed at the abject cowardice of (yes, cowardice) and fierce resistance from (alleged) leaders who abdicate responsibility to comprehend the power of a simple understanding of variation. As a lot of us know, Deming had zero tolerance for such ignorance or arrogance.
Let me tie this reaction into the current hot topic of root cause analysis. An excellent article by John Dew, “The Seven Deadly Sins of Quality Management” (Quality Progress, 2003) considers the true root causes to quality problems. They are entrenched in a “quality as a bolt-on” culture, of which the conversation I had above is symptomatic. These root causes include:
1. Placing budgetary considerations ahead of quality
2. Placing schedule considerations ahead of quality
3. Placing political considerations ahead of quality
4. Being arrogant
5. Lacking fundamental knowledge, research, or education about improvement
6. Pervasively believing in entitlement
7. Practicing autocratic behaviors that result in “endullment” rather than empowerment
Regarding items 4 and 5, I believe quality professionals have made huge strides in speaking the language of senior management. In fact, maybe too good; I’m seeing an increasing emphasis on “bottom line results.” In many organizations, senior management still does not know the fundamental lessons of quality and, frankly, shows no interest in learning them other than insisting, “Get to the punchline and give me the 10-minute overview and bottom-line results.”
Promotions self-perpetuate the status quo. Could it be that few quality managers make it into senior management positions because senior management does not really believe in quality concepts?
Am I the only one who sees the potential implications of this simple example?
Mark Graham Brown, a balanced scorecard and measurement expert, thinks that 50 percent of executive meetings where data are involved are a waste of time—as is middle management spending an hour a day poring over useless operational data. (Put that into a dollar figure.)
Why is it the only people who truly don’t seem to get it, or want to get it, tend to:
- Look at tables of raw data and draw circles around numbers they don’t like
- Look at data summarized by smiley faces, bar graphs, trend lines, and traffic lights
- Compare a number to an arbitrary goal and throw a tantrum
- Brag about reading the latest airport best-seller, leadership-fad book
Sigh. Passionate lip service continues to be alive and well.
What can you do?
Herein lies the opportunity for quality professionals: Getting the respect we deserve by bringing “data sanity” to organizations, which would free up precious time to consider and make quality an organizational “build in.” People in quality must stop seeing themselves as victims or being complacent because they are “so busy.” Activity is not impact. (See my 2009 column on this subject here.)
Join me and watch like a hawk for opportunities to convert everyday executive data presentations into this “funny statistical way” of doing things. This will keep you from doing yet another self-sabotaging seminar simulating Deming’s red bead experiment. We need to stop whining that people “just don’t get it” and think more formally about how to stop boring execs to death.
Getting mad and focusing that energy wouldn’t hurt, either.