8803-DataSanity-600x900CLICK HERE to request a free copy of the preface and chapter summary of my book…

DATA SANITY: A Quantum Leap to Unprecedented Results

What they’re saying about Davis…

"WOW! A must attend for everyone! Totally changed my thinking on analysis of data and making it truly meaningful."

"Very dynamic speaker!"

"This session was awesome! Presenter was very engaging and simplified what people fear as a complex topic. I consider myself good at statistics but am always looking for away to do better and teach others. My two key take aways were a better understanding of the moving range and the calculation of the control limits. One of the most succinct and useful presentations on improving data reporting that I have ever attended. Thank you."

"I have heard Davis speak many times - message is excellent!"

Listings for Category: Articles

A Statistician’s Favorite Answer: ‘It Depends,’ Part 1

Quality improvement people sure love those tools. A particular favorite, of course, is the control chart, of which, I think, seven are usually taught. Two questions I’m always asked are, “Which chart do I use for which situation?” and “When and how often should I recalculate my limits?”

Wrong questions!

Regarding the first (we’ll deal with second question in part 2), I’ve seen many flowcharts in books to help you determine which chart to use for which situation. I find them far too confusing for the average user. (They even give me sweaty palms.) I don’t even teach this in my work.

Uh-oh… Time for the (Dreaded?) Third Quarter Review Meeting

You know what the third-quarter review meeting means: a packet will be handed out  with bar graphs and, no doubt, trend lines on each of about a zillion “key performance indicators” that show:

• This month vs. last month vs. 12 months ago (maybe year-to-date as well)
• The three months’ performance of the current quarter
• The first three quarters of the year
• This quarter vs. last quarter vs. third quarter a year ago

Time to Lose the 10-Minute Overview

I attended a talk in 2006 given by a world leader in quality that contained a bar graph summary ranking 21 U.S. counties from best to worst (see figure 1). The counties were ranked from 1 to 21 for 10 different indicators, and these ranks were summed to get a total score for each county (e.g., minimum 21, maximum 210, average 110. Smaller score = better). Data presentations such as this usually result in discussions where terms like “above average,” “below average,” and  “who is in what quartile” are bandied about. As W. Edwards Deming would say, “Simple… obvious… and wrong!” Any set of numbers needs a context of variation within which to be interpreted.

Are You Becoming a ‘Qualicrat?’

During my recent travels, I have noticed an increasing tendency toward formalizing organizational quality improvement (QI) efforts into a separate silo. Even more disturbing is an increasing (and excruciating) formality. Expressions such as “saving dark-green dollars” are creeping into justifications for such “programs,” usually referred to as Six Sigma, lean, or lean Six Sigma. As always, Jim Clemmer pinpoints this trend perfectly:

“The quality movement [has given] rise to a new breed of techno-manager—the qualicrat. These support professionals see the world strictly through data and analysis, and their quality improvement tools and techniques. While they work hard to quantify the ‘voice of the customer,’ the face of current customers (and especially potential new customers) is often lost. Having researched, consulted, and written extensively on quality improvement, I am a big convert to, and evangelist for, the cause. But some efforts are getting badly out of balance as customers, partners, and team members are reduced to numbers, charts, and graphs.”

Given Two Numbers, Only One Can Be Larger

Customer satisfaction data resulting in various quality indexes abound. The airline industry is particularly watched. The April 10 Quality Digest Daily had an article with the title “Study: Airline Performance Improves” and the subtitle “Better on-time performance, baggage handling, and customer complaints.”

The analysis method? In essence, a bunch of professors pored over some tables of data and concluded that some numbers were bigger than others…and gave profound explanations for the (alleged) differences. If I’m not mistaken, W. Edwards Deming called this “tampering:” They treated all differences (variation) as special cause.

Copyright © 2024 Harmony Consulting, LLC All rights reserved. | Phone: 207.899.0962 | Admin | Use & Privacy | Site Map
Site design, development & hosting by Small Web Solutions