Thursday, November 12, 2009

Measurement-Context is Important

Fellow blogger and Leadership Change Agent Brad Kolar posted last week on something that has got me thinking. Brad shared an experience he had with a team in which the team had some data about an improvement that showed a tremendous productivity gain. However, Brad got the team to look at the data a different way, which showed that the improvement was not nearly as dramatic and that more work needed to be done on adoption of the improvement. Read Brads full post here.

I have had all kinds of similar experiences to Brad's story. The most recent one occurred this week. I was doing some coaching with one of my friends who is also a Six Sigma Green Belt. He was asking me to review his data and help him present a graphic and set of statistics that would show the improvement in performance of a process that he has been working on for several months. He had done some of his work ahead of time so there was very little for me to add but one thing struck me during our discussion. He was comparing process performance in September with process performance in October since they represented his before and after improvement conditions. On the surface, this seemed a valid comparison until you consider that the process interacts with outside weather conditions. So I asked the question, was September and October different or the same in terms of outside weather? Turns out, September was warmer and drier than October, making the comparison suspect. So what to do? We looked for a comparable period of outside weather conditions in the before period and ran the analysis on that month. Turns out March was very similar to October so that was our before comparison month.

Completing the analysis showed us that indeed the process had improved in terms of variation and centeredness. My friend was on solid ground to make his claim of improved performance. Just goes to show, you have to think about your data and ask yourself hard questions about the validity of assumptions that are made to be able to stand up to scrutiny and convince others that your improvement is real.


Brad said...

Hey Jim,

Great post. So many people use data without context. It's like they forget that there is an entire world that exists outstide of their data - that the data needs to fit into.

Another type of context I think people miss is the context of what analysis or statistics mean. The biggest error is people confusing statistical significance with importance. People often make the mistake of claiming victory for a "significant" change in their results even when those changes might make no practical change to the business.

I remember a stats class in college. We had to run a multiple regression. One of my friends got a result that had an r-squared of .23. He was bragging about his result. I challenged him that he really didn't find anything useful. He said, "But my finding was significant to .001" I said, "Exactly, you know with a high degree of certainty that you can't explain over 75% of the variance in this data!"

Programs like excel and SPSS have made it easy for people to crunch numbers. The problem is that those people don't always take time to tie those back to the business context or the statistical context in which the numbers are being crunched.

Anonymous said...

Greg Pronger said on Facebook:

Either alone is unlikely to move you forward. This is akin to the guy with an IQ of 180 busing tables due to lack of drive, or the VP who "Yes-manned" himself into his job (though not likely to remain).

The Nov 12 post is a good example, temp change was a large part of the measured improvement. A factor outside of the measurement system, but out ... Read Moreof knowledge, the issue come forward.

Without measurement and analysis though, intellect will often jump to false conclusions based upon erroneous or insufficient info.