This post originally appeared on BetterProjects.net on July 1, 2010.
“Yeah, that sounds about right.”
“Those numbers are pretty accurate.”
“Oh, no, I didn’t look at the data, that’s just what I know happens.”
If you’re like me, statements like those above make you cringe in horror whenever they issue forth from the mouth of a stakeholder. Gut instincts are often great for directional guidelines but are not sufficient for precise analysis.
Despite the necessity of precision analysis for quality decision making, precision in a vacuum can blind us to the direction of the changes that will occur due to our process or system changes. Its during these times that the ‘gut’ instincts possessed by those who are intimately familiar with the processes, those 'cringe’ statements at the top of this post really come in handy.
There is a concept in physics called the Uncertainty Principle. In short, the more specific your information about exactly where a particle is located in space and time, the less you know about its velocity. You have two options and you can only pick one: you can know exactly where something is right now or you can know where its going.
When I read this article earlier today about probability, I was instantly reminded of the previous paragraph about the Uncertainty Principle and how all of this impacts project analysis. What our gut tells us can often be misleading, but close enough to act as a starting place for our analysis. The difference between ½ and 1/3 is close enough when we’re looking for a direction to take our project, but once the real analysis begins, we must move past the approximation and truly focus on finding out the real answers at the root of the problem.
The article also gives us a few guidelines for doing this analysis, although they are not specifically mentioned. First, find experts in the field. Don’t just go to managers and directors who think they know what goes on, find the people who are closest to the data, who do the process every day, and ask them.
Second, get a second opinion. Sometimes experts can interpret the data or perform the process in different ways. Its always helpful to have someone else look at your conclusions as a sanity check.
Third, challenge our assumptions. As Yuval Peres points out, where we start our analysis often times biases the outcomes that can be reached by our analysis. Never be afraid to start over at a different location if your answers seem to fit the question too well.
So what about you? What other ways have you found to challenge your own analysis?