The High Cost of Bad Data in Lean Six Sigma
You may remember when the speed limits were lowered to 55 to "save lives." Yet a study by the Cato Institute found just the opposite: the fatality rate on the nation's roads declined for a 35 year period excluding the period from 1976 to 1980. After the laws were changed in 1995, the fatality rate dropped to the lowest in recorded history. There were also 400,000 fewer injuries.
Furthermore, there's no evidence that States with higher speed limits had increased deaths. States with speed limits of 65 to 75 saw a 12 percent decline in fatalities. States with a 75 MPH speed limit saw over a 20% decline in fatality rates.
Wrong Root Cause
What does this data suggest? Higher speed limits weren't the cause of highway fatalities. For those of us who can remember the seventies, you may have owned a Fix Or Repair Daily (i.e., FORD) or some other clunker. The main reason that the roads are now safer than ever before is because cars and roads are built better than ever before. Anti-lock brakes, power steering, and crash protections all help prevent fatalities.
To paraphrase a recent political campaign: It's the car stupid!
The other main cause is bad driving habits: It's the driver stupid!
What did this slow down in delivery of people and goods over the nation's highways cost? Although it's almost impossible to connect all of the dots, the stock market was down and interest rates soared. It did, however, create an overnight market for CB radios and radar detectors.
Similarly in 1995, Denver initiated an emissions testing program to reduce carbon monoxide and other emissions. The program costs $44 million per year, but has only reduced emissions by 4%, far less than the 33% projected from the initial data.
Not surprisingly, 6.7% of the 833,122 cars tested in 2001 failed. This is exactly 3-Sigma.
One of the assumptions was that the owner would have their car fixed after it failed. In reality, about 75% of the owners bring their car back through on another day when the car passed because of variability in the testing process. In other words, the testing process was barely at a 1-Sigma level (less than 30% accurate).
But what did it cost to squeeze a few more pounds of emissions out of the air? Were there other ways to spend $44 million/year that might have reaped greater gains? For those of us who waited in line for up to an hour to have our emissions tested, what did that cost? Time that could have been spent making money, spending money, being with family and friends?
Not surprisingly: "The real difference over the past 10 to 15 years is technology has surpassed the ability of cars to pollute. Cars are running cleaner and staying cleaner longer - and that has made the biggest difference."
It's the car stupid!
Here's my point:
Data can provide an excellent rear view mirror into the past. But it can be misused in the same way a drunk uses a lamppost: for support, not illumination.
Forcing your data to support your point of view can lead to more defects, delay, and cost. In the case of the 55 MPH speed limit, there were more fatalities, more delay, and more cost. In the case of the emissions testing program, the 100% inspection process took time and money from taxpayers for a minimal reduction in emissions. And it was too error prone to deliver consistent results.
Identifying the wrong root cause can lead you down a path of waste, rework and expense that can be avoided.
Let your data guide you; don't force the data to fit your pet hypothesis. Then, once you've implemented your solution, verify that you've actually succeeded at reducing the root cause and its effects. Otherwise, you're not doing Six Sigma, you're just conning your company and customers, and hastening the day when the business will be shuttered forever.
Rights to reprint this article in company periodicals is freely given with the inclusion of the following tag line: "© 2008 Jay Arthur, the KnowWare® Man, (888) 468-1537, firstname.lastname@example.org."