- Companies and organizations exchange numbers with no context for the data and the result sucks. Are the sites even truly comparable? I do not believe that we are all different in fact there is a blog on that subject here but I do think that certain key differences are important enough to note and communicate in the survey. For instance, you would not compare the Overall Equipment Effectiveness or OEE from a continuous process chemical plant with a batch process specialty paint plant. They are both in the chemical arena but they would have significantly different OEE numbers. So comparing by industry alone is not always a good idea without more information.
- Companies and organizations exchange numbers with no standards for calculation and the results suck. If you try to compare two metrics that are calculated using different data or a different equation you can not draw meaningful conclusions. Sticking with the OEE example, if you have one site that includes preventive maintenance down time and another that does not then their is no comparison in the metric. If you are an SMRP member you can gain access to their metric standards for calculation of many of the common metrics and improve your comparisons validity.
- Benchmarking studies that do not look at processes and practices just metrics might suck. While metrics are important we have to look at processes and practices. If all we do is compare numbers without looking at the systems and processes that generate those numbers then I believe it is very hard to identify the best practices that can get you to a new level of performance. Many times you can learn more by walking around and talking to staff than could ever be gained from a metrics survey. All I am saying here is you need to do both.
- Benchmarking sample comparison is not representative of the best practices that exist so sites have a false sense of security and that sucks. Many organization only want to look at surveys and practices from their industry or niche. Over the years I have seen a couple of industries in particular that love to do this. The problem is while they were comparing themselves against their "peers" other industries have passed them by elevating the best practices. They are now left to be the best in a sorry lot. Study other organizations and other vertical's metrics and practices to see how you compare with the best in the world not just the best in your vertical.
- Metrics in the study are not targeted to the problems or situations you face at your site therefor they provide little actionable information and you guessed it... it can sucks. Sometimes the benchmarking survey leads people to believe that those metrics in the survey are all they should focus on when in reality the metrics that they really should be focused on are entirely different. The metrics you focus on are the ones that drive the behavioral change that you need. If you change the behaviors then the other metrics will improve on their own. For instance, if you are having trouble getting failure history for FRACAS and that is the behavior you are trying to change then that is what you should measure and focus on but you will be hard pressed to find work order failure code completion percentages on a benchmarking study.
Tuesday, August 27, 2013
Five Reasons Your Benchmarking Survey Might Suck