The Problem with Metrics

image
If you can’t measure it, you can’t manage it.” I don’t know who coined that phrase, but I have to tell you, I have some problems with the concept. My primary problem is that “it” often refers to a person, or at least the activity of a person and I think people should be managed more holistically. Another problem that I have is that since we are managing people, we are subject to the behavior of people who know they are being managed.

My first job after college was as a programmer/analyst for Burroughs Corporation, working in one of their manufacturing plants. System responsibilities had been hastily reassigned during the period between my accepting the position and my start date, so I found myself responsible for payroll, HR and work management systems. The product made at our plant was memory sub-systems and I was exposed very quickly to two problems with metrics. Problem one was the ease with which metrics can be manipulated. Our plant manager, aware that a key metric of his was the number of days between receiving components and shipping an assembly, simply purchased a used trailer from a trucking company. Material that arrived before we were ready to deal with it was unloaded onto our loading dock and reloaded into his trailer. When we were ready to start using the components, they were “received.” This little bit of subterfuge kept the metric that determined his bonus, nicely in the acceptable range.

Inside the plant, and much more problematic for me, was the fact that we measured every operation that was conducted by every person on the assembly floor. If five people were operating machines that stamped memory chips into circuit boards, one of the systems I was responsible for would calculate and report the cost per chip inserted. Later, I would use that value in other calculations. Unfortunately, the collection system didn’t differentiate between new assemblies and rework. So, while the guy building new circuits was feeding racks of memory chips into an automated press, the woman at the next station was unsoldering and re-soldering defective chips by hand. He might insert hundreds of chips per hour, while she struggled with five. Mine wasn’t an ethical dilemma though. The nature of the repair business meant that sometimes we would receive a part for repair that was no longer in active production. In the reporting side of my system, when I had to do the math that involved “average insertion times” the results were too small to measure and they would fail on a divide-by-zero error. Discarding these results made the woman look bad, including them crashed the system.

I am recalling those early struggles as we study the results of our newly implemented analysis of engineering inspection reports. One of the changes we have to make is to properly reflect inspections that people performed, prepared and participated in. These aren’t just three categories, or three bits of metadata, they are three distinctly different statements about someone’s role in a very important process. Not only is the distinction important, it’s important to know what is what, once these variables are calculated so the right metric is used in the right formula. For instance, the lead time on scheduling the inspection is properly attributed to the person who was in charge, so a guilt-by-association attribute should not be applied to every participant. There also has to be a distinction drawn between the person preparing the report and the person who performed the inspection. We aren’t talking anything on the scale of squirreling stuff away in a trailer, but if the relative performance of individuals is being compared, the right metric has to be used.

So why am I trying to commit this to memory (that is why some of us have blogs) “because it’s so easy to get this stuff wrong” and once your mistake is incorporated into a management report, nobody will ever know. I don’t know about you, but when I get a number to appear in the box on a screen prepared in SharePoint Designer, I am happy, and I feel close to being done. When I look back at that code and see “@Prepared_x0020_By” I know that it’s the right column, but when my coworker is reviewing the code where I refer to “$EngName”, she might be tempted to simply assume it’s the right variable. This is where SharePoint development has to mirror systems development. We have to document what these reports are supposed to show, and we have to document what they actually show. Then, we have to test the process, and the results.

Big Data

clip_image002
Earlier this week, I was part of a panel with Chris McNulty and moderated by Marc Anderson at the Gilbane Conference in Boston talking about “Successful SharePoint Adoption Strategies. Despite being on the program, I found myself feeling a bit out of place at Gilbane. The program seemed to be targeted to larger shops than ours. I always feel that you can take something away from any good presentation, and there were many good presentations, so I did my best, but I want to talk about one where I felt like I was about to be tossed onto the island of misfit toys. One of the keynote presentations was given by Christer Johnson from IBM, and he was talking about advanced analytics and Big Data. Now with just over 600 policies, nobody is ever going to accuse me of knowing anything about big data, but I seriously wonder if I am really all that alone.

One of the facts that Christer referenced several times was that there are over 361 billion gigabytes of messages floating around the universe. What caught my attention was the fact that he chose to express the quantity as 361 billion gigabytes. He did briefly mention that he was really talking about Exabytes, but nobody understands what an Exabyte is. Then it occurred to me, nobody can really comprehend the concept of 361 billion gigabytes of messages either. We are used to messages in the 140 character to one page of text range. Saying “361 billion gigabytes of messages” to me is like Ebenezer Scrooge saying “tens of thousands of a £100 notes” to Bob Cratchit. Bob would have known about £100 notes, but as my good friend David Pennington pointed out, they would have had a “mythical quality” associated with them. I know what a gigabyte is, but I can’t picture the point during the day/week/month when my message traffic exceeds one. Now, in fairness to Mr. Johnson, his keynote was fascinating and thought provoking, but clearly, there will never be an IBM “Let’s Build a Smarter Planet” ad featuring our results.

So the cynic in me is asking the optimist in me, “What did you get out of that keynote?” Well, I focused on the portion of his presentation where he was talking about the analytics around Kraft Vegemite. The early results pointed to a product that was perhaps too salty. Additional analytics revealed that, more than for any other Kraft product, people talking about Vegemite, used the word “love” in their messages. Mr. Johnson went on to talk about how important it is to take feedback (messages) in their proper context, and how we shouldn’t rely too heavily on the apparent relationship between individual metrics and customer satisfaction. That is a very important lesson, and one that scales to any size shop. If I made business decisions based on SharePoint’s metrics, I would tear down our Internet-facing server. The traffic on that server is miniscule, but the quality of the content being moved around is what is important. I don’t receive feedback on a continuous basis, certainly not gigabytes of message traffic, but I did receive two “Thank You” emails this week. Two satisfied customers; that’s a metric that I value.

I told Marc that I wouldn’t be blogging about our Gilbane session today, but I think that both he and Chris would be supportive of the other lesson I am taking from the sessions I attended (including ours). I am trying to build out our SharePoint farm more for a quality user experience than to garner more and more hits. As Chris said, “if you wanted hits, you could lock everyone into SharePoint main page when they launch their browser” – not something that would score any points for me among my peers. To a certain degree, we simply ignore the quantity portion of the quantity/quality ratio. Obviously, if we built out a page that nobody ever used, I would question the value of that page, but if we build a page that is seldom used but satisfies a business requirement each time, I’m good with that. I probably can’t build a case for that page on ROI, but I think we have to consider the cost of not having the service to offer. I may not have Exabytes of messages to process, but If I can’t send, or can’t deliver, or in 10 years if I can’t find the important messages that we do have, I’ve failed at my job.