Wednesday, May 2, 2012

ITSM Maturity Assessments: A Value-based Approach

I am not a big fan of ITSM maturity assessments.  Don't get me wrong, I think we should DO maturity assessments; it's just that they are so frequently done poorly, with useless, maybe even dangerous, starting assumptions.

  1. Improved process maturity is an end goal in and of itself

  2. Interdependence between individual processes is either negligible or non-existent, in the context of measuring maturity

A typical maturity assessment starts with the assumption, implicit or explicit, that improved process maturity is an end goal in and of itself.  Here is a great example of what I mean, based on the CMMI model of evaluating maturity.  The definition of Level 1 maturity for Incident Management indicates there is no defined owner of the process.  Part of reaching Level 2 maturity is identifying a single process owner.  I agree that having a single owner of the process is a good thing, but what goal does it help achieve?  Moving to Maturity Level 2?  Congratulations.  My concern is that many maturity assessments end there.  You're at Level 2 (or 4, it doesn't matter).  Now what?

The answer to this question depends entirely on why you were doing an assessment in the first place.  Were you just wanting a benchmark to compare against other companies, or against some later point to measure improvement?  Fine.  Let's say you came out slightly ahead of your peer companies in most process maturity levels.  Does that make you a better service organization?  Not necessarily.  It is certainly one characteristic you could use towards making that determination, but there is no guarantee that you are better just because your maturity levels are higher.

Think of it like a basketball team.  Your team, across all five positions, is slightly taller than players on the other teams.  It very well could help you beat your competitors.  Or maybe your teamwork assessments are higher than your competitors.  Your team passes very well, shoots accurately, and plays great team defense.  However, another team has 2 undisciplined players with raw talent that is so great, any normal attempt to stop them is useless.  The make up for lack of discipline with pure athletic talent and effort.  On defense, wild leaps at the ball lead to frequent steals.  If your players tried that, they would look like buffoons, and end up in lopsided defeats.  This team, however, wins 2/3 of the time.  They might lose big every once in while, when one or both or their stars are off their game; but their overall results speak for themselves.

IT departments can function like that.  We have to accept that disciplined process adherence doesn't always work well.  I've seen IT shops where a few insanely talented engineers or developers make amazing things happen, with no concern over following procedure.  It works because it works.  The business outcomes are fast and strong enough that they can withstand the occasional failure.  Besides, the super-techs are so good that they can fix their failures and still look like heroes in the process.

In reality, no one wants to run their shop that way.  We know that well-defined processes that also allow for calculated risk taking ultimately lead to better results.  Ultimately we are talking about Value.  What is the value of knowing your process maturity levels?  Very little, if it doesn't lead to creating business value.  What is the business value of having a single Incident Management process owner?  In and of itself, absolutely nothing.

And that's where most maturity assessments fail.  They tell you how well you follow ITIL, COBIT, ISO/IEC 20000, etc. standards and guidelines.

As part of my presentation for the upcoming ServiceNow Knowledge12 conference, I will present my company's example of how we looked at value:  The old technology focused way, and a newer business value focused way.

Several years ago, we did a self-assessment for a new CEO, presenting the results as the <Insert booming voice here> STRATEGIC TECHNOLOGY READINESS.

We assigned color codes for each block, Red, Yellow, or Green,  indicating how "ready" each component was for the future.  There was a time we could get away with this sort of presentation to the business.  Remember the days when we could tell the CEO that, unless the flux capacitor was upgraded this year, the future of the business was in trouble, and they bought it?  Believe me, that's how they heard the message.  We were probably right, too; but that's beside the point.

We presented the STRATEGIC TECHNOLOGY READINESS to the CEO so that he could better understand our strengths and challenges.  You read that right: It was about IT's strengths and challenges, as we saw them.  Here's what we saw:

Wow, that looks pretty good.  IT has awesome people, and we excel at the things the business used to expect from IT: Reliability, Security, and Architecture.  To be fair, Architecture was probably more for show, but it sounds cool.  The areas of struggle were the things the business was responsible for: Priorities, the Applications, and Business Processes.

Unfortunately for us, the IT landscape in the U.S. changed soon after this time.  We were an awesome IT Department, yet started seeing signs of trouble:

  • An internal survey included IT as an area needing improvement

  • Frustration from senior leadership regarding technology modernization

  • Slow implementation of requested new technology

  • "IT is expensive”
It hit me that a new paradigm was needed.  The way we measured maturity needed to change.  ITSM (instead of ITIL) was becoming a term used by industry visionaries.  Along with that came discussions around Value of IT.  The writing was on the wall.  Business leaders across the country were starting to question what they were actually getting out of the seeming money pit of IT.  Maturity needed to be measured and demonstrated from the perspective of value.  Immature IT led to negative business outcomes.  We need to be net outcome neutral or (hopefully) net outcome positive.  The new way to show maturity looked a lot more at business outcomes.

This is something much closer to what the business was expecting from IT.  How much is IT support impacting overall productivity?  To what extent were we providing and/or enabling innovation?  How quickly can we change gears towards new technologies?

The new assessment looked more like this:

That's more realistic.  We were excellent technology implementers, and the systems continued to run very well.  We were OK as balancing security with allowing sufficient accessibility.  No one, including IT, was taking responsibility for how well the technology matched current business processes.  The old perspective was that we just implement, it's up to the business to use it well or not.  You get the idea.

I'm not suggesting this as a perfect model of measuring the value of IT maturity.  This is still very rough, and doesn't do a great job at showing the interdependence between the elements.  You are welcome to use it, however.  I recommend that you first determine whether these elements of value are appropriate for your organization.  This works well for me, you may find the need to drop some and add others.

The point is that a maturity assessment is only helpful when it helps define the value of the relevant processes to the business.  You could start assigning costs to these value elements, and you'd have a great start towards developing a common understanding of how IT costs lead to business value (or not).  For example, the visual helped me see that we were spending a lot of money and time on security, without really identifying business benefits.  Risk is a much better concept, and is understood much better by business executives.  Maybe our spend on security gives us more risk avoidance than we need.

As I said, this is only a start.  I'd love to hear how others bridge the gap between process maturity and business value.  Please leave your ideas in the comments!