tag:blogger.com,1999:blog-27003954485114697892024-03-16T07:15:30.160-05:00Hazy ITSMAnonymoushttp://www.blogger.com/profile/09929890999944016287noreply@blogger.comBlogger39125tag:blogger.com,1999:blog-2700395448511469789.post-16189756806461060712015-03-21T15:13:00.000-05:002015-03-21T15:13:26.167-05:003 things any metric should accomplish<div dir="ltr">
Do you ever struggle with what to measure, or whether your metrics are worth the effort Anything you measure should accomplish one or more of these three things:</div>
<div dir="ltr">
</div>
<ol>
<li>Drive action</li>
<li>Guide decisions</li>
<li>Measure progress towards a tangible goal or objective</li>
</ol>
<br />
<div dir="ltr">
It's very simple: If something you measure does not do at least one of these things, why are you measuring or reporting on it?</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
Measuring and reporting on those measures takes time and money. Don't waste either by gathering metrics that someone might find useful someday. </div>
Anonymoushttp://www.blogger.com/profile/09929890999944016287noreply@blogger.com5tag:blogger.com,1999:blog-2700395448511469789.post-44853306146163766792015-03-17T14:17:00.000-05:002015-03-17T18:58:27.062-05:00The Business - IT relationship is more like a love triangle<div dir="ltr">
I've come across some great articles recently, using the Valentines Day analogy for the relationship between IT and the rest of the business. See "<a href="http://americas.g2g3.com/blog/bid/99697/ITIL-s-BRM-How-Business-Relationship-Management-Shows-the-Love">ITIL®’s BRM: How Business Relationship Management Shows the Love</a>" by Julie Montgomery for an excellent example.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
Most of these articles, however, either gloss over or skip entirely the reality that there is a third actor in this relationship: the consumer of IT services. To clarify, I mean the employees of the business that consume transactional services provided by internal IT, HR, Facilities, and any other internal service provider organizations. This could be the caller on an incident, the department manager requesting provisioning of accounts for an employee, the business executive sponsor of a large and complex implementation project, etc.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
What IT (and other internal service providers) frequently miss is the point that this third actor is a separate and unique entity. We assume that this third player is just an extension of "the business" entity.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
An example: you are an IT manager working with the marketing director, who is constantly complaining about an ongoing issue causing marketing folks pain. In your mind, a reasonable workaround exists that, while not perfect, keeps the marketing people productive. You haven't given up on a resolution to the problem, but your team is now almost exclusively focused on a high visibility project sponsored by the COO. When you bring up the complaining marketing director in a staff meeting, you are met with reassuring comments like, "Which one does the business want done? They have to get their priorities straight." This is a common IT complaint. The business can't make up its mind regarding priorities.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
This perspective has one serious flaw: It assumes the business and the consumers of IT services are one in the same.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
Look at it this way: All three actors are part of the single entity called "the business." We share the same overall mission, bottom line, etc. In terms of the relationship between IT and the rest of the business, however, there are at least three distinct relationships:</div>
<div dir="ltr">
</div>
<ul>
<li>The relationship between IT and business leadership</li>
<li>The relationship between IT and the transactional internal consumer of IT services</li>
<li>The relationship between the internal service consumer and business leadership</li>
</ul>
<br />
<div dir="ltr">
The idea is that while we share the same big picture goals and objectives, we each have our own goals, objectives, and motivations. To pretend otherwise is, at best, naive.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
The IT Skeptic, Rob England, likes to use a <a href="http://www.itskeptic.org/organisations-have-failed-their-it-bad-parents" target="_blank">parent - child <u>analogy</u></a> to describe the relationship between business and IT. You could extend that analogy to call the consumers of IT services our "siblings." As such, we are always jockeying for the love and attention of our business parents. I think there is something to this, but if we are to position IT as partners in the business rather than a butler, we need to look further.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
Each partner relationship is different. We can't pretend the behaviors that nurture and grow the IT - Business leadership relationship will equally nurture and grow the relationship between IT and internal service consumers.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
IT manages two distinct kinds of relationship:</div>
<div dir="ltr">
</div>
<ol>
<li>The customer relationship with business leadership. Business leadership is our customer. Business leadership pays IT to provide services. The number one concern is "Are we receiving good value for our investment?"</li>
<li>The consumer relationship with the transactional recipients of the services. Internal users are our consumers. The number one concern here is "Do I have the services I want when I want them?"</li>
</ol>
<br />
<div dir="ltr">
The Facebook business model teaches us a lot about the difference between customers and consumers of services. Users of Facebook are clearly not the customers of Facebook. Facebook makes money from advertising, application developers, and third parties to whom Facebook sells all the amazing data it collects on us. Those are their customers. Facebook users are consumers of the Facebook service, but we are definitely not the customer Facebook needs to keep happy. For the business model to work, they just need a sufficient number of people consuming their services in order to generate enough data to satisfy their customers (companies buying the consumer data). Those who complain about Facebook not listening to their customers regarding things like privacy miss the point. Facebook listens to their customers all the time. It's just not us, the consumers of Facebook, that they're listening to. Facebook needs to keep their customers happy, and their consumers <i>just happy enough</i> that they keep consuming the service. They simply don't care if they lose a few angry consumers, as long as the customers continue buying the data being generated.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
The internal service provider model is not all that different. We, as internal service providers, need to keep the consumers of our services just happy enough that they don't rise up in revolt against us.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
In reality, the business leadership entity probably HAS made up its mind. The high visibility project is the priority.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
(That doesn't mean it is the correct priority choice, just a very likely outcome of the given scenario)</div>
Anonymoushttp://www.blogger.com/profile/09929890999944016287noreply@blogger.com211tag:blogger.com,1999:blog-2700395448511469789.post-4564814170321820232014-05-28T19:25:00.001-05:002014-05-28T19:25:40.860-05:00Keeping Employees Engaged With ITSM<div dir="ltr">
Employee engagement had been a popular corporate buzzword the past few years. I've been a bit leery of how the term is applied, since it appeared to mean whatever a given organization wanted it to mean. I've seen engagement used to mean productivity (Productivity has decreased, it must mean employees are less engaged). I've also seen where employee satisfaction surveys were used to measure engagement (Employees hate working here. They must not be engaged). Engagement has frequently come to mean the attitude of the employee and how they feel about their direct supervisor. </div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
I just came across an interesting article from Forbes titled, "<a href="http://www.forbes.com/sites/stevemeyer/2014/05/28/ceo-news-flash-if-your-workers-arent-engaged-its-your-own-fault/">CEO News Flash: If Your Workers Aren't Engaged, It's Your Own Fault</a>", which gives the most useful context for engagement I've seen. The idea is that humans are intrinsically motivated to be a valued participant in the workplace. The message being that corporate culture is what most frequently squelches that intrinsic motivation, and leaders have the responsibility to reestablish it. The author suggests we start by looking at two key aspects of leadership:</div>
<div dir="ltr">
</div>
<ol>
<li>Setting high standards</li>
<li>Creating a culture of recognition</li>
</ol>
<br />
<div dir="ltr">
IT in general and many ITSM initiatives in particular can work against these tactics. Who is more recognized for achievement in your organization? The diligent engineer who always plays by the Change process rules, or the maverick who puts out the dramatic IT fire, often created by their own sloppiness? I hope it's the former; but in many organizations I've worked for and with, the latter unintentionally receives the accolades. And don't assume your organization doesn't reward the arsonist/firefighter. Rewards can come in many forms. <a href="http://www.hazyitsm.com/2014/05/the-big-3-questions-of-consequence.html" target="_blank">Some obvious, some not</a>.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
What about SLAs? Are they used to measure individual performance in addition to organizational adherence to agreements? Too often they are. We must remember that SLAs are <i>minimally acceptable</i> targets when it comes to individual performance. It is the equivalent of earning a C grade. Meets expectations. If all employees strive to merely meet your SLA targets, that leaves no room for the occasional task that fails to meet minimum expectations. In order to meet organizational SLAs, we need the performance on individual tasks to exceed minimal expectations more often than not. Do your expectations around employee performance reflect a culture of high standards? Look carefully at how you set expectations around individual performance. If they are the same as the standards around organizational performance (ie., SLAs), you may be unintentionally creating a culture of low employee engagement.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
Think of it this way. Performance against the standards you set on an individual basis is a key leading indicator of overall organizational performance. Make your standards high, clear, and reachable.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
Provide reinforcement. Publicly recognize performance excellence, focusing on the "why" and "how" over the "what." The "what" of recognition just says "well done". "How" takes it a step further and indicates how the performance enables a greater goal or outcome. Most important, "why" personalizes the experience to say "I get why you are good".</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
Engagement doesn't have to be a nebulous concept. Managed purposefully from the top, it can create tremendous value. At a time when IT departments continue to struggle for the favor of business partners, we need all the employee engagement we can muster. And it starts with you.</div>
Anonymoushttp://www.blogger.com/profile/09929890999944016287noreply@blogger.com5tag:blogger.com,1999:blog-2700395448511469789.post-11395224738594875112014-05-19T23:42:00.000-05:002014-05-19T23:42:08.330-05:00The Big 3 Questions of ConsequenceI had a great conversation with a new client today. It was a pre-workshop call to solidify the agenda for our upcoming process re-engineering workshop. The discussion turned to how we transition from the old processes to new ones. It's one thing to design a new business process. It's another thing entirely to put that process into practice in a manner that optimizes the likelihood of success.<br />
<br />
In addition to some standard organizational change management suggestions, such as communication, training, etc, I mentioned the importance of identifying consequences associated with the old procedures. I said something along the lines of, "It's just as important to understand the positive consequences some staff get for behaving inappropriately." I did't mean legal or ethical inappropriateness, but behavior that is inconsistent with the desired process. I asked the questions:<br />
<br />
<ul>
<li>Do people on your teams ever get praise for putting out day-to-day fires?</li>
<li>How often do they receive praise for doing the right thing so that the fire-fighting situation never arises?</li>
</ul>
<br />
The answer to question 1 is frequently "all the time", while the answer to question 2 is frequently "never".<br />
<br />
A while back I wrote about some practical considerations for <a href="http://www.hazyitsm.com/2013/05/process-and-culture-change.html" target="_blank">process and culture change</a>. One part of that post was about consequences for appropriate and inappropriate behavior. I find that the most overlooked part of organizational change is that of consequences and rewards. I thought about it some more today, and realized that almost every example I've seen of failed process change did the following three things poorly or not at all. Conversely, in addition to clear measurable goals, every successful major process/organizational change did each of these thoroughly.<br />
<br />
<h3>
Three Questions of Consequence</h3>
<br />
<ol>
<li>When developing new processes, are you including positive consequences for behaving appropriately?</li>
<li>Does your current process have positive consequences for behaving inappropriately?</li>
<li>Does your current process have negative consequences for behaving appropriately?</li>
</ol>
<br />
Let's take a brief look at each question.<br />
<br />
<h4>
When developing new processes, are you including positive consequences for behaving appropriately (and negative consequences for behaving inappropriately)?</h4>
This can manifest many ways, but it is critical to include clear expectations of positive and negative behavior. How do you appraise employees? Do you include appraisal criteria for process changes? For example, when re-defining change processes, have the relevant employees' job descriptions and/or performance assessment criteria been updated to reflect desired behaviors under the new process? I am surprised how often this is overlooked, or considered minimally important. Process folks all too often assume that everyone else gets the importance of process change the same way we do. At the very least, we assume that process compliance is out of the scope of any process (re-) engineering project. We must work with the personnel supervisors to ensure that compliance expectations are documented on a role-by-role basis. An additional benefit is that you can also determine whether the supervisors are on board.<br />
<br />
I can't state this strongly enough: Your process re-engineering or improvement project <i>will fail</i> without clear behavior expectations.<br />
<br />
<h4>
Does your current process have positive consequences for behaving inappropriately?</h4>
These are the most dangerous consequences, and the most important to uncover. As you address organizational change, look for the hidden positive and negative consequences embedded in the current process. Are there a few "star" performers who are always called out for exceptional fire fighting? If I'm consistently rewarded for my fire fighting efforts, why on earth would I want to help make a transition to a more consistently applied process? Don't forget that everyone else notices the rewards and accolades given to those who work outside the boundaries of the preferred system. What makes it even harder is that we're also talking about <i>perceived</i> positive consequences. Of course I don't intend to reward the system administrator that routinely takes on requests that should go through the service desk. As a manager, however, I may forget the occasional process lapses and focus my promotion efforts on all the glowing customer feedback. What happens when other staff perceive that the rule-bender gets more positive attention and even promotions?<br />
<br />
Equally important in this assessment is a determination of why the rule-bender bends the rules in the first place. Are there problems with the service desk to the point where customers understandably seek out help elsewhere? Are there issues where, in the best interests of your business, <a href="http://www.hazyitsm.com/2011/12/should-help-desk-be-place-to-start-it.html" target="_blank">the service desk <i>should</i> be bypassed</a>?<br />
<br />
<h4>
Does your current process have negative consequences for behaving appropriately?</h4>
Just as baffling is the idea that there are actually negative consequences for performing appropriately. These can be the most difficult to uncover, as they are usually the least intentional. Again, these can be perceived or purposeful, and are frequently associated with positive consequences for behaving inappropriately. Is someone following expectations around change request lead time also being punished by their boss for slow throughput? Maybe a service desk agent is evaluated poorly due to a lower volume of incidents handled, while all along they were following the defined expectation of minimizing ticket re-assignments. Can you add any of your own examples?<br />
<br />
Being purposeful about formal and informal consequences is a critical part of any process re-engineering, improvement, etc. program. The most thoughtful and well designed process changes are doomed to fail without thorough assessment and, if needed, remediation of competing and conflicting consequences.<br />
<br />
What are your thoughts?Anonymoushttp://www.blogger.com/profile/09929890999944016287noreply@blogger.com3tag:blogger.com,1999:blog-2700395448511469789.post-37094924368135751752014-04-06T21:36:00.002-05:002014-04-07T09:42:57.430-05:00Wherefore art thou, service desk?<div dir="ltr">
This started as a reply to a <a href="http://goo.gl/Q6IyeL" target="_blank">blog</a> written by ITSM consultant extraordinaire Barclay Rae. Go back and read that if you get the chance. My thoughts are intended to compliment what Barclay shared and expand in an area that's had a lot of my attention lately. Where has the Service desk gone, what have we done to it, and how do we give it its proper place in ITSM? </div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
We've worked so hard to define ITSM and ITSM software as "more than (just) the help desk" that we started to believe ourselves that the service desk doesn't matter. We (ITSM industry) have such a strong inferiority complex that we needed to show everyone else, the business and the rest of IT, that what we do is more than logging calls and managing trouble tickets. Along the way, we've forgotten that the people we put in between the rest of IT and the business actually matter. Instead we looked for easier and cheaper ways to do that function. This has given rise to:</div>
<div dir="ltr">
</div>
<ol>
<li>Outsourcing the service desk function entirely, and/or</li>
<li>Scaling back on the quantity and quality of people we use to staff internal service desks</li>
</ol>
Over the past year I've come across more and more organizations downplaying the importance of the service desk. Even when facilitating Incident Management workshops, the service desk is barely mentioned. The service desk manager may even be present, but he/she recognizes that the people taking the calls are virtually irrelevant.<br />
<div dir="ltr">
<br /></div>
<div dir="ltr">
We've justified the easier, cheaper goal based on <a href="http://goo.gl/rjB9FY" target="_blank">research</a> that tends to imply that customers don't want to talk to us anyway. The service desk is an anachronism of days long past when customers had no other options.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
I'd argue that internal technology support is significantly different than most transactional customer support; but I'm willing to set that aside for now. Let's go with the assumption that most customers of internal IT services prefer not to interact with your staff. So customers try to resolve issues on their own. What happens when they can't? They've googled the issue, searched through your knowledge base, asked coworkers for help, and the issue remains. It's clear that by this point their issue is not a simple password reset that self service can easily solve. </div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
Now they need to call in professional help, so who do we have them call? Outsourced staff armed only with scripts for common issues, or internal staff of whom we expect little more than being able to transcribe the issue into a ticket. These are the people you want representing the face of your services to your business?</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
Yes, we're probably right that automating simple service transactions are a good idea. But that doesn't mean we can scale back on the people who do engage with our customers. What it means is that the people who do call for help can be divided into two groups:</div>
<div dir="ltr">
</div>
<ul>
<li>Those with more complex issues that aren't easily resolved by standard repeatable steps, OR</li>
<li>Those who prefer not to use self help, and need more hand holding</li>
</ul>
We need people taking those calls to have a broad technical base, so they understand how to triage and diagnose issues that may have multiple causes across technology AND business disciplines. They must understand how our specific business works. And we need people with the skills to empathize and listen, who actually enjoy helping less tech savvy customers achieve results.<br />
<div dir="ltr">
<br /></div>
<div dir="ltr">
Instead, we give our customers the polar opposite of what they need. We give them outsourced staff who know nothing about our business, and staff with minimal technical breadth.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
The old model was to staff the Service desk with entry-level systems analysts. The current model staffs the Service desk with the cheapest resources we can find. I propose that what is needed going forward are service desks staffed with junior business analysts or junior relationship managers. Let's use people who are focused on technology breadth and customer outcomes. </div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
Crazy idea, huh? </div>
Anonymoushttp://www.blogger.com/profile/09929890999944016287noreply@blogger.com0tag:blogger.com,1999:blog-2700395448511469789.post-41964306484516741362014-02-14T11:40:00.000-06:002014-02-14T11:40:11.224-06:00ITSM Cannot Live on Process AloneI've received great feedback on my article, "<a href="http://www.hazyitsm.com/2014/02/process-improvement-is-not-service.html" target="_blank">Process Improvement is not Service Improvement</a>". I was in the process of responding to a comment, when I realized the response probably needed its own article.<br />
<br />
Process improvement is frequently, maybe even almost always, a component of service improvement. What I'm saying is that it's a bad idea to use process CSF's and KPIs as the desired outcome of an improvement project. I've come into client projects where the goal was something akin to "improve Problem Management to CMMI level 3". That's a noble purpose. The conversation might look like this.<br />
<blockquote class="tr_bq">
Me: What is your project goal?</blockquote>
<blockquote class="tr_bq">
Client: To improve the services we provide to the business. </blockquote>
<blockquote class="tr_bq">
Me: Why do you need to improve services? Is there a specific business driver? </blockquote>
<blockquote class="tr_bq">
Client: Reach maturity level 3 in Problem Management. </blockquote>
<blockquote class="tr_bq">
Me: OK, how will you know when you've reached it? </blockquote>
<blockquote class="tr_bq">
Client: We have some internal targets set. Once we've reached those, we'll have an outside audit done. </blockquote>
<blockquote class="tr_bq">
Me: Great! What do you hope to achieve by doing this? </blockquote>
<blockquote class="tr_bq">
Client: I told you. CMMI level 3. </blockquote>
<blockquote class="tr_bq">
Me: No, what I mean is, what is the business driver causing you to do this now?</blockquote>
<blockquote class="tr_bq">
Client: The CIO talked about SOX compliance. There was a finding in our internal audit that needs to be addressed. </blockquote>
<blockquote class="tr_bq">
Me: OK, so the business driver is the remediation of a SOX audit finding? </blockquote>
<blockquote class="tr_bq">
Client: Yes ... (Sigh) ... but our outcome is to reach CMMI level 3.</blockquote>
Then we discuss how maturity level 3 may have nothing to do with addressing SOX compliance. The client agrees with that, but says executives determined that CMMI level 3 would provide what they needed to remove the audit finding.<br />
<br />
OK, now we're getting somewhere. It turns out that HOW achieving level 3 remediates the audit finding was never shared with the project team. All they know is that they need to achieve level 3 to remediate an audit finding.<br />
<br />
Does having a process success factor, reaching Problem Management level 3 maturity, inherently improve the service provided to the business? No. There's an excellent chance it will even increase the cost of providing services. If we're going to increase costs, there better be an inherently clear business purpose for doing so, and that purpose should clearly provide more benefit than the added costs.<br />
<br />
How many different process changes could you implement in order to achieve CMMI level 3? What does it mean to achieve level 3? How do we know that the process changes put in place in order to reach level 3 will actually resolve the audit finding? It's possible that your process changes help you achieve level 3, but do not address the audit finding. This is a recipe for disaster. The "service improvement" effort is based on a process CSF, which was selected in order to meet a compliance issue, and we know that making the necessary process changes may not even resolve the compliance issue!<br />
<br />
This isn't a unique example. Plug in terms like "reduce incidents", "improve SLA achievement %" or just about any other process based metric you want.<br />
<br />
<h3 style="text-align: center;">
<span style="font-family: Georgia, Times New Roman, serif;">There is no direct correlation between achievement of the process goal and improvement of services provided to the business.</span></h3>
<br />
You might end up improving services, but you could just as easily increase costs of providing services with no business-visible improvement in those services.<br />
<br />
Process improvement is almost always part of service improvement. They compliment each other very well, but they are not the same thing. Before embarking on any sort of service improvement program, whether it is continual or one-time, make sure the desired business value is clearly defined before you start defining any process CSFs or KPIs.<br />
<br />
Failure to do so dooms your program before it even starts.Anonymoushttp://www.blogger.com/profile/09929890999944016287noreply@blogger.com0tag:blogger.com,1999:blog-2700395448511469789.post-21970810398615253842014-02-08T17:07:00.001-06:002014-02-14T11:44:38.820-06:00Process Improvement is not Service Improvement<div dir="ltr">
What is the statute of limitations on ITSM transgressions? I hope they have long past, since I am now confessing some of my past sins.<br />
<br /></div>
<div dir="ltr">
I used "process improvement" interchangeably with "service improvement". </div>
<div dir="ltr">
<br />
There. I've said it on the Internet, where everything is indisputable fact. Good to unburden myself like that. It's like a good cleanse. </div>
<div dir="ltr">
<br />
I'm amazed by how often I come across CSI (Continuous Service Improvement) efforts that list things like fewer incidents or process efficiency as the primary goals. But again, I once did the same thing. Much ITIL and process maturity direction tries to sell the idea that "better" process is all we need to do to reach better service offerings. Process metrics like first contact resolution, mean time to restore, and self service growth are constantly presented as THE way to measure success in IT service management. Of course there are outliers who offer different measures, but the fact that they are outliers speaks volumes.</div>
<div dir="ltr">
<br />
Here's the confusing part: We are providing a service that is also a product. Customer interactions with individuals delivering the service are part of that service. We call those interactions "providing service" or "customer service". This word service is used all over ITSM, so it's easy to confuse the service product with the activity of customer service.</div>
<div dir="ltr">
<br />
Let's be clear. Incident management is not a service. It controls a process or series of activities done in order to restore a service to normal working state. Measures of incident management or customer service have no direct correlation to the willingness of customers to consume your service-product. Those measures may have an indirect connection. Please allow me to paraphrase Deming:</div>
<div dir="ltr">
<b><br /></b>
<blockquote class="tr_bq">
<b>You can have great customer service and zero service customers. </b></blockquote>
</div>
<div dir="ltr">
<br />
(Or in ITIL speak: You can have great service processes and zero service customers.) </div>
<div dir="ltr">
<br />
This is also true: <b>You can have lots of service customers and lousy customer service </b>(or lousy service processes). </div>
<div dir="ltr">
<br />
While service process quality is certainly a variable of overall service quality, they are not <u>synonymous</u>. More or better process definition does not mean better service. More process can even lead to degraded overall service-product. (See "<a href="http://goo.gl/ISZfSF">Is your IT Team and Budget a Victim of Process Over-Engineering</a>?") </div>
<div dir="ltr">
It's well documented that I am <a href="http://goo.gl/M480RI">not a fan of First Contact Resolution as a success metric</a>. I'll take it a step further and say that ITIL process metrics should never be used to measure service success. Process improvement does not mean service improvement. Service Request process improvement to CMMI level 4 doesn't mean you are delivering an improved service. It can be a tactic used to reach improved service, but it can just as easily end up being an expensive boondoggle that does nothing to improve your service-product.</div>
<div dir="ltr">
<br />
Service Improvement is about value. Do my customers feel they receive sufficient benefit from their cost investment? Do they like doing business with me? </div>
<div dir="ltr">
<br />
I've come across ITSM practitioners where their CIO had set a goal of reducing incidents by 2%. And that's their CSI initiative. What the heck does fewer incidents have to do with service value? Enough!</div>
<div dir="ltr">
<br />
I've come clean. Are you ready to as well?</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
Edit: I've posted a <a href="http://www.hazyitsm.com/2014/02/itsm-cannot-live-on-process-alone.html">follow up article</a> to address some questions.</div>
Anonymoushttp://www.blogger.com/profile/09929890999944016287noreply@blogger.com15tag:blogger.com,1999:blog-2700395448511469789.post-7438019394961896072013-12-31T11:35:00.000-06:002013-12-31T11:35:10.367-06:00The ITSM Value Proposition is Incomplete<br />
Business value of IT services is often defined as a simple formula.<br />
<blockquote class="tr_bq">
<span style="font-size: large;">Value = Efficiency + Effectiveness</span></blockquote>
ITIL puts it this way.<br />
<blockquote class="tr_bq">
<span style="font-size: large;">Value = Utility + Warranty</span></blockquote>
<br />
Over the past few years I've come to believe that we are missing a key ingredient: Customer Experience. Customer service is not the same as customer experience, although they are related. Customer service is interested in making the customer as happy as possible, after a service impacting event has occurred. Customer experience is interested in how a consumer of a service interacts with the service through the life cycle of a service event.<br />
<br />
If I order a laptop for a new employee, customer service wants me to feel good about the outcome of the request. Customer experience is concerned about how I go about making the request, and any interactions I have during the process. Can I check the status through a portal? How easy or difficult is that to do? Was I able to find what I wanted quickly and easily? Was the actual request easy, and comprehensive enough for me to feel comfortable that I will receive value for what I paid?<br />
<br />
A recent experience got me thinking about the relationship between service efficiency, effectiveness, and customer experience.<br />
<br />
My family ran into some problems at dinner several weeks ago. Two of our three entrees were wrong to the point of needing to have them sent back and re-made. The third entree was prepared below expectation, but not to the point where my wife was willing to send it back and wait. Considering the moderately upscale restaurant context, I expected better accuracy in the orders, and faster turnaround when the two entrees were re-made. The server did an admirable job in taking care of us after the errors, and she did not throw anyone else under the bus -- a good lesson for all service practitioners.<br />
<br />
Then the bill came. It itemized our entire meal, including making the two incorrect entrees complimentary, or so I thought. It turns out that the two incorrect items were on the bill twice each. The comp entrees appeared a little further down. I asked the server, since it looked like they intended to comp the two entrees, but accidentally added them back in a second time. The server thought it was weird, too, and went to check with her manager. The manager stopped by (a loooooong time later) and explained that the bill was right. For inventory purposes they added each item that had been prepared, and simply removed the cost of the items we had returned. In other words, no comps. We simply didn't have to pay for the incorrect entrees that we returned. To keep the inventory accurate, they needed to add each item to the bill, subtract the returned items, and then add back in items we accepted.<br />
<br />
To be fair, the manager did end up making our appetizer complimentary; but it made me think about efficiency, effectiveness, and customer experience.<br />
<br />
As a customer, I would have been OK if the bill simply included the items we kept. Why did my bill need to reflect the items prepared but sent back and comped? The answer was inventory accuracy, but was that a good answer? Including the returned items was good for service efficiency. Depending on how the restaurant sees service effectiveness, it could be considered good or bad effectiveness. Customer experience, however, was negatively impacted by using this method of keeping inventory straight. I wouldn't have thought twice about it if the bill simply itemized the food that was accepted. My expectations changed when I saw that some items on my bill were comped. At that point I started to wonder why the entrees weren't comped or discounted.<br />
<br />
Did the restaurant do themselves a disservice by making something irrelevant to the customer (accurate inventory levels) so clearly visible to the customer? This was a minor issue to me, but I wonder how often IT service providers do something similar in the name of process efficiency? Do we properly consider customer experience while designing and delivering services? I believe the answer is frequently "No".<br />
<br />
This can be as simple as how we respond to a customer status inquiry. Which response provides a better customer experience?<br />
<br />
<ul>
<li>Let me look up your tickets. I see that Bob updated the request 2 days ago, but I'm not sure what's happening now. He sometimes forgets to keep tickets updated, so I'll have him send you an update.</li>
<li>Let me check into this and get back to you. When is the best time to call you back?</li>
</ul>
<br />
It can also be more ingrained into service design. I come across many clients that ask their customer to fill in numerous, possibly confusing, fields looking for specific details during an initial request. It certainly is more efficient, and you could argue about effectiveness as well. I doubt, however, that it adds to a positive customer experience.<br />
<br />
I once overheard an IT staffer comment, "We need to get these people to understand how to make a request we can work with". One goal the team identified was to reduce the number of incoming requests. Seriously. During the discussion it became clear that what they really wanted was to increase the value of each request. It was simply the difference between looking at it from an IT efficiency perspective versus looking at it from a business value perspective.<br />
<br />
Business value of IT service is no longer just about utility and warranty. Customer experience is a crucial component of value. Nobody outside of IT is excited by the reduced hardware costs of virtualization, while the overall cost of IT continues to grow. That's no better than subtracting a line item from my bill, and then adding it back in a few lines later. There might be a good explanation, but the business customer doesn't care unless it adds to their experience.Anonymoushttp://www.blogger.com/profile/09929890999944016287noreply@blogger.com1tag:blogger.com,1999:blog-2700395448511469789.post-88036100684706610862013-10-04T15:22:00.004-05:002013-10-04T15:28:42.142-05:00What makes for a compelling metrics story?<i>This article is cross posted at <a href="http://www.theitsmreview.com/2013/09/compelling-metrics-story/" target="_blank">The ITSM Review</a>.</i><br />
<br />
In my first article “<a href="http://www.theitsmreview.com/2013/09/metrics-story/">Do your metrics tell a story?</a>” I discussed the “traditional” approach to reporting metrics, and why that approach is ineffective at driving action or decisions.
<br />
<br />
Personal observations are far more effective. Personal observations appearing to conflict with the data presented can actually strengthen opposition to whatever decision or action the data suggests. Presenting data as part of a story reboots the way we receive data. Done well, it creates an experience very similar to personal observation.<br />
<br />
So how can we do this well? What makes a compelling metrics story?<br />
<br />
<h3>
<b>Every element must lead to a singular goal</b></h3>
This cannot be stressed enough. Any metrics story we tell must have a singular purpose, and every element of the package must exist only to achieve that purpose. Look at any report package you produce or consume. Is there a single purpose for the report? Does every piece of information support that single purpose? Does the audience for the report know the singular purpose? If the answer to any of these questions is no, then there is no good reason to invest time in reading it.<br />
<br />
ITSM legend <a href="http://www.itsmpa.org/itsmpa/ITSMPABioMalcolmFry.htm">Malcolm Fry</a> provides an excellent example of the singular goal approach with his “Power of Metrics” workshops. If you haven’t been able to attend one of his metrics workshops, you are truly missing out. I had the honor when Fry’s metrics tour came through Minneapolis in August 2012. The most powerful takeaway (of many) was the importance of having a singular focus in metrics reporting.<br />
<br />
In the workshop, Fry uses a “Good day / Bad day” determination as the singular focus of metrics reporting. <a href="http://thoughtrock.com/">ThoughtRock</a> recorded an <a href="http://bit.ly/1dbj7ha">interview</a> with him that provides a good background of his perspective and the “Good day / Bad day” concept for metrics. The metrics he proposed all roll up into the determination of whether IT had a good day, or a bad day. You can’t get clearer and more singular than that. The theme is understood by everyone: IT staff, business leaders … all the stakeholders.<br />
<br />
There are mountains of CSF/KPI information on the Internet and organizations become easily overwhelmed by all the data, trying to decide which CSFs and KPIs to use. Fry takes the existing CSF and KPI concepts and adds a layer on top of CSFs. He calls the new layer “Service Focal Point”.<br />
<blockquote>
The Service Focal Point (SFP) provides a single measurement, based on data collected through KPIs. Good day, bad day is just one example of using SFPs. We only need to capture the KPIs relevant to determining the SFP.</blockquote>
(Fry also recently recorded a webinar: <a href="http://bit.ly/17Qg6Nk">Service Desk Metrics — Are We Having a Good Day or a Bad Day?</a> Sign up, or review the recording if you are reading this after the live date).<br />
<br />
<h3>
<b>Create a shared experience</b></h3>
A good metrics story creates a new experience. Earlier I wrote about how personal histories – personal experiences – are stronger than statistics, logic, and objective data in forming opinions and perspectives. Stories act as proxies for personal experiences. Where personal experiences don’t exist, stories can affect opinions and perspectives. Where personal experience does exist, stories can create additional “experiences” to help others see things in a new way.<br />
<br />
If the CIO walks by the service desk, and sometimes observes them chatting socially, her experience may lead to a conclusion that the service desk isn’t working hard enough (overstaffed, poorly engaged, etc.) Giving her data demonstrating high first contact resolution and short caller hold times won’t do much to change the negative perception. Instead, make the metrics a story about reduced costs and improved customer engagement.<br />
<br />
A great story creates a shared experience by allowing us to experience similarities between ourselves and others. One of the most powerful ways to create a shared experience is by being consistent in what we report and how we report it. At one point in my practitioner career I changed metrics constantly. My logic was that I just needed to find the right measurement to connect with my stakeholders. It created the exact opposite outcome: My reports became less and less relevant.<br />
<br />
The singular goal must remain consistent from reporting period to reporting period. For example, you may tweak the calculations that lead to a Good day / Bad day outcome, but the “storyline” (was it a good day or a bad day?) remains the same. We now have a shared experience and storyline. Everyone knows what to look for each day.<br />
<br />
Use whatever storyline(s) works for your organization. Fry’s Good day / Bad day example is just one way to look at it. The point is making a consistent story.<br />
<br />
<h3>
Make the stakeholders care</h3>
A story contains an implied promise that the story will lead me somewhere worth my time. To put it simply, the punch line – the outcome – must be compelling to the stakeholders. There are few experiences worse than listening to a rambling story that ends up going nowhere. How quickly does the storyteller lose credibility as a storyteller? Immediately! The same thing happens with metrics. If I have to wade through a report only to find that there is ultimately nothing compelling to me, I’ll never pay attention to it again. You’ll need to work pretty hard to get my attention in the future.<br />
<br />
This goes back to the dreaded Intro to Public Speaking class most US college students are required to take. When I taught that class, the two things I stressed more than anything was:<br />
<ul>
<li>Know your audience</li>
<li>Make your topic relevant to them</li>
</ul>
If the CIO is your primary audience, she’s not going to care about average call wait times unless someone from the C-suite complained. Chances are good, however, that she will care about how much money is spent per incident, or the savings due to risk mitigation.<br />
<br />
<h3>
Know your ending before figuring out the middle of the story</h3>
This doesn’t mean you need to pre-determine your desired outcome and make the metrics fit. It means you need to know what decisions should be made as a result of the metrics presentation before diving into the measurement.<br />
<br />
Here are just a few examples of “knowing the ending” in the ITSM context:<br />
<ul>
<li>Do we need more service desk staff?</li>
<li>How should we utilize any new headcount?</li>
<li>Will the proposed process changes enable greater margins?</li>
<li>Are we on track to meet annual goals?</li>
<li>Did something happen yesterday that we need to address?</li>
<li>How will we know whether initiative XYZ is successful?</li>
</ul>
<br />
<ul>
</ul>
<h3>
<b>A practical example</b></h3>
Where should we focus Continual Service Improvement (CSI) efforts? The problem with many CSI efforts is that they end up being about process improvement, not service improvement. We spend far too much time on siloed process improvement, calling it service improvement.<br />
<br />
For example, how often do you see measurement efforts around incident resolution time? How does that indicate service improvement by itself? Does the business care about the timeliness of incident resolution? Yes, but only in the context of productivity, and thereby cost, loss or savings.<br />
<br />
A better approach is to look at the kind of incidents that cause the greatest productivity loss. This can tell us where to spend our service improvement time.<br />
<br />
The story we want to tell is, “Are we providing business value?”<br />
<br />
The metric could be a rating of each service, based on multiple factors, including: productivity lost due to incidents; the cost of incidents escalated to level 2 & 3 support; number of change requests opened for the service; and the overall business value of the service.<br />
<br />
Don’t get hung up on the actual formula. The point is how we move the focus of ITSM metrics away from siloed numbers that mean nothing on their own, to information that tells a compelling story.<br />
<br />
If you would like guidance on coming up with valid calculations for your stories, I highly recommend “<a href="http://amzn.to/13bUyJC">How to Measure Anything: Finding the Value of Intangibles in Business</a>” by <a href="http://www.hubbardresearch.com/our-staff/">Douglas Hubbard</a>.<br />
… and a few more excellent resources:<br />
<ul>
<li><a href="http://amzn.to/181kv1V">Keeping Up with the Quants: Your Guide to Understanding and Using Analytics</a></li>
<li><a href="http://youtu.be/XVMYTplQ158">Stories vs. statistics: Professor John Allen Paulos at TEDxTempleU</a></li>
<li><a href="http://bit.ly/176IAna">Storytelling – A Tale of ONE Hot Dog</a></li>
</ul>
Anonymoushttp://www.blogger.com/profile/09929890999944016287noreply@blogger.com0tag:blogger.com,1999:blog-2700395448511469789.post-75449580819453190922013-10-04T15:22:00.003-05:002013-10-04T15:29:03.736-05:00Do your metrics tell a story?<i>This article was cross posted at <a href="http://www.theitsmreview.com/2013/09/metrics-story/" target="_blank">The ITSM Review</a> and <a href="http://kpilibrary.com/topics/do-your-metrics-tell-a-story" target="_blank">KPI Library Expert</a> blogs.</i><br />
<br />
Do your service management metrics tell a story? No? No wonder nobody reads them.<br />
<blockquote class="twitter-tweet" width="500">
Do your <a href="https://twitter.com/search?q=%23metrics&src=hash">#metrics</a> tell a story? No? No wonder nobody reads them. <a href="https://twitter.com/search?q=%23ITSM&src=hash">#ITSM</a> <a href="https://twitter.com/search?q=%23ITIL&src=hash">#ITIL</a> <a href="https://twitter.com/search?q=%23CIO&src=hash">#CIO</a><br />
— Dan Kane (@hazyitsm) <a href="https://twitter.com/hazyitsm/statuses/362039090864074754">July 30, 2013</a></blockquote>
<script async="" charset="utf-8" src="//platform.twitter.com/widgets.js"></script><br />
<div style="background-color: white; color: #444444; font-family: Arial, Helvetica, sans-serif; font-size: 12px; line-height: 16.796875px; padding: 0px 0px 10px;">
</div>
That was a tweet I posted a few weeks ago, and it’s had some resonance. I know that during my practitioner days, I missed many opportunities to tell a compelling story. I wanted everyone else to get the message I was trying to communicate, and couldn’t figure out why my metrics weren’t being acted upon. I had a communications background before getting into IT, so I should have known better.<br />
<br />
<h3>
<b><span style="font-family: inherit;">Facts are not the only type of data</span></b></h3>
<div style="padding: 0px 0px 10px;">
I’ve blogged about metrics a few times before. In “<a href="http://www.hazyitsm.com/2012/11/lies-damned-lies-and-statistics-7-ways.html">Lies, Damned Lies, and Statistics: 7 Ways to Improve Reception of Your Data</a>” I shared a story about how my metrics had gone astray. I was trying to make a point to reinforce my perspective on an important management decision. In what became a fairly heated meeting, I found myself saying at least three different times, “the data shows…” Why wasn’t it resonating? Why was I repeating the same message and expecting a different result?<br />
<span style="background-color: transparent; color: #444444; font-family: inherit; line-height: 16.796875px;">Go back and read that article to see how it resolved. The short answer: I lost.</span><span style="font-family: inherit;"><br /><br />I’d love to live in a world where only objective, factual data is considered when making decisions or influencing others; but we have to recognize two important realities:</span><br />
<br />
<ol>
<li><span style="font-family: inherit;">Other types of data, especially personal historical observations that often create biases, are more powerful than objective data ever could be.</span></li>
<li><span style="font-family: inherit;">Your “objective” factual data can actually reduce your credibility, if it is inconsistent with the listener’s personal observations. As the information age moves from infancy into adolescence, we are becoming less trusting of numbers, not more.</span></li>
</ol>
<span style="font-family: inherit;"> So, giving reasons to change someone’s mind is not only ineffective, it can also make things worse. Psychological research indicates that providing facts to change opinion can cement opposing opinion more deeply than before.</span><br />
<span style="font-family: inherit;">Information, whether accurate or not, can be found that backs up almost any perspective. Why should I trust your data any more than the data I already have? Read the comments section from almost any news story about a controversial subject. How many minds get changed?</span><br />
<span style="font-family: inherit;"><br /></span>
<br />
<h3>
<span style="font-family: inherit;">We n</span>eed a reason to care</h3>
<br />
<br />
<span style="font-family: inherit;"><span style="font-family: inherit;">Why </span>should I pay attention to, act on, or react to, your metrics if there is no compelling reason for me to do so? We have to give our audience a reason to care. We want the audience of ITSM metrics to do something as a result of the metrics. The metrics should tell a story that is compelling to your intended </span><span style="font-family: inherit;">audience.</span><br />
<span style="font-family: inherit;"><br /></span>
<span style="font-family: inherit;">Let’s look at a fairly common metric – changes resulting in incidents. Frequently we look at the pe</span>rcentage of changes that generated major incidents (or any incidents at all). Standing alone, what does this metric say? Maybe it shows a trend of the percentage going up or down over time. Even so, what action or decision should be made as a result of that data? Without context we can look for several responses:<br />
<br />
<br /></div>
<ul>
<li>Service Desk Manager: “Changes are going in without proper vetting and testing.” </li>
<li>Application Development Manager: “We need to figure out why the service desk is creating so many incidents.” </li>
<li>IT Operations Director: “Who is responsible for this?” </li>
<li>CIO: “zzzzzzzz” </li>
</ul>
Who has the appropriate response? The CIO of course (and not just because she’s the boss)! The reality is that the metric means nothing at all. Which is kind of sad really, since there may actually be something to address.<br />
<div>
<br /></div>
<div>
Maybe the CIO will initiate some sort of action, but not until she hears a compelling story to accompany the metric.If the metric itself doesn’t tell the story, decisions will be made based on the most compelling anecdote, whether or not it is supported by the metric.<br />
<br />
<h3>
Metrics need to tell a story</h3>
At a new job around 15 years ago, I inherited a report that had both weekly (internal IT) and monthly (business leadership) versions. Since the report was already being run, I assumed it must be useful and used. The report consisted of the standard ITSM metrics:<br />
<ul>
<li>number of calls opened last month vs. historical </li>
<li>incident response rate by team and priority </li>
<li>incident resolution rate by team and priority </li>
<li>highest volume of incidents by service </li>
<li>etc. </li>
</ul>
However after a few months I realized that nobody paid attention to these reports, which surprised me. According to ITIL these are all good metrics to pull. I saw useful things in the data, and even made some adjustments to support operations as a result. However, my adjustments were limited in scope, and the improvements I saw initially didn’t hold and so everyone simply went back to the “old ways”. The Help Desk team that reported to me did experience a sustained significant improvement in their first contact resolution rate, but all other areas of support saw nothing but modest improvements over time.<br />
<br />
The fact is that the reports didn’t tell a compelling story. There were other factors as well, but looking back now I can see that the lack of a consistently compelling metrics story held us back from achieving the transformation for which we were looking.<br />
<br />
<h3>
So your metrics need to tell a story, but how?</h3>
The traditional ITSM approach to presenting data does a poor job at changing minds or driving action, and it can actually strengthen opposing perspectives. Can you think of an example where presenting numbers drove a significant decision? Most likely, the numbers had a narrative that was compelling to the decision maker. It could be something like, “our licensing spend will decrease by 25% over the next three years, and 10% every year after.” That would be a pretty compelling story for a CFO decision maker.<br />
<br />
In my <a href="http://www.hazyitsm.com/2013/10/what-makes-for-compelling-metrics-story.html" target="_blank">next article</a>, we’ll look at how metrics can tell a compelling story.</div>
Anonymoushttp://www.blogger.com/profile/09929890999944016287noreply@blogger.com0tag:blogger.com,1999:blog-2700395448511469789.post-55174034312809574632013-09-02T18:08:00.000-05:002013-09-02T22:55:46.359-05:00First Contact Resolution is the last refuge of a scoundrel<blockquote class="tr_bq">
<blockquote class="tr_bq">
"<a href="http://bit.ly/1dIW93R" target="_blank">Patriotism is the last refuge of a scoundrel</a>"<br />- Samuel Johnson, April 1775</blockquote>
</blockquote>
This quote came to mind after reading a comment on another ITSM blog. The comment indicated that the core service desk metrics needed for senior management were first contact resolution (FCR), mean time to repair/resolve (MTTR), SLA compliance, and customer satisfaction. This was just a given. I come across that and similar perspectives frequently through client interactions and other online discussions, so I assume this perspective remains fairly mainstream.<br />
<br />
In his famous quote, Samuel Johnson decried the use of insincere patriotism, especially as a means to sway public opinion or change parliamentary votes. The end may occasionally justify the means (One could argue that the unbridled use of "patriotic" messaging in the USA during World War II to raise money and strengthen support for the war effort was a critical element in defeating Hitler and his allies. I don't bring that up to spark a debate over that justification -- only that this is a fairly mainstream opinion here in the States). On the other hand, everyone reading this article can likely come up with MANY examples of governments using patriotism as a means to justify horrific outcomes. Fill in your own examples.<br />
<br />
The point I wish to make is that Johnson wasn't denouncing patriotism. He was denouncing patriotism as the <i>means</i> to an end. This is where we come back to ITSM. Can we really use MTTR or FCR as a means to determine the quality of service, and more importantly, the value to our business? Don't assume a great FCR means the service desk is maximizing their value to the business. What if the resolutions they provide are nothing but band-aids for an open wound? One of my favorite Dilbert cartoons is one with <a href="http://bit.ly/1dJ1wQu" target="_blank">Dogbert doing tech support</a>. Dogbert interrupts the caller, saying "Shut up and reboot" and then "Shut up and hang up." The outcome is the much revered improvement in average call handle time. The reason it resonates with me is that it so closely matches reality. How often do we promote metrics that, while well intentioned, actually encourage less than optimal behavior?<br />
<br />
Of course relating Johnson's quote to ITSM metrics is an exaggeration. Most ITSM metrics gathering exercises are well intentioned. Many of the mainstream ITSM metrics are popular as a response to perceptions of lousy service from internal IT. It really mattered that we showed improved responsiveness to incidents, because that's what the rest of the business saw everyday. Today, however, our business partners expect more from IT, and they want it to cost less. We have to be very careful how and where we spend our resources. Reliance on traditional operating metrics may temporarily improve morale in the "troops", but it can severely hurt where the business really needs IT. If you need to choose how to use your resources, should you put your efforts into getting SLA compliance over 90%, or going live with Marketing's new campaign on time with little-to-no errors? Let's put it another way: what is the ROI of each option?<br />
<br />
IT people. myself included, can act like attention-starved puppies sometimes. We'll do <i>anything</i> to get immediate positive feedback, regardless of the consequences. It feels so good to be the hero that gets the PC working for a coworker, forgetting that the new customer campaign depends on that kiosk you're developing being ready for UAT by the end of the day.<br />
<br />
Using MTTR or FCR to show executives how well IT performs is certainly not the act of a scoundrel, but we can be easily fooled into behaving like it. Look at what you measure and the targets you set. Do you know the ROI of reaching those targets? You may be surprised at the result.Anonymoushttp://www.blogger.com/profile/09929890999944016287noreply@blogger.com0tag:blogger.com,1999:blog-2700395448511469789.post-56315059751863147082013-06-11T16:24:00.000-05:002013-06-11T16:31:06.594-05:00What is IT's role in the business? <div dir="ltr">
That's a pretty big question. A recent blog post by Nate Beran ("<a href="http://natetechnically.blogspot.com/2013/06/who-are-we-to-decide.html">Who are we to decide?</a>") and subsequent <a href="https://plus.google.com/u/0/102949431461137494152/posts/c4h1NGZPW2b" target="_blank">Google+ discussion</a> got me thinking about that. Nate nails the point about IT being a poor business enabler. We take it upon ourselves to save the users from themselves, often rendering the user unable to do their job.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
I take a slightly different perspective on what we should do about it. I agree with Nate that we're not the police. We've taken an assumed/outdated mandate from the old paradigm, and continue to enforce it in a completely different business world. Find me an IT shop that has never rejected a request due to an exaggerated or out-of-context risk.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
My take is that, ultimately, IT should be the technology investment advisor/planner. We take time to understand the business goals, and help management determine the level of risk tolerance. Then we offer advice around how to meet those goals and mitigate risk. The executive team and the board decide what to do with the advice. If we've become the trusted advisor, they'll run with our advice more often than not.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
Like their partners in Finance, IT takes on a dual role of investment adviser and operational implementer. At times we'll need to be the operational enforcer, also like Finance. Unlike the old paradigm, we now have a board level mandate, based on real choices. IT enforcement can stay away from the petty concerns like whether an account executive can use Evernote, and even less petty concerns like BYOD. If the proper risk analysis has been done, management has already determined the scope of BYOD. IT doesn't need to be the gatekeeper.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
This may take some time, because most IT shops are pretty good at macro level risk assessment, we tend to be lousy at the micro level. We apply the same level of rigor to both contexts -- the old paradigm. We've got a lot of learning to do.</div>
Anonymoushttp://www.blogger.com/profile/09929890999944016287noreply@blogger.com1tag:blogger.com,1999:blog-2700395448511469789.post-78442658468191205842013-05-14T16:36:00.001-05:002013-05-14T16:36:38.449-05:00Process and Culture Change<body>I was re-reading an IT Skeptic blog post from a couple months ago,
called "<a href="http://www.itskeptic.org/content/obtaining-compliance-policies-processes-and-procedures" target="_blank">Obtaining
compliance for policies, processes and procedures</a>". It's a quick
read, so I recommend you check it out before (or just after) finishing
this article. While reading the post, I was reminded of a class I used
to teach other managers at a previous job. The session was about
handling performance problems on your team, and was based
significantly on the book "<a href="http://www.amazon.com/Coaching-Improved-Work-Performance-Revised/dp/0071352937" target="_blank">Coaching
for Improved Work Performance</a>" (Amazon.com link) by Ferdinand
Fournies. Fournies discusses pragmatic coaching tools that can be used
to achieve optimal performance from employees. It is the most useful
management book I've ever come across, and I've found many other books
very useful. What I like best is that, while the book covers a lot of
theory behind Fournies' approach, it is very action oriented. There
are specific steps laid out clearly to address different scenarios. I
highly recommend the book for managers of all types, including process
managers who may or may not have actual supervisory authority.<br />
<br />
My class was focused on addressing work performance problems, using a
modified version of Fournies' coaching analysis tool. The coaching
analysis tool is a great compliment to the IT Skeptic's compliance
steps, in that it amplifies and gives some examples of practical
application. The tool is applied as a flowchart, asking the manager
simple "yes/no" questions with actions to take depending on the
answer. In Fournies' model there are 16 questions for a manager to
answer, starting with "Is it worth your time and effort?" and <span style="font-style: italic;">ending</span>
with "Could they do it if they chose to do it?". There are 15
questions to address before the one we often jump to: could they do if
they wanted to? It's actually a very simple tool to apply, and you can
often run through the questions very quickly. Fournies intends this to
be used on an individual coach-employee scenario, but I find it useful
to guide myself with all staff. The difference is that you can drop
people out along the way when they become compliant with the new
procedures.<br />
<br />
I'll provide the list of questions, but I'll leave the detailed discussion to those who pick up
Fournies' book. I'll highlight a few favorites, however.<br />
<ol>
<li>Is it worth your time and effort?</li>
<li>Do they know what they're supposed to do?</li>
<li>Do they know how to do it?</li>
<li>Do they know why they should do it?</li>
<li>Are there obstacles beyond their control?</li>
<li>Do they think your way will not work?</li>
<li>Do they think their way is better?</li>
<li>Do they think something else is more important?</li>
<li>Are there positive consequences for performing appropriately?</li>
<li>Are there negative consequences for performing appropriately?</li>
<li>Do they anticipate future negative consequences for performing
appropriately?</li>
<li>Are there positive consequences to them performing
inappropriately?</li>
<li>Are they performing inappropriately without receiving negative
consequences?</li>
<li>Are personal problems interfering?</li>
<li>Could they do it if they chose to do it?</li>
</ol>
<h4>
#3: Do they know how to do it?</h4>
This question immediately follows "Do they know what they're supposed
to do?". I like this question because we so frequently assume that
someone who knows what they're supposed to do MUST know how to do it.
Is this true in any other context besides business? Let's say you are
a brand new golfer. If I tell you that you're supposed to hit the ball
into the cup, you know what you're supposed to do. So how come you do
it so badly? You must be unwilling to change! It's an exaggeration I
know, but I've had similar things occur. I implemented a new standard
where service desk reps needed to summarize the incident or request
back to the caller to ensure understanding. There was one rep that
refused to do it, which made me frustrated. I found out, however, that
she had a specific flow with the calls that was different from
everyone else. They end-of-call-summary completely broke her flow and
slowed her down to the point where she couldn't keep up with even a
moderate call volume. She needed help with the "how" part.
<h4>
#6 and #7: Do they think your way won't work? / Do they think
their way is better?</h4>
The key here is to actually ask the employee these directly. Not "Why
won't you do it?", but "Do you think the new way doesn't work?". The
difference is that it invites the employee to share something that
would otherwise be seen as negative or complaining. They might even
have a better way to do something, or you will need to convince them
otherwise.<br />
<h4>
#12: Are there positive consequences to them performing
inappropriately?</h4>
This is in the midst of several questions around consequences. The
point is to not just look at positive consequences for compliance and
negative consequences for non-compliance. There are also hidden,
unintentional consequences that could easily go unnoticed. A client
couldn't figure out why everyone was "too busy" to get to incidents
and requests sitting in queue in a timely manner. One of the things we
found was that several employees were taking in requests for work that
they handled outside of the standard processes. These requests were
usually not documented anywhere, except for a few emails between the
tech and the requester. These techs received all kinds of praise from
the requesters, who also made sure the tech's manager and the
requester's business unit management knew what great service they
received. It was so much easier than going through the service desk.
We had to really work at cleaning out the "black market" consequences
environment. It also pointed out some glaring issues with standard
processes, since there were legitimate reasons why people didn't want
to use the service desk.<br />
<br />
Also note that the question about applying negative consequences is at
the <i>end</i> of the consequences section. The point is that, before you
start applying negative consequences to non-compliance, make sure
there are not hidden consequences (positive or negative) driving the
non-compliance.<br />
<br />
This is an excellent tool to apply in any cultural change
process, especially cultural change necessitated by ITSM journeys.
We've been talking for years about people/culture change being
arguably the most important of any ITSM undertaking, but there has
been very little practical guidance on how to actually do it. The
available guidance tends to focus around training and attitude. As in:
"I'll train them, and, if staff do not follow the new procedures, it
must be their poor attitude"; or some variant of this.<br />
<br />
Go back to the golf example. An instructor shows and tells you how to
swing the driver. From that point on, all your shots are long and
straight, right? Of course not. It takes study, repetition, and
review. Determine what's going wrong, and repeat the cycle. Sounds a
lot like Deming's Plan-Do-Check-Act. When it comes to ITSM consider
yourself lucky when the Plan stage consists of actual hands-on
training. And we wonder why ITSM projects fail.<br />
</body>
Anonymoushttp://www.blogger.com/profile/09929890999944016287noreply@blogger.com0tag:blogger.com,1999:blog-2700395448511469789.post-74263258444349327552013-05-10T20:04:00.001-05:002013-05-10T20:04:29.532-05:00Service Level Management - Measure the Outcomes, Not the ProcessI hate SLAs. Or rather, I hate the way SLAs are commonly used. All too often, they are used as the outcome of Service Level Management; much the way the Service Desk is frequently equated to Incident Management. Yes, each is very important to the other, but they don't define each other.<br />
<div class="p1">
<span class="s1"><br /></span></div>
<div class="p1">
<span class="s1">I wrote about the relationship between Service Management and Service Level Agreements (SLAs) <a href="http://www.hazyitsm.com/2012/12/service-management-goals-and-service.html" target="_blank">a few months back</a>. I won't rehash the whole thing, but the point was that SLAs, especially when done as formal contracts with internal customers, create more harm than good.</span></div>
<div class="p1">
<span class="s1"><br /></span></div>
<div class="p1">
<span class="s1">Service Level Management, or SLM, is far more complex than simply meeting contractual targets. The latest episode of the Practitioner Radio podcast, "<a href="http://www.servicesphere.com/blog/2013/4/29/the-evolution-and-children-of-service-level-management-pract.html" target="_blank">The Evolution and Children of Service Level Management</a>", explains it very well. It was nice of Chris and Troy to cover this while I was writing this entry. I highly recommend that you take the 23 minutes to listen. They have a great discussion about the role of SLM, compared to its "children": Business Relationship Management, Service Catalog, and Service Owner. I'll post some of my own thoughts about the role of SLM sometime soon.</span></div>
<div class="p1">
<br /></div>
<div class="p1">
This time, however, I'd like to look at metrics around SLM. How do you measure the relative success of SLM? All too frequently we look at adherence to SLAs as the beginning and end of measurement.<br />
<br />
"If I meet my defined SLAs 90% of the time, we have a solid service management practice. Right?"<br />
<ul>
<li>When was the last time an irate customer was placated by that statement?</li>
<li>When was the last time your partners in the business recognized IT's significance through that statement?</li>
<li>When was the last time you generated business value with that statement?</li>
</ul>
I didn't think so. Now an argument could be made that the SLA is a fine internal measurement to identify process efficiencies, but no one outside of your IT shop cares about efficiencies. They might pay it some lip service, but that's when they only see IT as a cost to be minimized. It's time to focus SLM on what it does for your business. I've got a few ideas based on my own observations, and aggregating observations of ITSM pros around the world.<br />
<br /></div>
<div class="p1">
<ul>
<li><u>Kill the SLA</u></li>
</ul>
<blockquote class="tr_bq">
First, remove the concept of contracts between service providers and their internal customers. SLAs done this way create an us-versus-them mentality right from the start. You work for the same business with the same business goals. Contracts with your business peers are a primary reason <a href="http://www.pcpro.co.uk/features/371254/why-everyone-hates-the-it-department" target="_blank">everyone else hates IT</a>. If you are an external service provider that must have SLAs, make it clear that SLAs are the minimally acceptable target, not the desired goal.</blockquote>
<blockquote class="tr_bq">
Replace the formal SLA with documented targets. It may sound like just a semantic change, but it's more than that. It's a completely changed mindset. These targets should NOT be based on what IT can do, they are based on meeting and exceeding the three foundations of business strategy: Business Goals, Business Objectives, and Business Mission.</blockquote>
<ul>
<li><u>New Measures</u></li>
</ul>
<blockquote class="tr_bq">
Based on these new targets, start measuring the impact on current</blockquote>
<ul><ol>
<li>Business Goals</li>
<li>Business Objectives</li>
<li>Business Mission</li>
</ol>
</ul>
<blockquote class="tr_bq">
Let's say a business unit has a goal to increase revenues by 20% this year. Meet with the B.U. to determine how they are planning to hit their goal, and how your services can help them get there. Focus your service targets specifically around these things. </blockquote>
</div>
<ul class="ul1">
<li class="li1"><u>Continuous Review</u></li>
</ul>
<blockquote class="tr_bq">
You now have targets that you can measure and share with your partners. Meet with these partners regularly to review how well the targets have been met. Then ask the crucial question: Are they on track to meet or surpass their targets? If the answer is yes, and you've met your own targets, you now have a clear connection between the value being delivered by your SLM. If the answer is no, but you are meeting your service targets, there is a clear indication of disconnect between SLM and business goals.</blockquote>
<blockquote class="tr_bq">
Honestly, who cares if IT hits their targets if the business is not hitting theirs? Service level targets must be regularly reviewed and updated as needed in order to continue meeting <i>business</i> goals, objectives, and mission.</blockquote>
If you're doing anything else, you are failing. Period.<br />
<ul class="ul1">
</ul>
Anonymoushttp://www.blogger.com/profile/09929890999944016287noreply@blogger.com0tag:blogger.com,1999:blog-2700395448511469789.post-47911339660588195522013-04-13T17:49:00.000-05:002013-04-13T17:49:26.330-05:00Automate With CareIn my continuing role as the <a href="http://goo.gl/6zdiB" target="_blank">Jiminy Cricket</a> of IT Service Management, I frequently find myself pulling the reins in on the futurists, and pushing the traditionalists along. My take on the role of automation fits that perspective very well.<br />
<br />
A recent tweet by <a href="http://twitter.com/WindUpBird/" target="_blank">Mark Kawasaki</a> got me thinking about the role of automation in ITSM, as well as the broader field of Customer Experience Management. Keep in mind that I work for a process automation software company, so apply appropriate filters as you see fit.<br />
<br />
Anyway, here's the original tweet.<br />
<blockquote class="twitter-tweet">
Robots, Self-Service, and Automation are only removing the kinds of human interaction that no one wants anyway. <a href="https://twitter.com/search/%23ITSM">#ITSM</a> <a href="https://twitter.com/search/%23custserv">#custserv</a><br />
— Mark Kawasaki (<a href="http://twitter.com/WindUpBird/" target="_blank">@WindUpBird</a>) <a href="https://twitter.com/WindUpBird/status/309108239797190657">March 6, 2013</a></blockquote>
Mark is a talented and insightful ITSM thinker. If you don't follow him, you really should. As much as I would love to agree with his tweet, I have some reservations that are more than minor clarifications. First, automation efforts frequently remove human interactions that customers may well want. Second, and more important, is that decisions around automation tend to have more to do with cost savings than customer experience.<br />
<br />
<h3>
Automation removes interactions desired by your customers</h3>
I believe it's inherently dangerous for service providers to assume too much about their customers. Yes, those of us in the tech community tend to view human interaction, especially in regard to customer service, as a roadblock. It's the "I know what's wrong with my cable service, please don't make me jump through customer service hoops to schedule a service call" mentality. Is that true for all customers? Are we forced to go through human customer service just to cater to the lowest common denominator? Maybe in some cases, but I doubt that's true most of the time.<br />
<br />
I am suggesting, shockingly, that IT pros may not be the best judges of the value of human interaction in customer service and service management. Just as shocking: your CFO may not be either.<br />
<br />
I think of the credit card commercial from the past year or so, where a guy on a beach is calling customer service while walking on a lonely beach at sunset. I tried to find a link to the commercial, but I can't find it. Feel free to add it in the comments if you do. Anyway, as soon as the line starts ringing, the guy starts hitting buttons on the phone, only to discover that he's talking to an actual live human being. The point of the commercial is that, for specific card programs, you can talk to a person instead of going through the IVR system. The guy is pleasantly surprised, as would be the intended audience of this commercial.<br />
<br />
Before you automate, make sure you are not removing something your customers like about you!<br />
<h3>
Decisions to automate based on cost savings, not customer experience</h3>
<div>
I also had the following Twitter exchange with Daniel Billing following an excellent Google Hangout discussion around Knowledge Management Systems:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="http://1.bp.blogspot.com/-UnQk2qsNmeo/UWnByYVh2cI/AAAAAAAAATQ/f_G367mKe98/s1600/Screen+Shot+2013-04-12+at+4.38.22+PM.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" src="http://1.bp.blogspot.com/-UnQk2qsNmeo/UWnByYVh2cI/AAAAAAAAATQ/f_G367mKe98/s1600/Screen+Shot+2013-04-12+at+4.38.22+PM.png" height="459" width="640" /></a></div>
<br />
My point is this: Why do you want to automate something? If your primary reason is <i>not</i> improving customer experience, start over. Period. Even if the primary reason <i>is</i> improving the customer experience, how do you know? Are you rationalizing a decision that's really based on cost savings? We are in an "IT cost-savings" phase for many enterprises. We talk about business-enabling ITSM initiatives, but how often is that really true? CIOs are under tremendous <a href="http://goo.gl/rQT9E" target="_blank">pressure to reduce costs</a>, while <a href="http://goo.gl/aNVfA" target="_blank">innovating at the same time</a>. Most of us find innovation difficult, and even harder to explain to your business partners unless it comes from them. Cost savings, however, live in the very core of the modern CIO. They were bred on cost savings and efficiencies. We're not so good at identifying the customer experience impact of cost savings efforts, and business partners tend to be distrustful of IT attempts to show negative impacts of decisions. Heck, IT always says the sky is falling; what's new this time?</div>
<div>
<br /></div>
<div>
Don't guess at how an automation decision will impact customer experience. Make marketing your friend. Make Sales your friend. Ask them what is most important to the customers. Ask what their short and long term goals are. Describe what the automation will do in regard to the customer experience, and ask them to tell you the likely impact.<br />
<br />
Automation of all forms can provide amazing benefits, <i>and</i> it must be applied judiciously. Make sure it is done for the right reasons, and with the right measures of success.</div>
<script async="" charset="utf-8" src="//platform.twitter.com/widgets.js"></script>Anonymoushttp://www.blogger.com/profile/09929890999944016287noreply@blogger.com2tag:blogger.com,1999:blog-2700395448511469789.post-86089344351655887992013-02-05T12:15:00.001-06:002013-02-05T16:03:11.385-06:00If you only knew the power... <div dir="ltr">
<a href="http://pinterest.com/pin/121104677452030459/" style="clear: right; float: right; margin-bottom: 1em; margin-left: 1em;" target="_blank"><img border="0" src="http://media-cache-lt0.pinterest.com/550/ff/69/3a/ff693abddbb254b8b3cc4ad0c7d0123d.jpg" height="300" width="300" /></a>I've gone and done it. I've joined the dark side.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
And I couldn't be happier. </div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
This week marks my final week as a pure ITSM practitioner. After 12 years, I'm leaving an organization I love to join one I admire, in a job I can't wait to start doing. I start my new life as an ITSM consultant on February 11. I can't tell you the new company's name, but it rhymes with Nervous Pow.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
I anticipate continuing this blog, and continuing to speak for the real world of practitioners. The ITSM community remains far too theoretical, with a lack of clear purpose. We have a lot of great thinkers out there that do excellent and necessary work; AND we need more people willing to share their practical, everyday ideas.</div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
There are several areas where I think we can, and should, be more proactive in helping practitioners navigate the real-world issues they encounter.</div>
<div dir="ltr">
</div>
<ul>
<li>What questions should I start asking, and to whom, to help determine where my company should start our service management initiative?</li>
<li>Give me some examples of how companies have started their journey, and why they chose that route.</li>
<li>When all the analysts and consultants talk about "value", what should I be sharing with my IT colleagues and business partners so we understand what that means?</li>
<li>What does a real service catalog look like? Don't just explain that a catalog should include services "in business terms". That's not helpful. Give actual examples of how companies have defined their services.</li>
<li>Examples, examples, examples, and more examples. We know that one organization's workflow won't be best for all organizations. But most of us are smart enough to look at a few examples of what others have done, and figure out how to change those examples to best fit our organizations.</li>
</ul>
<br />
<div dir="ltr">
We have way too much "Don't give away the milk for free" thinking in ITSM consulting. As in, "Why should I buy the cow when I can get the milk for free?", meaning I shouldn't share details or examples in social forums or blogs, because then no one will want to buy my services. Seriously? Is the extent of your value so shallow that reading 3-4 of your ideas online will tap out your good stuff? Patrick Lencioni's book <a href="http://amzn.to/XKhv4f" target="_blank">Getting Naked: A Business Fable About Shedding The ThreeFears That Sabotage Client Loyalty</a>, offers great practical advice. Instead of selling consulting, why not just start consulting to demonstrate why the client can't live without you? Social media outlets let us "just start consulting" to audiences far broader than we could reach on our own.</div>
<div class="MsoNormal">
<o:p></o:p></div>
<div dir="ltr">
<br /></div>
<div dir="ltr">
We're here to enable positive outcomes for businesses. Helping practitioners make their services better is what this is all about. Don't ever lose that perspective.</div>
<br />
<div style="line-height: 0px; padding-bottom: 2px;">
</div>
<div style="float: left; padding-bottom: 0px; padding-top: 0px;">
<div style="color: #76838b; font-size: 10px;">
Source: <a href="http://www.amazon.com/Come-Dark-Cookies-Funny-Poster/dp/B005MJ5RQQ/?keywords=star+wars+cookies&qid=1340319353&ref=sr_1_55&ie=UTF8&sr=8-55" style="color: #76838b; font-size: 10px; text-decoration: underline;">amazon.com</a> via <a href="http://pinterest.com/tsgeek/" style="color: #76838b; font-size: 10px; text-decoration: underline;" target="_blank">Dan</a> on <a href="http://pinterest.com/" style="color: #76838b; text-decoration: underline;" target="_blank">Pinterest</a></div>
</div>
Anonymoushttp://www.blogger.com/profile/09929890999944016287noreply@blogger.com1tag:blogger.com,1999:blog-2700395448511469789.post-29778255948016433472013-01-21T15:57:00.002-06:002013-01-21T15:57:37.963-06:00Pragmatic BYOD - 6 Considerations to Enable Employee-Owned DevicesThis blog is about, if nothing else, pragmatism. How do we take the conventional wisdom and best practices of the world outside my company, and make them useful in the real world of supporting real end users. With that in mind, I was annoyed to come across an article on CFO.com <a href="http://bit.ly/yrP0q0" target="_blank">(CFO, Don't Buy That Phone!</a>) which presents the benefits of BYOD, or "Bring Your Own Device", with the rosiest of rose-colored glasses. A particular paragraph got my attention.<br />
<blockquote>
Jim Buckley, CFO of mobile-device management firm MobileIron, points out other savings: “In a BYOD program, end users take more responsibility for their devices, taking the initiative to fix them themselves rather than involving support, and, because it’s their personal device, they take better care of them.” Another cost benefit Buckley identifies is that “companies no longer have to deal with the device life cycle. Smart phones and tablets generally change every 18 months. That’s a lot of new technology the enterprise no longer has to keep up with.”</blockquote>
<blockquote>
This is consistent with the conventional wisdom around BYOD right now: That BYOD <em>inherently</em> reduces the cost of IT support because users will be more prone to take care of themselves. I ran that by one of our service desk reps, and he just started laughing. Uncontrollably.</blockquote>
For BYOD to have any chance of driving real business value, several other things must also be done.<br />
<ol>
<li>You must invest in a robust mobile device management (MDM) system, that allows you to keep corporate data and apps secure, regardless of the device being used. Good MDM tools do exist, but they take money to buy and people to maintain the system and support the end users.</li>
<li>You must invest, possibly significantly, in the technology infrastructure to make the data and apps available to a broad, diverse set of devices, with users needing to access the systems from anywhere. Cloud will help with that, and will help a lot. How many companies can make their legacy systems available in the cloud quickly and securely? We're moving it that direction, but it will take time, money, and people to get there.</li>
<li>Unless all your legacy systems and data already exist in a cloud-accessible app, you must invest heavily in virtualization technologies. You are probably providing some virtual services already, but just wait. The investment is about to get huge.</li>
<li>Who owns the device? The "YO" in BYOD stands for Your Own, meaning that we are frequently talking about a device owned by the employee themselves. This isn't a shared assumption in many cases, however, so policies must clearly identify ownership. At the very least, identify the differences in expectations and support between employee-owned and business-owned devices.</li>
<li>Expect investment in support services to increase in the near term. This is not like letting a corporate VIP use their MacBook instead of your standard HP laptop. You've probably invested in a highly standardized environment, which allowed you to minimize the FTE needed to support desktops and laptops. Now you have hundreds of different devices that employees may use, and they will expect you to know how to assist them with each and every one.</li>
<li>Keep expectation-setting positive. The last message you want to send your business partners is that IT needs to limit their (presumed) productivity. Start talking about what can be done. For example, you could create an internal blog of the adventures of a user's BYOD journey.</li>
</ol>
Don't get me wrong. BYOD is a necessary evolution of knowledge work. It is not a fad, and it is not going away. What does concern me is the sense that CFOs can just start allowing users to bring their own devices, and we can instantly reduce a few IT support headcount, or reallocate them to growth-enablement roles. Hey, these devices support themselves! When they need help, the users can just go to the service provider or manufacturer, right?<br />
BYOD, cloud, mobile, consumerization ... whatever you want to call this wave ... it will soon become mainstream, standard operating procedure. This is a very good thing, which will provide significant value to most organizations. It opens up knowledge workers to more individual control, and more creativity, leading to better ways to achieve results. Don't be afraid.Anonymoushttp://www.blogger.com/profile/09929890999944016287noreply@blogger.com0tag:blogger.com,1999:blog-2700395448511469789.post-24919425926226383652013-01-18T13:45:00.002-06:002013-01-18T13:45:52.106-06:00Social Media in the Enterprise? That'll Never Happen!I just had an interesting exchange during an identity management planning discussion. We were reviewing fields included in a new HR system for possible mapping into the ID management system, and saw that fields to capture Facebook and Twitter IDs were included. Most folks in the room started laughing. I mentioned that social media handles or IDs will be included in internal employee directories sooner rather than later. The reactions ranged from scoffing to comments like "scary."<br />
<br />
Whether Twitter or Facebook accounts specifically will transition to the corporate staff directory remains to be seen.<br />
<blockquote class="tr_bq">
<span style="font-size: large;">Do you doubt that some method for sharing social media accounts across the mainstream enterprise will happen </span><span style="font-size: large;">soon?</span></blockquote>
I'm sure some forward thinking organizations already do this, since it's made it to at least one HR system's default fields list.<br />
<br />
Sidebar: I know there are social media management tools that allow companies to create internal social networks. Often it's an add-on to a suite of tools for monitoring and managing your company's social presence. That's an interesting start, but what individual wants to actively participate in a closed social network with <i>yet another</i> social account or persona? There are only so many people out there who participate in more than one social network/tool, especially when that network doesn't allow you to auto-link with Facebook. I have active <a href="http://www.facebook.com/DanielJKane" target="_blank">Facebook</a>, <a href="http://twitter.com/hazyitsm" target="_blank">Twitter</a>, <a href="http://plus.google.com/107804869521924906601" target="_blank">Google+</a>, and <a href="http://www.linkedin.com/in/danieljkane/" target="_blank">LinkedIn</a> accounts. I'm not normal (insert you're joke silently here). Enterprise social management tool vendors should design their tools so that internal staff can easily choose what content to include or not include from their existing social network of choice.<br />
<br />
Back on topic, what do you think? Will (or should) companies look to integrate links to employee social media profiles? Obviously it would need to be voluntary, but the upcoming dominance of digital natives will take care of that. What are the benefits? The drawbacks?Anonymoushttp://www.blogger.com/profile/09929890999944016287noreply@blogger.com3tag:blogger.com,1999:blog-2700395448511469789.post-29056354837817811962013-01-11T12:31:00.002-06:002013-07-08T17:42:54.695-05:00Rethinking the Role of Incidents in Service Management<div dir="ltr">
</div>
<div dir="ltr">
<div dir="ltr">
I once had my accounts at a bank where customer service was very good at resolving errors in my account. However, I ended up needing to call them almost every single month to get an error resolved. I don't bank there anymore. Imagine this bank using the slogan, "We fix account errors faster than you may expect!" Do you want to invest in that bank? Then why does ITSM's primary message often sound similar?</div>
<div dir="ltr">
</div>
<ul>
<li>We outperform Incident SLAs 90% of the time</li>
<li>We've reduced the negative impact of changes by 60%</li>
</ul>
Do you realize that what we're saying in business terms is, essentially, "We fail less frequently"? Is that the message you want to send?<br />
<br />
I'm not saying these measures are bad, or that we shouldn't tell our business partners about them. It's just that focusing our metrics around these types of measures implies that the reason we get a paycheck is that we can fix the problems created by the very systems we deployed. In other words: we're good at fixing our failures. I know the reality is more complicated than that, but is your business partner <i>wrong</i> to arrive at that conclusion?<br />
<br />
<div>
Service value is based on business value. Period. Business value means increasing revenue, decreasing costs, increasing goodwill, or improving outcomes around a corporate strategic plan. Even at the non-profit where I work, value is based on one or more of these four factors. You might replace "goodwill" with "mission impact", but it is effectively the same thing.</div>
<div>
<br /></div>
<div>
Why then do we put so much emphasis on self-reported issues as a proxy for value? Self-reported issues were the easiest way to collect data regarding the value our services. It doesn't make it a better way, or even a good way; just an easy way. Let me ask it another way: are self-reported incidents a good way to measure effectiveness of service management? Of course not, and for three very good reasons:</div>
<div>
<ol>
<li>Self-reported incidents only tell us about things that hurt service consumers to the point that they have no choice but to contact us. Most people don't care enough to reach out, until the pain is so great that it cannot be carried any longer. What about all the non-reported defects?</li>
<li>Service management is about ensuring the service consumer is getting what they need from the service. Incidents only tell us about broken stuff, which barely scratches the surface of ensuring the service consumer is getting what they need.</li>
<li>Self-reported incidents tell us almost nothing about service value. How do they tell us about increased revenue, decreased costs, or increased goodwill? They might tell us a little bit about increasing or decreasing costs, but even that is a stretch. Even if tracking self reported defects could tell us something useful about cost control, it's still effectively telling the business IT is getting better at fixing our own failures. Again not a bad thing, AND nothing to crow about, either.</li>
</ol>
</div>
<div>
I propose that incidents should not be limited to service interruptions; and even if they are, they should cover ALL service interruptions, not just the self-reported ones. I want to take it further, however. Service Management needs to be more closely tied to Customer Experience Management. The book "<a href="http://www.amazon.com/Outside-Putting-Customers-Center-Business/dp/0547913982" target="_blank">Outside In: The Power of Putting Customers at the Center of Your Business</a>", written by analysts from Forrester, provides an excellent overview and model for implementing customer experience management. Forrester provides a <a href="http://blogs.forrester.com/kerry_bodine/12-05-22-outside_in_the_power_of_putting_customers_at_the_center_of_your_business" target="_blank">mini overview on their blog</a>. One characteristic is mapping the intended customer experience for your product or service.<br />
<br />
My idea is to take the intended experience for a service, and compare that to the actual customer/user service experience. An "incident" then becomes ANY variance between the expected and actual experience. They could be good or bad things. I want to know about errors, of course. I also want to know about the ways the users of my service do things differently, and possibly better than, the way we expected it to be used. I'll know a lot more about the value of the service this way.<br />
<br />
This requires a bit of imagining the future, but it can be a near-term future. Developers can focus efforts on gathering and reporting customer experience with their apps. Would there be a market for applications that could allow configuration of customer experience standards? I know I would be interested in purchasing an app that, in addition to it's user functionality, included this sort of capability.<br />
<br />
In the mean time, we can use many of the tools we already use for measuring customer experience. First, however, we need to document what we expect that experience to be and what it should provide.</div>
</div>
Anonymoushttp://www.blogger.com/profile/09929890999944016287noreply@blogger.com3tag:blogger.com,1999:blog-2700395448511469789.post-13682126186723719412012-12-02T22:45:00.001-06:002012-12-02T22:51:27.289-06:00Service Management Goals and Service Level Agreements<br />
<span style="font-family: inherit; text-align: -webkit-auto;">(Much of this article is also posted as part of a <a href="http://www.servicemanagers.org/2012/11/sla-are-service-management-goals.html" target="_blank">co-authored piece on ServiceManagers.org</a>)</span><br />
<span style="font-family: inherit; text-align: -webkit-auto;"><br /></span>
<span style="font-family: inherit; text-align: -webkit-auto;">What is the relationship between Service Level Agreements (SLAs) and Service Management goals? A common misapplication is that your SLAs are your goals, or at least that they are a part of your goals.</span><br />
<span style="font-family: inherit;"><br /></span>
<span style="font-family: inherit;">I'd take the goals discussion a level higher and focus on missional goals first. Using a soccer analogy (sorry, I'm an American ... "football"), SLAs are not the goal. Scoring more than the opponent is the goal. SLAs are more like defining the positions for the players, and communicating what the purpose is for each position. Then you have tactics, the designed plays and maybe a few pre-planned what-if scenarios. These are your defined processes, which should, AT A MINIMUM, meet the SLAs. As the coach, you have selected who should be on the field at any specific time, usually based on how well they are suited for the current/adjusted objective. It could be a push for scoring when you're behind late in the game, or it could be defending a lead while taking as much time as possible off the clock.</span><br />
<br />
The point is that SLAs are not the primary means to meet the overall goals. They are certainly one of the tools you have for communicating expectations, but we also understand that the real goal is to make money, to delight the customer, to maximize their experience, etc.<br />
<br />
To summarize this perspective on SLAs:<br />
<br />
<ol>
<li>SLAs are one tool, out of many in your SM toolkit.</li>
<li>Poorly done SLAs (badly written, poorly negotiated, poorly communicated, etc.) will often inhibit excellent service management far more than any benefits you get from simply having a defined set of expectations.</li>
<li>Even well written and communicated SLAs often degrade the customer experience. Too much focus on SLAs in order to meet goals can inhibit talented service delivery people who can read a situation and determine a better means to achieve the customer experience goals. I've seen too many scenarios where talented, creative people were ignored off pushed aside due to SLA adherence being the exclusive way to meet goals. I know it sounds far fetched, but it does happen.</li>
</ol>
<br />
Customer experience strategy can be a more effective way to define expected outcomes, while taking individual talents into consideration. I highly recommend the book "<a href="http://amzn.com/0547913982" target="_blank">Outside In: The Power of Putting Customers at the Center of Your Business</a>" as a great approach to this perspective.<br />
<br />
It's not "either-or", but rather "yes-and".Anonymoushttp://www.blogger.com/profile/09929890999944016287noreply@blogger.com1tag:blogger.com,1999:blog-2700395448511469789.post-41364703741565117622012-11-11T23:06:00.000-06:002012-11-11T23:07:19.809-06:00Is ITIL the Enemy?<span style="font-family: Tahoma; text-align: -webkit-auto;">Is ITIL the enemy? At the end of ServiceNow's Knowledge12 conference in May, a presenter noted that the term "ITIL" had barely been heard the whole week. A decent amount of cheering followed. My take was that ITIL concepts had become such an ingrained part of IT, that talking about those concepts as "ITIL" was unnecessary. Now I'm not so sure.</span><br />
<div style="font-family: Tahoma; text-align: -webkit-auto;">
<br /></div>
<div style="font-family: Tahoma; text-align: -webkit-auto;">
I was messaging with a former colleague the other day, when she commented on how poorly ITIL is perceived at her current company. "ITIL ... (was) executed in such a half-witted fashion that the amount of overhead and time wasted significantly increased." Like most ITIL failures, it was done as an "ITIL implementation" project. It got me thinking about my own perspective, and this was my response:</div>
<blockquote class="tr_bq">
<span style="background-color: white; color: #333333; line-height: 17px;">Believe it or not, I take a pretty cynical approach to ITIL; not so much in the framework itself, but in the way most companies try to implement it. First and foremost, any attempt to "implement ITIL" is doomed to failure. Most companies try to implement it like it's an ISO standard, like the closer you match the guidelines, the better. I laugh at CMMI assessments of ITIL maturity. The base assumption is that more adherence to the framework = more "mature". That is not the point of IT service management at all. You could have the most clear processes in the world, and still not be meeting or exceeding corporate expectations. I use ITIL as a means to help bridge gaps between business/customer expectations of IT service, and their perception of the actual service provided. If they expect Apple Store service and get what they see as Sam's Club outcomes, there's a huge gap that ITIL (and CoBIT, ISO 20000, etc.) can assist in bridging.</span></blockquote>
<div style="font-family: Tahoma;">
<span style="color: #333333;"><span style="line-height: 17px;">My experience is that ITIL cannot be a goal, and should not be used as a measure of service management success. However, elements of ITIL, such as Service Strategy, can be an incredibly useful tool after you figure out where the customer-provider disconnects exist. </span></span><span style="color: #333333; line-height: 17px;">What do you think? Is ITIL getting in the way, or does it remain a useful tool for helping address service problems?</span></div>
Anonymoushttp://www.blogger.com/profile/09929890999944016287noreply@blogger.com4tag:blogger.com,1999:blog-2700395448511469789.post-82047345276276084882012-11-05T22:46:00.000-06:002013-07-08T18:26:37.492-05:00Lies, Damned Lies, and Statistics: 7 Ways to Improve Reception of Your Data<span style="background-color: white; font-family: Arial, Helvetica, sans-serif; line-height: 24.75px;"><a href="http://www.blogger.com/blogger.g?blogID=2700395448511469789#new8">*</a> Now with a bonus 8th suggestion! Thanks to <a href="http://plus.google.com/u/0/115870293431423290969/posts" target="_blank">Stephen Alexander</a>.</span><br />
<span style="background-color: white; font-family: Arial, Helvetica, sans-serif; line-height: 24.75px;"><br /></span>
<span style="font-family: Arial, Helvetica, sans-serif;"><span style="background-color: white; line-height: 24.75px;">"There are three kinds of lies: lies, damned lies and statistics</span>." I hate this quote almost as much as "first thing, kill all the lawyers", and for essentially the same reason. They are both applied wildly out of context, to the point that the meaning assigned to them are almost the exact opposite as the original quote intended. Shakespeare was not talking about a hatred for lawyers. His character, the comic-villain Dick The Butcher, was talking about how a world without lawyers would be a great way to start the utopia dreamed of by murderers and thieves. Shakespeare was not espousing the virtues of lawyers, as some have attributed; but he certainly wasn't saying that killing all the lawyers would be good for humanity either. There's nuance in the comedic moment.</span><br />
<span style="font-family: Arial, Helvetica, sans-serif;"><br /></span>
<span style="font-family: Arial, Helvetica, sans-serif;">The same is true for the "lies" quote. It is from a Mark Twain article (later included with a series of articles to <a href="http://www.amazon.com/Chapters-Autobiography-1906-1907-Oxford-Twain/dp/0195101561" target="_blank">form a book</a>), in which he was supposedly quoting former British Prime Minister Benjamin Disraeli. There is much debate over who really originated the saying, but that's not my concern here. What bothers me is the haphazard application of the saying, as if it is sufficient proof to indicate the unworthiness of data-based decision making. Of course, all decisions are based on data. Even a hunch is, essentially, a data point.</span><br />
<span style="font-family: Arial, Helvetica, sans-serif;"><br /></span>
<span style="font-family: Arial, Helvetica, sans-serif;">Even if it bothers me, we live in a world where statistical data is both revered and disdained. If the data supports your idea, it's great! If it doesn't, it is suspicious. In the data-obsessed world of IT Service Management, of which I am one of the chief obsessors, we need to keep some perspective when it comes to statistical information. This became relevant to me one day as decisions were being made around me that were based not on statistical data, but on a series of anecdotes instead. It was assumed that, because the anecdotes appeared to contain similar themes, we could/should make high-impact decisions based on them. The statistical data was presumed to be irrelevant due to the fact that the anecdotes indicated that the data was missing critical information.</span><br />
<span style="font-family: Arial, Helvetica, sans-serif;"><br /></span>
<span style="font-family: Arial, Helvetica, sans-serif;">It hit me that I was not entirely right, and the others were not entirely wrong. There is significant nuance involved, where the two types of data (statistical and anecdotal) are both needed. That led me to consider some suggestions regarding how we position "data" in the context of ITSM decision making.</span><br />
<ol>
<li><span style="font-family: Arial, Helvetica, sans-serif;"><b><i>Decisions are based on data.</i></b> All conclusions are inherently based on data of some sort, some qualitative and some quantitative. In the absence of trusted, useful statistical data, decisions will be based on anecdotes, whether or not they represent truth.</span></li>
<li><span style="font-family: Arial, Helvetica, sans-serif;"><b><i>Statistical data must move from an untrustworthy state to a trustworthy state</i></b>. It cannot and will not be used for decision making until the decision-maker trusts the data. We cannot assume that because we have numbers that the intended audience for the numbers will believe them.</span></li>
<li><b><i><span style="font-family: Arial, Helvetica, sans-serif;">Don't get bent out of shape when your numbers are not immediately received as Truth.</span></i></b></li>
<li><span style="font-family: Arial, Helvetica, sans-serif;"><b><i>Presenting data consistently is far more important than the precision of the data</i></b>. Be persistent and consistent in how you present and interpret data. This cannot be stressed enough. There is no such thing as the perfect data, so stop looking for it. I used to constantly change the data I presented, hoping that the "next version" would catch on, that everyone else would suddenly get it. The opposite is true. When the data presented is changing all the time, you come across as someone with something to hide. Your credibility is shot.</span></li>
<li><span style="font-family: Arial, Helvetica, sans-serif;"><b><i>Find out how your data is being received</i></b>. "Build it and they will come" does not apply to metrics. Ask intended recipients what they think of the data as presented. Is there a way to make it more clear? Are there concerns about data accuracy?</span></li>
<li><span style="font-family: Arial, Helvetica, sans-serif;"><b><i>Your data presentation must be actionable</i></b>. Take action on your data, and teach others how to take action based on the data. If the information is not actionable, it will be ignored and mistrusted.</span></li>
<li><span style="font-family: Arial, Helvetica, sans-serif;"><b><i>Anecdotes provide great information</i></b>. Complaints are amazing opportunities to focus your data queries. For example, I found that not all requests were coming through the Help Desk, so the data regarding quality of Help Desk service was not complete. Before I could make decisions based on the data as captured, I had to understand why requests were not coming through the Help Desk. Of course, you also need to find out whether that is a good or bad thing, but that's <a href="http://www.hazyitsm.com/2011/12/do-users-prefer-to-contact-help-desk.html" target="_blank">another discussion</a>.</span></li>
<li><span style="font-family: Arial, Helvetica, sans-serif;">*<b><i> Your data must be relevant to the recipient and the context. </i></b></span>Often overlooked, but essential to the credibility of your data. Before publishing or presenting your data, make sure you can answer this important question: Why will my intended audience care or find this relevant? If you don't have a clear and concise answer (no more than one brief sentence), your data is probably not ready for consumption.</li>
</ol>
<div>
<span style="font-family: Arial, Helvetica, sans-serif;">What would you add to the list?</span></div>
Anonymoushttp://www.blogger.com/profile/09929890999944016287noreply@blogger.com6tag:blogger.com,1999:blog-2700395448511469789.post-26959400016962375892012-06-07T17:40:00.000-05:002012-07-17T22:21:15.372-05:00Social is changing the game, in more ways than you think <br/><br/>Allow me to move off ITSM a bit. Trust me, I'll get back to it.<br/><br/>I am not a social media expert, and I'm skeptical of many self-described experts. I did, however, recently hit a Klout score of 51; so I must be doing something right, at least regarding social influence and reputation (See "The Reputation Economy is Coming - Are You Prepared?" AND) if you believe in Klout's interesting, if flawed, influence algorithm.<br/><br/>What is most interesting about Klout is not the actual score, but the IDEA that the value of your sharing is based on the usefulness of what is shared, and less to do with speed and frequency. Feel free to disagree with how Klout calculates that value. It's much harder to disagree that the value of social sharing is far more based on the perceived quality of the shared content, as opposed to the speed in which you can register a comment, opinion, or decision.<br/><br/>A recent <a href="http://bit.ly/L1QN1V">Forbes article on the coming reputation economy </a>makes the following point:<br/><blockquote>The economy is moving in one direction and one direction only. Take time to invest in your online reputation and you will be more confident, more connected, and more desirable to work with.</blockquote><br/>But how do you invest in your online reputation? Think about the poeple and organizations you follow online. I'm not just talking about Twitter follows, but that's part of it. Who's posts do you pay the most attention to on Facebook, Google+, Pinterest, LinkedIn, and yes, Twitter? We follow people who provide the most value. It could be entertainment value, professional value, home improvement value, etc. We tend to value quantity of posts, artfulness of the presentation, self-promoion, and quick judgements much lower in the social media context, compared to in person interactions.<br/><br/>Your character/reputation/influence is becoming strengthened by the value of the content you create and shepherd. The 20th century "conventional wisdom" rewarded extroverts, to the point where introversion had essentially become a handicap to overcome. The Old Boy network/corporate boardroom ideal that grew from the Harvard Business School's over-reliance on extroversion as an essential trait for success, is starting to die. Most people just haven't noticed it yet.<br/><br/>Think about it. How many folks reading this post are more introverted than extroverted? If you've got a high Klout score, how much of that is based on dominating the social world with quantity over quality? Unless you are a celebrity, no one cares what you have to say if you don't have useful content. No one caring = lower Klout. Limited, thoughtful, useful sharing = higher Klout.<br/><br/>Sounds like an introvert to me.<br/><a name='more'></a><br/><br/><strong>The Culture of Personality</strong><em></em><br/><br/>We have deified extroversion to the point of allowing extroverts to lead in arenas where they actually detract from productivity and quality, primarily because we have agreed that extroverts make better leaders. This conventional wisdom continues to assert that introverts can only be effective leaders when they fake extroversion well enough to be convincing. This hasn't always been the natural order of things.<br/><br/>Susan Cain, in her book <em><a href="http://amzn.to/K2U8Aj">Quiet: The Power of Introverts in a World That Can't Stop Talking</a></em>, discusses the transition near the turn of the 20th century from a "Culture of Character" to the current "Culture of Personality". The Culture of Character was shaped by a shared sense of the rules of interaction. People knew when it was right to speak. Speaking out of turn made you look less enlightened. You spoke only when you had something of value to share. Slow, thoughtful decision making was a sign of integrity and intelligence. Fast, in-the-moment decision making was a sign of weakness and poor character. Most of the people we would call extroverts today were viewed with suspicion until they proved they were of good character.<br/><br/>Something changed as we started the 20th century. Societies were becoming more familiar with the reality that, at least in the United States, we were moving from an economy based on family-run agriculture to corporate owned businesses. More people were working in officies and factories than in the fields. Dale Carnegie helped bring the values of aggressive, confident communication styles into the mainstream. Over the next hundred years, fast, confident decision-making becme more and more acceptable, not just in business, but in religion, social circles, and almost every arena of our lives. Teachers now reward participation over accuracy, and, I would suggest, thoughtfulness. Preschools regularly counsel parents about their "withdrawn" children quickly falling behind in social circles.<br/><br/>In business, group effort is revered, even when it comes to finding new ideas. According to Cain, study after study over the past 40 years show with great clarity that group-based brainstorming reduces BOTH the quantity and the quality of ideas generated, compared to people brainstorming individually. With all rhe evidence to the contrary, we still force most people to do creative work, including assessing pros and cons of various decisions, in group settings. When has there ever been an innovation created by committee? Yet, in group settings, how do we determine the best way to solve a problem? The most confident and outspoken person convinces everyone else to follow their path.<br/><br/>And the problem of group decision making is not just around ideas for creativity and innovation. Cain references a study that looked at accuracy of decisions in group vs. individual contexts. Groups of people were asked to answer several relatively simple questions around current events and popular culture. When they answered the questions on their own, most people got 100% of the answers correct. When they formed into small groups to address the same questions, "ringers" were added to the groups. Some of the ringers were coached to be charming, confident, and decisive, while purposefully advocating for incorrect answers. They found that every such group got more answers wrong than almost everyone did as individuals. A disturbing number of individuals changed their correct answers to wrong answers after the group decision-making exercises.<br/><br/>The one context where group brainstorming and decision-making did not suffer from, what Cain calls "The New Groupthink"? Online group interactions.<br/><br/>That's a good teaser for the next post. I'll pick up and expand more on how social networking is not only changing how we share and make decisions, but also who most influences those decisions.<br/><br/>The introvert revolution (which will, of course, be slow and steady) is nigh!Anonymoushttp://www.blogger.com/profile/09929890999944016287noreply@blogger.com0tag:blogger.com,1999:blog-2700395448511469789.post-84334961451245207782012-05-02T11:57:00.000-05:002012-07-22T19:14:01.778-05:00ITSM Maturity Assessments: A Value-based ApproachI am not a big fan of ITSM maturity assessments. Don't get me wrong, I think we should DO maturity assessments; it's just that they are so frequently done poorly, with useless, maybe even dangerous, starting assumptions.<br />
<ol><br />
<li>Improved process maturity is an end goal in and of itself</li>
<br />
<li>Interdependence between individual processes is either negligible or non-existent, in the context of measuring maturity</li>
</ol>
<br />
A typical maturity assessment starts with the assumption, implicit or explicit, that improved process maturity is an end goal in and of itself. <a href="http://www.tarrani.com/pix/ITSMMaturityModel_v3.PDF" target="_blank">Here</a> is a great example of what I mean, based on the <a href="http://www.sei.cmu.edu/cmmi/solutions/svc/" target="_blank">CMMI model</a> of evaluating maturity. The definition of Level 1 maturity for Incident Management indicates there is no defined owner of the process. Part of reaching Level 2 maturity is identifying a single process owner. I agree that having a single owner of the process is a good thing, but what goal does it help achieve? Moving to Maturity Level 2? Congratulations. My concern is that many maturity assessments end there. You're at Level 2 (or 4, it doesn't matter). Now what?<br />
<br />
The answer to this question depends entirely on why you were doing an assessment in the first place. Were you just wanting a benchmark to compare against other companies, or against some later point to measure improvement? Fine. Let's say you came out slightly ahead of your peer companies in most process maturity levels. Does that make you a better service organization? Not necessarily. It is certainly one characteristic you could use towards making that determination, but there is no guarantee that you are better just because your maturity levels are higher.<br />
<br />
Think of it like a basketball team. Your team, across all five positions, is slightly taller than players on the other teams. It very well could help you beat your competitors. Or maybe your teamwork assessments are higher than your competitors. Your team passes very well, shoots accurately, and plays great team defense. However, another team has 2 undisciplined players with raw talent that is so great, any normal attempt to stop them is useless. The make up for lack of discipline with pure athletic talent and effort. On defense, wild leaps at the ball lead to frequent steals. If your players tried that, they would look like buffoons, and end up in lopsided defeats. This team, however, wins 2/3 of the time. They might lose big every once in while, when one or both or their stars are off their game; but their overall results speak for themselves.<br />
<br />
IT departments can function like that. We have to accept that disciplined process adherence doesn't always work well. I've seen IT shops where a few insanely talented engineers or developers make amazing things happen, with no concern over following procedure. It works because it works. The business outcomes are fast and strong enough that they can withstand the occasional failure. Besides, the super-techs are so good that they can fix their failures and still look like heroes in the process.<br />
<br />
In reality, no one <em>wants</em> to run their shop that way. We know that well-defined processes that also allow for calculated risk taking ultimately lead to better results. Ultimately we are talking about <em>Value</em>. What is the value of knowing your process maturity levels? Very little, if it doesn't lead to creating business value. What is the business value of having a single Incident Management process owner? In and of itself, absolutely nothing.<br />
<br />
And that's where most maturity assessments fail. They tell you how well you follow ITIL, COBIT, ISO/IEC 20000, etc. standards and guidelines.<br />
<br />
As part of my presentation for the upcoming ServiceNow <a href="http://knowledge.service-now.com/k12/" target="_blank">Knowledge12 conference</a>, I will present my company's example of how we looked at value: The old technology focused way, and a newer business value focused way.<br />
<br />
Several years ago, we did a self-assessment for a new CEO, presenting the results as the <Insert booming voice here> STRATEGIC TECHNOLOGY READINESS.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="http://3.bp.blogspot.com/-4GOD36eFu74/UAyVnxAYeZI/AAAAAAAAAE8/WSqB-n6KJZI/s1600/STR_Graph_1.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="http://3.bp.blogspot.com/-4GOD36eFu74/UAyVnxAYeZI/AAAAAAAAAE8/WSqB-n6KJZI/s320/STR_Graph_1.jpg" width="288" /></a></div>
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<span style="background-color: white;"><br /></span><br />
<span style="background-color: white;">We assigned color codes for each block, Red, Yellow, or Green, indicating how "ready" each component was for the future. There was a time we could get away with this sort of presentation to the business. Remember the days when we could tell the CEO that, unless the flux capacitor was upgraded this year, the future of the business was in trouble, and they bought it? Believe me, that's how they heard the message. We were probably right, too; but that's beside the point.</span><br />
<br />
We presented the STRATEGIC TECHNOLOGY READINESS to the CEO so that he could better understand our strengths and challenges. You read that right: It was about IT's strengths and challenges, as we saw them. Here's what we saw:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="http://1.bp.blogspot.com/-vv4mmNhDp58/UAyV088PP3I/AAAAAAAAAFM/WikoGr6ntws/s1600/STR_Graph_2.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="http://1.bp.blogspot.com/-vv4mmNhDp58/UAyV088PP3I/AAAAAAAAAFM/WikoGr6ntws/s320/STR_Graph_2.jpg" width="288" /></a></div>
<br />
Wow, that looks pretty good. IT has awesome people, and we excel at the things the business used to expect from IT: Reliability, Security, and Architecture. To be fair, Architecture was probably more for show, but it sounds cool. The areas of struggle were the things the business was responsible for: Priorities, the Applications, and Business Processes.<br />
<br />
Unfortunately for us, the IT landscape in the U.S. changed soon after this time. We were an awesome IT Department, yet started seeing signs of trouble:<br />
<ul><br />
<li>An internal survey included IT as an area needing improvement</li>
<br />
<li>Frustration from senior leadership regarding technology modernization</li>
<br />
<li>Slow implementation of requested new technology</li>
<br />
<li>"IT is expensive”</li>
</ul>
<span style="background-color: white;">It hit me that a new paradigm was needed. The way we measured maturity needed to change. ITSM (instead of ITIL) was becoming a term used by industry visionaries. Along with that came discussions around </span><em style="background-color: white;">Value</em><span style="background-color: white;"> of IT. The writing was on the wall. Business leaders across the country were starting to question what they were actually getting out of the seeming money pit of IT. Maturity needed to be measured and demonstrated from the perspective of value. Immature IT led to negative business outcomes. We need to be net outcome neutral or (hopefully) net outcome positive. The new way to show maturity looked a lot more at business outcomes.</span><br />
<div>
<div class="separator" style="clear: both; text-align: center;">
<a href="http://4.bp.blogspot.com/-_Gk0PQStW-g/UAyV2kobtWI/AAAAAAAAAFk/4RdqKzKz8es/s1600/VOT_Graph_1.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="http://4.bp.blogspot.com/-_Gk0PQStW-g/UAyV2kobtWI/AAAAAAAAAFk/4RdqKzKz8es/s320/VOT_Graph_1.jpg" width="277" /></a></div>
</div>
<br />
This is something much closer to what the business was expecting from IT. How much is IT support impacting overall productivity? To what extent were we providing and/or enabling innovation? How quickly can we change gears towards new technologies?<br />
<br />
The new assessment looked more like this:<br />
<div>
<div class="separator" style="clear: both; text-align: center;">
<a href="http://3.bp.blogspot.com/-iKf2JnFS41I/UAyV3IBXiWI/AAAAAAAAAFs/zEQqEh1de8U/s1600/VOT_Graph_2.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="http://3.bp.blogspot.com/-iKf2JnFS41I/UAyV3IBXiWI/AAAAAAAAAFs/zEQqEh1de8U/s320/VOT_Graph_2.jpg" width="277" /></a></div>
</div>
<br />
That's more realistic. We were excellent technology implementers, and the systems continued to run very well. We were OK as balancing security with allowing sufficient accessibility. No one, including IT, was taking responsibility for how well the technology matched current business processes. The old perspective was that we just implement, it's up to the business to use it well or not. You get the idea.<br />
<br />
I'm not suggesting this as a perfect model of measuring the value of IT maturity. This is still very rough, and doesn't do a great job at showing the interdependence between the elements. You are welcome to use it, however. I recommend that you first determine whether these elements of value are appropriate for your organization. This works well for me, you may find the need to drop some and add others.<br />
<br />
The point is that a maturity assessment is only helpful when it helps define the value of the relevant processes to the business. You could start assigning costs to these value elements, and you'd have a great start towards developing a common understanding of how IT costs lead to business value (or not). For example, the visual helped me see that we were spending a lot of money and time on security, without really identifying business benefits. Risk is a much better concept, and is understood much better by business executives. Maybe our spend on security gives us more risk avoidance than we need.<br />
<br />
As I said, this is only a start. I'd love to hear how others bridge the gap between process maturity and business value. Please leave your ideas in the comments!Anonymoushttp://www.blogger.com/profile/09929890999944016287noreply@blogger.com3tag:blogger.com,1999:blog-2700395448511469789.post-28531012210331938382012-03-11T17:17:00.000-05:002012-07-17T22:21:15.373-05:00ITIL Secrets Revealed! Is it an Incident? A Request? A Problem? The
definitive answer!There is no value in arguing the difference between Incidents and Problems. That doesn't mean you shouldn't have a clear distinction, just don't waste time arguing the difference. I mean this quite literally: within your organization, spend no more than 10 minutes discussing where you distinguish between Incidents and Problems. Same thing as the difference between Incidents and Requests.<br/><br/>The outpouring of recent discussions, especially on LinkedIn, with protracted discussions about what makes for the "right" distinction between these types of tickets has gotten out of hand. These discussions are exactly what gives ITIL such a bad name. It also goes a long way to reveal <a href="http://www.pcpro.co.uk/features/371254/why-everyone-hates-the-it-department" target="_blank">why everyone hates the IT department</a>. Honestly, who decided that ITIL says how we MUST define these tickets? Yes, it provides <em>guidance</em>; but there is nothing magical in that guidance. A much better debate is whether it is important to distinguish between the two at all. Even then, what value is created by debating that for a long time?<br/><br/>Ultimately it is far more important that you decide what the distinction between Incidents, Problems, and Requests should be; and communicate that decision consistently and frequently. When defining the difference between Incidents and Requests, for example, I would look at the answers to three questions, in declining order of importance:<br/><ol><br/> <li>What is best for your customer?</li><br/> <li>What is best for your business? This is really about the metrics you need in order to know whether you are doing what is best for your customer.</li><br/> <li>What will create the least amount of confusion for your IT staff?</li><br/></ol><br/>What are your thoughts and experiences? How do you like to determine where an Incident ends and a Request begins?Anonymoushttp://www.blogger.com/profile/09929890999944016287noreply@blogger.com0