Have you ever stopped to think about what IT service management (ITSM) maturity is and how you assess it? A quick search of the phrase brings back a number of different opportunities to apply ITSM maturity assessment models to your organization, but what are you measuring and what does it all mean?
Any organization that has implemented IT service management (ITSM) will want to know if it has achieved a return on its investment. Did all those processes, tools, training sessions and new structures deliver any improvement? Looking around, there are many vendors offering maturity assessments that suggest they will show how 'good' an organization's ITSM actually is. But are they relevant? How much do we need to worry about ITSM maturity?
This blog looks at ITSM maturity, and the assessment of it, from a number of angles to try to make sense of it all.
Consider this simple example – one company “operates” a dozen ITSM processes, all of which are deemed to be pretty immature when local management conducts an ITSM maturity assessment. Another company operates just six ITSM processes and is considered to be above average maturity for the majority of them. Which of these companies is the most mature in its use of ITSM?
For me it’s one of the most confusing things about ITSM maturity. If I were to ask an ITSM tool vendor – and I have – which company is most mature (in terms of ITSM), I imagine they would say the company that operates the most ITSM processes. Probably because the company would need an ITSM tool designed for mature organizations, i.e. they need support for more processes beyond service desk and incident management. On the other hand, I would argue that the company doing the fewer ITSM processes well could be the more mature company. While they have not seen the need to increase the number of ITSM processes they operate, they have seen the need, and have the capabilities, to operate fewer ITSM processes well.
I’d be willing to bet that many would disagree with me though, especially when ITSM maturity assessments look across ITIL’s 26 processes and four functions to determine how good companies are.
The simple answer is that we test our operations/processes against an industry ITSM maturity model. But which one is that, as there are many?
Many maturity models, ITSM-related or otherwise, build on the work of the CMMI Institute, a subsidiary of ISACA, and its Capability Maturity Model Integration (CMMI) program. This has a stepped view of maturity as shown in the diagram below:
For example, AXELOS – the custodians of the ITIL ITSM best practice framework – offers two different CMMI-flavored ITIL maturity models: a high-level self-assessment service and a full self-assessment service. Both of these use process-based scoring using a series of pertinent questions.
The high-level model contains a set of 30 questionnaires, i.e. one questionnaire for each of the 26 ITIL processes and the four ITIL functions, with each questionnaire covering:
The maturity of each process or function is calculated from the answers given to the questions within each questionnaire, to place the process or function at one of CMMI stages:
Gartner – the global research firm – offers an alternative, process-based view of ITSM, or ITSSM (IT service support management) as they refer to it, with maturity ranging from:
There are of course many other ITSM or ITIL maturity models out there to choose from, that all allow organizations to measure how poor or great they are. But, in many ways, whichever method you use to assess ITSM maturity it’s not so much how you score your company (with that method) but more about how you score it the second, third, and fourth time you assess maturity.
Clear as mud? I’ll try to be a little clearer…
One needs to be careful about ITSM maturity assessments and how they are used. For instance, if all an assessment is used for is to show how well an IT team is doing, then it’s a missed opportunity rather than a successful investment of time and money. In these one-off assessment scenarios, who’s to say that the results are a forgone conclusion with scores dictated by the need, i.e. scores are inflated when “proving” brilliance and, conversely, scores are understated when investment is needed to help drive expansion and improvement.
Or even if maturity assessments are run every x years there might be little or no continuity, and little improvement activity happening, between them.
ITSM maturity assessment should be so much more than just taking a snapshot in time, or a set of unrelated snapshots taken over time, that are used merely to prove a point. Instead it needs to be an input to some larger initiative – ideally some form of improvement activity that seeks to better the performance (across a number of parameters), services, governance, and customer experience of the assessment-undertaking organization.
If it helps, try not to think of ITSM maturity assessment as a singular thing, i.e. the answers to a number of questions. Instead, think of it a voyage of improvement with a number of check points along the way.
I’ll leave you with some words, okay Tweets, of wisdom from a few of the smart people:
“Maturity scales that defer process improvement to level 5 = mis-applied Taylorism. One of the most harmful ‘best practices’ out there” ~ Charles Betz
“ITIL assessments end up being assessment of whether you are doing #ITIL, not whether you are doing what you should be doing” ~ The IT Skeptic
hese, and my thoughts before them, will make you stop to think about what you hope to achieve the next time your company is thinking about undertaking an ITSM maturity assessment.