One of the impediments to achieving Agility is the lack of an objective way of measuring L&D Agility. And we all know, that what can’t be measured can’t be improved!

So, here you go – we have created an Agility Index to help L&D ( or for that matter any organisation that has to create & manage Digital Content) organisations on their improvement journey. We have drawn parallels with a modern News Room, where things no longer happen only in a “linear” and planned manner. There is enough and more breaking news and crowd sourced news ( one has to be careful of fake news!) being put on air where things happen in a more “spatial” manner.

To measure L&D Agility, we need to look at 2 key factors:

  1. Responsiveness in Planned Situations ( RPS)
  2. Responsiveness in Unplanned Situation ( RUS)

Typically, there is a tradeoff between the above two factors.

An organisation which is highly responsive in planned situations (long cycle) finds it difficult to be responsive in ad hoc/unplanned situations and vice versa. The ability to be good in both, without violating the cardinal laws/guiding principles of learning/communication is critical to an organisation being Agile!

An organisation’s responsiveness can be classified as Low, Medium, High depending on their capability on the Process, Technology & People aspects to be able to be responsive in a consistent, repeatable and predictable manner. Basis these classifications, there are 5 levels on the Agility Index

Since technology has a huge play in driving Agility and esp. for the Unplanned situations, reaching Level 5 may not be practical given where tech developments stand today! Level 4 should be a good state to be in. In our experience most L&D organisations are at Level 2, some are at Level 1 and very few at Level 3. We haven’t come across any that’s at Level 4! A Level 4 organisation mirrors a well functioning modern News Room!

How do we determine Low/Medium/High in an objective manner? A little bit of arithmetic always helps! Each of the key aspects ( Process, Tech, people) is measured basis 5 key questions each ( total of 15 questions) and then the overall score is divided into 3 levels:

  • Low: less than equal to 5
  • Medium: greater than 5 but less than equal to 10
  • High: greater than 10

In the interest of brevity, I am skipping further details. Do reach out, if you’d like to know more. Meanwhile, would love to hear our views and reactions to the Index per se. Does it make sense? Does it fill a void? Will you be ok to use the same?

And how does an organisation move levels? It’s a complex journey where we’ll have to work with the old horses – Process, Technology & People! Technology has a big role to play but, one has to be careful so that Tech doesn’t drive the process and things stay the other way round! Again, happy to discuss offline unless you’d like to wait for my next article 🙂

Similar Posts