Quantified KM value case story 140 – construction time saved at AFCONS

This story describes how AFCONS Infrastructure saved 100 days in a big construction project

Chenab bridge, an AFCONS project
Image from wikimedia commons
By bhisham pratap padha from Jammu, India –
IMG_7770b, CC BY-SA 2.0, 

AFCONS Infrastructure is an infrastructure company, we work on large public infrastructure projects like ports, metros, expressways, bridges, and tunnels. AFCONS won the MAKE award in 2016 and 2017, and the MIKE awards at Global, Asia, and India levels in 2019 and 2018. The AFCONS KM model is a traditional practice-based one, including

In a recent article, Rudolf D’Souza, the CKO at AFCONS, describes the value KM has brought to the organisation as follows:

“We do have big impact stories to share of how KM has benefited the company. The completion of India’s first twin underground metro tunnels under the Hooghly River 100 days before time is an example. Until a few years back, the typical life cycle of a project was five years. Today, characteristically the time frame is three years, and in some cases even 24 months. This time frame includes the lean period when work is reduced because of the weather”. 

View Original Source (nickmilton.com) Here.

3 ways to estimate the value of lessons learned

Many organisations attempt to assign value to lessons in a lessons management system, and there are three ways you can do this. 

A screen sub-panel from the lessons management hub
showing value assigned to lessons

Assigning value to lesson-learning has three main advantages;

  • It reassures the people using the system that there is value in lesson learning. A panel on the front page of your lessons management system, such as the one shown here, reassures people logging into the system that sharing and re-using lessons is a valuable thing to do.
  • Lessons can be high-graded according to value, with the most valuable lessons getting highest priority.
  • It reassures management that there is value in lesson learning, and makes them think twice before axing the lessons management team and closing down the system.

However there are also counter-arguments;

  • Assigning value to lessons can be subjective (see below)
  • Value is yet another piece of metadata that needs to be added when documenting a lesson

If you decide to go down the path of assigning value to lessons, you can estimate this value in three ways;

  1. You estimate a projected value, where you look at the impact the lesson had on the project where the lesson was identified (measured through lost time, saved time, wasted cost, saved cost etc), and then you forecast  that forward by estimating the frequency of recurrence. Imagine a new way of working was developed which saved a project $100k on a particular activity. Imagine that activity is repeated in other projects a total of 10 times a year. If you document that new way of working as a lesson, and/or embed it into project procedures, then the projected value of that lesson is $1 million per year ($100k times 10, assuming all the future projects re-use the lesson). This of course is only an estimation – maybe the activity is repeated only 5 times, or 20 times. In some cases it might save $50k, in others $200k.
  1. You record an actual value, where the lesson is re-used (either directly, or through embedding into process) and you can then measure the improvement that results. This is more accurate than the projected value, but it sometimes can be difficult to isolate the impact of one lesson when many lessons are applied together.  The reporting is often anecdotal – “We started our project by reviewing and adopting all the lessons from the past. The project was delivered in X time/cost which is  saving of Y compared to the benchmark”. This approach was used by the Ford Best Practice replication system, where manufacturing-plant contact who received a lesson needed to report what had been done with this knowledge in their local plant and, if they applied the lesson, what value it has added.
  1. You record an aggregate value, by looking at the improvement in results over time.  There are several examples on this blog of the aggregate improvement that comes through learning and re-use of knowledge, for example in Trinidad offshore platforms, Oil wells in Oman, Nuclear power stations in Korea, and jumping frogs in California.

If you can, assign value to lessons. This reassures both managers and workers that lesson-learning is a good investment of time and effort.

View Original Source (nickmilton.com) Here.

Quantified KM value case studies numbers 130 through 139

In the embedded video below you can find 10 case studes which illustrate the value delivered by health librarians, who are a main component in the KM systems used in the UK National Health Service.

The video is entitled “Health Librarians and Knowledge Specialists Impact Case Study Vignettes” and is provided by HEE Knowledge for Healthcare. As the YouTube page says,

Health librarians and knowledge specialists ensure that decisions are based upon the best available evidence and encourage knowledge to be captured, shared and re-used. Multiple benefits are possible for NHS staff and organisations when they work closely with health librarians and knowledge specialists. This selection of case studies gives a flavour of this work.  

These case studies include:

  1. £16,800 savings from reduction in length of hopsital stay for a patient, due to decision based on summarised evidence provided by health librarian 
  2. Librarian alerts clinician to innovative treatment leading to effective cure (potentially 9 lives saved)
  3. £100,000 savings on agency costs due to evidence provided by specialist librarian
  4. 303 hours of a senior midwife’s time released (worth £12389) by using the epxertise of a clinical librarian
  5. Librarian input essential to acheive £1.9 million savings
  6. Change in practice leading to 2-week reduction in waiting times from using summarised best practice from a specilist librarian
  7. Cost savings (no value quoted) and reduction in risk by changing practice informed by evidence provided by a specialist librarian
  8. Transformed approach to discharge of frail elderly patients, as a result of evidence summary provided by the librarian (no value quoted)
  9. £48500 savings and improved patient experience by implementing a change based on evidence supplied by the specialist librarian
  10. Low-cost intervention leading to a “significant” reduced length of stay for critically ill patients, informed by evidence provided by a specialist librarian (no value quoted).

Librarians and Knowledge Specialists are therefore a core part of the knowledge supply chain in the health sector, bringing the evidence to inform healthcare decisions and saving time, increasing efficiency, and improving practice.

View Original Source (nickmilton.com) Here.

How to apply Knowledge Management to Mergers and Acquisitions

Knowledge management delivers maximum value when applied to high value knowledge, to support high value decisions, and in areas where that knowledge is otherwise at risk of being lost. A typical high value area where major decisions will be made is Mergers and Acquisitions. 

Image from wikimedia commons,
Merger of KCR and MTR operations 2007-12-02

Mergers and Acquisitions are high cost, complex operations, where crucial decisions need to be made very well, and yet which happen relatively rarely, so it is easy for tacit knowledge to be lost. People caught up in the high pressure activity can easily forget the detail of how the decisions were made, and fail to pass the knowledge on to future mergers and acquisitions teams.

This combination of high value decisions made relatively infrequently, where human memory alone cannot be relied on as a knowledge store, means that there is great value on documenting the learning for use in future mergers and acquisitions.

In addition, many mergers and acquisitions are conducted for knowledge reasons, in order to acquire competence and capability.

 The approach to KM for Mergers and Acquisitions (and let’s include Divestments as well) should be as follows; 

  • Learning Before. Through Peer assists and other knowledge-seeking activities, the M&A team should seek to acquire lessons, guidance, success factors and pitfalls from previous Mergers, Acquisitions and Divestments. They may conduct a Knowledge Gap Analysis, to map out the knowledge they need to acquire, and should create a Knowledge Management Plan for the M&A project. 
  • Learning During. If possible, and for high value Mergers, Acquisitions and Divestments only, a Learning Historian should be seconded to the M&A team to capture lessons, practices and key documents as the exercise progresses, perhaps through a series of After Action Reviews
  • Learning After. Interviews and Retrospects of the key players should be held to determine the success factors to repeat, and the pitfalls to avoid. The focus here will be on capturing tacit knowledge and experience; the ‘golden rules’, ‘top tips’, recommendations and advice that will allow success to be repeated routinely. This might include interviewing any external consultants who had been an integral part of the team. 
  • Knowledge Asset. Output from the interviews and the Retrospect should be packaged into a knowledge asset; a web-based package or secure wiki that provides helpful accessible advice for future re-use. This advice will be at varying levels of detail; from the managerial overview of the “top 10” bullet points, down to operational advice. 
  • Post-merger KM framework. If we assume that each of the two merged companies had an approach to Knowledge Management, in other words their own Knowledge Management Frameworks, then after the merger these two frameworks will need to be merged. This should be done through a combination of an external assessment to look for the strengths and weaknesses in each approach, and a set of Knowledge Exchange meetings to decide on a best combined approach. This would be piloted and rolled out like a conventional knowledge management implementation. Of course, the framework chould be compliant with ISO 30401:2018
  • Post-merger Knowledge Mapping. In a knowledge-based acquisition, where one company acquires another in order to gain access to their knowledge, there will have been some pre-acquisition work done to ensure that the knowledge gain is worth the expense of acquisition. However once the deal is done, there needs to be detailed knowledge mapping to identify the critical knowledge areas, to determine how well managed each of these are, and to put in place actions to improve the management of critical topics. After all, if you spent the money to acquire the competence, you would want to make sure this new asset was well managed.

Case History 

The learning approach outlined above was applied by Company A to their merger with Company B, and again with Company C two years later.

In the latter case, 14 of the core acquisition team were interviewed, and lessons were derived on leadership, the team and team processes, the role of the investment bank, the transaction process, communication, and staff issues. One year after acquisition, 31 staff were interviewed to gather lessons on the Integration process. Interviewees included the Chief Counsel, one of the Country unit Presidents, and several other very senior staff. Learnings were captured on topics such as managing the Integration project, dealing with delay, and getting ready for Day 1 of the integrated company.

Lessons from the first merger were used to guide the process for for the second, and lessons from the second acquisition and integration were used to guide the process for several further acquisitions. The value of the knowledge management approach was seen in the reduced involvement of external consultants in successive mergers. Initially the bill to Big 5 consultants had been very high, but this reduced from one acquisition to the next as Company A internalized the knowledge, with savings on external consulting spend in the order of $1 million.

View Original Source (nickmilton.com) Here.

Why "Finding Better Knowledge" is 100 x more valuable than "Finding Knowledge Better"

2 years ago I posted an article where I suggested that a KM strategy based on “finding better knowledge” was more valuable than a strategy based on “better ability to find knowledge”. Now we have a figure for how much more valuable. 

In the 2018 post I suggested that there are two basic ways in which Knowledge Management can add value to an organisation:

  1. Finding Knowledge Better;
  2. Finding Better Knowledge.
The first approach focuses on better search, better content management, tagging, taxonomy, portal structure, and so on. The intent is to have “documented knowledge at your fingertips”, and the result is faster and better access to documents and documented knowledge.  The value of this approach is that it saves people time in searching for relevant material, and so drives operational efficiency.

The second approach focuses on learning from experience, on capturing lessons, on connecting people into networks and communities of practice, on collaboration, and on synthesising knowledge into current “best” practices. The intent is to create learning loops and channels in the organisation, for improvement of practice, so that knowledge is continually improved. The value of the second approach is in delivering better decisions, and delivering better results, not just faster decisions. The value comes through improved operational effectiveness.
Often the choice between these two is a clear choice, and the two options are mutually exclusive. To get good knowledge requires time and conversation; good knowledge is rarely fast, and fast knowledge is rarely good.

In my 2018 article I suggested that the value of the second approach, is at least an order of magnitude greater than the first; maybe 2 orders of magnitude. I did not have the statistics to test this estimate at the time.

Now I do.

In the three Knoco KM surveys in 2014, 2017 and 2020 we asked people to tell us (among other things) their business drivers for KM, in order of priority. Business drivers included operational efficiency and operational effectiveness, as discussed above. We also asked them, where they could, to tell us how much value in $USD their KM program has delivered. So we now have the data to test the value of these two approaches.

The graph below shows the average value for the organisations grouped by their priority business driver.

  • Organisations whose primary driver was to increase organisational efficiency, delivered on average $1 million from KM. For these organisations, the most common KM strategy was to improve access to documents.
  • Organisations whose primary driver was to increase organisational effectiveness, delivered on average $106 million. For these organisations, the primary KM strategy was divided between improved access to documents, connecting people through communities and networks, and better lesson learning.
  • The business driver of “providing a better service to customers and clients” is also a type of operational effectiveness driver.
So we can see that in this dataset, operational effectiveness, which comes from finding better knowledge, is actually 100 times more valuable than operational efficiency, which comes from finding knowledge better/more easily. Thats 2 orders of magnitude better.
Bear this in mind when you set out your KM strategy and business case. If the business case is based on saving people time through better access to knowledge, you may be underselling the value by a factor of 100. 

View Original Source (nickmilton.com) Here.

Helping people FEEL the value of KM – the KM "test drive"

If you want someone to buy something, they need to be convinced that it is worth the investment. If your product is a good one, then you can convince people by letting them try before they buy.

That’s why Apple allows you to play with all its products in the Apple store. That’s why cheese-stalls in the market give away free samples. That’s why car salesmen let you take a test-drive in that new Mercedes.

KM, Before and After

But how can you test-drive knowledge management?

For the past 20 years, we have been running a knowledge Management exercise called Bird Island which acts as a KM test-drive. The purpose of the exercise is to allow people to experience personally the value of Knowledge Management by seeing (and feeling) the impact it has on their performance.

The exercise is a simple one – the delegates are divided into teams, given a small set of materials, and asked to build as tall a tower as possible (with some environmental constraints). Then knowledge is brought into the equation, first through an after action review within the team, secondly through a peer assist with another team, and finally through presentation of a best practice knowledge asset showing the secrets of building the tallest towers from previous courses.

Armed with a full set of knowledge, they build the tower again, and frequently treble or quadruple their previous performance. 

Behind the exercise is a very simple KM system;

  • Every time a team makes a new modification and improvement to the tower design, we photograph it 
  • We update the Best Practice knowledge asset to include the new modification 
  • We present the updated knowledge asset in the next training course 
  • People use this as the basis for their own design, and often innovate even further (and the innovations feed the next improvement cycle). 

It is the emotional impact in the exercise that sells KM.

This impact comes at three points:

  1. When a team with a small tower, who have defined what they think is the limit of possibility for tower height,  holds a peer assist with a team which has already exceeded that limit. You can almost hear the minds opening at this point.
  2. When the teams are shown a picture of the current world record tower, which is FAR beyond their perceived limits. You can definitely hear minds opening here, and at this point I tell them that the only thing the winning team had which the current teams don’t yet have, is knowledge. Everything else is equal – knowledge is the only difference.
  3. When the teams look in wonder and pride at their second towers, built with a full set of knowledge, which are usually close to the world record, and sometimes set a new record.

This is the KM test drive; it’s an emotionally engaging mind-opener for the participants, and never fails to convert people to the value of KM. 

View Original Source (nickmilton.com) Here.

The value of mentoring programs in KM – quantified story number 129

Mentoring is a valuable component of KM when it comes to onboarding new staff. A recent article tells us just how valuable it is. 

Image from wikimedia commons

The article is entitled “The secrets of leveling up junior employees“, is written by Miriam Kharbat, and it deals with the software industry (but is applicable to other industries as well. Miriam describes the value of mentorship in transferring knowledge to new staff, and makes the following points:

  • Mentorship can be beneficial for both parties Miriam quotes a  2006 Sun Microsystems study which found that mentors were promoted six times more often than those not in the program, and mentees were promoted five times more often than those not in the program. 
  • They also found that retention rates were 72% higher for mentees and 69% higher for mentors than for employees who did not participate in the mentoring program.
  • Start mentorship by giving the junior staff real tasks. Miriam describes asking new staff to download the source code, run it on their local machine and update any dead links or new issues to the knowledge base or ReadMe file.
  • Listen carefully, explain simply, and beware of the curse of knowledge.
  • Teach them where to look for, and ask for, answers. Show them the knowledge base and get them into the community of practice. 
  • Conduct reviews of real work. Miriam suggests code reviews, and says that “Code reviews can be an excellent opportunity for knowledge sharing. They are a great way to teach best practices and good programming patterns. During a code review, ask questions and suggest alternatives. If you think something is not correctly implemented, explain why you think your way is better. Learn to understand the difference between personal preference and essential changes.”
  • Let the mentee drive the schedule of mentorship, but if you haven’t heard from them in a while, check in to see if everything is OK. 

Mentoring new staff, as a component of the KM framework, therefore not only benefits the organisation by getting new staff up to speed quickly, it also benefits the mentor and the mentee as well. 

View Original Source (nickmilton.com) Here.

What KM can learn from business start-ups 2 – an effective business model

Yesterday I started a set of blog posts likening KM implementation to a business start-up. Here is number 2 in the series. 

Picture by Tumisu (pixabay.com) on Needpix

In many ways, the initial implementation of Knowledge Management within an organisation is like the launch of a new product into a market by a start-up organisation, and there are many lessons KM can learn from start-ups; their failures and their successes.

(If you want to make a bad pun, you could call KM implementation a “Smart-up”).

This blog series uses this analogy to inform KM implementation by reviewing 5 common reasons for start-up failure and suggesting ways in which KM programs can avoid these failure modes. These common reasons are taken from  a great article by David Skok , and are as follows:

  1. Little or no market for the product; 
  2. The business model fails; 
  3. Poor start-up management team; 
  4. Running out of cash; 
  5. Product problems.

Failure of the business model

The business model for a start-up fails if the cost of acquiring new customers exceeds the value each customer brings. If this happens, the start-up will lose more and more money until the investors remove their support.

As David Skok says, “One of the most common causes of failure in the startup world is that entrepreneurs are too optimistic about how easy it will be to acquire customers. They assume that because they will build an interesting web site, product, or service, that customers will beat a path to their door. That may happen with the first few customers, but after that, it rapidly becomes an expensive task to attract and win customers, and in many cases the cost of acquiring the customer (CAC) is actually higher than the lifetime value of that customer (LTV)”.

KM also has a business model. KM receives funding from senior managers, who are  the investors in the “KM start-up”. Those investors want a return on their investment, in whatever way they define “return”. The KM Implementation must deliver more value to the business than it costs (e.g. deliver a positive ROI), and those costs include the costs of the KM team, KM software, and the costs of implementation, roll-out and support (“acquiring the KM customers”).  This positive return must be documented and justified in order that senior management do not remove their support and their money.

In addition, KM is a long term commitment; survey data shows that it takes over a decade before KM is fully integrated in the majority of companies. This is a long time to take KM value on faith, and the viability of the business model needs to demonstrated on a regular basis throughout that decade if funding is to be secured.

Also, during that time there may well be a change in senior management, and the new bosses may need convincing that KM has a positive business model for the organisation. You need to have your evidence ready that KM is a wise investment. Internal reorganisation, and a change in investor, is one of the most common reasons for KM failure, and you need to protect against this.

In order to demonstrate a positive return on investment, the KM implementation team needs to:

  • Understand the metrics that will convince senior management that KM is delivering a return on investment. Know what they want to see from KM, and know how this will be demonstrated or measured 
  • Conduct short term pilot projects throughout the implementation that deliver demonstrable value, as proofs of concept that KM has a positive business model. These pilot projects should solve business problems, and ideally should impact business metrics. Early in KM implementation the KM Framework will still be in process of development, so use a Minimum Viable Framework – one that does just enough to deliver real value. The pilot will also deliver practical lessons that allow you to elaborate the framework.
  • Collect and regularly report all examples of value delivered through KM, in the form of success stories and/or metrics. These will not only allow you to demonstrate a positive business model to your sponsors, showing that KM delivers more value than it costs; it will also provide valuable marketing collateral for further KM roll-out. 

Bear in mind the business model for KM, and how you will demonstrate that it is viable. Without this demonstration, you are vulnerable to losing your funding. 

View Original Source (nickmilton.com) Here.

Why "Knowledge for action" is better than "knowledge for storage"

Knowledge has to lead to action in order to add value. 

call to action by Sean MacEntee on Flickr

As the blogger Bill Wilson says (in the context of root cause analysis) “Learning without action is mere mental trickery, while action without learning is simply useless physical exercise”.  If knowledge management is to deliver more than mere mental trickery and to live up to its promise of adding value, then it must lead to action.

A few years ago we worked with a client who was developing a lesson learning system from projects. The collection of lessons has been going well, but the client had the firm view that lessons should be stored in a library that future projects could review if they wanted. For them, the knowledge would be stored “for future reference”.

Of course, few people have time to read through the lessons, and there are now so many lessons that reading through them is becoming more and more daunting. 
We are now helping the client to move to a different philosophy, where lessons are forwarded to the owners of the organisational processes, so they can continue to update the processes, procedures and guidance in the light of the new learning. This is “knowledge for action”, and if we assume that people follow the updated guidance, it should result is less “useless physical exercise” and to more efficient ways of working.

This philosophy is that wherever possible, every piece of new knowledge should lead to an action. The action might be;

  • Fix a problem,
  • Investigate further (especially if the learning is not yet clear),
  • Document a new procedure, process or guidance document,
  • Update an existing documented procedure, process or guidance document,
  • Update a training course or other training or e-learning material,
  • Circulate the lesson for others to decide on an action.
Communities of practice, as well, should focus on creating and managing actionable knowledge. Actionable knowledge can be stored on the community wiki, and includes

  • Advice and guidance
  • Good practices
  • Improved practices
  • Solutions to problems
  • Answers to questions
  • New approaches
  • Recommendations
  • Tips and Tricks
Non-actionable knowledge is
  • Interesting articles
  • Links to interesting articles
  • Musings
  • Quotes and aphorisms
  • Descriptions of what you are doing (unless you analyse this to bring out actionable learning)
  • Descriptions of what you have done (unless you analyse this to bring out actionable learning)
  • Large document stores
Communities that circulate non-actionable knowledge, or “knowledge for interest” are classified as Communities of Interest rather than communities of practice, the clue being in the title. 

CoPs deliver more value when they focus on solving the problems of the members than when they circulate “interesting links and ideas”. CoPs that operate through a Pull process – where members with problems or issues ask questions and receive recommendations and support from other members – know they are adding value.  Each answered question represents a solved problem; knowledge which the person who asked the question can immediately put into action.

So when you are sharing knowledge in a CoP, ask yourself whether you are sharing “something that others will find interesting” or “something that will help people do their job better” – something actionable.

And when you are designing lesson learning systems, make sure each lesson leads to action, rather than being retained “for interest”.

We recommend “knowledge for action” rather than “knowledge for storage” as being a far more effective system.

View Original Source (nickmilton.com) Here.

How to identify a knowledge "near miss"

In organisational safety management, they identify a “near miss” as evidence that safety practices need to be improved.  We can do the same in knowledge management.

Image from safety.af.mil

I have often used Safety Management as a useful analogue for KM, and here’s another good crossover idea.

In safety management they identify safety breaches (accidents, injuries, “lost time incidents”) as metrics and indicators that safety management needs to be improved.

They also track “near misses” – incidents where nobody was harmed, but only by luck, or “unplanned events that did not result in injury, illness or damage – but had the potential to do so“. A hammer dropped from height and landing a few feet away from a worker on the ground, a bolt blown past someone’s head by an escape of compressed gas, a near collision between two aircraft, all are examples of near misses indicating that safety management needs to be improved. 

In KM we can  track lost knowledge incidents, where time, money or effort was wasted because knowledge should have been available but “got lost” along the way. The knowledge is or was once available to the organisation, but failed to reach the person who needed to act upon it, with resulting cost to the organisation in terms of recovery cost, rework, lost sales, delay etc. If you are lucky you can quantify this cost as part of the Cost of Lost Knowledge, aka the Cost of Ignorance, and use this in your KM business case.

But we can also track Knowledge Near Misses. This is where the knowledge was not lost and no cost therefore incurred, but it was only found or transferred by lucky chance.

I heard a great example recently in a client organisation (and I paraphrase below).

The organisation was planning an activity. It seemed a little risky but quite doable, and there was management pressure to go ahead. They were discussing this activity in a meeting, and someone from another part of the business who happened to be in the meeting by chance (he was not invited to discuss this particular activity) spoke up and said “I was part of a team that tried this before. It was a complete disaster, and we are still recovering from the mess it created”.

The lessons from this previous project had not been captured, they were not in the lessons database, and the project report was not findable but buried in a mass of project files on a hard drive somewhere. Had that person not by chance been at the meeting, the “complete disaster” would most likely have been repeated with resulting costs in manpower, money and reputation.

This was a knowledge near miss. This event did not result in cost to the organisation through lost knowledge, but had the potential to do so, and was only avoided through luck. With a proper KM framework in place, and followed by all staff in a systematic way, this knowledge would not have been lost, and the planned activity could have been assessed in the full light of historic lessons.

You can find another KM near miss story here

The knowledge near miss is a useful metric which provides evidence of the value of, and need for, effective KM.

View Original Source (nickmilton.com) Here.