It is active – with 5 million users, 14 million page views per year, and plenty of new edits and content added in the last few days. It’s 7500 articles cover topics such as project planning, project activities, legislation and standards and industry context, It also contains overviews of iconic buildings such as Number 10 Downing Street, the White House, the Palm Atlantis and the Bank of China Tower.
It looks like it uses MediaWiki technology, and is edited by Designing Buildings limited. It has useful features such as a “related articles” box, and a “featured articles” section. It even contains a section on Knowledge Management in construction.
This is a really good showcase for the power of wikis
McKinsey is one of the leading Knowledge Management organisations in the world. Here is how they got there.
Image from wikimedia commons
I have referred to McKinsey a few times on this blog, describing their approach to knowledge centers, and some of the KM roles they have in place. McKinsey are one of the leading firms in KM terms, and a many-time winner of the MAKE awards.
Below, from this case study, I have distilled a roadmap of how they got to their leading position. The case study was written in 2006, but seems to take the story up to the end of the millenium.
1970s, period of slow growth and increased competition for McKinsey, decision to invest in Expertise and strategy. “One Firm” policy. Various informal networks but no attempt to reuse knowledge from one assignment to the next.
1976 – new MD and increased focus on skill development, training and functional expertise. Creation of a standard classification of key tasks.
1980 – Fred Gluck (later to become MD) starts a group focusing on knowledge building. Practice bulletins introduced.
1982 – Creation of 15 “centres of competence” covering business areas such as finance, logistics, and strategic management. Experts appointed as practice leaders, each area with a full-time practice coordinator.
1980s – still no mechanism or process to capture knowledge. COnsultants encouraged to publish and share, but contributions are rare.
1987 – formal KM project set up, with 3 objectives; a database of project details (Practice Information System – PIS), a knowledge base covering the main areas of practice (Practice Development Network – PDNet), an index of specialists and core documents (Knowledge Resource Directory – KRD).
late 1980s – KRD evolves into McKinsey yellow pages, PIS and PDNet populated
1990 – Introduction of client service teams focused on working long term with clients and developing deep understanding of client issues. Staff encouraged to publish books and articles.
1993 – McKinsey spending $50 million annually on knowledge building
1994 – Rajat Gupta becomes MD, strong supporter of KM
1995 – “Practice Olympics” created in order to promote development of practice knowledge.
1995 – establishment of first Knowledge Center in India, focused on provision of research and knowledge to client-facing consultants.
Late 1990s – McKinsey develop a “personalisation” strategy for KM, with a focus on dialogue and knowledge sharing between individuals, and the development of knowledge haring networks. Editors employed to convert client PowerPoint decks into reference documents with quality ratings.
“Our work is founded on a rigorous understanding of every client’s institutional context, sector dynamics, and macroeconomic environment. For this reason, we invest more than $600 million of our firm’s resources annually in knowledge development, learning and capability building. We study markets, trends, and emerging best practices, in every industry and region, locally and globally. Our investment in knowledge also helps advance the practice of management”.
In this video, Ed Hoffman – ex-CKO of NASA talks about the role of the CKO, and the role of KM in major projects. This is an audio interview – the pictures on the video are a loop of screenshots of the major projects knowledge hub – so you could treat this as a commute podcast.
He covers key focus areas for the CKO in project-based organisations
Engagement, particularly with risk and safety management as well as with knowledge
Accelerated learning, especially from other organisations
Conversation (what Ed calls “the sound of success”)
In conclusion Ed sees conversation, digitalisation, engagement and rapid learning as the response to increasingly complex and fast projects and missions.
This video, hosted by SearchContentManagement.com, is a talk given by Rosemary Amato, the Deloitte program director for global client intelligence, during KM World 2011 during which she describes how Deloitte keeps its KM strategy current.
Deloitte surveys their staff to test how people use knowledge, share knowledge and collaborate. Based on the responses, they can make changes to their KM strategy. The interesting thing here is the way they focus on needs of the users.
As Amato says
“We want to understand the people using our knowledge assets; what they want and how they want to work. The end user needs to value how knowledge can serve them, and without this no KM department can succeed. They need to know what knowledge they need, who to call, where to look for it and how to search for it, and most importantly they get an answer that solves their need.”
She also talks about how knowledge sharing is embedded in the way people work, including the need to capture knowledge from departing experts. She describes how one expert worked one on one with younger consultants for 6 months, to share and capture his knowledge.
Taken from this document, here is a great insight into lesson management from Emergency Management Victoria.
Emergency Management Victoria coosrinates support for the state of Victoria, Australia during emergencies such as floods, bush fires, earthquakes, pendemics and so on. Core to their success is the effective learning of lessons from carious emergencies.
The diagram above summarises their approach to lesson learning, and you can read more in the review document itself, including summaries of the main lessons under 11 themes.
They collect Observations from individuals (sometimes submitted online), and from Monitoring, Formal debriefs, After ActionReviews and major reviews.
These observations are analysed by local teams and governance groups to identify locally relevant insights, lessons and actions required to contribute to continuous improvement. These actions are locally coordinated, implemented, monitored and reported.
The State review team also take the observations from all tiers of emergency management, and analyse these for insights, trends, lessons and suggested actions. they then consult with subject matter experts to develop an action plan which will be presented to the Emergency Management Commissioner and Agency Chiefs for approval.
The State review team supports the action plan by developing and disseminating supporting materials and implementation products, and will monitor the progress of the action plan.
This approach sees lessons taken through to action both at local level and at State level, and is a very good example of Level 2 lesson learning.
The need for Knowledge Management and Lesson Learning is most obvious where the consequences of not learning are most extreme. Fire-fighting is a prime example of this – the consequences of failing to learn can be fatal, and fire fighters were early adopters of KM. This includes the people who fight the ever-increasing numbers of grass fires and forest fires, known as Wildland fires.
The history of lesson learning in the Wildfire community is shown in the video below, including the decision after a major tragedy in 1994 to set up a lesson learned centre to cover wildfire response across the whole of the USA.
The increase in wildland fires in the 21st century made it obvious to all concerned that the fire services needed to learn quickly, and the Wildland Lessons Learned center began to introduce a number of activities, such as the introduction of After Action reviews, and collecting lessons from across the whole of the USA. A national wildfire “corporate university” is planned, of which the Lesson Learned center will form a part.
The wildfire lessons center can be found here, and this website includes lesson learned reports from various fires, online discussions, a blog (careful – some of the pictures of chainsaw incidents are a bit gruesome), a podcast, a set of resources such as recent advances in fire practice, a searchable incident database, a directory of members, and the ability to share individual lessons quickly. This is a real online community of practice.
Many of the lessons collected from fires are available as short videos published on the Wildland Lessons Center youtube channel and available to firefighters on handheld devices. An example lesson video is shown below, sharing lessons from a particular fire, and speaking directly to the firefighter, asking them to imagine themselves in a particular situation. See this example below from the “close call” deployment of a fire shelter during the Ahorn fire in 2011, which includes material recorded from people actually caught up in the situation.
Sometimes lessons can be drawn from multiple incidents, and combined into guidance. Chainsaw refueling operations are a continual risk during tree felling to manage forest fires, as chainsaw fuel tanks can become pressurised, spraying the operator with gasoline when the tank is opened (the last thing you want in the middle of a fire). Lessons from these incidents have been combined into the instructional video below.
This video library is a powerful resource, with a very serious aim – to save lives in future US Wildland fires.
As a result of this post I was invited to join a NASA webinar on lesson learning, which you can review here, and which provides a more up to date overview of the NASA approach to lesson learning. Here are my take-aways (and thank you Barbara for opportunity to attend).
Each NASA project is required to conduct lessons capture meetings, which they call “Pause and Learn”. These Pause and Learn meetings generally use an external facilitator. Lessons are entered into LLIS in a standard template, which contains the following sections:
Recommendation(s) (there is some variation in the way that Recommendations are differentiated from Lessons)
Evidence of Recurrence Control Effectiveness
Although LLIS is essentially a passive database, there is an external process to control the re-occurrence of lessons, and many lessons seem to be referenced or referred to in standards and guidance. However even when the lesson has been referenced in standards it still remains in the database, and LLIS contains lessons all the way back to the Apollo program. I submitted a question to the webinar about how NASA deals with the archival of embedded, obsolete or duplicate lessons, but this was not one of the questions selected for discussion.
Some parts of NASA take the lesson management process further. Dr Jennifer Stevens, the Chief Knowledge integrator of the Marshall Space Flight Center, described the work of the distilling team, who look through the database of lessons and distill out the common factors and underlying issues which need correction. They see lessons as an immediate feedback system from operations, and they compartmentalise and group lessons until they can identify a corrective action; often updating a policy or guidance document as a result. Some lessons, which they can’t act on immediately, go into what they call a Stewpot, where they look for trends over time. A lesson, or type of lesson, which is seen many times is indicative of some sort of systemic or cultural issue which may merit action.
Projects are NASA are required to create a Knowledge Management plan, which they refer to as a Lesson Learning Plan, as described by Barbara Fillip, KM lead at Goddard Space Flight Center. This plan documents:
How the project intends to learn from others
How the project intends to learn through its lifecycle
How the project will share lessons with others.
The plan is built on a basic templates of 3 pages, one for each section, and there is no requirement for a planning meeting. Each project completes the plan in their own way. This is similar to the Knoco KM plan – drop me a message if you want a copy of our free KM plan template.
A few more snippets I picked up:
NASA, in their Pause and learn sessions, use “We” language rather than “They” language. The conversation is all about what WE did, and what WE should do, rather than what THEY did and how THEY need to fix it.
A motto they use to promote Learning before doing is “Get smart before you start”. NASA do not refer to success and failure in their Lesson Learning system – they talk about Events. An Event is what happened – a Mistake or Failure or Success is just a label we put onto events. NASA seeks to learn from all events.
In conclusion, the NASA lesson learning system is as well-developed Level 2 system, and lessons are used to systematically drive change. Although LLIS does not have seem to have the functionality to automate this driving of change, there are enough resources, such as the Distillation team, to be able to do this manually.
The RCAF have the following roles and accountabilities, shown in the diagram to the right, and described below:
A senior sponsor, known as the Lessons Learned Command Authority – this is actually the Commander of the RCAF, and is accountable to the Vice Chief of the Defence Staff for implementing and overseeing the Lesson Learned Programme. Note that the Chief of Defence Staff requires the RCAF to establish processes that add value to the existing body of knowledge, or attempt to correct deficiencies in concepts, policy, doctrine, training, equipment or organizations, and the Lessons Learned Programme is one response to this requirement.
A delegated customer/custodian for the Lesson learned program known as the “Lesson Learned programme Authority”. This is the Deputy Commander, who is responsible for all Air Lessons Learned matters, including maintenance and periodic review of the programme.
A leader for the Lesson Learned program, called the Lessons-Learned Technical Authority. This is the Commanding Officer of the Canadian Forces Aerospace Warfare Centre, who reports to the Lesson Learned Programme Authority for lessons-learned matters, and who is responsible for executing all aspects of the programme with the help of a dedicated Analysis and Lesson Learned team.
Clear accountabilities for the leaders of the various divisions in their roles as Lessons Learned Operational Authorities, to effectively operationalize and implement the programme within their command areas of responsibility.
Each of these appoint a Lessons Learned point of contact to coordinate the Lessons Learned activities and functions for their organizations as well as to address issues that have been forwarded along the chain of command.
Wing Lessons-Learned Officers embedded in the organisation at wing and formation levels, who provide Lesson learning advice to the wing commander related to missions and mission-support activities.
Unit Lessons-Learned Officers within the RCAF units who coordinate the documentation and communication of what has been learned during daily activities; liaising directly with their relevant Wing Lessons-Learned Officer. These are like the Lesson Learned Integrators in the US Army.
You can see how accountability for lesson learning comes down the chain of command (the red boxes in the diagram) from the RCAF Commander right down to Unit level, and how enabling and supporting roles are created at many levels – the LL Programme, the Divisional points of contact, the Wing LLOs and the Unit LLOs.
The principle of delegated accountability down the line management chain enabled by supporting resources is a good one, which can be applied in many organisational setting.