Leaving lessons in a lessons database doesn’t work – an example from NASA.
NASA found out the hard way that just collecting lessons into a database is not enough.
Image from wikimdia commons |
5 years ago, NASA conducted an audit of lesson-learning. At the time, NASA spent 750,000 annually on their lessons learning approach, centred around a tool called LLIS (The Lessons Learned Information System). NASA was at the time (and still is) one of the worlds leaders in Knowledge Management, and they wanted to know if this money was well spent, and if not, what could be done (note of course that lesson learning is only a part of NASA’s KM approach, and thanks to Barbara Fillip for bringing me up to speed).
According to the levels of use and application found by the auditors 5 years ago, there was plenty of room for improvement in lesson-learning. Specifically –
“We found that NASA program and project managers rarely consult or contribute to LLIS even though they are directed to by NASA requirements and guidance.
In fact, input to LLIS by most Centers has been minimal for several years. Specifically, other than the Jet Propulsion Laboratory (JPL), no NASA Center consistently contributed information to LLIS for the 6-year period from 2005 through 2010.
For example, the Glenn Research Center and the Johnson Space Center contributed an average of one lesson per year compared to the nearly 12 per year contributed by JPL …..
Taken together, the lack of consistent input and usage has led to the marginalization of LLIS as a useful tool for project managers”
With minimal contributions (other than at JPL), and with rare consulation, then this system was just not working.
Why did it not work?
The project managers that were surveyed offered a variety of reasons for not using or contributing to LLIS, including:
- A belief that LLIS is outdated, and is not user friendly
- A belief that LLIS does not contain information relevant to their project
- Competing demands on their time in managing their respective projects.
- Policy Requirements have been weakened over time.
- Inconsistent Policy direction and implementation.
- Lack of Monitoring.
Levels of lesson learning
I described, earlier this year, 3 levels of lesson learning, and the approach reviewed by the auditors is Level 1 – reactive capture of lessons in the hope that others will review them and learn from them.
Ideally any organisation should aim for level 2 – where lessons lead to changes in designs, practices or procedures. A lesson is therefore an increment of knowledge, and those little increments are used to build an ever-improving body of knowledge. Once the lesson has been embedded as a practice change, or a design-principle change, or a change in a checklist, then it can be removed from the database.
Ideally the NASA database would be empty – all lessons would be incorporated in some body of knowledge somewhere. The only lesson in the system would be those pending incorporation.
If this system works well and quickly, then there should be no need for anyone to consult a lessons database – instead they should go to the designs, the checklists, and the design principles.
By relying on a Level 1 lesson learning system, NASA were already making things difficult for themselves.
Tags: Archive
Leave a Reply