Tacit Knowledge and cognitive bias
Is that really Tacit Knowledge in your head, or is it just the Stories you like to tell yourself?
IMAGINATION by archanN on wikimedia commons |
All Knowledge Managers know about the difference between tacit knowledge and explicit knowledge, and the difference between the undocumented knowledge you hold in your head, and documented knowledge which can be shared. We often assume that the “head knowledge” (whether tacit or explicit) is the Holy Grail of KM; richer, more nuanced, more contextual and more actionable than the documented knowledge.
These biases and failures are tendencies to think in certain ways that can lead to systematic deviations from good judgement, and to remember (and forget) selectively and not always in accordance with reality. We all create, to a greater or lesser extent, our own internal “subjective social reality” from our selective and flawed perception and memory.
Cognitive and memory biases include
- Confirmation bias, which leads us to take on new “knowledge” only when it confirms what we already think
- Gamblers fallacy, which leads us to think that the most recent events are the more important
- Post-investment rationalisation, which leads us to think that any costly decisions we made in the past must have been correct
- Sunk-cost fallacy, which makes us more willing to pour money into failed big projects than into failed small projects
- Observational selection bias, which leads us to think that things we notice are more common that they are (like when you buy a yellow car, and suddenly notice how common yellow cars are)
- Attention bias, where there are some things we just don’t notice (see the Gorilla Illusions)
- Memory transience, which is the way we forget details very quickly, and then “fill them in” based on what we think should have happened
- Misattribution, where we remember things that are wrong
- Suggestibility, which is where we create false memories
Do these biases really affect tacit knowledge?
While you would expect experts in the World Bank to hold a reliable store of tacit knowledge about investment to alleviate poverty, in fact these experts are as prone to cognitive bias as the rest of us. Particularly telling, for me, was the graph that compared what the experts predicted poor people would think, against the actual views of the poor themselves.
- the use of shortcuts (heuristics) in the face of complexity;
- confirmation bias and motivated reasoning;
- sunk cost bias; and
- the effects of context and the social environment on group decision making.
So what is the implication?
- We can test Individual Knowledge against the knowledge of the Community of Practice. The World Bank chapter suggests that “group deliberation among people who disagree but who have a common interest in the truth can harness confirmation bias to create “an efficient division of cognitive labor”. In these settings, people are motivated to produce the best argument for their own positions, as well as to critically evaluate the views of others. There is substantial laboratory evidence that groups make more consistent and rational decisions than individuals and are less “likely to be influenced by biases, cognitive limitations, and social considerations”. When asked to solve complex reasoning tasks, groups succeed 80 percent of the time, compared to 10 percent when individuals are asked to solve those tasks on their own. By contrast, efforts to debias people on an individual basis run up against several obstacles (and) when individuals are asked to read studies whose conclusions go against their own views, they find so many flaws and counterarguments that their initial attitudes are sometimes strengthened, not weakened”. Therefore community processes such as Knowledge Exchange and Peer Assist can be ideal ways to counter individual biases.
- We can routinely test community knowledge against reality. Routine application of reflection processes such as After Action review and Retrospect require an organisation to continually ask the questions “What was expected to happen” vs “What actually happened”. With good enough facilitation, and then careful management of the lessons, reality can be a constant self-correction mechanism against group and individual bias.
- We can bring in other viewpoints. Peer Assist, for example, can be an excellent corrective to group-think in project teams, bringing in others with potentially very different views.
- We can combine individual memory to create team memory. Term reflection such as Retrospect is more powerful than individual reflection, as the team notices and remembers more things than any individual can.
- We can codify knowledge. Poor as codified knowledge is, it acts as an aide memoire, and counteracts the effects of transience, misattribution and suggestibility.
Tags: Archive, cognitivie bias
Leave a Reply