Mistakes

From Citizendium
Jump to navigation Jump to search
This article is developing and not approved.
Main Article
Discussion
Related Articles  [?]
Bibliography  [?]
External Links  [?]
Citable Version  [?]
 
This editable Main Article is under development and subject to a disclaimer.

Avoidable mistakes have been responsible for the loss of billions of dollars and thousand of lives. Psychological experiments have thrown light on their causes, and other studies have suggested methods by which their incidence may be reduced.

Definition

With hindsight, any decision that has unintended consequences may be considered to have been a mistake. It is more useful, however, to define a mistake as a misguided decision, whatever its consequences. The unintended consequence that is to be expected of a misguided decision may be avoided by chance; and chance may result in an unintended consequence that could not have been anticipated. A driver may escape a collision when he crosses a busy junction against a red light. He may also be the victim of someone else who does so - but in that case, his decision to cross the junction legally cannot usefully be considered to have been a mistake, despite its unintended consequence.

Overview

A practical outcome of studies of mistakes has been the discovery of ways of avoiding them. Experimental psychology has revealed the existence of characteristics of the human brain that are conducive to misguided choices, but it has also revealed the possibility of conscious control over those subconscious tendencies. Other studies have explored the effects of organisations, organisational environments and administrative procedures. A number of measures have been adopted or proposed for the avoidance of mistakes.

The causes of mistakes have been categorised as misinformation, the misinterpretation of information, and decision errors.

Misinformation

Misinformation by journalists, politicians and businessmen is widely believed to be commonplace [1]. Misinformation in the form of "creative accounting" is a common feature of company accounts, and it has not been uncommon for the demise of a company to follow closely upon a report of its good health. Outright deception (as reported to have been practised in 2008 by the management of the Lehman Brothers bank in 2008 [2]) has seldom been established, but is widely suspected. The professions are generally believed to be trustworthy, but there have been numerous cases of inadvertent expert error. Cases involving the error known as the prosecutor's fallacy by expert witnesses and lawyers, have resulted in serious miscarriages of justice[3], and there is evidence to suggest that doctors fall victim to an error known as the false positive fallacy when evaluating blood test results[4]. Misinformation in the form of ill-founded forecasts by credit rating agencies has been revealed as one of the causes of the recession of 2008-11[5].

Misinterpretation

Some errors of interpretation by decision-makers have been due to ignorance or inattention, but many have occurred subconsciously. There have even been cases of failure to recognise visual evidence. . Experiments have demonstrated that the brain may make its aware of only a censored version of the information that it receives [6](graphically demonstrated in a video[7] - don't miss it !). The interpretation of inputs may also be hampered by "cognitive dissonance", which is the psychologists' term for the conflict that is experienced when existing beliefs are contradicted by new information [8]. It has been found that humans are endowed with a subconscious drive to avoid that unpleasant sensation that is almost as strong as the sex drive, and which can prompt decision-makers to discount, or ignore, unwelcome information. An observed tendency to perceive causal connections when none exist has been attributed to attempts of the brain to relate observations to experience. (An example of its distorting effect upon decision-making is said to be the widespread use of the Rorschach inkblot test by clinical psychologists, despite extensive evidence that it is worthless[9]).

Decision error

The distinguished cognitive psychologist Massimo Piattelli-Palmarini has identified barriers to rational decision-making that have the following characteristics[10]

  • they are found in all who have not been trained to avoid them, including acknowledged experts in the matter under consideration;
  • they are innate and thus distinct from ordinary errors of judgement;
  • they affect decisions in a variety of different situations.

The cognitive illusions that he have been identified by Professor Piattelli-Palmarini and others include:

  • tunnel vision, which is tendency to give exclusive attention to what is immediately apparent:
  • loss aversion, which is a greater sensitivity to the danger of loss than to the prospect of gain;
  • omission bias, which is a reluctance to act when inaction is more harmful;
  • probability blindness which is the inability of the brain to make intuitive estimates of probability;
  • accountability bias, which is a tendency to give weight to the effect of a decision upon the repute of the decision-maker:

Experiments by the psychologists Daniel Kahneman and Amos Tversky[11] have provided evidence of the existence of cognitive illusions and have drawn attention to a tendency to adopt unreliable short-cuts (or "heuristics") as a substitute for logical analysis:

  • the availability heuristic is the tendency to choose the more familiar of two alternatives;
  • the simulation heuristic is a tendency to choose the more readily imagined of two explanations;
  • the representative heuristic is a tendency to base judgements on recollections of situations deemed to be similar.

Avoidance

The discovery of ways of exerting conscious control over the behaviour of the brain (termed "neurofeedback") [12] suggests that there is a prospect of overcomong the subconcious barriers to rational decision-making. Other work suggests that many accidents could be avoided by removing the administrative and cultural obstacles that prevent organisations from "learning from their mistakes". Among the barriers that had been found to be restricting the ability of Britain's health service to avoid the recurrance of past mistakes were:- [13]:

  • an undue preoccupation with the immediate event rather than the root cause of failure;
  • a tendency to "scapegoat" rather than addressing organisational failings;
  • the formation of alliances whose members agree to cover for each others mistakes and act defensively against intervention from outsiders.

To remedy those barriers the British health service introduced a confidential reporting system [14] and there was a similar development in the United States [15]. Other organisational studies have shown that small groups make fewer mistakes than their members would have made on their own[16], but that there is also evidence that they tend to supress minority views and adopt extreme positions - termed "groupthink"[17]. However, the problem of groupthink can be countered by increasing the diversity of function and educational background among group members[18].

References

  1. Ipsos-Mori Veracity Index 2008
  2. Anton Valukas: In re Lehman Brothers Holdings Inc, report to the United States Bankruptcy Court, March 2010
  3. Evaluating Legal Evidence, Department of Computor Science, Queen Mary College, University of London, 2009
  4. Michael Eysenck and Mark Keane Cognitive Psychology page 567 [1](Google extract)
  5. Hearing on the Credit Rating Agencies and the Financial Crisis, Committee on Oversight and Government Reform, United States House of Representatives, October 22 2008
  6. Research at the University of Illinois Visual Cognition Lab
  7. Christopher Chablis and Christopher Chabris and Daniel Simons The Invisible Gorilla, 2010
  8. L A Festinger: Theory of Cognitive Dissonance, Stanford University Press, 1957
  9. Robin Dawes: "Giving uo Cherished Ideas: The Rorschach Test, Journal of the Institute for Psychological Therapies, 1991
  10. Massimo Piattelli-Palmarini: Inevitable Illusions, John Wiley & Sons, 1994, page 140
  11. Daniel Kahneman, and Amos Tversky (eds): Judgement Under Uncertainty, Cambridge University Press, 1982
  12. D. Corydon Hammond: Comprehensive Neurofeedback Bibliography: 2007 Update, Journal of Neurotherapy, Vol. 11(3) 2007
  13. An Organisation With a Memory, Report of an expert group on learning from adverse events in the UK's National Health Service, The Stationery Office Limited, 2000
  14. National Reporting and Learning Service, National Health Service, 2009
  15. To Err is Human: Building A Safer Health System, US Institute of Medecine, 1999
  16. Alan S. Blinder and John Morgan, Are Two Heads Better than One?: An Experimental Analysis of Group vs. Individual Decisionmaking National Bureau of Economic Research, September 2000
  17. Cass Sunstein: The Law of Group Polarisation, University of Chicago, 1999
  18. Mannix and Neal: Diversity at Work, Scientiic American Mind, August/September 2006