Learning from mistakes
Psychologists from ETH Zurich have been studying what prompts mistakes in hospitals and elsewhere, how they can be avoided, and what doctors and nurses can learn from them.
This article has been published in Globe, no.
4/December 2013.
Read the magazine online or subscribe to the print magazine.
The operation is in full swing. Doctors and surgical nurses are working intently; everything is going according to plan. Suddenly, the patient’s oxygen saturation changes and his values start falling. Will the anaesthetist notice in time before there is any risk of complications? If not, will one of his colleagues alert him? Whatever the outcome, how?ever, this particular patient will be fine. Because the team is working on a dummy to simulate an operation that went wrong in real life.
Unaware of the real case, the team is merely instructed to perform a particular operation. The project supervisors transferred all the data from the real patient to the dummy to create an identical situation. Will the result be the same? Will the operation fail?
Occupational and organisational psychologists Gudela Grote and Theo Wehner from ETH Zurich have supervised similar simulations professionally in several projects, and evaluated video recordings of them with the respective surgical teams. Together with the doctors and nursing staff, they are trying to find out what prompts mistakes, how they can be avoided and what we can learn from them.
"It’s everyone’s actions that favour or prevent mistakes, not one individual’s level of exper?tise."Theo Wehner, Professor for occupational and organisational psychology at ETH Zurich
Near-errors are sufficient
In the simulations, making mistakes is absolutely allowed – even if Wehner is convinced that a near-error is sufficient to learn from it – especially as "phew, that was a close one!" is regarded more positively than "I’ve made a mistake." After all, for Wehner making a mistake is all too often still equated with failure. "And in European culture, failure is a big no-no that has to be avoided at all costs." As a result, mistakes are only reluctantly made public – even though others might benefit from it.
Take hospitals, for instance, where mistakes are alarm?ingly common, as was recently revealed by a report from the scientific research institute of the AOK, one of the largest health insurance companies in Germany. It states that around 19,000 people die in Germany every year as a result of medical errors – that’s five times as many as on the roads. Organisational problems, stress, the wrong drugs, infections: the list of reasons is long.
Practising on dummy patients
One fundamental reason, as the psychologists from ETH Zurich have discovered in diverse studies, is a lack of com?munication. Based on simulations, for instance, Grote and her colleagues were able to demonstrate that the perfor?mance of anaesthesia teams depends greatly on their ability to communicate openly with one another and constructively express doubts regarding the performance of colleagues – "speaking up", as the psychologists call it.
Around 30 teams comprising one doctor and one anaes?thesia nurse participated in a study conducted at University Hospital Zurich. They were set the task of anaesthetising dummy patients for an operation and inserting a breathing tube into the windpipe – a routine situation, in other words. As in the case mentioned at the beginning, however, the study supervisors complicated the exercise by manipulating the blood pressure, pulse or respiratory rate. Based on the video recordings, the psychologists from ETH Zurich studied how the participants had communicated, while doctors evaluated the team performance from a medical perspective.
Grote’s research team focused on typical hospital situations such as the following: during an operation, an anaesthesia nurse gets the impression that something is wrong or suspects that the assistant physician is making a mistake. But she doesn’t voice her concerns – either because she daren’t, given her position, or because she fears negative repercussions. The experts observed the same reluctance among assistant doctors towards the senior or chief physician.
Open communication helps
However, one thing is clear from the studies: fewer mistakes occur in the operating theatre in teams that communicate more, and more openly. "It’s everyone’s actions that favour or prevent mistakes, not one individual’s level of exper?tise", says Wehner. In other words, errors normally can’t be blamed on only one person, as we are all too ready to do in everyday life. Even if, as in the above examples, "it was the anaesthetist’s fault" would be the easiest and most obvious conclusion when things go wrong. Evidently, however, it isn’t that simple. It’s the cooperation, the team effort, that is the key to the success or failure of an operation.
It is primarily rigid hierarchical structures that stand in the way of an open error culture. Ever so slowly, however, a culture of destigmatising mistakes and making them public is emerging in medicine, too. For instance, we are starting to see something in hospitals that has long been routine in aviation: "critical incident reporting systems." They enable doctors to report critical incidents anony?mously, which can be viewed by other medical practition?ers so that they can learn from them and avoid similar mistakes in future. Nonetheless, according to Wehner these systems are still few and far between.
Successful mistakes
In certain situations, however, errors can paradoxically also lead to success. Sometimes it is precisely unconven?tional actions that go against all the rules which make someone succeed instead of fail. Take the pilot who defied all the regulations and performed an emergency landing on the Hudson River when his plane developed engine problems shortly after take-off in New York, saving the lives of all 150 passengers on board: today, he is hailed as a hero. If the landing had gone wrong, however, he would have failed at his job and been lambasted for insubordination.
As well as mistakes, there are also misconceptions, as Wehner explains. The former are made by people who actually know better; the latter are made by those who lack the necessary knowhow. To put it another way, if I know how to enter a motorway but suddenly find myself driving down the wrong side of the road, I am making a mistake. When Columbus christened America the West Indies, however, he was under a misconception; he didn’t know any better at the time.
Mistakes are considerably more difficult to understand and analyse. Discovering their causes is one of Wehner’s favourite topics. As an expert, sometimes he spends years beavering away at a case to find out what prompted some?one to act in a certain way. And what might seem utterly incomprehensible at face value often turns out to be some?thing quite plausible in the end: we’re simply human.
Error-friendly technology
Error-friendly technology can "forgive" many human oper?ating errors. This is why, as Wehner points out, collabora?tion between engineers and scholars in the humanities and social sciences is so important. We have to give erroneous ideas more leeway in the development of machines and equipment – which is one reason why Wehner joined ETH Zurich in the first place.
However, despite all our scientific considerations, there is one thing we should never forget, as Wehner sums up: mistakes, misconceptions and thus failure, too, are all part of life. In fact, it is even a privilege to be able to fail: "If I manage to achieve everything straight off, I haven’t got any incentive to change anything and expand my horizons."
And no chance to "fail better" next time. As the Irish author and Nobel Prize winner Samuel Beckett once put it so nicely: Ever tried. Ever failed. No matter. Try again. Fail again. Fail better.