Dr Jayant Patel, Butcher of Bundaberg, was a foreigner. What a relief! It gives us a nice distant target for our horror and outrage. Memo to ourselves: foreigners can be sneaky and we should check their credentials better. Case closed.
But, though we could and should have prevented Dr Patel’s rampage altogether by checking his credentials (like we do with vets!), the bigger scandal is that internal systems didn’t detect his rampage within a few months if not weeks.
Finding and punishing the Dr Patels of this world is just fine with me. But it would be a whole lot better - literally more than ten times better - if we also used their outrages to radically improve hospital quality control.
Advertisement
Better information systems are the start. In the early 1990s experts in the New York State hospital system meticulously built a clinical database of cardiac by-pass surgery. All events were “risk rated” so that outcomes were related to operations’ “degree of difficulty”.
It turned out that 27 surgeons who only occasionally performed cardiac surgery were performing poorly. They were moved out of cardiac heart surgery to specialise and so improve their own performance elsewhere. A win-win.
One well-regarded hospital had mysteriously high mortality rates for emergency heart patients. The data revealed the anomaly and some simple detective work discovered the cause which was remedied with new procedures. The change has saved around 11 lives a year since. The whole system saw a 41 per cent decline in mortality rates over 3 years.
In Australia, a quite different program based on similar principles yielded similar benefits at the Wimmera Base Hospital in Horsham Victoria.
The Royal Commission should not shy away from holding responsible those who might have stopped Dr Patel long before he fled. Yet examples like New York State’s experience shows that this is a side show compared with improving the performance of the systems under which all the well-motivated members of our health professions work.
Some of the most successful initiatives have been surprisingly similar to the techniques the Japanese introduced into manufacturing in the 1970s and 80s. Firms like Toyota dramatically ramped up their productivity around a cluster of simple but subtly revolutionary ideas, based ultimately on the idea that they were not making things so much as structuring a system in which people controlled and constantly revised and optimised the complex system of which they were a part.
Advertisement
Central principles included:
- When given the choice and appropriate encouragement, people prefer to work well rather than to shirk.
- Given that complex systems are difficult to manage with surveillance from above, setting people to work and solve problems in teams helps unleash creativity and makes bad behaviour more difficult - because well motivated groups police their own members.
- In this context, fear and punishment must be driven out of the workplace, so that people can be motivated to identify and fix problems instead of watching their back and passing the buck.
- Systems - particularly systems of control and information - should be built not so much to assist management direct or maintain surveillance of workers, but to assist teams of workers to improve the quality of their work.
In the American State of Utah similar principles appear to have dramatically improved the clinical quality of their hospital system. They drive out fear by encouraging practitioners to report all adverse incidents within 48 hours in return for immunity from legal liability for negligence.
The culture of safety that this engenders generates far more information about adverse events to be analysed and encourages professionals constantly to improve and optimise their own systems and performance.
The Utah system is predicated on the idea that well over 90 per cent of adverse events arise from systems that can be improved rather than from individual idiosyncrasies and inadequacies - let’s face it we all have those!
So this system reduces errors far more effectively than a punitive approach based on identifying individual wrongdoing. Indeed, it turns out to be much better at detecting rogues. Not only do generally lower accident rates and better information systems mean that rogues stand out like a politician at Gallipoli on Anzac Day. But there’s also a virulent culture of identifying problems and fixing them.
And, just as the Japanese discovered building cars, better quality needn’t cost money. Getting it “right first time” saves squillions in rework and all the disruption that goes with it. It also facilitates constant improvement further down the production line.
If we’d had such a system in Bundaberg we’d have prevented most or all of the outrages of the Dr Patels. But we’d also prevent over ten times more problems arising from mundane errors by well-intentioned and well-credentialled professionals working in systems that could be improved out of sight.
Building such a system would be the most fitting monument to the victims - alive and dead - of the Butcher of Bundaberg.
Discuss in our Forums
See what other readers are saying about this article!
Click here to read & post comments.
2 posts so far.