Physician training has very little impact

Jishnu Das writes about research on physician training in low-resource countries. His disheartening conclusion is that the training has very little impact on improving quality of care.

The research was as follows:

Our approach has been to try and decompose the quality of medical advice into two components—what doctors know and what doctors do. What doctors know—measured by testing doctors—represents the maximum care that a doctor could provide. What doctors do—measured by watching doctors—represents the care they actually provide to real patients. We call the first “competence” and the second “practice quality”.

And the depressing conclusion:

In Tanzania we find that two additional years of school and three additional years of medical school buys an increase of only 1 point in the percentage of essential tasks completed. Results are similar for other countries.

Training doctors has been a standard way to improve the quality of health care for years. It’s a major shock to discover this minimal impact. I wonder if the quality of training make a difference? Perhaps competency based training would make a difference.

While this is depressing research, it’s not necessarily telling us things we didn’t already know. If you want to change a physician’s behavior, you don’t just give her training. You change the system she is part of. Good projects health projects recognize that, and so do American HMOs.

Lesson: Don’t try to change individuals, try to change the system they are part of.

Two on Tuesday – Systems Failure

Two on Tuesday is a new feature where I find a couple examples of a phenomenon or issue that I find interesting, and try to learn something useful from them.

I recently ran into two examples of systems failure, both of which offer useful lessons in organizational function.

Example #1 – New Orleans. A community program to identify and report blighted houses gets canceled. Why? Because they never connected the web-based reporting system to the team which investigated. It would have been very simple to synchronize blight investigations with the complaints logged on the web, but it simply never happened. My guess is that the web site was designed by an IT department who had little or no contact with the people who actually did investigations.

Lesson learned: Don’t create a communications interface if you have no way of using the information you get from it.

Example #2 – the FAA. Safety investigator Mark Lund discovered that Northwest airlines mechanics were so incompetent they couldn’t close a cabin door or test an engine. When he tries to ground the planes, the FAA retaliates against him, not the airline. Why? Because the FAA was invested in its role as an agency that keeps American aviation flying, more its role as safety watchdog.

Lesson learned: You can’t be all things to all people. Give your investigators the independence they need to do their jobs right.

I spend a lot of time thinking about systems, and setting them up for success. Nearly as much as time as I spend thinking about behavior change. It’s easy to blame individual people when things go wrong, but we should design important process to help people make the right choices, and to catch errors. No system should ever depend on everybody doing their job right, because human beings just aren’t consistent enough.