Last week I took my son to the circus. Specifically, a traveling troupe of Chinese acrobats. It was quite clearly the troupe that plays Dushanbe, not the troupe that plays Moscow. They attempted the big stunts, but they didn’t always make it. Spinning plates got dropped, a human pyramid crashed, and one tumbler tumbled right off.
This is what interested me: it didn’t affect the show. They were ready for failure. They had spare plates standing by for quick replacement after droppage. The ribbon twirlers had fresh ribbon at hand in case of tangling. The air acrobatics had truly fantastic spotters. Everyone who fell had at least one person gracefully rush up to soften their fall. They responded to errors so quickly and smoothly that it was like a dance.
Ever since I saw the show, I’ve been wondering how we can build that kind of resilience into development interventions. How can we make sure our errors don’t wreck our work? One thought: maybe ongoing monitoring is the equivalent of those dedicated spotters who saved the falling acrobats. Collecting implementation data will let you know if your human pyramid is going askew, or keep the guy on the springboard from bouncing onto hard ground. Another: you have to be profoundly humble and honest to prepare for failure that way. You have to admit, up front, that mistakes are possible. If your spotters are hiding in the back room, they won’t catch the tumbler in time. You can’t seamlessly replace a knotted ribbon if the new one isn’t right next to you.
It’s a beautiful analogy. Would it be allowed in real life? True, some people do call this industry a circus. But do our donors actually want us to be honest and humble? Would people think we were just incompetent if we visibly prepared for failure? And what, exactly, would preparing for failure look like?
Now picture that top girl falling, and landing in the arms of a costumed spotter