I just spent three days in a training on data use. The trainer made a distinction between information and data. Data is the stuff you collect – raw numbers and observations. Information is what data turns into after you analyze it. Information is stuff you can act on.
The distinction affects most of what we do. I’ve written about this before, but monitoring and evaluation is a constant struggle to actually use the data we collect. Your indictors are useless if you don’t know what their results mean for your program.
It’s also the reason I get less excited than other people about crowd-sourcing data tools. Trues, at times we have a genuine shortage of data. But we always have a shortage of information. Adding crowd-sourced data doesn’t fix that unless it comes with the analysis to make it information.
When we talk about evidence-based medicine, or evidence-based policy, the same things come up. How does a physician use a new study to guide his clinical practice? If a Ministry of Health official reads a report on urban health, what should she do next?
Sometimes, it is clear who should turn data into information. In any project or intervention, the person(s) responsible for monitoring and evaluation should translate monitoring data into something that can be acted on. A crowdsourcing project, though, may have no plan from processing or analyzing data; they may just make the dataset available for others to analyze.
For health care providers, it’s more difficult. When study authors include practice recommendations in published papers, they can’t they hope to cover every medical specialty and client population. Sometimes professional associations step in, developing practice guidelines. In publicly funded systems, the government can development treatment regulations. Sometimes outside organizations like the Cochrane collaboration get involved.
And for policy? Well, think tanks try. And lobbyists, advocacy groups, industry collaborations, trade associations, and dozens of others. We expect, somehow, that government officials will weigh it all and make the best choice. Does that work? Your guess is as good as mine.
(yes, I am an enormous geek)
Good point indeed; example from development water projects in Africa: many NGOs are now discovering that Africa is full of broken hand pumps. But in stead of actually doing something about this, they start gathering DATA about where all these broken hand pumps are located. These data is now just used to get more funding to INCREASE the problem, using again the same fragile water pumps in stead of using more reliable water pumps.
The poor in Africa now PAY A FORTUNE to maintain the fragile water pumps that donors imposed on them. The poor have no voice in choosing better durable water pumps. Now, they spent most of their income to keep these fragile India pumps running. About 50.000.000 US$ per year in Africa alone is wasted in keeping fragile India pumps running. Good business for those who profit, bad news for the poor families that have to pay this every year.
Gathering DATA on broken pumps alone also clearly misses this crucial & cruel point of what people have to pay, capacity to pay and willingness to pay. Clearly important issues. Our studies show that poor communities often cant pay and don’t want to pay for unreliable water pumps, but often have NO CHOICE. As long as also these hidden data is not used as INFORMATION to actually tackle the problem, things will never change for the poor in Africa that need reliable water pumps.
And the poor get more poor & the aid business gets more business. Obama, Bill Gates, etc. how can you let that happen?.
FairWater Foundation has a solution that is more fair to the poor; which is rather simple: rehabilitation of fragile water pumps with durable water pumps like the BluePump, that can be maintained at low cost.
See also our website fairwater-org
I agree. This is probably where we need to be ruthless! Do we even need all the indicators that we are told to include? I have been involved in programmes where community based volunteers/staffs who are paid a basic honorarium collect the MIS data. It is important to have realistic expectations – there is only so much that they can do! It is unfair to burden them with additional formats, especially if they can use the time more effectively. Also, there are organisations that invest in specialised packages/software for data analysis. These provide reports including follow up lists. But, again, these outputs are often not used as much as they should be. It is much more important to prioritise what data we need and how we translate that into information and use that information in our current and future initiatives.