As you may have noticed, monitoring and evaluation is a topic near to my heart. One thing I’ve noticed is that we repeat the same errors, over and over and over. I’ll elaborate on these in my next three posts, but for now, I will tease you with some lists:
The Top Three Monitoring and Evaluation Mistakes Experienced NGOs Make
- Using the same indicators they’ve always used, even as projects change
- Too much evaluation, not enough monitoring
- Leaving M&E up to the M&E team
The Top Three M&E Mistakes New NGOs make
- Choosing really great indicators that are nearly impossible to measure
- Confusing a program with an RCT
- Focusing on the donor’s data needs when choosing indicators
The Top Four M&E Mistakes Everyone Makes
- Too many indicators
- Not focusing on data use
- Too many process indicators, not enough impact indicators
- The IKEA effect
Ten Common #Monitoring and #Evaluation Mistakes by @alanna_shaikh: http://t.co/KQ4uQhBZ #indicators #measurement
Over and over… http://t.co/lgNc4nZ5
Yep, done this: “Too much evaluation, not enough monitoring” – @alanna_shaikh teases upcoming posts on M&E mistakes
http://t.co/zqmCWZeE
@alanna_shaikh – Ten Common monitoring and evaluation mistakes Part I http://t.co/wg82LlSC #MandE
Ten Common Monitoring and Evaluation Mistakes http://t.co/mQQ5OXQ7 via @alanna_shaikh #M&E
Ten Common Monitoring and Evaluation Mistakes via
@alanna_shaikh
#impact: http://t.co/piQ9xr1c
A few more for the “everyone makes” list:
* Starting to think about the design of the evaluation at the end of a programme rather than the beginning
* Wanting to isolate the effects/impacts of a given NGO programme rather than seeing it as part of a broader set of inputs/influences (focussing on attribution rather than contribution)
* Forgetting that the origin of the word Indicator is “indicate” (and not “prove”)
Simply put and accurate! RT @mngreenall: Ten Common Monitoring and Evaluation Mistakes http://t.co/xyaR5sYG via @alanna_shaikh
+1 real data use @GdnGlobalDevPro: common monitoring and evaluation mistakes http://t.co/vcqMa5Nr #globaldev HT @davenportsteve @gml2001
Ten Common Monitoring and Evaluation Mistakes – @Alanna_Shaikh http://t.co/I4ZikVxy
Ten Common Monitoring and Evaluation Mistakes http://t.co/Aoe4EGTz vía @alanna_shaikh
The data movement is only as good as the data. Ten Common Monitoring and Evaluation Mistakes http://t.co/2MmZEyz3 via @alanna_shaikh
Great post! In addition to yours and Matt’s:
* Confusing project design with proposal writing.
* Ignoring time and resource implications of what goes into a logframe, i.e. a logframe needs a data collection plan to be functional.
* Increasing reporting requirements on “implementing partners” without giving them the necessary support.
* Focusing so much on the the purpose of M&E for accountability that it fails to result in improved programming.
* Giving in to the sector’s increasing desperation to “know” by counting widgets rather than focusing on qualitative (and messy) indicators such as ownership.
10 Common Monitoring and #Evaluation Mistakes from @alanna_shaikh http://t.co/EVhuH6sM #MandE #globaldev #aid #NGOs
USG agencies make these mistakes, too. via @alanna_shaikh Ten Common Monitoring and Evaluation Mistakes http://t.co/YEyizn7A
Ten Common Monitoring and Evaluation Mistakes | Blood and Milk http://t.co/xIQt8bMJ
Ten Common Monitoring and Evaluation Mistakes http://t.co/zHPikCMd via @alanna_shaikh #globaldev #evaluation #fail @DGateway @AidData
Ten Common Monitoring and Evaluation Mistakes http://t.co/C5Zby3vW via @alanna_shaikh
@alanna_shaikh Great post on top 10 M&E mistakes – the last project I visited made 6 of them. http://t.co/3BjhjBzY
Very recognizable RT @alanna_shaikh Great post on top 10 M&E mistakes http://t.co/7m2Swrc7 (via @patriciajrogers)
Ten common monitoring and evaluation mistakes http://t.co/P1XeAADi
Ten Common Monitoring and Evaluation Mistakes – Blood and Milk http://t.co/dFnuoh1Q
Ten Common Monitoring and Evaluation Mistakes | Blood and Milk via @alanna_shaikh http://t.co/N9Joyd1a
Reading this blog entry made really interested to get to know more. I hope you still intend to elaborate on this list?
Great post. I agree with most of what has been said but I suggest that you substitute “outcome” for “impact” in: “Too many process indicators, not enough impact indicators.” Projects rarely last long enough to demonstrate meaningful impact (long term/sustained change) but outcomes – changes in state or behavioural change should be demonstrable within the project lifetime. People often talk about demonstrating “impact” when they mean “outcomes”. A linguistic compromise could be to talk about “outcomes towards impacts” which makes it explicit that we are ultimately concerned about impact even if we cannot meaningfully measure it until after an intervention is over.