My blog has moved to my website evalfacil.eu. Come and look up my latest posts under the "blog" tab over there, at evalfacil.eu!
people-centred development
a notebook about human development and evaluation by michaela raab
Friday, 16 June 2023
Come and see me @evalfacil.eu
Monday, 13 September 2021
Moving to evalfacil.eu
In its 13th year (can you believe it ?!), this blog moves to my new website, on a dedicated blog page. Over the coming months, I will review the content on developblog.org and move some favourites to the new virtual space. And then, in a year or two or three, I will close down developblog.org, this very page you're on right now.
I hesitate to praise Blogger/Google but I can't help being grateful for such a low-threshold space to publish content, keep a (now defunct) library of links to interesting sites, share information on my activities and so forth. It has been fun to check the analytics, realising that my posts have attracted readers from all world regions, from Iceland to the Pacific Islands!
There are much better-resourced online libraries on monitoring and evaluation now. But I will continue writing, on my new blog, because blogging is an excellent way to organise one's thoughts. Please do pay a visit to evalfacil blog and if you have followed me here, start following me there! There will also be an option to share comments.
Many thanks for reading me!
Wednesday, 18 August 2021
Diskussionsforum: Evaluation und Wissenschaftlichkeit (in German)
Am 17. September ab ca. 13:45 veranstalten wir - Bernward Causemann, Ines Freier und ich - ein Diskussionsforum bei der 24. Jahrestagung der Deutschen Gesellschaft für Evaluation (DeGEval). Als erfahrene, mit-allen-Wassern-gewaschene und mit wissenschaftlichen Methoden vertraute Evaluationsgutachter:innen werden wir das Spannungsfeld beleuchten, das sich zwischen der Standardisierung und Normierung von Evaluation (im Sinne einer auf diese Weise verstandenen Wissenschaftlichkeit) und den Ansprüchen an die Nützlichkeit von Evaluation aufgebaut hat, und vertiefte Diskussionen zu diesem Thema moderieren.
Auch dieses Jahr findet die Tagung online statt; es besteht die Möglichkeit, sich für einzelne Tage anzumelden. Unser Diskussionsforum ist Teil der Sitzung D7 (Blitzvortragssession und Diskussionsforum). Mehr dazu auf der Konferenz-Website. Wir freuen uns auf eine interessierte Teilnehmerschaft!
Monday, 29 March 2021
Finally! Thoughtful guidance on applying the DAC criteria
Long-awaited new guidance on applying the evaluation criteria defined by the Development Assistance Committee of the Organisation for Economic Cooperation and DevelopmentD) (OECD-DAC) is finally available in this publication! Long-awaited, because evaluators and development practitioners have grown desperate with assignments that are expected to gauge every single project against every single OECD-DAC criterion, regardless of the project's nature, and of the moment & resources of the evaluation. This new, gently worded document is a weapon evaluators can use to defend their quest for focus and depth in evaluation.
Those who commission evaluations, please go straight to page 24, which states very clearly: "The criteria are not intended to be applied in a standard, fixed way for every intervention or used in a tickbox fashion. Indeed the criteria should be carefully interpreted or understood in relation to the intervention being evaluated. This encourages flexibility and adaptation of the criteria to each individual evaluation. It should be clarified which specific concepts in the criteria will be drawn upon in the evaluation and why."
On page 28, you will find a whole section titles Choosing which criteria to use which makes it clear that evaluations should focus on the OEC-DAC criteria that make sense in the view of the needs and possibilities of the specific project, and for the evaluation process. It provides a wonderful one-question heuristic: "If we could ask only one question about this intervention, what would it be?" And it reminds readers that some questions are better answered by using other means, such as research projects or a facilitated learning process. The availability of data and resources - including time - for the evaluation helps determine which evaluation criteria to apply, and which not. Page 32 reminds us of the necessity to use a gender lens, with a handy checklist-like table on page 33 (better late than never).
About half of the publication is dedicated to defining the six evaluation criteria - relevance, coherence, effectiveness, efficiency, impact, and sustainability - with plenty of examples. This is also extremely helpful. Each chapter comes with a table that summarises common challenges related to each criteri on - and what evaluators and evaluation managers can do to overcome them. It also shows very clearly that lack of preparation on the evaluation management side makes it very hard for evaluators to do a decent job - see for example table 4.3 (p.55) on assessing effectiveness.
The document is a bit ambiguous on some questions: The chapter on efficiency still defines efficiency as the conversion of inputs (...) into outputs (...) in the most cost-effective way possible, as compared to feasible alternatives in the context" (p.58), which makes it extremely hard to assess the efficiency of, say, a project that supports litigation in international courts - interventions that may take decades to yield the desired result. However, the guidance document states that resources should be understood in the broadest sense and include full economic costs. On that basis, one can indeed argue, as Jasmin Rocha and I have on Zenda Ofir's blog, that non-monetary costs, hidden costs and the cost of inaction must be taken into account. Yet, table 4.4 on efficiency-related challenges remains vague (p.61). Has anyone read the reference quoted in the table (Palenberg 2011)? I did and found it very cautious in its conclusion. My impression is that in many cases, evaluators of development interventions are not in a position to assess efficiency in any meaningful manner.
On the whole, I would describe the new OECD-DAC publication as a big step forward. I warmly recommend it to anyone who designs, manages or commissions evaluations.
Für Deutschsprachige: Online Moderieren - Lebendig und Produktiv
Mein online-Workshop zu online-Workshops ist am 26.Mai online! Mehr Informationen und Anmeldungsmöglichkeiten gibt es beim PME-Campus.
Apologies to those who don't speak German - my first workshop on online facilitation will be in German. But if it works out nicely, I might offer sequels in English and in French! A bon entendre, Michaela
Friday, 12 March 2021
Join my workshop on online facilitation in PME
After a year of lockdown-induced life in cyberspace, web-based workshops have become a routine in planning, monitoring and evaluation (PME). Workshops are about exchange, about developing something together. But often, I have witnessed online workshops that were so virtual you hardly noticed the participants. Seemingly endless pages of screen-shared text being read out, word by word, in a soothing voice. No breaks. Confusion about technical refinements, links posted to inaccessible clouds. The loneliness of the person who finds herself alone in the main channel, without any pre-assigned breakout group...
But there are also online workshops that actually work, that engage us in an exhilarating process, and that produce results. They can be more efficient than real-life workshops. And they surely save enormous amounts of CO2 and travel costs. Even though many miss the informal encounters at the coffee machine, over lunch, in the bathroom line (we're in 2021 and people identified as male still enjoy better acess to toilets than the rest of us) - I suspect that online workshops are here to stay.
It takes deliberate planning, adaptive pacing and plenty of participation to make an online workshop work. After more than a year of developing, facilitating and documenting a range of workshops on different platforms, I am distilling key insights into a short workshop for German speakers - online, of course! Not the technical stuff - the providers' video tutorials take care of that - but key principles and ways to apply them. I'd be delighted to meet you. More details in German) are available here.
Wednesday, 30 December 2020
Gender equality in organisations: a good resolution for 2021
Gender equality is a key element of sustainable development – as illustrated in the Sustainable Development Goals (SDGs), which weave gender across virtually all 17 SDGs. It makes sense that 'mainstream' organisations, which are not specialised in promoting gender equality, have developed gender policies and related activities. Where are they at, and what should come next?
Friday, 25 September 2020
More handy tips for videoconferences
An addendum to yesterday's post - ICA:UK, a reliable source of materials and training on highly participatory facilitation, has summarised 10 principles to prevent online fatigue. I've been using all of them. They work.
At any rate, avoid text-filled slide shows with voices droning on in the background! Visual aids are great, but if you just show pages and pages of text that you read to your audience, they'll end up muting you and joining a different event on their other computer. I admit that was what I did in a recent conference full of half-hour text-rich presentations by invisible voices. I couldn't help it. Since the different conference followed the same mode, I still felt I would have been better off reading an article, in my own time, at my own (rather energetic) pace.
Tuesday, 22 September 2020
Easy socialising in tight video conferences
How to recreate a sense of a "real life" team event in a video conference? In real life (IRL, as nerds put it), people usually linger near the coffee/tea kitchen or in the hallway for a quick chat - one reason why it tends to be so hard to get participants back from "real" breakout rooms.
"Random" virtual breakout rooms - if they don't come with too burdensome assignments - can recreate this atmosphere.
Like most facilitators I know, I have facilitated more video conferences in 2020 than ever before. I have discovered that participants tend to hijack virtual breakout rooms: Before getting started on the small group assignment, they'd have an informal chat on totally different subjects. Or, in other cases, they'd get the assignment done as fast as possible so as to spend the rest of the small group chat on their own agendas.
Thursday, 10 September 2020
Know what you need to know
Evaluations often come with terms of reference (TOR) that discourage even the most intrepid evaluator. A frequent issue are long lists of evaluation questions that oscillate between the broadest interrogations – e.g. “what difference has the project made in people’s lives” – to very specific aspects, e.g. “what was the percentage of women participating in training sessions”. Sometimes I wonder whether such TOR actually state what people really want to find out.
I remember the first evaluation I commissioned, back in the last quarter of the 20th century. I asked my colleague how to write TOR. She said, “Just take the TOR from some other project and add questions that you find important”. I picked up the first evaluation TOR I came across, found all the questions interesting and added lots, which I felt showed that I was smart and interested in the project. Then I shared the TOR in our team and others followed suit, asking plenty more interesting questions.
I wonder whether this type of process is still being used. Typically, at the end, you have a long list of “nice to know”-questions that'll make it very hard to focus on questions that are crucial for the project.
I know I have written about this before. I can’t stop writing about it. It is very rare that I come across TOR with evaluation questions that appear to describe accurately what people really want and need to find out.
If, as someone who commissions the evaluation, you are not sure which questions matter most, ask those involved in the project. It is very useful to ask them, anyway, even if you think you know the most important questions. If you need more support, invite the evaluator to review the questions in the inception phase – with you and all other stakeholders in the evaluation – and be open to major modifications.
But please, keep the list of evaluation questions short and clear. Don’t worry about what exactly the evaluator will need to ask or look for to answer your questions. It is the evaluator’s job to develop indicators, questionnaires, interview guides and so forth. She’ll work with you and others to identify or develop appropriate instruments for the specific context of the evaluation. (The case is somewhat different in organisations that attempt to gather a set of data against standardised indicators across many evaluations - but even then, they can be focused and parsimonious to make sure they get high quality information and not just ticked-off boxes.)
Even just one or two evaluation questions is a perfectly fine amount. Anything more than ten can get confusing. And put in some time for a proper inception phase when the evaluation specialists will work with you on designing the evaluation. Build in joint reflection loops. You’ll get so much more out of your evaluation.
Monday, 13 July 2020
My first hackathon #EvalHack
Sunday, 10 May 2020
Five tips for remote facilitation
Sunday, 29 March 2020
Facilitating with care in lock-down situations
Wednesday, 11 March 2020
International Evaluation in Times of the New Coronavirus
Monday, 17 February 2020
Proud to be Rosa Marina Flores Cruz's mentor
(1) What are the main issues that you are currently working on?
I am an Afro-indigenous activist and researcher from the Isthmus of Tehuantepec in the State of Oaxaca, Mexico. I work on topics like rural feminism, environment and energy, and the autonomy and rights of indigenous peoples. I have worked in training projects for community health and human rights promoters, and for the defence of the Right to Free and Informed Prior Consultation.
Photograph by Shirley Kimmayong |