A few weeks ago I attended another public discussion on the (potential) role of evaluation in policy making. The brief conference - basically, a panel discussion followed by a question-and-answers session - was hosted in Berlin by the Hanns Seidel Stiftung and CEVAL, the Centre for Evaluation at Saar University. The panel was made up of German speaking evaluation specialists from Austria, Germany and Switzerland.
Policy uptake of evaluation findings has been a main topic of the International Year of Evaluation 2015– for instance at the Paris conference I wrote about in October. Evidence gathered in evaluations and research is supposed to support political decision making.
Saturday, 12 December 2015
Monday, 23 November 2015
Virtual Workshopping
Earlier this week I facilitated an internal reflection and planning meeting with evalux, a Berlin-based evaluation firm which celebrated its 10th anniversary this year. One of the workshop participants was based in Beijing. It would have been too onerous to fly her over to Berlin, so we found a way to beam her into the workshop via the internet.
I like highly participatory workshops, where people work in alternating configurations –
Wednesday, 14 October 2015
Participatory research!
Wow - read this presentation of participatory research by 16-24-year-old girls and young women in Kinshasa. An exciting piece of work supported by the UK Department for International Development (DFID) and Social Development Direct (SDD). The initiative turns the "objects" of research into researchers. I trust it will yield much richer information than what you would get from a "top-down" externally-designed survey on young women in Kinshasa. And the young people who collect and analyse the information will gather skills, knowledge and strength in the process! I would expect their interviewees to benefit from the process, too.
Governments which fund development want to see "evidence-based" approaches, that is, research needs to be built into development. Fortunately, the widespread misconception that only large-scale quantitative surveys and experiments yield reliable evidence appears to be fading.
Governments which fund development want to see "evidence-based" approaches, that is, research needs to be built into development. Fortunately, the widespread misconception that only large-scale quantitative surveys and experiments yield reliable evidence appears to be fading.
Monday, 12 October 2015
Good things happen in the short term and bad things happen in the long term
This is a long title but I love that
sentence, culled from Elliot Stern’s intervention on the Benefits and Barriers to Evaluation Use at the recent evaluation
conference in Paris. The one-day conference, convened jointly by the European
Evaluation Society, France’s evaluation society, the United Nations
Organisation for Education, Science and Culture (UNESCO) and the Organisation
for Economic Cooperation and Development (OECD), took place at the quite extraordinary
UNESCO headquarters in Paris.
Wednesday, 23 September 2015
My data
A couple of days ago a colleague working on an interesting new e-learning tool invited me to test an initial, yet unofficial version of that tool. I clicked on the link they had sent to me. A screen appeared which asked me for my full name, my e-mail address and my company. Every single field was mandatory, that is, I could not move to the subsequent screen without providing my name, my e-mail address and a company name.
That is a threshold. When you open a book or a newspaper, no-one asks you to send your name, your e-mail address or other personal data. You open the thing and you read it. The publisher can track the number of sold books - to some extent - the places where they have been sold, and that's it. Has anyone ever complained about that?
That is a threshold. When you open a book or a newspaper, no-one asks you to send your name, your e-mail address or other personal data. You open the thing and you read it. The publisher can track the number of sold books - to some extent - the places where they have been sold, and that's it. Has anyone ever complained about that?
Wednesday, 16 September 2015
Interesting debate on evaluating human rights work
Who is evaluation of human rights work for? How about "strategic plausibility" as an evaluation criterion? How do we measure success when protecting civilians in conflict? These are the kinds of questions discussed in this web debate on evaluating human rights work. Very commendable!
Workshops that work: Six essential tips for facilitators
It is delightful to get plenty of positive feed-back on the workshops I design and/or facilitate. A few weeks ago one participant even described the workshop I facilitated as a "once in a lifetime experience"! Since I would love all workshops people attend to be useful, I have started asking participants to tell me what exactly they like about "my" workshops, so that I can share it here. Some of the points below have been made in earlier posts on this blog, others have come up in recent conversations.
Monday, 14 September 2015
10 things to know about evaluations
The UK Department for International Development (DFID) has produced this wonderful short guide for everyone who uses evaluations. Have a look at it and spread it around! It'll make evaluations more useful. The guide also refers to the Better Evaluation site, also highly commendable for evaluators and everybode interested in the topic.
Thursday, 10 September 2015
Emerging Evaluators meeting virtually and in real life
My occasional associate Wolfgang Stuppert is part of the Emerging Evaluators' Network of the European Evaluation Society (EES). He has invited me to broadcast this invitation to the First Virtual Conference for Emerging Evaluators. Here it is:
On 19 September 2015, the First Virtual Conference for Emerging Evaluators will take place. On that day, from 3 pm to 8 pm [Berlin time, I presume], more than 100 emerging evaluators will gather on-line to discuss the bright and not-so-bright sides of their profession.
On 19 September 2015, the First Virtual Conference for Emerging Evaluators will take place. On that day, from 3 pm to 8 pm [Berlin time, I presume], more than 100 emerging evaluators will gather on-line to discuss the bright and not-so-bright sides of their profession.
Tuesday, 25 August 2015
What is a sound theory of change?
The term "theory of change" (ToC) has established itself in the development world. Agencies invite consultants (including your blogger) to facilitate workshops which would help them develop the theory of change for a particular programme; donors ask prospective grantees to come with a sound theory of change; evaluators (including your blogger) bemoan the absence thereof. There are companies which have developed theory of change software and dedicated websites that propose to help you build your own ToC. The picture below shows a fraction of what I get when I ask a popular search engine to find images of "theory of change":
Monday, 20 July 2015
Value for money in training?
In an evaluation of a large initiative designed to help changing social norms on gender-based violence (GBV), I found out that each of the many different organisations involved carried out training workshops. The training participants were mainly for police, staff in health services, religious leaders, social workers and other people who tended to spend no or very little time at universities or other academic venues. That is, the trainees were people who were not used to sitting and listening attentively to complicated presentations.
Yet, most of the training workshops were organised the "academic" way: The trainer would take the audience, seated in neat rows, through a more or less lengthy set of "power point" slides with plenty of text. Often, the content of the slides was highly theoretical, presenting definitions with plenty of terms people would never use in everyday language. The audience would sit and listen and ask the occasional question, if they dared to. People don't want to look stupid - so if a presenter uses lots of big words the audience are not familiar with, changes are that there won't be many questions.
Arguably, that type of training is a waste of time and money. There is a large and growing body of experience on adult education - and education in general - which shows that one of the most effective ways to acquire new knowledge and skills is learning by doing, by solving problems.
Yet, most of the training workshops were organised the "academic" way: The trainer would take the audience, seated in neat rows, through a more or less lengthy set of "power point" slides with plenty of text. Often, the content of the slides was highly theoretical, presenting definitions with plenty of terms people would never use in everyday language. The audience would sit and listen and ask the occasional question, if they dared to. People don't want to look stupid - so if a presenter uses lots of big words the audience are not familiar with, changes are that there won't be many questions.
Arguably, that type of training is a waste of time and money. There is a large and growing body of experience on adult education - and education in general - which shows that one of the most effective ways to acquire new knowledge and skills is learning by doing, by solving problems.
Tuesday, 26 May 2015
Too quick, too dirty
Today I came across the (virtual) file of a "quick and dirty" evaluation I carried out a while ago. That made me feel a bit queasy - because I should have turned down that offer, or advised my clients to go for something radically different. Quick and dirty can be quite wasteful.
Someone had approached me for the job at very short notice - the expectation was that the consultant would start travelling within a couple of weeks. I happened to have time - some other assignment had been postponed - and I found the subject matter interesting. That's why I accepted the job. I was worried about the terms of reference, though. They came with plenty of specific ideas as to how the evaluation would have to be carried out - including requirements to visit three countries, to conduct a survey and to complete the job within some 30 person-days spread across two months. Within that time frame, the relevance, effectiveness, efficiency and sustainability of a project covering several rural areas across three countries was to be assessed. Oh, impact, too, but I talked them out of that.
Someone had approached me for the job at very short notice - the expectation was that the consultant would start travelling within a couple of weeks. I happened to have time - some other assignment had been postponed - and I found the subject matter interesting. That's why I accepted the job. I was worried about the terms of reference, though. They came with plenty of specific ideas as to how the evaluation would have to be carried out - including requirements to visit three countries, to conduct a survey and to complete the job within some 30 person-days spread across two months. Within that time frame, the relevance, effectiveness, efficiency and sustainability of a project covering several rural areas across three countries was to be assessed. Oh, impact, too, but I talked them out of that.
Saturday, 21 March 2015
Rape is not about sex
Every so often, I facilitate gender training. One of my favourite ways to get discussions started is a quiz, inspired by the 1994 classic, The Oxfam Gender Training Manual (Suzanne Williams with Janet Seed and Adelina Mwau). I read a set of statements - for instance, recent research findings related to the field the participants work in - about sex and gender. For each statement, the participants are asked to determine whether it is about sex (as in biologically male, female or a bit of both - not sexual activity) or gender (roughly speaking, the behaviour societies expect from people according to their sex).
The discussions are always interesting. In a recent quiz of that type, I read the statement: "A survey in Botswana revealed that 67% of school girls interviewed had been subjected to sexual harassment by teachers; 10% had consented to sex for fear of reprisals." (That was about a decade ago.)
One participant argued the statement was about sex.
The discussions are always interesting. In a recent quiz of that type, I read the statement: "A survey in Botswana revealed that 67% of school girls interviewed had been subjected to sexual harassment by teachers; 10% had consented to sex for fear of reprisals." (That was about a decade ago.)
One participant argued the statement was about sex.
Thursday, 29 January 2015
A written survey with people who don't read and write
Last year - ah, no, in 2013 - my colleague Wolfgang Stuppert and I carried out an evaluation of services for survivors of violence against women and girls in Mozambique. We felt it was important to gather feedback from many women and girls who used the services. We had only little time in Mozambique and no resources to train enumerators who would interview large numbers of service users.
We decided to organise a written survey. But some service users, we were told, could not read and write well enough.
We decided to organise a written survey. But some service users, we were told, could not read and write well enough.
Subscribe to:
Posts (Atom)