Tuesday, 26 May 2015

Too quick, too dirty

Today I came across the (virtual) file of a "quick and dirty" evaluation I carried out a while ago. That made me feel a bit queasy - because I should have turned down that offer, or advised my clients to go for something radically different. Quick and dirty can be quite wasteful.

Someone had approached me for the job at very short notice - the expectation was that the consultant would start travelling within a couple of weeks. I happened to have time - some other assignment had been postponed - and I found the subject matter interesting. That's why I accepted the job. I was worried about the terms of reference, though. They came with plenty of specific ideas as to how the evaluation would have to be carried out - including requirements to visit three countries, to conduct a survey and to complete the job within some 30 person-days spread across two months. Within that time frame, the relevance, effectiveness, efficiency and sustainability of a project covering several rural areas across three countries was to be assessed.  Oh, impact, too, but I talked them out of that.


There was hardly any time for me to study the project documentation; desk study, travel planning and country visits had to happen in parallel. As a result, I knew only some things about the project when devising my interview guides and survey questions. There were plenty of other things I read about too late to build them into the survey, or to contact interesting interlocutors I had not thought of before. Everything and everybody was so rushed. Local counterparts in the first country on the visiting lists had less than a week to come up with an itinerary. I had hardly any influence at all on the selection of my interviewees, as I had just started with the desk review when the interview dates were set. There was no time to mull over the data and over the encounters in the countries, either - the draft evaluation report had to be ready within a week from completing my trips; the final report a week later. Oh, and I think I never quite worked out which were the final versions of the manuals the project was supposed to develop: The person who could have told me which draft was the ultimate one was on leave throughout the evaluation period.

The project came with design flaws which threatened its relevance. Those flaws were quite easy to detect from the written materials that were made available to me, starting from the project proposal. I believe the most useful findings I came up with, at the end of that rushed evaluation, were about those design flaws. I would not have needed to travel at all to see that. 

I trust I could have found out much more, even within the same evaluation budget, if twice or three times as much time could have been made available for the evaluation. I could have read the full project documentation before deciding which sites to visit, who to talk to and what to ask. I could have asked the local counterparts to run a simple survey with the users of their services (see also my post "A written survey with people who don't read or write"), and thus gather interesting information from plenty of people and from all project sites - rather than relying on quick, rather contrived conversations with those who could be drummed up at short notice. Not that anyone would have complained about the quality of my evaluation. I got plenty of friendly feed-back throughout. But a less rushed process would have yielded so many more insights. And it wouldn't have needed to cost more. 

My plea to those who commission evaluations: Don't rush it. Most evaluations are built into the project cycle. Nothing can keep you from planning an evaluation one year before it is actually supposed to start. You will get so much more out of it.

No comments:

Post a Comment