By Noni Mumba
One would think that a 2-day workshop on evaluation would be boring, mentally taxing and just tiresome. Indeed, I, being one of the delegates in that meeting, was uncertain about how it would turn out; after all, some people get squeamish when evaluation discussions start. However, the serene setting that was Sawela Lodges, the scenic gardens, good food and excellent ambiance helped to start us off on a positive note. Which reminds me that I need to rate this great hotel on TripAdvisor.
Prof. Jim Lavery set the tone of the workshop on a high. Starting off with using architecture to describe the evaluation process. Just like putting up a well-designed building, developing engagement approaches and evaluating them is challenging; and I agree. Perhaps before moving forward, I should state here that there is a continuing debate on whether Community and Public engagement are different. For purposes of this blog, I will use ‘community’ to mean both terms. So, back to challenges of evaluating community engagement; Jim (all protocol observed) gave us 2 premises:
(i) that community engagement has many goals, which are sometimes conflicting; there are many ethical issues that go with engagement for research; that engagement must be responsive to the context within which it is being conducted; and,
(ii) that perhaps it is high time we employed the same rigor to engagement, that is applied in science. These sort of grounded our discussions during the workshop.
We then moved on to listening to, and critiquing 4 case studies that were presented during the 2 days (2 on day one, and another 2 on the second day). We got an opportunity to showcase our (KWTRP) wonderful evaluation plan, which got an interesting description: a forest. Meaning? Very good work going on, great mix of wonderful approaches being used; however, that we need to be able to tell our story better. Food for thought for our whole team.
We also got to hear about Malawi’s Exhibitions, Thailand’s Puppet Shows, and Vietnam’s Schools project, all with great evaluation methodologies, and who also got an equal dose of good critiquing. Those who had specific evaluation challenges shared them with a ‘peer group’ for support and input in small group discussions, conducted out in the very lush gardens. What was clear in all these presentations is that we all have such a rich array of engagement initiatives, and are obviously evaluating them and using findings to improve these activities.
As all these discussions and presentations were going on, the matter of which theoretical framework is best suited for our evaluations was discussed. There were some simple and some really complex diagrams describing the famous Theory of Change; and there were those amongst us who were brave enough to use the Realist Evaluation Theory. At a personal level, I guess the Realist Evaluation Theory is something I have to read (perhaps severally) in order to grasp concepts such as CMOs (rhymes with GMOs? I will allow you to find out what this is!)
By the end of the two days, we all appreciated the fact that evaluating community engagement is no walk in the park! “Two or three people can look at one approach, and come to totally different conclusions,” said Jim, which is not a bad thing at all. One other thing that struck me in this workshop was the fact that we must feel comfortable to also share negative findings of our evaluation. Now that’s a tough one to do! But doable.
Generally, the workshop was excellent, in my opinion, fun, and all delegates were just superb. Our great facilitator Robin Vincent ensured that we all participated actively. His vast experience in evaluation of engagement came in handy during this workshop, and we all appreciated his efforts in developing a ‘Pathways to Impact’ diagram, baptized ‘The Snail’. The ‘snail’ and Jim’s ‘architecture’ diagram are perhaps frameworks that we may want to explore how to use, even as we reflect on moving forward our evaluation work in Kilifi.
Looking forward to more interaction on MESH!