Search on EES

 

The following interview was conducted remotely with the help of yEES! Volunteers Lea Corsetti, Enes Kacatopcu, Esther Winston Ngulwa, and Supun Sandanayaka.

To learn more about the EES Conference and Professional Development Workshops visit: https://europeanevaluation.org/events/ees-goes-virtual-evaluation-in-an-uncertain-world-complexity-legitimacy-and-ethics-6-10-september-2021/

 

1. Please give a short introduction of yourself for any readers who may not already be familiar with you

Alix and Tom met working with CARE International where they both started their careers working in development. While at CARE International UK, they were both staff union reps and also bonded over their mutual love of dogs. They transitioned into consulting practice around 2016, working together on several MEL contracts with a specific thematic focus on complex governance and advocacy programming, mostly with INGOs and funders. Since becoming consultants, they’ve developed specific interest and expertise in theory-based and participatory evaluation methods such as process tracing and outcome harvesting. Alix is based in Vancouver Canada and Tom resides in London, UK.

For more about us and our musings on evaluation, find us on twitter at @traffyaston @alixwadeson and LinkedIn https://www.linkedin.com/in/alix-wadeson-11275452/ and https://www.linkedin.com/in/tom-aston-consulting/

2. Given the Conference’s theme “Evaluation in an Uncertain World: Complexity, Legitimacy and Ethics” in your view, what are the most pressing issues for evaluation?

Our focus and emphasis on programmes that are inherently complex – for example, explaining changes related to women’s empowerment, or how policy change happens. These cases require more in-depth analysis and nuance on how and why change happens, not simply inputs and outputs (people attending trainings or receiving cash transfers). Rather, they require us to dig into a given context to unpack the nature of relationships among key stakeholders and the way this fosters behaviour change (or not). We increasingly learned to appreciate that change processes are often non-linear in nature and require evaluative methods that are fit-for-purpose. Therefore, we needed to use appropriate methods and build relevant expertise in respond to these challenges. The paper we wrote last year, was an attempt to take stock of our learning.

Our workshop on Process Tracing is designed to offer participants a way to address some of these challenges around evaluating complexity in an uncertain world. We’ve learned that it’s helpful to harness the different perspectives of evaluation stakeholders which both add to rigour as well as legitimacy and utility of theory-based evaluations. Additionally, we would argue that these methods are enhanced when they include the participation of the people implementing programmes, including those on the frontline, thereby increasing the legitimacy of both evaluation processes and findings.

3. What are your expectations for the 2021 EES Online Conference?

This is both of our first times attending and presenting at the EES. We are excited to see so many high-quality workshops and sessions available, with peers in the evaluation space who we deeply respect. We’re also interested to see how the virtual platforms will work, since normally we would be attending such conferences in person.

4. Can you give us a sneak peak of what your PDW and sessions will discuss?

For a teaser to the flavour our workshop “Practical Applications of Process Tracing Tools for Complex Interventions” (Sept 6 & 7 at 2-4 CET) and our perspectives, take a peek at Tom’s blog on the basics of process tracing here: https://thomasmtaston.medium.com/miracles-false-confessions-and-what-good-evidence-looks-like-e8a49f7ff463  (See above as well under #2)

For a taste what expect for the session on Contribution Rubrics on Wednesday Sept 8:

https://thomasmtaston.medium.com/rubrics-as-a-harness-for-complexity-6507b36f312e

5. If you could give one piece of advice to young and emerging evaluators, what would it be? 

a) To learn more about evaluation?

Consume content voraciously – there are so many ways to learn about evaluation in today’s world from journal articles, books, to YouTube, to blogs, and podcasts in addition to numerous dedicated evaluation websites, communities of practice, and listservs. Twitter is a good starting point as well – follow thought leaders and organizations in the evaluation space to keep abreast with what’s current and the ongoing debates in the field.

 On both an ethical and practical point, we would argue these resources need to be made more accessible and available for everyone. For example, for people who do not speak English (not a lot of content is available in other languages), and the ‘jargon’ doesn’t always travel well across languages. Secondly, the cost and access to trainings are often limited to those who can afford it or who have organizations to support their capacity building. This is often difficult if you are just starting out in evaluation.

b) To begin a career in evaluation?

Find your mentor – someone who you connect with (genuinely) who can champion you and your career progression in evaluation. Someone you can ask difficult questions, receive advice, and learn from over time. For us, one mentor who supported the foundation for our expertise in process tracing came from working with Gavin Bryce of Pamoja Evaluation Services (follow him on twitter https://twitter.com/pamojabryce).

Build a support network – for example the Canadian Evaluation Society BC chapter (of which Alix is a member), there is a monthly meet-up for emerging and student evaluators that is hosted by experienced evaluators. This has been a great forum for them to make connections, ask questions and received advice. Tom and I have a few evaluators we work with closely not just on assignments, but also to discuss and learn about new trends, debates, and discourse in the evaluation field. Having a group of trusted confidents is very useful at any stage of one’s career but building it early is recommended.

Choose a lane – it’s definitely useful and important when starting out to read across the evaluation spectrum, and to be aware of the different types of evaluation methods and how they work (for instance, how inferential and multi-regression statistics work). However, at some point early on in one’s evaluation career, it’s critical to decide what excites you and then use that to narrow in on the methods and types of programs (thematic/sector) that you intend to work on. There is so much out there in the evaluation field, one cannot be an expert at everything. So, decide on where to prioritize the investment of your limited time and energy to become adept and build expertise in a few methods and types of evaluation.