Search on EES

Conducting a Remote Evaluation: Insights from the Front Line

https://www.khulisa.com/conducting-a-remote-evaluation-insights-from-the-front-line-part-1/ 

https://www.khulisa.com/conducting-a-remote-evaluation-insights-from-the-front-line-part-2-its-a-new-world-but-not-everywhere/ 

The arrival of remote working has upended professional lives around the world and evaluators have not been spared.  In this two-part series, the authors explore the difficulty of remote evaluation and the wildly different forms that these evaluations are often required to take.  In many cases, in-person evaluations were abruptly taken online while others were planned and implemented as remote from the outset.  In any case, the articles lay out practical tips and technologies that every evaluator should be familiar with if they are to manage expectations, maximise the opportunities and minimise the costs of conducting their work.  It is also useful in that it provides examples of the reaction of those involved in and studied by the evaluation – some reluctant, some positive. The second article in the series deals with the challenges faced by a specific evaluation of a nutritional program in Zambia and the problems it faced in disseminating its findings. It is particularly relevant to evaluations working in regions with poor internet connectivity.   

Evidence impact: Improving education worldwide through the use of systematic review evidence

https://www.3ieimpact.org/blogs/evidence-impact-improving-education-worldwide-through-use-systematic-review-evidence 

This brief entry in 3ie’s Evidence Matters blog is a useful refresher on the role of evidence synthesis in situations of data abundance, which is an increasingly common problem for evaluators. Taking the example of an influential education-based systemic review, the blog reads like a proof of concept that such reviews can achieve wide policy impact, being explicitly credited with informing debate and spurring further analysis.  In that sense it serves as an introduction tot 3ie’s commendable online knowledge management: its evidence impact summaries, part of its broader Evidence Hub, are an invaluable resource to evaluators interested in proving the value of evaluation to peers and confirming the continuing impact labour-intensive research on high-level policymaking.   

Addressing social norms in development programmes – Learnings from the WISH programme

https://www.itad.com/article/addressing-social-norms-in-development-programmes-learnings-from-the-wish-programme/ 

One of key challenges facing evaluators is increasing pressure to tackle new concepts for which there is little precedent in professional evaluation and even less agreement on best practice.  One such concept is social norms, which undoubtedly play an important role in the outcomes of many interventions, but often confound rigorous analysis.  This blog is a useful attempt to gather together some practical tips in dealing with the issue, drawing on the experience of reviewing the Women’s Integrated Sexual Health programme (WISH) conducted by the UK  Foreign, Commonwealth and Development Office (FCDO).  The blog assumes a deliberately pragmatic position, eschewing the long-winded abstract theorising that too often burdens discussion of social norms, and prefers affordable, understandable and rapidly implemetable actions.  These include: an easily-communicable definition of a social norm, the adoption of audio-visual formats for informing participants, and the replacement of static theory with interactive, practical guidance such as checklists.

Assessing climate-smart farming: a new framework

https://www.evalforward.org/index.php/blog/CSA 

In the last decade, M&E has seen the emergence of a rapidly developing cohort of technical frameworks to assist evaluators.  This blog provides a primer into one of these – the Climate-Smart Agriculture (CSA) Farm Sustainability Assessment Framework being developed by the UN’s Food and Agriculture Organisation (FAO). Although a complicated piece, the blog could be useful to those following ongoing efforts to mainstream climate-sensitivity into evaluations and uncontroversially reconcile it with other key metrics. As a step towards a more standardised approach in which comparability of results between projects, communities and countries is made easier, the CSA could be an important contribution.

Better data on agricultural innovations: a journey into the ethiopian socioeconomic survey 

https://cas.cgiar.org/spia/news/better-data-agricultural-innovations-journey-ethiopian-socioeconomic-survey 

This blog, which summarises “Shining a Brighter Light: Comprehensive Evidence on Adoption and Diffusion of CGIAR-Related Innovations in Ethiopia, is a welcome addition to the growing literature on research impact assessment. It is commendable especially for its focus on collecting high-quality cross-sectional data in challenging conditions.  Indeed, the report provides a standard against which similar assessments, particularly in the field of agricultural research, can be measured. Among its highlights: close analysis of the socio-economic characteristics of adopters, exploration of synergy between different innovations, and extensive mapping of where adopters live and work over an extended period.  

And of course, don’t miss EES’ interviews with Conference Presenters!

https://europeanevaluation.org/2021/08/26/ees-goes-virtual-evaluation-in-an-uncertain-world-complexity-legitimacy-and-ethics-6-10-september-2021-an-interview-with-alix-wadeson-and-tom-aston/

https://europeanevaluation.org/2021/08/26/ees-goes-virtual-evaluation-in-an-uncertain-world-complexity-legitimacy-and-ethics-6-10-september-2021-an-interview-with-juha-uitto-director-of-the-independent-evaluation-office-of-the/