Evaluating under fire: Managing impact during coronavirus

These are tough times for our sector. Voluntary organisations are having to adapt rapidly to the coronavirus pandemic. Many are operating in totally different ways or shifting to provide new services. Other organisations are struggling to survive.

Given these constraints, now may not be the time to use traditional impact and evaluation approaches that often look at what programmes have achieved against long-term plans, and which assume the operating environment remains relatively unchanged during the evaluation. It may be necessary to put some evaluation on hold, if it can be done without causing harm.

However, rather than abandoning evaluation totally, now is the time to find ways to do it differently, to meet our own and our users’ needs. Evaluation is a flexible tool that can be adapted to any situation.

Thanks to our colleagues at NPC and Inspiring Impact for contributing ideas to this blog post.

Use evaluation for decision-making

Many of us are having to make tough decisions, rapidly, about how to prioritise resources or develop new services. Where possible, decisions should be based on the best information available, with real-time data being key. We might learn from developmental evaluation approaches, which focus on informing decision-making and supporting adaptation (our January impact roundup has more on this).

Focus your key questions on what is essential for you to know for rapid decision-making. What data do you need to help you adapt? Evaluation can help you assess the immediate needs, adjust services accordingly and then check the extent to which you are effectively meeting them.

Adapt your data collection

It may not be possible to collect data of the best quality when testing out new ways of working or new interventions. Impressionistic and qualitative data, collected quickly from a small sample of your key audiences, may be all that is possible – but it may also be good enough for rapid decision making. Just be honest about its limitations and how it was collected.

Basic monitoring data you could collect includes data on the people you are, or are not, reaching, and how they are engaging with you. Seeking user feedback is still important; now is the time to really engage with your users or target audiences, and find out what’s working and what’s not as you deliver your work in new ways.

When we emerge from this crisis, if ongoing monitoring and evaluation has not been possible, you might consider retrospective evaluation. There are a number of robust qualitative methods (for example process tracing or most significant change) that can be done retrospectively. But simpler tools like after action reviews (tools for supporting group learning after a piece of work) can be highly effective if resources are constrained.

Voluntary organisations may find it useful to use monthly or even weekly team meetings to share what they have learned on key questions such as:

  • who is most in need, and what do they need?
  • how can we best meet this need with our resources?
  • who are we reaching/not reaching?
  • what feedback have we had?
  • what might we do differently?
  • who do we need to collaborate with?

Social distancing means we can’t usually collect face-to-face data, but phone interviews are very effective and collecting data online is easier than ever. At NCVO Charities Evaluation Services we have used video conferencing apps for interviews for some time – many of them are free and available to anyone with the internet (eg Skype, Zoom, Facetime, Google Hangouts). Of course, these methods assume respondents have access to hardware, so don’t forget those who are harder to reach.

Remember that data collected under these circumstances may not be totally reliable or comparable to that collected six months ago. People’s behaviours and views may be very skewed. The way you deliver your work may have changed significantly, and therefore any outcomes may also be different.

Evaluate new ways of working

This may not be the time for planning processes like theories of change – these can be time-consuming and require at least some stability in context and delivery. However, it’s still important to identify intended changes for your beneficiaries as well as considering others that may be affected by change. Agreeing on the most important outcomes right now will help you better understand what types of activities to prioritise.

In many cases you will be focusing on what you are achieving within a short time and evaluation may need to reflect that, rather than focus on longer-term change.

Focusing on process evaluation – on who was reached, how and with what activities – will be especially important. In a crisis, we know that some groups are likely to be affected more than others. When you consider where to focus services, you should monitor who you are reaching, so that existing inequalities in provision are not worsened. To what extent did you reach those most in need?

On the other side of this pandemic, you may find that some of your new ways of working are worth keeping. Collecting some basic evaluation data now on what works and what does not will help you decide.

What next?

  • See our coronavirus pages for guidance and resources for charities and voluntary organisations during the crisis. Note that we have also made our member-only materials available to all during this difficult time.
  • Join us at Inspiring Impact’s webinar on Thursday 16 April, 12.00–13.00, to share what you’re learning on evaluation during the coronavirus pandemic.
  • Our colleagues at NPC and Inspiring Impact and the Charity Evaluation Working Group are developing resources on evaluation in changing times – sign up to the Inspiring Impact newsletter for the latest on this.

 

This entry was posted in Practical support and tagged , . Bookmark the permalink.

Like this? Read more

Sally Cupitt Sally Cupitt is Head of NCVO Charities Evaluation Services. She been a consultant at CES for over 20 years. She specialises in independent evaluations of voluntary sector organisations, research and helping organisations to develop and implement monitoring and evaluation frameworks and systems.

Comments are closed.