Live discussion: Proving our impact – 21 October, 10.30-11.30

vcse-review

 

Please join us 10.30-11.30 on 21 October for a live discussion on impact. You’ll have the opportunity to pose questions for the panellists, debate and put forward your own experiences and suggestions.

All thoughts will feed into the VCSE Review which is helping to shape the future funding and partnership decisions made by the Department of Health, NHS England and Public Health England.

21 October 2015
10.30-11.30

Background

Defining, measuring and capturing long-term outcomes and social value are crucial to making the most of the VCSE sector’s contribution.

Contributing to social impact is the ‘bread and butter’ of the sector; it is its own unique currency which should be being promoted as widely as possible. However, measurement of social value and improving long-term outcomes needs development.

Discussion points

The discussion will focus on three key areas:

  • What more can be done to increase the availability of outcomes/social value/impact data?
  • What kinds of outcomes and impact does the VCSE sector need support to measure and demonstrate?
  • How could learning from funded grants and projects be better shared and disseminated?

Panellists

Sally Cupitt, NCVO

Sally Cupitt is Head of NCVO Charities Evaluation Services. She been a consultant at CES for over 15 years. She specialises in independent evaluations of voluntary sector organisations, research and helping organisations to develop and implement monitoring and evaluation frameworks and systems. Prior to working at CES Sally was involved with frontline community work, particularly in mental health and homelessness, and ran an advocacy project.

Alex Van Vliet, Lloyds Bank Foundation

Alex joined the Foundation in April 2015 in the new post of Research and Data Analyst. He works on how the Foundation uses evidence and research to improve their work, understand the difference they make and better support small and medium-sized charities across England and Wales. Prior to joining the Foundation, Alex worked at New Philanthropy Capital, a charity thinktank and consultancy, where he worked on a range of projects to help charities and funders measure and articulate their impact.

Rob Newton, Leeds Beckett University

Rob works in policy with the Institute for Health and Wellbeing at Leeds Beckett University and Leeds City Council. Rob is part of the team delivering the 3 year ‘What Works for Wellbeing’ ESRC research project, contributing to the ‘Community’ evidence programme. Rob specialises in building links between research and practice and leads partnership activity across Leeds in support of the Leeds Health and Wellbeing Board. Prior to working for Leeds Beckett University, Rob worked for the Special Interest Group of Municipal Authorities as a policy lead on local government finance.

Questions and answers

Post your questions in the comment section below and join us on 21 October 10.30-11.30 to see live responses from our panellists.

This entry was posted in Live discussion, Practical support, Training and events and tagged , . Bookmark the permalink.

Like this? Read more

Nick Davies Nick was NCVO's public services manager until March 2017. He is also a trustee of the South London Relief in Sickness Fund.

42 Responses to Live discussion: Proving our impact – 21 October, 10.30-11.30

  1. Please could you tell me how to listen in to this discussion?

    • Hello Rachel – the live discussion will be taking place in the comments here in the comments. Just come back here tomorrow (Tuesday 21 October, 10.30-11.30).

      All the best,

      Jack Garfinkel, Web Production Manager

  2. Nick Davies Nick Davies says:

    Good morning and thank you for joining us for today’s live chat on impact. We are joined by Sally Cupitt from NCVO, Alex Van Vliet from Lloyds Bank Foundation and Rob Newton from Leeds Beckett University

    This is an interactive discussion so if you’ve dropped in and have some thoughts please post a comment. Remember to keep refreshing the page to see new comments.

    I’d like to kick things off by asking our panellists and others dropping in about the support the VCSE sector needs to better measure impact? Are there particular types of outcome that the sector struggles with?

    • Sally says:

      Morning all.

      What outcomes are trickier to measure? From NCVO Charities Evaluation Services’ perspective, these would include:
      1. Outcomes of preventative work – how do you show something hasn’t happened?
      2. Outcomes for end-of-life care, or some other situation where things will deteriorate over time.
      3. Outcomes of capital spend projects – can you claim the outcomes of services provided in, for example, a community centre, if you have invested in it?

      All possible, just require a bit more planning. Ask if interested!

      • Rob Newton says:

        It is worth noting that all these 3 are about measuring long term impacts. There are perhaps two challenges here:
        1. For policy makers and commissioners – to take a more long term view
        2. For sector and providers in there evidence – to consider how we account for long term impacts. Economics have principle of discounting and positive time preference, and this has been utilised by Health Economists. There perhaps can be more chance for recognising this on both sides of the producer/consumer of evidence.

    • Alex Van Vliet says:

      Morning all

      I would highlight two aspects that seem a particular challenge for the VCSE. The first is accessing information on long term changes – often held by public bodies, but not accessible to the VCSE. Working through data protection issues will be critical.

      The other is how to focus on the most important intended outcomes of your work, and not casting the net too widely – it can be tempting to think a funder will be impressed by measuring everything, rather than what really matters to your work.

      – Alex

      • Sally says:

        Morning Alex,
        the the ‘wide net’ point, I very much agree. its easy when doing a theory of change or equivalent to come up with loads of exciting outcomes. Prioritising is key, at least when starting out on an outcomes- and impact-focused approach.

        from our perspective, getting all VCSE orgs to be able to identify and measure five key outcomes, and then use that data, would be a massive step forward!

    • Nick Davies Nick Davies says:

      Thanks for those thoughts. One of the things we’ve heard throughout the consultation is that it is difficult to demonstrate impact in projects which are only funded for short periods. Does this ring true?

      • Sally says:

        In theory, short term projects should not be more difficult to assess than others; the issue is about what you try to measure and how realistic you are about what can be done with evaluation.

        We often see organisations that have agreed to demonstrate impact within a two-year project; this is in most cases unreasonable. Impact simply will not have been achieved in that time period: instead, focusing on some early, intermediate outcomes is likely to be the best possible option. (This is assuming no one is able to track long term outcomes, beyond the funded project.)

        Theory of change is vital in this, in that it helps you identify those early outcomes.

        • Alex Van Vliet says:

          Having a clear articulation of why early, intermediate outcomes are important for longer-term change is so important, but this often depends on service design having a clear reference to wider evidence/literature, which in our experience is not widely used in the VCSE.

          • Sally says:

            Good point Alex. In an ideal world, there would be a brilliant bank of robust evidence, openly accessible to all, to help VCSE organisations evidence the causal linkages and assumptions behind their theory of change.

            in the absence of that utopia, many have to draw on their own data or experience to make a reasonable case as to how they they change will happen.

            Of course, when they start collecting data, they will do so against that theory of change, which will make it stronger in the long run.

      • Rob Newton says:

        Absolutely. Obviously, if your funding in short term and the outcomes are longer term then there is a mismatch there. There is an issue of whether demonstrating impact is practically possible and useful for the VCS org. Who are we producing this evidence for? Is it to learn and to improve, or is it always to demonstrate impact in the hope of getting funded again. Hopefully both, but there is an issue of the power dynamic of who the evidence is being produced for.

        • Rob Newton says:

          There is also the issue of skills and time on this practicality point. Smaller organisations don’t necessarily have the time or the skills to be undertaking a measurement industry. But perhaps tools like the ones that Sally mentions make it accessible and useful.

          • Sally says:

            I think there is a very interesting discussion in here about whose responsibility it is to assess impact. We define impact as the broad, long-term social change that orgs contribute to, whereas outcomes are changes more immediately linked to one’s intervention.

            True impact evaluation is really hard, often expensive, and actually seldom done. In many cases I think you can argue that people funding programmes of work could focus on impact, while the funded focus on primarily on outcomes.

            Certainly in smaller orgs, people running, for example, a children’s service, usually need to focus on service delivery and not becoming quasi researchers.

            Having said that, I am not arguing that monitoring and evaluation isn’t central to everyone’s work; simply that there are levels of evaluation, and that they are not all appropriate in all circumstances.

  3. Rob Newton says:

    Good morning all and pleased to be chipping into this discussion. Looking forward to seeing what people have to say.

    From my perspective, we’ve just completed an initial phase of the What Works for Community Wellbeing evidence programme, where we’ve been asking people from various fields across the country about what their aims and challenges are in their work, how wellbeing evidence can support this and their challenges in evidencing wellbeing outcomes. So for a starter, an outcome that we have a challenge on outcomes for is broader wellbeing benefits. This came out (unsurprisingly) as a particular challenge for VCS orgs because they are often providing broader, more personalised services around needs of a whole person/community.

    • Alex Van Vliet says:

      One issue that we have come up against as a funder of many small charities providing a holistic service is how to capture and articulate the benefit of that approach, and the ‘added’ social value that charities deliver in the communities, through volunteers and so on.

      • Rob Newton says:

        A need for pragmatism perhaps here. A service can have its core purpose to improve broader ‘wellbeing’ and there are plenty of good examples of these, but to provide a specific service you need to have some key outcomes which indicate other wellbeing outcomes, as not only will evidence be hard to pin down, the design of the service will also be.

        • Rob Newton says:

          And to learn from some work in Leeds, the city council has utilised Outcomes Based Accountability a lot in service design and partnership work. This has enabled quite broad and at times nebulous outcomes to be drilled down to a tighter definition and clear outcomes. Simplicity within the complexity, without over-simplifying things. This approach uses 2 or 3 ‘obsessions’ or key indicators to indicate much broader change.

  4. Rob Newton says:

    So leading on from this, ‘wellbeing’ as a term is potentially quite ill defined and means different things to different people, with different understanding of the appropriate evidence base. So perhaps for some programmes, it’s not just trying to measure the outcome effectively, but trying to define the outcome itself in the first place. This is a particular challenge to proponents of ‘wellbeing’ as an achievable and useful outcome.

    • Sally says:

      Morning Rob! One of the things we do a lot of with people is planning for evaluation – often a stage that is left out, but at one’s peril.

      This may mean using a planning tool – like theory of change, logic models or the CES planning triangle – to help people work out what their intended outcomes are. Very often orgs come to us with fairly vague statements about the changes they wish to achieve, and we spend time helping people break these down into something specific, clear and measurable. This needs to be done before measurement systems can be put in place.

  5. Nick Davies Nick Davies says:

    We’ve been told that while come large organisations have invested in their impact assessment – for example by calculating their Social Return on Investment (SROI) – this is harder for smaller organisations to do. Does this create an uneven playing field for organisations competing for contracts? If so, how can it be levelled?

    • Rob Newton says:

      As an answer to the first question, yes! This was something which was reflected in our What Works Wellbeing workshops. There was an impression that larger organisations had more capacity to evidence impact, and were also very effective at communicating that evidence in a convincing and accessible way.

    • Sally says:

      Several issues in here I think.

      Re size, biggest is not always best. I have met smaller orgs that do evaluation really well – and even dip a toe into impact evaluation. Conversely, I have met massive national charities who cannot even report accurately on how many users they have. Sometimes, bringing in new approaches like outcomes-focused evaluation can be easier in smaller organisations.

      However, you are right in that some VCOs have their own internal research and evaluation teams. And are far advanced with this work.

      Some of the levelling has to come at the point of funding, I think. A number of funders are adopting really helpful funder-plus approaches that build the capacity of smaller VCOs to self evaluate. Some are also prepared to pay for the evaluation they are requesting.

      Its also about managing expectations as to what is reasonable and appropriate to evaluate, given the size of the funding, the nature of the intervention, and the developmental stage of the organisation.

      • Alex Van Vliet says:

        Our ‘funder plus’ programme has supported many small charities to develop their outcomes measurement – we’re conscious, however, that we always need to emphasise that measurement should be about learning first and proving second.

    • Alex Van Vliet says:

      SROI can be a red herring – a meaningful assessment of value depends on having solid underlying measurement outcomes, which even large organisations struggle to do well. It also creates an arms-race to ever greater ratios on investment – it’s now not unusual to see claims of £14 of ‘value’ generated for every £1 spent.

      • Sally says:

        I agree Alex; SROI can be powerful, but replies (as do pretty much all evaluation methods) on the collection of good outcomes data.

        As a very rough rule of thumb, I look with suspicion on claims over 1:5, or perhaps 1:10, but I may be cynical!

  6. Nick Davies Nick Davies says:

    We know that huge amounts can be learnt from evaluations of previous projects but often findings are not widely shared. How do you think funders and VCSE organisations could improve dissemination?

    • Alex Van Vliet says:

      As a funder, we sit on a tremendous amount of information about the projects we have invested in, but lack the resource ourselves to disseminate the learning – it’s often done ad hoc, but rarely results in anything being published.

      I’m interested in how the mooted ‘Health Data Lab’ model could build a culture in the VCSE of ‘publish first’, but it will take a big culture change from our current approach.

      • Alex Van Vliet says:

        The Health & Social Care Information Centre has been putting together a business case for a health analytical service/data lab, which would measure impact on secondary care: A&E, admissions, readmissions, lengths of stay and costs.
        http://www.thinknpc.org/our-work/projects/data-labs/health-data-lab/

        • Rob Newton says:

          I hadn’t seen that before. Linked to it (but perhaps slightly off the point), there are so many different places for evidence gathering across PHE, NHSE and local JSNAs and other projects. I find it difficult to know where to access what, and I work in this kind of area every day.

      • Rob Newton says:

        Yes it would need some culture change and some good thinking behind the approach and standards, but potential is there. A model would be have to get some widespread universal support. I can think of plenty of forums and google groups which start off with good intentions but in reality nobody keeps investing time after a few months!

    • Nick Davies Nick Davies says:

      This clearly a particular issues with evaluations that find poor result which organisations can be understandably reluctant to share. Yet we know that learning from failure is one of the best ways to improve. How can we overcome this reluctance and create a positive failure culture?

      • Sally says:

        Although it worth noting that even people who have ‘good’ evaluations might be reluctant to share them, particularly if there is a heavy emphasis on process evaluation. Many process evaluation will find things ripe for improvement.

        Things are improving. Its worth noting that when CES first started 25 years ago, we couldn’t even name our clients publicly – even being associated with evaluation was thought to be commercially sensitive!

      • Sally says:

        I wonder if there is some work to be done in terms of funder/funded relationships. (I use the term funder widely, to include investors and commissioners.)

        We often hear VCOs saying they are scared to report bad findings to their funders. And yet all the funders we work with are really positive about transparency and usually see the sharing of such findings as a sign of a learning organisation. This may of course be because the funders we work closely with are a self-selecting bunch, interested in quality and evaluation.

    • Rob Newton says:

      Specifically on Ageing, but the Centre for Ageing Better is a long term well funded project which I’d hope would go some way to answering issues in your question about dissemination and learning. http://www.centreforageingbetter.com/

  7. Rob Newton says:

    Good question. Not sure if I have the answer! We’re planning for What Works Wellbeing to be doing a lot of evidence synthesis and secondary data analysis. Which is for the purpose of filtering and disseminating learning. There is a challenge here about grey literature and other non-published learning, and how we incorporate this. But it is important, because much of the knowledge we’re interested in is that created in and by communities and community organisations.

  8. Sally says:

    This is tricky. For us, our clients own the data and the final report, so although we strongly encourage publication, this is not always possible. There are often good reasons why people may not want an evaluation published.

    However, there is a case for publishing some of the findings, still in a transparent way, without an organisation having to reveal, for example, aspects of its internal processes.

    Where the funder of the evaluation is not the evaluand, the funder may be in a position to demand publication. This of course requires the evaluator to report her/his findings with integrity, while having to negotiate a slightly tricky relationship with the evaluand.

  9. Nick Davies Nick Davies says:

    Above, Alex mentioned the Health and Social Care Information Centre (HSCIC), which is a national provider of data for health and social care .

    We’ve heard from respondents to this consultation that VCSE organisations often struggle to access and use the data they need with many unaware of the HSCIC. Even those who are aware of the HSCIC can find it difficult to navigate.

    What more do people think can be done to increase the availability of outcomes/ social value/ impact data?

    • Rob Newton says:

      Making data more open as a start. Plenty of pushes for this over the last few years. In Leeds we have the Leeds Data Mill, which is trying to get lots of data published in an open way, and in the same place, and for anyone to be able to play around with it. http://leedsdatamill.org/ The city dashboard is the kind of thing that you can start to do. http://dashboard.leedsdatamill.org/canvas/55f2b8cf18f0066c0d784e86 Perhaps some of these principles can be adopted elsewhere.

    • Alex Van Vliet says:

      I think we need mechanisms/protocols on data sharing both at a local level (eg between a small VCSE provider and their CCG) and national level (eg the Health Data Lab project I mentioned above) – this could help to demonstrate outcomes in a way that is meaningful both to local commissioners and policy makers in central government.

  10. Sally says:

    Online sharing of data, tools and learning is of course an excellent way. Check out Inspiring Impact (http://inspiringimpact.org/) – this initiative, of which we are a part, has just been refunded for another three years.

    One of the aims of II over the new few years is to try and reach the places we haven’t yet reached, with messages about outcomes, impact and how to get information and support.

  11. Nick Davies Nick Davies says:

    That’s the end of the live chat but we’d love to hear your comments on any of the issues which have been discussed above.