Dave Thomas, Volunteering Development Officer at NCVS writes...
Whenever I go to network meetings or conferences, I enjoy a game of Buzzword Bingo as much as anyone. But there’s a word that has been creeping into the vocabulary of volunteering which deserves rather more than a tick on the bingo card. That word is “Impact”.
It’s become very popular, although its meaning is sometimes a little blurred. James Noble is the Impact Management Lead at New Philanthropy Capital, whose blogs on this topic make for interesting reading. His very readable summary sets out the view that impact measurement should be long-term, comparative and robust:
- Long-term. The difference that our volunteer programme makes is not instant. We should collect our data and evidence over a long period.
- Comparative. In order to see a difference, we must compare two or more things. This could be a comparison of before and after a volunteer intervention.
- Robust. To actually ‘measure’ impact we need to do both 1) and 2) with enough rigour and scale to be confident in the results.
In trying to measure the difference our volunteer programme makes, very few of us have the resources or the time to carry out detailed research that meets all three of James’ criteria. But this doesn’t remove our need to produce some evidence, because this is something that we really need to know.
NCVO’s KnowHow website has a helpful set of pages about impact to guide us through the various stages.
Planning
Before we can decide how to measure a difference, we need to know what difference we want our volunteers to make. This is pretty obvious, but unless we have a clear idea at the start of the process, how can we know what and who we should be monitoring and measuring? The planning stage of the Impact cycle includes:
- Deciding what difference you are going to make
- Developing Policies
- Developing Role Descriptions
- Recruiting Volunteers
- Induction and Training
… what else would you include in your own plan?
Baseline
In order to make a change, we need to know where we are starting. Even if your volunteer project has been running for years, it will still be seeking to make improvements to the service, to the lives of service users or whatever the “cause”. Wherever you are going, you start from here, so let’s start by understanding where we are right now.
“Mapping Exercise” is more than another tick on the “Buzzword Bingo” card. It is a way to be clear about the situation that you want to change.
Measuring
As a Leader of Volunteers, you will probably count the numbers of volunteer hours, how many times activities have taken place, how much money raised, etc. Things like the team planted 250 trees and gave 2000 hours of time worth well over £20,000. But… so what? That isn’t impact.
We still haven’t found out what difference we have made? The difference lies in the case studies that tell people’s stories. We collect stories by asking questions. But make sure that you record and store them securely, especially if they contain personal data.
You probably already have some stories in:
- Volunteer Support and Supervision Records
- Beneficiary feedback
- Feedback from referrers, partners, parents, carers, other organisations, statutory services … and anyone else
- Complaints
- Previous evaluations. Especially if they have been carried out by someone independent of the volunteer programme.
- Comments about your service on social media
- Coverage in local media
… how else can you collect stories?
Another useful measure is that of “Distance Travelled”. This tries to turn stories and “soft outcomes” into numbers. Some people like the straight line(s) of the Rickter Scale, but the Outcomes Star is also well-used. Salford CVS also has a good page about outcomes measurement.
Analysing the data
You now have a set of numbers and a collection of stories. In New Philanthropy Capital’s Well-being Measure presentation, John Copps highlights: No stories without numbers, and no numbers without stories.
Stories are not as easy to analyse as sets of numbers and getting to grips with what is known as “unstructured data” can be very daunting; however, there are some proven approaches. This blog from Michigan State University tells us that there is no right or wrong way to analyse your data, and provides advice and tips.
The key message in analysis is to be organised and consistent. I would also suggest that this analysis should start to take place while the data is still being gathered. Remember that this is a long-term process.
Reporting the findings
Traditionally, evaluation reports are written, but why not think more creatively? Could your volunteers produce a video showing the difference that they have made? How else could you tell their story? Most of us will stick to the written report – if only because it’s what managers and funders understand. Remember to include useful numbers as well as the stories.
Using the evaluation
This is the stage that can be far too easy to overlook. Our volunteers, service users and other stakeholders will have invested a great deal of time and energy in monitoring and reporting. They have shared their experiences and stories with you. You owe it to them to make good use of all this data.
So recycle your learning. Go back to the planning that you did at the start of the process. Did you achieve all that you said you would? What worked well? What didn’t? How will you do things better over the next month, year, three years?
What next?
I will soon be offering a two-hour training session on Volunteering Impact, so keep an eye on our e-bulletin and our Training Courses page for dates and costs.
Meanwhile, please share your thoughts about the impact of your own volunteer programme with me. Email davet@nottinghamcvs.co.uk or call 07564 040767.