Blog Post

More than evaluation


Holly Donagh, Partnerships Director at A New Direction reflects on the challenges and rewards of managing a large-scale evaluation programme

Our story in numbers

A New Direction (AND) has been shepherding and stewarding the national evaluation of the first phase of Creative People and Places (CPP) for the last three years and I have had the sometimes honour, sometimes stress, of chairing the steering group for this process and overseeing the work. We are now about to publish the last few pieces of evaluation and handing the baton on to others, so it seems like a good time to reflect on some learning.

When we started the work in 2013 there were only a handful of places involved in CPP and no programme structure to draw-upon as such. So AND picked up the challenge of coordinating the national evaluation programme on behalf of the places (we were part of the Creative Barking and Dagenham consortia) and I am immensely glad that we did. Being involved in CPP has taught us a lot, helped us stop being so focused on London (we are primarily a regional charity) and meant we could be part of an exceptional network that has achieved a huge amount in a short period of time.

We were very clear at the outset that AND was leading the evaluation on behalf of the 21 CPP places and would play a largely facilitative role, it’s to the Arts Council’s credit that they trusted the places with their own evaluation and let us get on with it.

It was established early-on, across the places and the Arts Council, that we wanted the evaluation to be about producing knowledge that would inform practice in real-time and that would be useful. Not just be a document that ticked the box of a funder or could be used to spin a positive story for the programme.

This meant we commissioned a large scale meta-evaluation, largely to collate monitoring information and provide consistency across the programme, as well as more experimental ‘deep-dives’ into areas we were particularly interested in, provocative ‘think-pieces’ as well as artist-led evaluation.

In this way we tried to make sure that the evaluation itself embodied the experimental and ‘action-research’ nature of CPP and I hope this will be part of the positive legacy of our work, encouraging others to think about the real purpose of evaluation and the range of methods that can be employed.

Crucially we spent as much time talking about the way we would synthesize learning and share it, as we did on commissioning the learning itself. We also thought hard about supporting the evaluation and research skills of the CPP professionals so the whole process could be one defined by learning.

I would not say everything worked, we were often frustrated that the evaluation could not seem to keep up with the diversity of stories and impacts that the programme was  having, we struggled to bring the voice of the ‘non-engaged’ into the picture and there were technical elements of the structure of the process that we could have improved. But, with the time and money we had I think we helped shape a narrative for CPP that is hugely powerful and provides much rich content for the sector in its widest sense.

There were a number of principles that worked well that are worth sharing:

1. Get the governance right. Setting-up a steering group for the evaluation early on and being very clear about the process for joining the group and role of members, meant issues that came up as we went along - of procurement, transparency, the respective role of funders and ‘fund-ees’, objectivity – had to a large extent been covered off by a rigorous memorandum of understanding and set of procedures set down at the start.

2. Set the overarching intention of the evaluation – even if it seems obvious. Having a clear strategy for the evaluation (devised with the steering group) that prioritised understanding and conveying the narrative of CPP, producing useful information that could benefit the sector and a real-time link to CPP professionals was a very different model to simply contracting an agency and waiting for a report. Having a overarching ambition and strategy for the three-year evaluation led to better and more appropriate decisions.

3. Try to be holistic. Linking evaluation, communications (especially prioritising the website as a key hub of knowledge) and peer learning via regular meetings and the annual conference meant that we could understand and respond to the needs of the network as well as explore ways of engaging beyond CPP. This is different to an evaluation that is primarily only understood and used by stakeholders.

4. Get a communications system up and running quickly – then improve it. It’s controversial – but I think that setting up a number of projects on Basecamp (an online project management platform) so that we could cheaply and quickly connect all CPP professionals across the country was very important. 

5. If you really want critique you need to have trust. Much is said about the need for professionals to share stories of ‘failure’, reviewing the bad as well as the good in order encourage learning. In reality, it is all too easy to focus on a positive story in any evaluation. CPP seems to me like an environment where people are able to share without fear of sanction and it is this culture of trust that is more important than anything to good evaluation. You need to pay attention to how you breed this culture if you want it to happen.

I am particularly excited by the fact that the CPP learning is now part of a bigger conversation looking at the civic role of arts organisations (with the Gulbenkian inquiry), the nature of cultural democracy and capability (Kings College London) and thinking on the nature of ‘place-based’ working (see Langkelly Chase amongst others).

This is all about how we better understand the relationship between people, place, art and culture – not as a series of hierarchies and professional categories -  but as fluid territory where individuals and communities can develop their humanity and creativity. This seems like a big shift in arts policy away from the tired dialectics of ‘access vs excellence’ and hurray for that. The work in many CPP places is genuinely pioneering in this respect and it has been a privilege to work alongside people who are changing the debate as well as doing the work (‘building whilst flying’ as you might say…).

We will be taking a lot of this learning into our work at AND and across the Bridge network nationally and I can’t sign-off without saying a huge thank you to everyone we have worked with.

Holly Donagh, A New Direction