Arts Impact Assessment

Meeting 1

For a list of participating organisations, please click here.


To find out more about the Seminar event in November 2013, please click here.

15th May 2013

at Rich Mix

In Meeting 1, our key theme was Drivers for Impact Assessment. The meeting was structured around four questions:

1. What do we want to know when we assess impact?

2. Why do we want to know it?

3. Whom do we want to tell?

4. What are the opportunities and the barriers?

Elizabeth Crump, Strategic Lead at What Next? and the Cultural Learning Alliance.

Content Overview


Guest Speaker


Discussion and Learning

To what extent does our current evaluation and impact assessment activity support our organisations’ mission and priorities?

Download Meeting 1 Agenda
Lizzie Crump Biography
  • To report back to funders about activities that have taken place

  • To make a case for the appropriateness of an activity when applying for funding

  • To advocate for an organisation and its work, and for the arts more broadly

  • To support quality and to highlight areas for improvement

  • To inform future strategy and planning


While the group agreed that impact assessment could support quality assurance and strategy, the emphasis at Meeting 1 was on fulfilling the reporting requirements of funders.

Is it possible to have a dialogue with funders about what we evaluate and how we do it?

Why do Impact Assessment?

What role can Impact Assessment play in strategic planning?

What overlaps and differences are there between what funders want to know and what organisations want to know?

Guest Speaker:

Lizzie Crump


"Why is Impact Assessment important in the current landscape?"

What role can impact assessment play in quality assurance?

We asked the following questions:

  • What are the overarching aims of our organisations?

  • What are the overarching research questions that sit alongside these aims?

  • Does our current practice in evaluation and impact assessment contribute as effectively as possible to our organisations’ development?


We also discussed the following points:

  • There is a lack of strategic planning in terms of evaluation and impact assessment. We do not tend to think about what the overarching aims of an organisation or a department might be and then plan research and evaluation which seeks to support these aims

  • The tendency is to discretely evaluate many different projects, often driven by funders' reporting requirements

  • This creates a patchwork of evaluation with multiple evaluation frameworks and findings which are difficult to aggregate.

  • This approach makes it hard to see the bigger picture of what an organisation is achieving and which areas have most potential for development

Reporting to funders is an important aspect of impact assessment. But what are the risks if we focus solely on reporting?


Our discussion raised the following points:

  • Our perception is that funders only want to know whether their investment has achieved the expected outcomes and demonstrated value for money

  • Organisations additionally want to look at unexpected outcomes and be open to negative results in order to learn

  • We are nervous of showing funders results that may appear negative. Therefore we tend to construct evaluations quite narrowly according to funders' requirements. This has implications for the quality and breadth of information gathered and ultimately for organisational learning and development

  • We may get caught in a loop where we are only finding out the things that we believe funders want to hear and then applying for the same type of funding for the same type of work. As a result, opportunities for new approaches and new sources of income may be missed

This was a recurring question at meetings, and one that we discussed with guest speakers Dougie Lonie of Youth Music (see Meeting 2) and Tim Joss of The Rayne Foundation (see Meeting 3).

  • Group members were nervous about suggesting to a funder that they might take a different approach to that outlined in the funder's guidelines

  • Our guest speakers indicated that they were open to all forms of research and reporting that could effectively demonstrate the outcomes and value of their investment

  • They recognised that organisations have their own approaches, and that their priority was the quality of the research

  • The funders we spoke to also said that they wanted to hear about what hadn't worked well and why. This information contributed to their own learning and development.

  • Impact Assessment plays a vital role in developing the quality of our work / service

  • It enables an organisation to understand the outcomes it generates and how it generates them

  • Understanding how outcomes are generated enables an organisation to refine models of working and to improve results

  • Understanding ‘how’ also means that an organisation can better articulate its unique approach. This can form the basis of a case for support and distinguish the organisation from others. This identity strengthens the organisation in terms of advocacy and fundraising

  • Negative findings are vital in quality assurance. If we are not able to identify and interrogate negative findings then we lose important opportunities to learn and develop.

  • Impact Assessment enables an organisation to see where its work is making the most difference and where its greatest value lies

  • It also enables an organisation to pinpoint what is effective about its particular approach.

  • This knowledge is vital for business planning and allocation of resources. It helps the organisation to prioritise areas for development. It may open up new sources of income and suggest new partnerships. It may also point to possibilities for growth and replication of services (discussed in more detail at Meeting 3)

  • It enables an organisation to be specific about results and how it achieves them, and therefore be more competitive

  • Lizzie Crump gave an overview of current policy in education with further information relating to health and social issues

  • Local Authority commissioning and Payment by Results was discussed

  • For arts organisations to be competitive in the new landscape they need to have robust evidence for the effectiveness of their approach and the ability to measure accurately their progress against outcomes

  • Lizzie also discussed advocacy and evidence. While some agencies want an overview and statistics, others are more responsive to case studies highlighting outcomes for individuals. It is therefore best to collect both types of evidence and to have a range of ways of demonstrating the outcomes of your work, used individually or in combination as required

  • A discussion followed around the subject of attribution. Arts education tends to work in environments and with individuals where there are multiple interventions. Atribution is therefore extremely difficult to prove. Citing ‘correlation’ between activity and outcomes was felt to be the best way of expressing impact


Shortly after Meeting 1, Arts Council England announced the Cultural Commissioning Programme, which will specifically support the arts and cultural sector to collaborate with public services and contribute to the health and wellbeing, and the social and economic outcomes and priorities of local communities, councils and health authorities.

ACE Cultural Commissioning Photo0223 cropped long

Barriers to assessing impact

A number of barriers to assessing impact were identified by the group, including:


Timeframe and short-term planning – Volume of projects – Number of funders – Lack of clarity and mission – Staff turnover – Tools and methods – A lack of consistent metrics – Requirements of funders and how they are perceived – Attribution – Quality of Evidence – Ethics – Collection and storage of sensitive information.


These are given in further detail here.

Barriers to impact assessment


The group undertook two pieces of homework:


1. To carry out a reflective meeting with a colleague within their organisation, looking at drivers, goals, barriers and actions. An agenda was provided (see download, left).


2. To write up a research aim, showing the basic questions that underpin the work that they do.

Meeting 1 Homework