Back to Creative Practice

5

Creative practice
Observatory

How to think about /do evaluation?

A practitioner reflection on generative evaluation

Much of the evaluation work in the CreaTures project has been led and analysed by researchers in discussion with artists and creative practitioners. The researchers worked in depth to understand the priorities and goals emanating from the experimental and open-ended work of our partners and what these might mean to policymakers and funders. This has been crucial in our attempts to transform the way that evaluation is legislated for and performed through policy and funding.  

However, equally important is a practice that enables practitioners to judge if their designs are successful on their own terms and to then articulate these aspects to others, such as funders. This practice needs to be light-weight, unobtrusive and non-repetitive; it has to work alongside tight schedules and limited funding as well as an interest in experimentation. But its design can also be made to demonstrate rigour in execution and allow goals of transformation to be assessed on their own merits.

I liken the inputs and outputs of transformation-oriented creative practice to the Copenhagen train map. Lots of trains come in from diverse places and go out to lots of other destinations; however, there is a point (near Østerport) where almost all the trains go on a particular section of the track – so it is with the processes of working to make cultural change. We don’t know in advance what influences and concerns people will bring, and the focus of the practitioner will vary (from alternative economies, to improving biodiversity, to post-fossil futures and beyond); but, in the end, there are only certain tracks to follow if cultural change (rather than innovative expression for the sake of it) is to emerge.  

Of course, not every creative practitioner is seeking to make transformation. Here, we are only addressing the learning from CreaTures, which is concerned with transformative futures through creative practice and has been working with people who are seeking change, however undefined. In this case, a fundamental tenet is that some change has taken place. In other parts of our work, we talk about what change we see happening and the paths used to make it; here we talk about how to know it is being made.

The best generative evaluation practices do not feel like judgment; they are woven into other work and become part of the experience. They nonetheless have a practical step that allows some before-and-after reflection. The Arts Council England offers advice for how to conduct ‘thoughtful evaluation’ (see below), suggesting organisations ask themselves first about the ‘aims of the work’ and ‘the mission behind your organisation’. However, their focus is designing a good survey – one that allows for measuring success. Surveys are not easy to incorporate into practice and are unattractive to participants to complete. How can we gather information on transformation in a more lively, reflective fashion? One way that is more attuned to both practitioners aims and the participants experiences is to make the evaluation part of the experience.  

To give an example, Democratising Technology (DemTech) was a project to engage older people in conceptualizing novel technology,which resulted in emergent long-term transformation (of course, that potential was hard to see even at the finish of the project in 2008, but is clearer now — knowing about impact can take many years). The Geezers — a self-named group of participants — supported by the artist Lorraine Leeson undertook a 10+ year initiative to build a novel water turbine design and tackle environmental concerns through diversifying renewables research. The turbine was launched on the River Thames in 2013 and it was followed by a floating, pollution-busting waterwheel on the Lea River in 2017. In 2022, the collection of projects won a Time Higher Education Knowledge Exchange award.  

DemTech’s potential for making transformation was based on equipping groups with several different elements that contribute to transforming attitudes and sense of self: 

  • Forum – a space to contribute and people to listen 
  • Motivation – the desire to contribute 
  • Articulacy – the vocabulary and fluency to present one’s ideas in a particular domain 
  • Confidence – the assurance to become involved 
  • Knowledge – enough understanding to have an opinion 
  • [Sense of] Agency – an awareness that change is possible and of oneself as an agent of change 
  • Association – the ability to interpret things together or see links, such as: old and new, people and things, etc. (Light et al 2009). 

The evaluation conducted to assess which aspects were important and how these had many dimensions, including discourse analysis of exchanges with participants, comparison between the practices of four different artists interested in social engagement and cultural change, and long-term monitoring of participants’ continuing journeys. These first elements required researcher time because they were labour-intensive and costly.

Here, let’s look at the other mechanism: running exercises before and after the series of workshops to learn whether new meaning had been instilled by participation.

Participants were given a simple timeline stretching back to their youth and reaching into an open future, with a marker for the present. They were asked to put significant things and events that they remembered from their past along the timeline. They were then asked to put potential future things and events into the future space. After participation in a series of activities, they were given back their timelines and asked to add other things and events into the future section, reflecting on what expectations for the future the work had given rise to.  

The quality of these later contributions was completely different from those that they had started with. Not only were the participants demonstrating greater insight into the range of things to discuss, but there was more depth to the contributions and greater relevance in them to the global discussions taking place at that time. Single word answers had become mini-essays. Even the commitment with which they undertook the follow-up task was noticeable, with more confidence, energy and enthusiasm.

One of the  key things about the timeline task is that it is content-free and only provides structure– it is non-directive. Unlike a question that might  shape any experience that follows, offering an exercise that engages participants this simply has the benefit of establishing their interest, ideas and commitment without ‘leading’ the answers they give. This also makes it possible to actually prove that the experience shaped the transformation. Think of the difference between a) giving people a timeline; b)asking people what they think are important issues in the future; c) asking if they think that something particular will be important in the future (such as the focus of the work); or d) asking whether they are worried about this in the future. Each has a different orientation and some of these questions will be more determining of people’s next thoughts and experiences than others. The determining quality builds from a) to d), getting less open and more directive as it goes.

An open style is good for collecting responses in a before-and-after context where people’s second answers can be compared with their earlier response. More focused directive questions work better for evaluation conducted only after an event has taken place. Many people will not know that they are being transformed unless an insight was very dramatic or they are truly perceptive about their thinking. Thus, questions that tend towards ‘are you thinking differently or feeling different as a result of participating?’ may only go so far – though are often worth asking – and the value of before-and-after exercises is that they reveal to you if someone is changed by your intervention even if they have not picked up on it.

Non-directive reflective practices – seeding without leading– require structures for evaluation that avoid introducing too much of the practitioner’s own thinking into the process. However, you come at it, it is always better to ask the broad question first and then follow up with ‘how?’ or ‘why’ and not go straight in with ‘are you going to behave differently  as a result of participating today?’ or some other action question. Of course, asking that question may form the idea of changing something in participants’ minds so it has that value at least, but it tends to set the agenda for everything else we might want to know.

More recently, CreaTures researchers have been working with the arts organisation Furtherfield. Here the focus has been on analysing Furtherfield’s long-term project for Finsbury Park in London which raises the contested nature of governance issues there, as well as for the future of other open and green spaces. ‘The Treaty of Finsbury Park 2025’ invites local people (and anyone else interested) to participate in setting up an interspecies celebration of life in the park. After a sorting activity to give people a role to play, the next hours are spent acting as one of several species that inhabit the park and negotiating a choice for the festival site.

While being briefed on their role, participants are asked for a little information on their attitudes. This starting point for evaluation by the hosts is not distinct from the other preparation work but it does not contribute to the game;  we introduced it so that debriefing later would show how the participants thoughts had moved on.  

That means we also had to hear from participants at the end of the game to find out what had changed in their thinking. That debrief is important. It reveals whether feelings and thoughts have changed. But it also does some work in making the experience transformative.

Reflection embeds experience. It gives feedback to the hosts about the experience and what could be developed more.But if done collectively (and sensitively, for opening up requires safety and a sense of trust), we can reflect together on:

  • Other people’s views and how they might differ from or complement our own impressions
  • Our own thought processes and what we value
  • How this relates to our worlds and what is salient to takeaway/act on
  • What it might mean/require in terms of change
  • Who we might want to collaborate with – become friends and allies with

Mainstream monitoring and evaluation approaches do not serve the types of change processes discussed here. As we explore  creative practice as a medium of transformative change, we might ask:  

  • What is judged as evidence? What is success? 
  • Over what time frame and range of stakeholders can/should success be judged?How can evaluation value risk-taking and related implications for creative practice? 
    How do we account for emergence?  
  • How can evidence be gathered of the impacts of creative cultural interventions without reducing context-specific, often highly personal and difficult-to-capture qualitative outcomes to product- or outcome-focused analysis?
  • How can evidence be expressed so it supports the practitioner’s practice, the learning of other participants, and other professional groups’ understandings of its value (for instance, policymakers or climate scientists)? 

Doing this helps creative practitioners to support claims of impact with respect to other professions and disciplines. This is not to instrumentalise art, design and other creative work, but to formalise some processes for understanding what that work might mean on its own terms. Methods like play and workshopping do not lead to specified and quantifiable outcomes in the way that simple structures do, so we need good processes that are relevant to our choices of approach and the questions we might want answered. These are generative evaluation practices: generative because they are not an external judgment at the end of a process, but a part of the engagement, iteration and understanding that makes up the reflective work of creating socially-engaged practices.

References

Light, A., Simpson, G. and Weaver, L., Healey,P.G. (2009) “Geezers, Turbines, Fantasy Personas: Making the Everyday into theFuture” Proc. Creativity and Cognition 2009, Berkeley, Oct 2009

Light,A. (2011) Democratising Technology: Inspiring Transformation with Design,Performance and Props, Proc. CHI 2011, 2239-2242

The Arts Council’s pages onEvaluation for Artists are here: https://impactandinsight.co.uk/docs/getting-started-with-thoughtful-evaluation/

And interesting contrast can befound at the Centre for Cultural Value: https://www.culturehive.co.uk/research-and-evaluation-practice/

And also on the pages of the HappyMuseum: https://happymuseum.gn.apc.org/resources/measure-what-matters/

Each set of resources handlesevaluation for practitioners in very different ways.