SALLY FORT
  • Home
  • ABOUT
    • About Me
    • Right now...
    • Clients
    • Client Feedback
    • Contact
    • Privacy Policy
    • Fees
    • BLOG
  • IMPACT
    • What is Impact?
    • Impact Development
    • Evaluation
    • Example Reports
    • Measuring & Managing Impact Quick Guide
    • Theory of Change Planning Template
  • ENGAGEMENT
    • Heritage
    • Visual Arts
    • Education
    • Creative Industries
    • Culture, Education & Community Guidance
    • Teacher Resource Packs
  • CHANGE
    • Research
    • Professional Development
    • Organisational Development

6/10/2021

WHAT IS THE POINT OF EVALUATION?

0 Comments

Read Now
 
Picture
Nothing makes me happier as an evaluator than clients who use and share the evaluation to improve their work and the work of others. It's the direct opposite of sending your report off to trustees or funders and forgetting about it. So I was delighted to find out Leeds Dance Partnership (LDP) had done exactly that. I recently completed their impact and process evaluation for the first four official years of the partnership - a three quarters of a million pound initiative supported by Arts Council England's (ACE) Ambition for Excellence scheme.

It was a very complicated partnership, programme and evaluation. Partners had to be honest, not only identifying successes, but also looking at where not everything had gone to plan. We included it all. Participants, the local and national dance sector, the Leeds cultural decision makers, and regional freelance artists all inputted to ensure a really balanced and practical set of perspectives.
There was a lot to say about the achievements, pitfalls and learning along the way. And at the same time we wanted the report to be accessible, easy to find what different people needed.

As soon as the report was completed, LDP sent it off to ACE ahead of a follow-up meeting. I rarely expect funders to read evaluation reports, knowing how stretched everyone's workload is. So I was delighted to hear ACE had not only read the report but also fed back their appreciation that "the report was more thorough than we expected - very good, and we welcomed the SWOT which explored the flaws as well as stating the positives."
Those investing in your work really do want to see the learning process not just the good news stories (of course they want to see those too!).

I thought that was the end of the story, but no. I was even more pleased when I received a message out of the blue via LinkedIn from an Organisational Development Consultant now working with Leeds Dance Partnership who said the report had been shared with her and, "I found this such a helpful and insightful piece of work that I wanted to write to say thank you as it has enabled me to engage with LDP faster and in a more informed way than would otherwise have been the case."

LDP has also made the summary and full reports available for anyone via their website here or you can read it on screen / download directly from my own collection  here.  A variety of other examples of my evaluation reports are also available on the Example Reports page.

So - these are just a couple of examples of what the point of evaluation is. It's a way to reflect, learn and evolve. It's a way to pass the memory of what happened, what worked and what didn't on from one set of people to another, to save time, stop reinventing wheels, and make the most of the resources you have. There are other reasons to do evaluation, do it well, and put it to good use. But making it publicly available and actively sharing it are a couple that really make me feel the work has been worthwhile.

Share

0 Comments

22/7/2020

Understanding Impact

0 Comments

Read Now
 

What is Impact?

It's a word used frequently by organisations, academics, practitioners and others looking to prove their worth, but what does it actually mean?

There is no single definition, but at its most simple, it means the significant difference your activity makes.
​Examples might be long term difference to individuals, a big difference to the locality, an important change in regional or national policy that will have long lasting implications, the creation of new jobs, a reduction in poor mental health, a stronger visibility for hidden voices and communities, a stronger economy, or a culture change in another group or organisation.

​This might be expressed as
- social impact or social value
​- economic impact or return on investment
- academic or research impact 

Impact is not:
  • ​Small every day changes or differences. These are outcomes. Just as important, but different in scale
  • Monetary, or at least not always
  • Purely statistical. Impact can be measured, but numbers may not be impact, and statistics alone make no sense without the story of how they came about. Outputs are the numbers of things that happened as a result of your work (such as number of workshops, people involved, things created) but they don't show how things have changed.
  • ​Purely descriptive. Anecdotes, stories, observations and other descriptions alone need some form of measurement to prove that change has happened
  • ​Always positive. It's important to note that negative impact can happen too and this needs including in impact assessment
  • ​Easy to demonstrate. Or at least not without some knowledge of how to do so reliably
  • ​Easy to claim. Often impact happens as a combination of many other experiences and activities as well as yours. It's important to understand how much you can take credit for, or not.

Why is it important?

As well as being able to demonstrate the difference you've made as a result of the resource you have, understanding, evaluating and managing your impact will help you:
  • achieve the most you can
  • do it as well as you can
  • make your resources go as far as they can

Understanding impact also helps with morale and job satisfaction as it helps everyone involved to really see tangible results to their work, which is especially important to those who may spend more days at their desk and not seeing the real difference 'on the ground'.

By having a thorough understanding of what your impact is, what it takes to create it and how it relates to your organisational purpose, you are in a stronger position for making strategic and operational decisions. You will have a much stronger knowledge of what can be improved in your processes, vision and models; what could happen when things are changed; and what it would take to create deeper or broader impact.

As social investment grows as a means of revenue, being able to evaluate and manage your impact will become ever more vital, in order to return and / or grow the investments made.

It also enables you to show how you support the Social Value Act, which not only demonstrates an ethical commitment, but can bring a competitive edge when tendering for commissioned services that receive public funding.

In short, understanding, evaluating and managing impact strengthens your organisation.

Share

0 Comments

9/8/2019

Evaluation: What Do You Want and How Much Does it Cost?

0 Comments

Read Now
 
Picture
One of the most common questions I'm asked is: can you help us with our evaluation?
My response to that is always: very probably, what do you want and what are the parameters you're thinking of regarding timeframes and budget? And what is it exactly that you want or need?

Often, people don't exactly know. They know evaluation is a good thing, or at the very least that they should be doing / getting some. But sometimes that's all they know. So here are some things to consider when you want to commission some evaluation (or put it out for tender).
  • Typically, evaluation should take around 7% of the cost of your project - give or take. There's no real hard and fast rule but it's about that to get something worthwhile. You can spend more, you can spend less, and this partly depends on the size and duration of the activity.
  • Most medium - large cultural projects tend to cost between £5000 and £20,000 to evaluate. Smaller ones understandably cost a bit less - again depending on what you want. (A large agency will cost more than an individual consultant, partnership or small company - there are pros and cons either way. And specialists within the London area tend to cost more than those elsewhere).
  • You needn't actually spend 7% or £5-20k. But do think about it taking that much resource. Some of this you might want to allocate to in-house staff so the cost is absorbed back into your budgets. But it's helpful to think of the monetary value to make sure it's properly allocated and planned in and given the value internally that it needs to be done well.
  • What are you buying - is it capacity or expertise? If it's capacity you might want to outsource the whole thing. So you need to allocate a bigger budget.
  • If it's expertise, an external specialist can guide and mentor someone in the organisation. Your specialist might help set up a framework, templates and methodologies; or they could run some internal training. And then let the team run with it. They might step back in to analyse the data and write your final report. So you can do more with less money wise.
  • Discuss costs together. If a specialist hasn't quite 'got' what you're asking for, tell them. They will be happy to adapt their suggestion. Most will give you two or three options with a breakdown of costs to help you focus on what you really do or don't want.
  • If you do know exactly what you have available to spend, tell them and ask them to advise you on how it can be best spent to get what you most need.
Two last pieces of advice:
Firstly - if the evaluation is because an external funder expects / requires it, please do be prepared to let an evaluator see your application. They will treat it confidentially, but it is a very quick way for them to give you guidance on exactly what will work best for you. A good specialist won't be pushing a big sale, but they can help you decide which options are going to be best.
Secondly - no matter how much you decide to outsource or not, an evaluator cannot do everything for you. You need a good, consistent, honest working relationship to get the best results possible. The more you put in and own it, the better the relationship and the results will be. Ideally it works as a partnership.

All of this shows how I try and work with organisations wanting evaluation. This is what you can expect from me. You can also just say: "We have £x. We'd like X. Could you do that for us?"

Share

0 Comments

15/1/2016

Ten Top Tips for Evaluation

0 Comments

Read Now
 
Photo of a paper tag hanging from a small metal tree. The tag shows a Victorian illustration of a monkey. Under the monkey someone has written two words to describe the museum, which say fantastic and upbeat.
Creative consultation / visitor research activity: The Whitaker Museum & Art Gallery 2015
A few years back, I worked on a three year contract supporting organisational change in a group of universities who were starting to come to terms with a then brand new agenda, where academics and researchers needed to become more outwards facing, connecting with the public on their doorstep and at large. Part of my role was to mentor internal directors and project managers, and the departments they worked with, in looking for the impact of their activities. Like many major programmes, the initiative had a quite intense, technical, formal, robust evaluation system underpinning it. Like many organisations, this was not the fun part of anyone's work, and on top of everything else going on, was not generally what most people were interested in prioritising. In my mentoring role, I wanted to increase people's confidence about being able to carry out evaluation that was realistic and meaningful, and reduce their fear of becoming overwhelmed. At the same time, some of the community groups involved had been saying their previous experiences of evaluation in university programmes had, at times, been overwhelming, invasive, and one-sided.
​As a result, I created a simple, practical, set of suggestions to make evaluation do-able, useful, positive and meaningful. It simply offered these ten top tips, and five years later with the huge amount of learning I've developed about evaluation, they still absolutely stand the test of time...
  1. Be Selective: stay focused – keep your aims and objectives in mind. Don’t try and capture everything about everything. 
  2. Impact: always aim to answer these questions: What happened that wouldn’t otherwise have happened? What difference did it make – to us and to our participants, partners or colleagues? What did we learn from this? What do we intend to do next?
  3. Achievability: think about what’s realistic and achievable with the resources you have, this may mean you select only one thing to explore in depth, and a few other areas to cover briefly. 
  4. Breadth and depth: include quantitative and qualitative information. Quantitative methods lead to facts and figures; qualitative methods capture experiences and personal impacts. 
  5. Varied questioning: use open and closed questions. Closed questions allow you to create statistics to show particular patterns of what’s happening. Open questions mean people can say what’s really important to them. 
  6. Bespoke and appropriate: every project or event is unique. Create methods that work for you and your partners/participants.  Think about what will make it easy for people to complete any evaluation requests and, where appropriate, make it fun and integrate it into the main activity of your work. 
  7. Combined approaches: where possible combine a range of evaluation methods and techniques. This will allow you to cross-reference your findings to develop a clearer understanding of what is happening. 
  8. People-centred: Remember to explain to participants why you are collecting the information and be mindful of protecting private information. 
  9. Analysis: analyse the information you have.  Statistics and conclusions are just the start, keep going until you know why is that important? What does it show or prove?  
  10. Applying the understanding: use your evaluation. Tell people what you found out and the lessons you have learned. Consider how it will inform future plans and strategies going forward.  Reflect on how you might do things differently next time.

Share

0 Comments

31/10/2014

The Future Histories of Manchester Communities

0 Comments

Read Now
 
Black and white picture of Ardwick Green; Image courtesy of Manchester Libraries, Information and Archives, Manchester City CouncilImage courtesy of Manchester Libraries, Information and Archives, Manchester City Council
I've just recently started work on the evaluation of a year long programme hosted by Manchester Metropolitan University's Institute of Humanities and Social Science research. Entitled Creating Our Future Histories, the scheme sees 'early career researchers' (usually those who are completing a PhD, or are just about to start one / have recently finished one) working with Manchester community organisations. Each partnership is mentored by a more experienced academic. The partnerships are punctuated along the way by a series of weekend workshops combining into a professional development course on how community engagement between academics / researchers / communities might take shape. Each partnership is also expected to meet at least once between each workshop.

The partner-groups are developing co-constructed plans and activities which research previously uncharted areas of the organisation's heritage, and look towards incorporating their future in a way which will become part of their heritage in years to come - there's the 'Future Histories' part. Late next Spring each group will showcase their findings in creative and public ways - many yet to be decided; though ideas are already circulating about film, video, exhibtions, time capsules and more.

I'm about a month in and I'm once again struck by the many rich and hidden histories of Manchester - industry, architecture, battle and radical action, many many things which show the inventiveness and resilience of this sometimes bloody minded and often ingenius city.

You can find out more about the project here and I particularly recommend the research group pages and project blogs to find out more about the organisations involved and the progress and reflections taking place.

Share

0 Comments

23/5/2012

Researching Digital Engagement for School Visits in Museums

0 Comments

Read Now
 
Wordle picture of the words contained in a list of case studies compiled on digital learning for school visits in museums
I'm just wrapping up some research for a museum. They asked me to collate case studies of good and innovative practice in how comparable venues (which in this case include medium-large scale museums and galleries) use digital technology to support school visits, in workshops, self-directed studies and potentially back in school.

They also wanted to find out about the ways such activity can be evaluated. They absolutely do not want to have form after form handed to teachers and students, and wondered how else really good evaluation might take place.

The brief contains phrases like blended learning and e-learning. It's problematic because there are no clear definitions of what those are and where they start and end, And it's a real rabbit hole - an entire and massive area of specialisation and expertise.

It's a small piece of work, just skimming the surface to help the museum think in new and different ways about what they might do, and how to monitor its impact well.

I've collated 64 pages, over 32,000 words, of case studies of applications, programmes, projects, reviews and industry expertise opinions on contextual issues such as evaluation, future proofing and general good practice in digital learning and engagement. I've visited more websites, read more conference papers, searched more forums than I can count and interviewed some really insightful and inspiring colleagues in the field.

Eventually, if the museum in question has no objections, I'll upload the collated set of case studies and expertise here for anyone else who might like it. It will be in a very rough and ready format - just my research notes really, in no particular order. But it may be of some help to someone so watch this space...

In the meantime, it seemed much easier to put all 32k+ words into wordle and see what happened. There it is above, that's what the whole shebang amounts to. Interesting at this stage that 'online' is so prominent, given that I wasn't specifically looking at just online options. Interesting too that if 'conversation', 'collaboration' or 'participation' are in there, they certainly don't jump out.

Share

0 Comments

14/3/2011

Public Engagement Evaluation Guide

0 Comments

Read Now
 
I spent a huge amount of last year working with some amazing academic staff, researchers and community groups as they learned more about one another through creative projects as part of the Manchester Beacon for Public Engagement. My role was (and still is) to help what's happening at practice level link with a rather complex overarching evaluation framework.

The Manchester Beacon programme is, in a nutshell, about encouraging learning institutions to better understand how to open themselves up to communities more effectively. An important part of that process is to trial new approaches and reflect on what works, or what could be improved. It's an action learning model really. 

The Beacon team and I  identified that those involved in the practice needed support in being able to report back on their work in ways that fit the evaluation framework. So we set about producing some guidance for them, based on the input of community groups and a pilot cohort of academics and researchers.

Fast forward many months and the evaluation guidance pack / toolkit I created with their help, and the help of other colleagues, is now freely available. It contains some basic principles of evaluation, hints and tips, templates, and examples of creative consultation. 

You can read or download it below; contact me for a copy; or read / download it here.
On this page, you can also find some very short podcasts and top tips from some of the staff and community groups who have used the document. At the end of the pack there are also lots more links to further evaluation guidance in public engagement and also support created specifically for the fields of science communication;  community engagement; arts and heritage; and health and wellbeing.

All thoughts or comments welcome...

Share

0 Comments

12/7/2010

Social Returns on Investment

0 Comments

Read Now
 
I'm now part of the network exploring the SROI model of social returns on investment. It's way of putting monetary values on the sorts of evaluation and participation outcomes that occur in the projects I work with, and demonstrating what difference the investment has really made - what happened that coulnd't or wouldn't have happened without.

It's also a way for me to offer organisations the sort of robustness they might expect from an academic research team. 

A lot of it will be new to me but I'm hoping it will ad a layer of technical formality to the likes of MLA's Generic Social Outcomes. Indeed MLA have already commissioned large scale pilot evaluation projects testing SROI's capacity for analysing the values of specific library and museums schemes.

Exploring qualitative outcomes, impacts and benefits is something I really enjoy. There are times though, when stakeholders need more quantifiable results and this should enable me to provide that too.

SROI has been developed, and continues to be developed by it's members, to support arts and culture, public services, science and education - all of which are fields I work with. It also supports employment and business, environment and climate, and health and care. So I hope this means it will become a tool that is recognised across all organisations.

I should add though, that for me this will be an additional way of identifying what works and where things can be improved. It won't be the only one. I don't think any single tool can really capture everything that's important about a project. It's important to create a variety of tools bespoke to each separate project to cross-reference information and help pick out trends, similarities, patterns but also individual stories - the illustrations of people who have really felt something change as a result of taking part. But this does offer a way to  create evidence of quantifiable outcomes alongside the qualitative benefits I'm always keen to advocate based on evidence.  

Interestingly DEMOS have just released a report suggesting that the approaches underpinning SROI are sound though a range of shared outcome measurements across the public sector, which can be gathered more simply than SROI is able to, is desperately needed in order to help smaller organisations demonstrate their worth. I can't argue with that kind of logic, but until it arrives, SROI seems to be the closest there is to different sectors speaking to the same language.

It's the start of a new learning journey for me, and a welcome addition to what I will be able to offer the organisations I work with.

Share

0 Comments

5/7/2010

Illustrating evaluation

0 Comments

Read Now
 
World image of early years foundation stage six areas of learning
In evaluation, looking at outcomes is vital. When you've invested in a project of course you'll need to know what was achieved and where there are still gaps. But it can be cumbersome reading mounds and mounds of text in a report. Charts, graphs and percentages can help clarify, but can still make for dull reading (and for some feel too much like maths homework).

To help busy partners to any project, finding a variety of ways to report on activity can make all the difference between people remembering what went well and being able to advocate the value of their work... or not.

I've recently been creating a toolkit for early years practitioners looking at how creative engagement can help achieve new and unexpected results. Its aim is to be quickly digestible and highly practical. The toolkit is based on the activity of ten creative early years projects in schools and Children's  Centres in the North of England.

The projects were mapped against the Early Years Foundation Stage six areas of learning. To illustrate which of the EYFS outcomes the projects really brought to life, I used wordle to create this at a glance illustration. The larger the word, the more presence it had in the projects. You can see here Personal, Social and Emotional development was the strongest feature across the programme overall.

It doesn't replace the need for writing other information in the report of course. However the teachers, children's centre staff and creative practitioners involved, and readers of the toolkit, can now instantly see where the projects thrived and what kinds of outcomes similar work might expect to achieve, so much more quickly and easily than deciphering a big chunk of writing or trying to analyse a graph.

Share

0 Comments

11/6/2010

Using the Arts with Literacy and Numeracy: evaluation report available

0 Comments

Read Now
 
Picture
For several years there has been debate about the potential for using the arts to help improve literacy and numeracy (and other subjects). For many arts organisations being able to find ways to achieve this has been a necessity to survive.  For some this raises discomfort, those who feel it's not what the arts are for and can run the risk of people losing sight of other benefits that perhaps are more intrinsic to artistic practice. Personally I don't choose one side or the other of the argument, there are truths and benefits (and no doubt pitfalls) either way.

Though I do know this - for children and young people who for whatever reason are not as developed as their peers in language and numeracy skills, the arts can present a more accessible way to unpick learning that some other formats. I've seen it happen first hand. I can't say for sure it's specific to the arts - I do think it's something about a creative approach generally and the opportunity to work in different environments and include kinaesthetic activity. All of which is common, but not exclusive, to arts activity.

However - last year I was asked to work with the inspirational arts producer Elizabeth Lynch to evaluate Performing for Success. An arts based project building on the proven achievements of Playing for Success (which used sports to improve young people's numeracy and literacy skills). It was a unique programme in that it met Extended School agendas and relied on partnerships between extended school deliverers experienced in sport, and arts or cultural organisations. However there was no national model, each pilot area approached the structure in different ways, some more effectively and successfully than others. It was DCFS funded initiative but not via the 'usual' channels (such as Find Your Talent or Creative Partnerships) but through an independent education contractor, Rex Hall Associates.

In the current climate who can say what will happen to these kinds of initiatives. However if you'd like to read our response to the programme you can download it *here* 


UPDATE:  I have learned that Rex Hall sadly passed away on May 31st. My experience of working with him was brief but so inspiring to see first hand the difference one person can make. My thoughts and wishes go to all of those close to him.  

Share

0 Comments
<<Previous
Details

    ...Blog

    I'm most interested in how the public, your public, whoever that may be, engages with culture and creativity.  

    And if it nurtures creativity and develops personal, social or professional skills  I'm absolutely all ears.
    View my profile on LinkedIn

    To subscribe by email, enter your email address:

    RSS Feed

    Categories

    All
    Access
    Art
    Arts
    Artsmark
    Autism
    Children
    Citizenship
    Communities
    Conferences
    Consultation
    Craft
    Creative Industries
    Creativity
    Culture
    Dance
    Digital
    Early Years
    Education
    Evaluation
    Events
    Families
    Festivals
    Freelancing
    Funding
    Galleries
    Heritage
    Impact
    Interpretation
    Lancashire
    Learning
    Leeds
    Manchester
    Monitoring
    MOSI
    Museums
    Opportunities
    Outdoors
    Partnership
    Play
    Presentations
    Public Art
    Public Engagement
    Recommendations
    Reports
    Research
    Resources
    Rural
    Schools
    Science
    Social Value
    Training
    Universities
    Videos
    Writing
    Young People

    Archives

    October 2021
    July 2020
    August 2019
    February 2016
    January 2016
    October 2015
    April 2015
    December 2014
    October 2014
    September 2014
    August 2014
    July 2014
    June 2014
    May 2012
    September 2011
    May 2011
    April 2011
    March 2011
    November 2010
    October 2010
    September 2010
    August 2010
    July 2010
    June 2010
    May 2010
    April 2010
    March 2010
    February 2010
    January 2010
    December 2009
    November 2009
    October 2009
    September 2009
    April 2009

CONTACT
  • Home
  • ABOUT
    • About Me
    • Right now...
    • Clients
    • Client Feedback
    • Contact
    • Privacy Policy
    • Fees
    • BLOG
  • IMPACT
    • What is Impact?
    • Impact Development
    • Evaluation
    • Example Reports
    • Measuring & Managing Impact Quick Guide
    • Theory of Change Planning Template
  • ENGAGEMENT
    • Heritage
    • Visual Arts
    • Education
    • Creative Industries
    • Culture, Education & Community Guidance
    • Teacher Resource Packs
  • CHANGE
    • Research
    • Professional Development
    • Organisational Development