Blog Post

Impact measurement tools: using Government and other nationally developed questions in your surveys

Becky Nixon • Aug 18, 2023

Why and how to use resources and evidence from the Office of National Statistics and other verified sources to improve your impact measurement and needs analysis, including a download of suggested indicators

Research and evidence has always been important to me in how I work, for my own interest and to improve the activities of the people and organisations I work with.  For example, after undertaking an impact assessment for a project working around loneliness I wanted to dig deeper into the research about what affected loneliness and isolation and what was most likely to work in service design.  A deeper dive into the topics I work around adds extra value to my work with clients as a result of this knowledge base.


This blog gets into the weeds a bit about some of the practical issues you might encounter when using these type of indicators - you may not be interested in all of it, but different sections are titled so that you can pick out what you want.  A pdf of suggested indicators is available in the link below.  One of the main factors is making these fit with the outcomes of your programme and that people accessing your service want (so consult them on what's best!) - if they are meaningfully linked in with assessment and review for people, they are most likely to be implemented usefully and correctly.


I was interviewed earlier in the year by the Campaign to End Loneliness for a Department for Culture, Media & Sport report on the Evaluation of interventions to tackle loneliness that discusses some of the issues in this blog, but most of all recommends that there is more support provided to the sector to understand, select and implement various measures.


I don't want to read the blog, can I just download the indicators?

The value of adding standard survey questions

When developing impact frameworks and tools I generally encourage organisations to use indicators that are already developed by other organisations.  The main advantages to this are:


  • They have already been robustly tested with a wide audience to ensure that the questions are understandable, are valid - i.e. they measure what they are supposed to including minimal overlaps between different questions in the same instrument, and are reliable - i.e. they give the same answer in the same situation over time. (As an example, lack of reliability is a common criticism of the the Myers-Briggs Type Indicator, which people may be familar with. Whilst really useful in some situations to help people understand relationships and dynamics in teams - I have run a workshop on it deconstructed, categorising people into 16 groups may not be so useful. People may be familar with taking it on different occasions and getting different results, although research shows it to be more reliable than people often believe).  Note that not all indicators will have been developed to measure before and after change as opposed to taking a snapshot at a particular time, but they are commonly used in this way by a variety of organisations.


  • There are robust benchmarks to compare against - for example the Community Life survey measures the Office for National Statistics four wellbeing questions and the single loneliness question.  There is also a Government dashboard of various questions and current measurements.  With any of these, data changes from year to year, so you should always check for the most up-to-date data.  You can use this data immediately right at the beginning of any project to demonstrate that your clients have particular needs, with scores below national averages.


There is currently a review of national wellbeing indicators being undertaken by the Government, which may change and/or add indicators.


Which questions do I recommend to clients most frequently?

I often mix and match questions from different places.  Many instruments are designed to be used as a whole rather than in this way (although not the questions that I'm recommending below) - but again, for the purposes that most charities need them this is not likely to be a problem.  I most commonly recommend the ONS four wellbeing questions and the single loneliness question from the Government's loneliness strategy.  I also recommend questions from the Community Life Survey. These choices are partly because of the benchmarks, as above, but also because they are questions that are used with the general public, where some questionnaires feel that they are pathologising people and only used where a deficit is perceived.


Like many charities, I used to recommend the short (7-item) or longer (14-item) Warwick-Edinburgh Mental Wellbeing Scale, but found some indicators problematic: asking older people aware they might be in the last years of their life whether they feel optimistic about the future seems insensitive; people can struggle to understand what feeling useful means (and in any case it seems to espouse a particular type of Protestant work ethic - why should someone feel as though they need to be useful?! Perhaps not a priority when you're coping with poor health or poverty) - and feeling close to other people could feel wrong for women experiencing domestic abuse or coercive control when actually not feeling close to someone abusive is the aim.  There are also issues with licencing that many organisations are unaware of, as is the case with other academic or commercial surveys.

Common issues and problems you might face and what you can do about them

Choosing questions with different measurement scales

Strictly speaking, the questions and answers should have the same phrasing as the version that you are copying from, but sometimes you might decide that questions from different places are perfect for your particular project, but the choices for answers are phrased differently, one asks people to rate their answers from 0-10, another asks you to agree or disagree with the statement, and a further has five answers from never to always.  This can be confusing for people (although on the other hand it does make them read the questions properly rather than just answering "slightly disagree" to everything). 


As you are probably not conducting academic research, making slight changes for consistency is one solution - it does mean that it may be more difficult to benchmark your answers (though not impossible to do an approximation to this with a note about your methodology) but it does mean that you are asking the questions that you think matter. An example of doing this is VASL's Community Champions project - they chose five key questions.  For three we kept the original answers - we felt that the single loneliness question was important that it stayed as it was, as the project is focused around loneliness. Another of the indicators we felt didn't fit well with the 0-10 format.  We adapted two other questions though so that the scale (0-10) was the same as for the ONS wellbeing happiness question.  For the "I am content with my friendships and relationships" question from the Harvard Flourishing scale this was a minimal change from renaming the 0-10 "strongly disagree" to "strongly agree" to "not at all" to "completely".  For the "I have enough people I feel comfortable asking for help at any time" this was a larger change as the original scale was purely text based and has five points from strongly disagree to strongly agree.  These questions are below.

Not being able to get a consistent before and after

The idea behind these scales is taking two or more measures at different times to measure the change between them.  If you're undertaking ongoing keywork with someone it can be relatively easy to get a before, during and after measurements of distance travelled, and can be done as part of a broader review process that helps people reflect on what has changed for them - in this way it can be meaningful for the person surveyed and not just something that they do predominantly for the benefit of the organisation.  Although note that people often find endings difficult and may not turn up for a final session, so it's good to do the scale before this. 


However, sometimes getting two or more measures is difficult, an example from my experience have been measuring changes amongst volunteers, some of whom complete surveys post-training and some complete annual surveys, but don't necessarily complete both, and it can be fiddly and time consuming to link up respondents unless you have a database set up for this purpose.  Another example is Growing Together, community centre in Northampton where people may drop in repeatedly, but on a casual basis and so being consistent about surveying people at different times can be difficult. 


It may be possible to be more organised about getting people to fill things in - for example getting volunteers to complete a feedback survey before they can start volunteering - this can be combined with a general feedback questionnaire about their experience of recruitment and induction and you can make sure that not all questions are compulsory so that whilst you're getting them to look at it, you're not compelling them to answer every question, which doesn't quite fit with the spirit of volunteering!


Another alternative is to adapt questions so that you are only asking them at the end of a time period and not doing a before measure - you can use this at an event or choose a week/two weeks/a month in which you'll focus on asking everyone accessing the service the same questions.  Here is an example of a form designed with Growing Together,


Using questions repeatedly with the same people and not seeing movement after the initial period

Often the aim of projects is to work with people for a specific length of time and then for them to move on to other things.  However, there are some situations in which people are always going to need involvement such as frail older people, people with long-term conditions, disabilities or situations.  In my experience, distance travelled measures can show a lot of progress initially as the person comes in needing more intense support, but then as things improve and the relationship with the service might be less intense, or about maintaining changes rather than improving things, measures can plateau.  There are a few possibilities here, you could have a different measure for the first year (or whatever intervention period you are using) that uses distance travelled and then swap to something else after that.  You could continue doing distance measured for everyone, but analysing it separately for people who have been involved for different lengths of time.  Or you can switch to a single measure rather than distance travelled as above.


Stopping things from getting worse but not making them better - not knowing the counterfactual - what would have happened otherwise

This is a similar problem to the one above.  What would have happened anyway without the intervention is often called the "counterfactual" and links in with attribution - what your service can claim in relation to the changes and what's caused by something else, for example a health condition that would slowly improve anyway or a change in the jobs market making it easier for people to find work. 


Sometimes all charities can do is to try to prevent or delay things getting worse, for example in progressive health conditions, but using before and after measures is likely to show "backwards" movement.  In randomised control trials, often called the "gold standard of research" (although there are many reasons why they are not always useful!) the study population is randomised into two (or more) groups, one of which does not receive any intervention at all to see what happens to them in comparison to people who have received the intervention(s).  However, this is not really ethical in day-to-day charity services!  One solution is not to use distance travelled, but instead to ask questions at specific times as for the Growing Together example.  If you do want to use distance travelled, then thinking carefully about the questions that are asked and where you might feel more positive change is possible, for example people feeling less stressed about their situations, that they have more people to talk to, or that they have greater connections with others in similar situations.


Testing and validation of questions has often only been done in English

One downside is that they have often only been tested in English rather than other languages. This is a problem that we ran into when running a Money Advice Service What Works? project - we recruited people who were apparently too financially excluded because of language to be able to do robust measurements because the surveys were only validated in English - the Money Advice Service weren't able to help us through this and didn't want to change the methodology so the opportunity to find out more from a population that is rarely heard from (including in this particular programme) was lost. It was also an issue that the National Lottery Community Fund ran into with it's Ageing Better programme, particularly in Leicester where we put together a project that had a huge focus on areas of the city with a large South Asian population. However, in both of these instances there was a need for more academic rigour than needed for most voluntary sector impact measurements, so translating or interpreting questions into other languages should be fine, although you might want to pilot them first with speakers of those languages to check. In any case piloting any questionnaires is generally good practice.


Scales not being sensitive enough to measure change

This is a problem that we found with the UCLA loneliness scale, also recommended in the Government loneliness strategy.  With only three points to the questions: hardly ever or never; some of the time; often, it wasn't picking up change enough, which is why I tend not to recommend it.  If you did like the statements in this, (How often do you feel that you lack companionship? How often do you feel left out? How often do you feel isolated from others?) as above you could change the answers - making them match the five-point answers in the single loneliness question recommended in the same strategy if you are using that, or using the 0-10 rating of the ONS four wellbeing questions.


Knowing what's changed but not why, and what can be attributed to your service?

Scales and measures have their place and they will tell you that a change has been made, but not necessarily why - for that you need to ask more open-ended questions about what has helped and hindered.  Good questions are out of the scope of this blog, but possibly a subject for a further one if it would be useful.


Attribution is another issue as above, what might have changed anyway and is nothing to do with you?  You can quantify this by asking for example what percentage of change people would attribute to the organisation as well as asking questions about what else has made a difference.


How to implement questionnaires

Also a complete topic in itself!  But here are some brief suggestions:

  • Provide paper and online versions of questionnaires to meet different needs.
  • Make sure any online survey is functional on a mobile phone, as this may be the most common way that people complete it.  Provide tablets if possible if people need them.
  • Don't ask too many questions, just those that are most crucial.  Having multiple choice questions can reduce the length of time it takes to get information and make analysis quicker, but you can lose detail with only multiple choice, so carefully focused open-ended questions are also helpful.
  • I currently use Microsoft Forms, which is "free" (at least if you have a Microsoft subscription), or Google Forms is another free alternative.  There is other software such as Surveymonkey, which has greater functionality for analysis, but if you go over a certain number of questions or responses there is a significant charge, which if you are not using it much may not be value for money.
  • Where there is capacity it can be helpful to have workers or volunteers go through the scales with people.  As well as the obvious advantage of motivating people to complete them, other people can often see changes that people accessing services may forget.  There are issues here about impartiality and conflict of interest so people need to be properly trained, supported and monitored to implement questionnaires effectively.
  • Make it clear why you're asking the questions and what benefit it might have to people accessing the service (and people who might access the service in the future), and to the organisation.  Be clear about what you're doing with the information, and if possible feedback the results and what you are doing to people who have taken their time to fill it in.
  • Be aware of any questions that may be distressing to people, which questions about mental wellbeing or loneliness might be, and consider how to support them.  Academic surveys will have a bit at the beginning about possible detriment to respondents as part of their ethical processes, and may direct people to other sources of support.  For charities, this is more likely to be support provided by the charity itself.

How Ideas to Impact can help

Support is available from a Power Hour to give people a sounding board and some pointers about what you can do to embed impact assessment into your work, through to working alongside organisations longer term as a learning partner, helping to devise an impact / evaluation framework and supporting data collection, analysis and reporting.  Providing an evidence briefing for service design and funding is another option for people who don't have time or access to academic journals to do the research themselves .

Other related Ideas to Impact blogs

Other wellbeing indicators resources

Please let me know if there are other links that you have found that are useful


What Works Wellbeing measures bank

Money and Pensions Service financial wellbeing indicators

Impactasaurus


Is there any other good practice that people would like to suggest based on what's worked for you, or any topics that it would be useful to have further information about?  Let me know!

by Becky Nixon 13 Feb, 2024
New research identifying the need for an organisational approach to resilience and wellbeing
by Becky Nixon 08 Feb, 2024
New research into loneliness and older men and its implications for services
CALS logo
by Becky Nixon 24 Nov, 2023
Putting psychologically and trauma-informed management training into practice in the workplace
by Becky Nixon 16 Oct, 2023
Supporting workers, ameliorating stress, and providing a more trauma-informed approach - a programme run by Leicester women's organisation New Dawn New Day , and Ideas to Impact .
by Becky NIxon 28 Sept, 2023
Feedback from participants about how they do or could implement actions towards trauma-informed working
by Becky Nixon 15 Apr, 2021
With a lot of people in flux at the moment with working patterns, with workplaces starting to open up and people going back to working elsewhere, I was reflecting on my early morning walk on what Gretchen Rubin calls the strategy of the clean slate to make habit change. For me, the change is not my own, but my daughter's. She has recently started an apprenticeship at a medical centre in a town around 12 miles away. In the morning she gets on the bus at 7.45am, so I have got into the habit of leaving the house with her and having an early morning walk, usually along the canal above, before settling to work. For some reason, the last bus back leaves at 3.25, before the schools and work has actually finished for most people (and then they wonder why it's not used much!) so I pick her up every night. Her work is closer to my gym than where we live, and now the pool is open again it motivates me to finish work an hour earlier and to have a swim before I get her as I'm driving halfway there already. Hopefully my daughter will soon be driving herself now lessons and tests have started again, so I was wondering whether I would keep these activities up when our routine changes again. This is where the power of habit takes over. Habitual behaviour happens in a different part of our brain from conscious thought ( this blog I wrote earlier talks a bit about this) and can easily over-ride other intentions. Habit change is about reprogramming our neural pathways until we do that activity "without thinking". If we repeat an action enough times, then our brains start to associate one thing with another in an "if...then" pattern, e.g. if my daughter is leaving the house to get her bus then it's time for me to go for a walk. Think of this rewiring as a route across a park - as more and more people walk along new route the grass gets worn away and the pathway gets more established, like these . Hopefully by the time my daughter passes her test and I wave her off on her own, my morning walk routine will be so established that I will carry on doing it, same for the early evening swim. It's difficult to say how long it will take any individual to form a habit, a study by Philippa Lally and colleagues at UCL identifies an average of 66 days, but so many factors come into play that it can vary, for example some of these could be how committed are you to the habit, how convenient/inconvenient it is, what social reinforcement you have for doing/not doing it. With some provisos Lally's article identified a range of 18-254 days. Action point When you are facing a change in your life such as changing where or how you work as you may be doing at the moment, think about whether this could help you to make or break any habits. If the strategy of the clean slate doesn't work for you, Gretchen Rubin in her book Better than Before has many other practical strategies that you can try. Using psychology for individual and organisational development I am working with colleagues to develop a series of courses, workshops and resources around using psychology in the VCSE to improve individuals, organisations and services, for example around areas of motivation and engagement; personality in the work place; working effectively including looking at habits, understanding yourself and avoiding procrastination; and the effects of trauma on the brain. If you are interested in any of these, please sign up below for my newsletter where they will be announced later in the year or contact admin@ideastoimpact.co.uk
by Becky Nixon 23 Mar, 2021
Background The Investing in Volunteers standard was last reviewed in 2010, so it was due for a refresh in line with changes in volunteering and the wider world. Ideas to Impact won the tender, and working with colleagues Janet Lewis-Jones (IiV lead assessor) and Ann Gilbert (formerly chief executive of Northampton Volunteering Centre) and assistant Marlen Tallett, we undertook desk research and a consultation to refresh the standard, working closely with Jo-Ann Maycock, Trish Kiss, and Adam Fox at NCVO and liaising with leads from the other UK organisations, WCVA , Volunteer Scotland , and Volunteer Now . The project was overseen and the final version agreed by the UK Volunteering Forum with representatives from each of the four countries. Aims of the refresh included: Streamlining the standard and updating practice and language in line with current volunteering good practice and wider developments in volunteer-involving organisations. Ensuring that it is relevant to a wide range of volunteer-involving organisations, including specific guidance for different types and sizes of organisation and different types of volunteering opportunities. Focusing more on the volunteers' experiences, including incorporating the findings from NCVO's research with over 10,000 volunteers Time Well Spent . We engaged with around 200 people through an online survey, online workshops, individual telephone calls, attendance at the National Volunteering Forum, individual email conversations, and three workshops held by country leads in Scotland and Northern Ireland. Once we had a draft standard, 24 organisations filled in a self-assessment against it, we undertook 17 volunteer and trustee mock interviews, and a focus group of different types of organisations reviewed the materials. Comments that we received on the new standard included, “I really like the way the standard is now more readable, and easier to use as a tool to explain to others not so knowledgeable about volunteering what underpins a good quality volunteer involvement process." “I think it hits the big picture elements for diversity and inclusion and reminds people that we need to be pro-active in reaching out to encourage people to get involved." “I think it's great to see people as the first point in the explanation of the quality area. Volunteering is about people first and foremost." Main changes in the 2021 standard There are now six quality areas instead of nine: Vision for volunteering Planning for volunteers Volunteer inclusion Recruiting and welcoming volunteers Supporting volunteers Developing volunteers Some of the main changes include: A greater emphasis on volunteering being part of the wider organisational strategy and a requirement that the impact of volunteering is understood and communicated - impact assessment has changed a lot in the 10 years since the standard was last reviewed. This is crucial so that organisations value their volunteers, but also so that volunteers can see that their work is important and makes a difference - feeling meaning and purpose is key to volunteer engagement and satisfaction. This includes ensuring that any volunteer managers are appropriately integrated into the organisational structure so that there is an ongoing mechanism for communication across the organisation. Detail relating to specific processes removed to guidance, with indicators focusing more on outcomes and the bigger picture . Included in this category of changes are: thinking through the purpose of holding information about volunteers (not just data protection), having a positive and managed exit process (not just collecting feedback from volunteers who are leaving) and supporting volunteers' future aspirations (not just providing references). Following feedback in Time Well Spent that many volunteers found volunteering overly bureaucratic we've also included making sure that systems and processes are proportionate and volunteers know why they are in place. "Volunteers feel..." statements that focus on how volunteers experience the organisation, for example volunteers feel supported rather than the organisation provides support. Other "volunteers feel" statements include that there is good communication, their contribution is meaningful and rewarding and they feel valued and a part of the organisation. Some consultation respondents were concerned about whether these would be difficult to assess, but when we interviewed volunteers we found this was the sort of information they naturally provided, often without us actually asking specifically! From the perspective of organisations, finding out whether volunteers do feel these things is part of day-to-day activities anyway through formal and informal processes, and is important in ensuring that volunteers have a quality experience and remain volunteering. We also broadened out the scope in the indicators relating to equity, diversity and inclusion , relating what happens with volunteer inclusion to the aims of the organisation as a whole, and ensuring that the whole organisation is welcoming to a wide range of volunteers. Ensuring that organisations are proactive in increasing volunteer diversity and tackling under-representation was also strengthened, along with further guidance for organisations that work with a specific section of the population, for example particular health conditions, or select volunteers from members or service users. In quality areas five and six we used our experience of assessing IiV to make strengthen areas where we'd heard frequent comments from volunteers about their experiences but that were not explicit enough in the previous version. This included making sure that volunteers were able to discuss how they are doing - not all volunteers want feedback on their "performance" but they do generally want to know what has happened as a result of their volunteering. The indicator on volunteers being able to learn and develop was changed to include organisations recognising the skills and experience that volunteers bring with them where appropriate, as it's not infrequent that volunteers say they have skills that could be put to use that organisations are unaware of. Also in this section we emphasised that volunteers often say they would like more contact with other volunteers - we found research that indicated that peer support from other volunteers was more highly correlated with volunteer engagement and satisfaction than support from paid volunteer managers. The new version of the standard will be available on the Investing in Volunteers website following the launch on 24 March 2021 About Ideas to Impact Ideas to Impact provides a range of services to organisations and individuals including relating to volunteer management and developing and assessing quality standards. This includes consultancy, research, workshops and one-to-one support in coaching or through a one-off Power Hour where we can talk you through any issues you have with volunteering and what we know from research and evidence and our experience of talking to hundreds of volunteers. Our news and blog gives more information about different aspects of our work. This includes courses that fit with Investing in Volunteers, including Encouraging Inclusion , Volunteers' Voices , and How to have happy and engaged volunteers , which also incorporates psychological research around engagement, motivation and wellbeing. Please email becky@ideastoimpact.co.uk if you would like to be added to the list to be informed when open versions of these courses will be run, or sign up to the newsletter below.
by Becky Nixon 03 Jan, 2021
If you're anything like me, your heart might be sinking at all the things that you were overwhelmed by at the end of last year that went onto the "I'll think about it in the new year" list, because that time has now come and things possibly don't feel any easier. This post describes one mechanism to identify and prioritise the things that you need to do - this activity it is aimed at individual workers rather than whole organisations. Take some Post-It notes Or similar sized pieces of paper (I once worked in an organisation that didn't allow us to have Post-It notes - or decent coffee - in case funders visited and thought that we were frittering away their money on luxury items). On the Post-Its write down all the things that are on your mind that need doing - one per Post-It so that you can move them around to rearrange them. These may range from small to large items, e.g. send out agenda for team meeting or prepare forms for annual development reviews to finalise and submit tender for main services or develop new strategic plan. Once you have completed this, for each item put a score out of ten where ten is high and zero is low: (a) in the top left corner for what impact this activity will make to progressing you towards your organisational mission if you achieve it (for people familiar with the urgent - important matrix, this is the importance element); (b) in the top right hand corner for how urgent the action is; and (c) in the bottom left hand corner for how much mental or emotional energy this is taking up - this isn't a traditional way to plan and prioritise, but since your wellbeing and mental and emotional state is crucial to how well you perform, it is critical to consider this. Do not use a 7 for any of the scores. I heard this on a podcast but can no longer remember who suggested this, but the idea was that a seven is a cop out and it's easy to end up with lots of sevens with no way of differentiating between them - make them a 6 or an 8. The next bit is more of an art than a science (sorry to anyone who was hoping to add up the scores and get a neat formula to make the decisions for them!) Make sure you have a system for recording tasks now and into the future. I use ToDoist - there are other similar apps or you can just use a diary and a notebook. This gives me the ability to sort by projects (which I review weekly) and by discrete tasks to do on specific days. It gives me a list of the activities that I need to do in the shorter term, and the ability to note things that need doing in the future - I may not have dealt with the actual task, but at least I know when I need to start thinking about it so it's not cluttering up my brain. (I also have a separate "sometime maybe" list of things that I want do but aren't time critical to stop cluttering up my daily to do list.) First try to get rid of anything that doesn't need to be done or can be done some other way. For anything that has a low impact score - ask why you are doing this at all. If it does need to be done, how can you minimise it, automate it, or delegate it? One issue I often see in not-for-profit organisations is lack of admin support for managers or team leaders - there's a pressure to reduce core/administrative costs, but ultimately this is generally not effective or efficient. If it's not appropriate to employ someone, think about a freelance virtual assistant who might help you a few hours a week or month. In terms of delegation, you don't want to dump things on other people who are over-loaded themselves, but it's worth considering delegating authority and decision-making to empower workers to deal with more activities which then don't need to come back to you - this may help to make everyone's lives easier. Identify anything that can be done quickly (e.g. in about 10 or 15 minutes), start by clustering these Post-Its in preparation to them in - most of these are probably not going to be things that need a lot of brain power, so batch it with other similar tasks and put it into a time in your day that's not your best thinking / quality work time. (However, recognise if you're someone who needs to get some of these things out of the way before you settle on bigger pieces of work because otherwise they are sitting there bugging you! There is no one-size-fits all approach to effectiveness.) For anything that has a high impact score - this is a priority and needs to be in your (or someone else's) diary. Separate out the larger and more significant pieces of work. These are often the things that make most difference, yet don't get enough time allocated to them. How are you going to make time for them? Who else needs to be involved? Where can you get help from? When do you need to start thinking about them to give you the time you need to work on them? See organisational psychologist Adam Grant's TED Talk about scheduling work for optimal creativity. For anything that has a high mental and emotional energy score - the things that are bothering you - ask yourself why to think about what you need to do about it. Is the problem that you don't know what to do but haven't put aside the time to think about it in more detail? Do you not have the knowledge or skills or time to do it? Does it involve difficult conversations or relationships? Is there a particular emotion you are feeling about it? Procrastination is often about avoiding negative emotions, sometimes it's helpful to just recognise this - read this New York Times article around the work of Dr Fuschia Sirois. Otherwise you might just want to "Eat the frog" and get on and do them, so they are not continuing to have a negative impact on you. (Mark Twain possibly said something like "Eat a live frog first thing in the morning and nothing worse will happen to you the rest of the day".) Go through each of the Post-It notes and put an activity and timescale into whatever system you have. Hopefully this has helped you feel more in control and either demonstrated that you are able to manage the work you have and to get it out of your overloaded brain, or it's demonstrated that there are more fundamental issues requiring a revision of your wider organisational strategy or structure to ensure that your organisation's work is more realistic. Need some help? Ideas to Impact can help you work through planning and prioritisation through Power Hours - a customised mix of coaching, mentoring and/or consultancy depending on your needs, or through workshops and facilitation for your organisation or through various workshops and training sessions . Contact Marlen Tallet, admin@ideastoimpact.co.uk if you would like to book a specific session, or Becky Nixon becky@ideastoimpact.co.uk for a more general discussion about how we can help you.
by Becky Nixon 17 Dec, 2020
An increasing number of charities are producing high quality impact reports to promote what they do. These have more of a “marketing” approach and different from impact reports that are more evaluative, for example this one that Ideas to Impact researched and wrote for Clinks. I've been working with a few clients recently to plan, collect information for, and design impact reports, and to get ideas I analysed other organisations reports, so to help others, here is the information I collected: this post contains a list of some of the things that you can include in your impact report to give you ideas about the content and the format. Impact reports are sometimes linked with the annual report, and sometime separate. Here is a report that Ideas to Impact produced in partnership with VASL’s Community Champions project. Some other examples are linked below.
by Becky Nixon 16 Nov, 2020
I've had similar conversations with different managers recently about what data they collect and why. Different purposes call for different types of information, so it is helpful to think this through - most of us can probably think about data that we have collected but that we haven't done anything with. A future post will look at how you can compose the content of surveys, distance-travelled monitoring and other frameworks, this one starts with some basics to lead you through a process of considering: Why you need data and what are the key messages that you want your data to convey What type of data you need What you will do with it As you do each exercise you may want to go back to a previous one to revise it with further thinking. Contact us using the details below if you would like a Word version of the pro formas.
Show More
Share by: