How to Demonstrate Community Centre Impact to Funders

A practical guide to collecting, analysing and presenting impact data that satisfies grant funders, local authorities and trustees.

By Plinth Team

TL;DR: The best impact reports combine hard numbers (attendance, demographics, financial reach) with human stories. Centres that capture data as part of daily operations — using tools like Plinth — can produce funder-ready reports in minutes rather than weeks and are more likely to secure repeat funding.

  • 41% of community centre managers could not provide impact data when a funder requested it (Locality, 2024).
  • Centres with digital reporting tools are significantly more likely to submit grant reports on time.
  • The National Lottery Community Fund explicitly states that evidencing existing impact strengthens new applications.

Who this is for: Community centre managers, trustees, and funding officers who need to report impact to funders.

Why impact reporting matters more than ever

Grant funding for community centres has become more competitive. The National Lottery Community Fund received over 12,000 applications for its Reaching Communities programme in the 2023-24 cycle, with a success rate below 30%. Local authority grants have contracted in real terms since 2010, and trust and foundation funders increasingly require detailed monitoring data before releasing second-year payments.

In this environment, centres that can clearly articulate their impact have a significant advantage — not just for winning new grants, but for retaining existing ones and building relationships with commissioners.

The Charity Commission's CC49 guidance on charities and public benefit also reminds trustees that they should be able to demonstrate the impact of their charity's work. Impact reporting is a governance obligation as well as a fundraising tool.

Key takeaway: impact data is not just for funders — it shapes better decisions, attracts support and demonstrates accountability to the community.

What funders actually want to see

Different funders emphasise different things, but most expect a common core of evidence.

The four pillars of impact evidence

PillarWhat it coversExample data
ReachHow many people you serve and who they areTotal attendance, unique users, postcode spread, age and ethnicity breakdown
ActivitiesWhat you deliver and how oftenNumber of sessions, types of programme, hours of operation
OutcomesWhat changed for participantsReduced isolation scores, improved wellbeing, skills gained, employment outcomes
ValueHow efficiently you use resourcesCost per beneficiary, volunteer hours leveraged, income generated per pound of grant

The majority of UK grant reports require a combination of these pillars. The most sophisticated funders — including the National Lottery Community Fund, Esmee Fairbairn Foundation and many local authorities — expect all four.

Key takeaway: structure your data collection around these four pillars from the start, and you will be able to satisfy almost any funder's reporting requirements.

Step 1: Build data collection into daily operations

The biggest mistake centres make is treating impact measurement as a separate activity that happens at reporting time. By then, the data is incomplete, unreliable or simply unavailable.

Instead, embed data collection into the tasks your team already performs.

Attendance and reach data

  • Use a digital check-in system at every session. Plinth captures attendance automatically when participants book or check in, with no separate register needed.
  • Collect basic demographic data (age band, postcode, ethnicity) at first registration. Explain clearly why you collect it and how it will be used — most participants are willing when they understand the purpose.
  • Track unique users, not just total visits. A centre with 5,000 annual visits from 200 unique users tells a very different story from one with 5,000 visits from 2,000 unique users.

NCVO estimates that the average UK community centre serves between 300 and 1,500 unique users per year, depending on size and programming.

Activity data

  • Log every session delivered, including room hire and externally led activities. Your booking system should capture this automatically.
  • Categorise activities by type (health, social, educational, creative, employment) to make reporting easier.
  • Record cancellations and the reasons — this data helps you improve programming and demonstrates responsiveness to funders.

Key takeaway: if data collection requires extra effort, it will not happen consistently. Choose tools that capture it as a by-product of operations.

Step 2: Measure outcomes, not just outputs

Outputs are what you deliver (50 yoga sessions, 200 advice appointments). Outcomes are what changes as a result (participants report improved physical health, clients resolve debt problems). Funders care far more about outcomes.

Practical outcome measurement tools

  • Pre-and-post surveys — short questionnaires completed at the start and end of a programme. Use validated scales where possible, such as the Short Warwick-Edinburgh Mental Wellbeing Scale (SWEMWBS) for wellbeing or the ONS4 subjective wellbeing questions.
  • Distance-travelled tools — simple scales (1-10) on relevant dimensions, measured at intervals. These work well for ongoing programmes where there is no clear "end."
  • Case studies — detailed stories of individual journeys, with consent. Three to five strong case studies per year add human depth to quantitative data.
  • Feedback forms — quick post-session ratings. Even a one-question "How did this session help you today?" yields useful data at scale.

The New Economics Foundation (NEF) suggests that outcome measurement should consume no more than 5% of programme delivery time. If your measurement approach is more burdensome than that, simplify it.

Key takeaway: start with two or three simple outcome measures per programme and build from there. Perfection is the enemy of progress.

Step 3: Analyse and present the data

Raw data does not persuade funders. Clear analysis and compelling presentation do.

Structuring a funder report

Most funders provide reporting templates, but when they do not, use this structure:

  1. Executive summary — one paragraph highlighting headline numbers and the most significant outcome.
  2. Reach and activity data — tables and charts showing attendance, demographics and programme delivery against targets.
  3. Outcome evidence — survey results, distance-travelled data and case studies. Link outcomes to the funder's stated priorities.
  4. Financial summary — how the grant was spent, cost per beneficiary and any leveraged income (e.g., room-hire income generated as a result of the funded programme).
  5. Learning and adaptation — what you changed in response to data or feedback. Funders value honesty and responsiveness more than perfection.
  6. Plans and sustainability — how the work will continue beyond the grant period.

Plinth generates sections 1-4 automatically from live data, allowing managers to focus their time on the narrative sections (5 and 6) that require human insight.

Visualisation tips

  • Use bar charts for comparisons (attendance by month, demographics by age band).
  • Use line charts for trends over time (cumulative reach, wellbeing scores across sessions).
  • Use pie charts sparingly — they work for simple proportions (income sources) but are hard to read for complex data.
  • Include one or two photographs with consent — they bring the report to life.

According to a 2023 study by the Institute for Voluntary Action Research (IVAR), funders spend an average of 12 minutes reading a monitoring report. Lead with the most compelling data and keep the narrative concise.

Key takeaway: a clear, well-structured report with strong visuals communicates impact far more effectively than a dense spreadsheet.

Step 4: Use impact data beyond funder reports

Impact data has value far beyond satisfying grant conditions.

  • Trustee reporting — quarterly impact dashboards keep the board informed and engaged. Trustees who understand the centre's impact are better advocates.
  • Marketing and communications — share headline stats and case studies on social media, your website and local press. "Last year, 1,200 local residents used our centre" is a powerful message.
  • Business planning — data on which programmes are most popular, which demographics are under-served and which activities generate the best return on investment directly informs strategic decisions.
  • Staff and volunteer motivation — people who can see the impact of their work are more engaged and more likely to stay.
  • Commissioning and partnerships — local authorities and health commissioners increasingly use evidence of community-level impact when designing services. A centre with strong data is a more attractive partner.

Public health research shows that community-based interventions can deliver a social return on investment (SROI) of GBP 1.20 to GBP 8.00 for every pound invested, depending on the programme. Being able to quantify this return makes your centre a compelling proposition for commissioners.

Key takeaway: impact data is a strategic asset, not a compliance burden. Use it everywhere.

Common impact reporting frameworks

Several established frameworks can guide your approach.

FrameworkBest forComplexity
Theory of ChangeMapping how activities lead to outcomesMedium — requires initial workshop
Logic ModelSimple input-output-outcome chainLow — suitable for most centres
SROI (Social Return on Investment)Quantifying financial value of social outcomesHigh — often requires external support
Outcomes StarIndividual-level distance travelledMedium — licensed tool with training
ONS4 Wellbeing QuestionsPopulation-level subjective wellbeingLow — four standard questions

For most community centres, a simple logic model combined with ONS4 wellbeing questions and regular attendance data provides a proportionate and credible evidence base.

Key takeaway: choose a framework that matches your capacity. A simple approach applied consistently is better than a complex one applied sporadically.

Mistakes to avoid

  • Collecting data you never use — every data point should serve a purpose. If you cannot explain why you collect it, stop.
  • Waiting until reporting time to start — by then, the data is incomplete. Embed collection into daily operations from day one.
  • Reporting only what went well — funders value honesty. Describing what did not work and how you responded demonstrates learning and maturity.
  • Ignoring qualitative data — numbers tell part of the story; quotes and case studies tell the rest. The best reports weave both together.
  • Over-claiming causation — be careful with language. "Participants reported improved wellbeing" is defensible. "Our yoga class cured depression" is not.

Key takeaway: honest, proportionate reporting builds trust. Over-claiming undermines it.

FAQs

What if we don't have baseline data?

Start collecting it now. For existing programmes, you can do a retrospective baseline by asking participants to recall their situation before they started attending. This is imperfect but better than nothing. For new programmes, build baseline measurement into the first session.

How do we get consent for case studies?

Use a simple written consent form that explains how the story will be used, who will see it and how the person can withdraw consent. Offer anonymity as an option. Most participants are happy to share their story when asked respectfully.

Do we need specialist evaluation software?

Not necessarily. Plinth handles attendance, demographics and outcome surveys as part of its core platform. For more complex evaluations (such as full SROI analysis), you may want to engage an external evaluator.

How often should we report to trustees?

Quarterly is standard practice. Provide a one-page dashboard with headline attendance, income and outcome data, plus a narrative summary of highlights and challenges.

Can small centres produce meaningful impact data?

Absolutely. A small centre serving 300 people per year can produce compelling evidence with basic attendance tracking, a simple feedback form and two or three case studies. Scale the approach to your capacity.

What tools does Plinth provide for impact reporting?

Plinth automatically generates attendance reports, demographic breakdowns and outcome summaries from data captured during normal operations. Dashboards are funder-ready and can be exported as PDFs or shared via secure links.

Recommended next pages