Why Impact Reporting Improves Donor Confidence
The evidence that transparent, specific impact reporting increases donor retention, gift size and trust — and what good reporting looks like for donors, funders and the public.
Clear, honest impact reporting demonstrably increases donor confidence — and the evidence for this link is growing stronger. Donors who receive specific, credible information about what their gift achieved are more likely to give again, to give more, and to advocate for an organisation among their networks. Donors who do not receive that information gradually disengage, often without explaining why.
This is not merely theoretical. The Charities Aid Foundation's 2024 UK Giving Report found that the number of people donating to charity has fallen to the lowest level since CAF began tracking in 2016, with only 50% of respondents reporting that they had donated in the previous 12 months (CAF, 2024). In a contracting giving environment, the charities that retain donors most effectively are those that make donors feel their contribution mattered — and specific impact reporting is one of the most direct tools for doing that.
This guide examines the evidence for the link between impact reporting and donor confidence, explains what good reporting looks like for different audiences, and sets out practical steps for charities to improve their reporting without overwhelming small teams.
The Trust Gap in UK Charitable Giving
The UK charitable giving landscape faces a structural trust challenge. While those who donate give more generously — CAF found an average of £72 per month among active donors in 2024 — the pool of donors is shrinking. CAF chief executive Neil Heslop has noted that "we are relying on an ever-smaller group of people to give while the challenging economic environment continues to place significant strain on charities" (CAF, 2024).
Research from Give.org, which surveys giving attitudes across the English-speaking world, has found that a substantial majority of respondents cite lack of clarity about how funds are used as a major barrier to giving. Among the top factors that build trust in a charity, Give.org research shows that respondents most frequently cite evidence of accomplishments shared by the organisation, followed by evidence that the charity makes a real difference for the people and communities it serves (Give.org).
Both of these trust-building factors depend on effective impact reporting. Donors cannot assess how funds are used if they are not told; they cannot verify that a charity makes a real difference without evidence. The trust gap is, in large part, an information gap — and impact reporting is the primary mechanism for closing it.
What the Evidence Shows on Donor Retention
The Fundraising Effectiveness Project (FEP), which analyses giving patterns across thousands of organisations, found that the overall donor retention rate in 2023 was 57.1% — meaning that roughly four in ten donors who gave in 2022 did not give again in 2023 (FEP, 2024). For new donors, the picture is considerably worse: only 28% of first-time donors in 2023 made a second gift (FEP, 2024).
The business case for improving retention is straightforward. Acquiring a new donor typically costs five to ten times more than retaining an existing one. Even modest improvements in retention rate translate directly into reduced fundraising costs and more stable income. A charity that raises its new-donor retention from 28% to 35% — a seven-percentage-point improvement — has materially changed its income trajectory for years to come.
Impact reporting is one of the most cost-effective levers for improving retention. Research cited in practitioner guidance suggests that donor renewal rates can improve significantly after introducing clearer impact reporting formats, particularly where donors receive specific, causal evidence connecting their gift to outcomes — rather than generic statements about the charity's work.
Why Generic Reporting Fails Donors
Most charity annual reports contain impact information, but most impact information in annual reports fails donors. The most common failure modes are:
Vagueness over specificity. "We helped thousands of people last year" tells a donor almost nothing compared with "2,847 people attending our employment service in 2024–25 progressed into paid work within six months." Numbers need context; context needs specificity.
Outputs masquerading as outcomes. Reporting the number of sessions delivered, leaflets distributed, or people who attended a workshop is not impact reporting — it is activity reporting. Donors want to know what changed for participants as a result of the charity's work, not just what the charity did.
Anecdote without evidence. A single case study, presented without supporting data, tells a donor nothing about whether the charity's work routinely produces similar results or whether the case study was exceptional. Stories need to be placed in context by data; data needs to be made human by stories. As research on donor engagement has consistently found, quantitative data proves scale while qualitative stories prove significance — donors need both.
No financial transparency. Donors who cannot understand what proportion of their gift is spent on programmes versus administration are left to assume the worst. Publishing a simple breakdown of expenditure — ideally alongside evidence of the value generated — is essential for trust.
One-size communication. An individual who gave £25 online has different information needs from a trust that made a £50,000 multi-year grant. Sending both the same annual report PDF is almost certainly failing at least one of them.
What Good Impact Reporting Looks Like for Individual Donors
Individual donors range from small one-off givers to major donors giving tens of thousands of pounds per year. Reporting should be proportionate to gift size and relationship depth.
For most individual donors, good impact reporting includes:
- A direct connection between their gift and an outcome. "Your donation of £30 funded three sessions of our after-school reading programme" is more motivating than "our reading programme helped 400 children this year." Individualisation drives emotional connection.
- One or two specific outcome statistics. Not a wall of numbers, but one or two carefully chosen metrics that demonstrate the scale and significance of the charity's work.
- A brief story from a beneficiary. With appropriate consent and anonymisation, a short narrative about one person whose life changed as a result of the programme makes abstract statistics concrete.
- Honest acknowledgement of challenges. Donors who receive only positive information become sceptical. A brief, professional acknowledgement of what proved harder than expected and what the charity learned from it builds credibility.
- A clear ask or invitation. Every piece of donor communication should close with a natural next step — a renewal, an upgrade, a volunteering opportunity — that maintains the relationship.
What Good Impact Reporting Looks Like for Institutional Funders
Institutional funders — grant-making trusts, foundations, statutory bodies — have different information needs from individual donors. They are typically:
- Accountable to their own boards and stakeholders for the quality of grants they make.
- Required to demonstrate that their portfolio is achieving the outcomes their strategy commits to.
- Comparing performance across multiple grantees and potentially multiple funders co-funding the same work.
Good impact reporting for institutional funders therefore requires:
- Alignment with agreed outcomes. Reports should directly address the outcomes and indicators agreed at grant award stage, not generic organisational achievements. See How to Standardise Impact Reporting Across Programmes for how funders can design consistent frameworks.
- Honest variance reporting. Where planned outcomes are not being achieved, a funder needs to know — and why. A grantee who reports 60% of planned outcomes achieved alongside a clear explanation of the shortfall and a mitigation plan is more credible than one who presents 100% achievement against indicators that have been quietly adjusted.
- Financial compliance reporting. Confirmation that funds have been spent as agreed, with a summary of variance, is non-negotiable for institutional funders.
- Evidence quality. Not just what outcomes were achieved, but how the charity knows. Were beneficiaries surveyed? At what point? Against what baseline? Funders increasingly distinguish between high-quality evidence (pre-post surveys with validated tools) and low-quality evidence (retrospective self-report without baselines).
What Good Impact Reporting Looks Like for the Public
Public transparency serves multiple purposes: it builds sector-wide trust, it enables informed charitable choice, and it satisfies the regulatory expectation that charities demonstrate accountability to the public benefit test. The Charity Commission Business Plan 2024–25 reinforces the Commission's expectation that charities are "transparent and accountable" (Charity Commission, 2024).
For public audiences, good impact reporting includes:
- A clear, jargon-free summary of what the charity does and who it reaches. Not a mission statement, but a concrete description of activities and the populations they serve.
- Consistent year-on-year comparisons. Isolated annual figures tell the public little; trends over three to five years show whether the charity is growing its impact or stagnating.
- Publicly available financial information. The Charity Commission's register makes accounts available for all registered charities, but charities can go further by publishing a clear, accessible summary of income, expenditure, and reserves alongside their impact report.
- Honest discussion of challenges and uncertainty. Public trust in charities depends partly on perceived honesty. Organisations that only report success invite scepticism; those that acknowledge complexity earn credibility.
The SORP 2026 Context
The revised Charities SORP, effective from January 2026, strengthens expectations around impact reporting for all UK charities. The new framework explicitly requires that charities explain not only what they did but "what changed as a result," with progressively greater detail required at higher income levels (ICAEW, 2025). For Tier 2 charities (income £500,000–£15 million), this means an explanation of impact on individual beneficiaries and on society more broadly, alongside a summary of indicators used and outputs achieved.
For charities that have historically relied on activity reporting, SORP 2026 represents a significant shift — and for funders designing reporting requirements for their grantees, it provides a minimum standard below which requirements should not fall for larger organisations.
Common Reporting Mistakes and How to Fix Them
| Mistake | Why it undermines confidence | Better approach |
|---|---|---|
| Reporting activities instead of outcomes | Donors cannot assess what changed; feels like output-padding | Report specifically on what was different for beneficiaries before and after |
| Using only percentages without absolute numbers | "90% of participants improved" means nothing without knowing the base | Combine percentages with raw numbers: "90% of 312 participants (281 people)" |
| Annual-only reporting | Donors disengage between annual reports | Supplement annual reporting with brief in-year updates (email, social) |
| Only reporting successes | Undermines credibility; donors sense airbrushing | Include one honest challenge and the organisation's response |
| Same format for all donors | Fails both major donors and small donors | Segment by giving level and relationship; tailor content and format accordingly |
| Jargon-heavy language | Alienates general public and non-specialist donors | Test language with non-sector readers; aim for plain English throughout |
| No call to action | Leaves donor with no next step | Always close with a specific, appropriate ask |
How Technology Supports Better Impact Reporting
Historically, producing high-quality impact reports required substantial manual effort: collecting data from multiple sources, reconciling different formats, writing and designing a polished document. For small charities, this often means impact reporting is either skipped or produced at such infrequent intervals that it cannot drive donor engagement effectively.
Modern grant management and impact measurement platforms reduce this burden significantly. Plinth, for example, enables charities to collect impact data from beneficiaries and services throughout the year through structured forms, and to compile that data automatically into funder reports. This means the raw material for a compelling impact narrative — outcome data, participation numbers, demographic breakdowns — is available at any time rather than requiring a data-gathering sprint before the annual report deadline. Explore how Plinth supports impact data collection at Plinth's AI grant management page.
For charities producing impact reports that need to satisfy multiple audiences simultaneously — institutional funders, individual donors, and the public — the key is to build a single authoritative data set and then extract different versions of the story for different audiences. A single beneficiary survey dataset can power a page in an annual report, a paragraph in a major donor thank-you letter, a statistic in a social media post, and a section in a funder's monitoring report — if the data is structured and accessible.
See also Turning Impact Data into Donor Revenue and How to Write a Charity Impact Report for complementary guidance.
The Long-Term Case: Trust Compounds Over Time
The strongest argument for sustained investment in impact reporting is not the short-term effect on individual donor decisions — it is the compounding effect on organisational trust over time. Organisations that report consistently, honestly, and specifically build a reputation for transparency that is one of their most durable assets.
In the UK context, where overall trust in charities has been under pressure for several years — CAF has tracked declining participation rates since 2016 — the charities that have maintained or grown their donor base are disproportionately those with strong, credible impact reporting. Trust is not built in one annual report; it is built across years of consistent, honest communication.
For institutional funders, the same logic applies. A grantee who has consistently delivered credible, honest reports over three grant cycles is far more likely to receive a renewal — and a larger grant — than one with an equal delivery record but poor reporting quality. Funders make their best bets on organisations they trust, and trust is earned through communication.
Frequently Asked Questions
Does impact reporting actually increase donations?
Yes, the evidence consistently supports this. Give.org research shows that a large proportion of people cite lack of clarity about fund use as a barrier to giving, and that organisations providing specific, credible impact information see higher donor retention rates. The precise magnitude of the effect varies by charity type, donor segment, and reporting quality, but the direction of the relationship is clear across multiple studies.
What is the difference between an output and an outcome in impact reporting?
An output is a direct product of an activity — the number of sessions held, people reached, leaflets distributed. An outcome is a change that results from that activity — improved mental wellbeing, increased employment rates, reduced isolation. Good impact reporting focuses on outcomes while providing outputs as evidence of delivery scale. Funders and donors are increasingly sophisticated about this distinction.
How often should charities report on their impact?
This depends on audience. Institutional funders typically require reports at intervals specified in the grant agreement (quarterly, six-monthly, or annually). Individual donors respond better to more frequent, lighter-touch communications — brief email updates, social media posts, or short video messages — supplemented by a comprehensive annual report. The goal is to maintain the donor's sense of connection and confidence throughout the year, not just at renewal time.
What is a donor impact report?
A donor impact report is a communication specifically designed to show a donor what their gift achieved. It differs from a general annual report in being targeted at the donor's contribution and interests. Well-designed donor impact reports are personalised to the individual (or segment), use accessible language, combine data with stories, and close with a clear next-step ask.
How do I report on impact honestly when outcomes are hard to measure?
Start by being honest about what you know and how you know it. If your best evidence comes from participant self-report surveys, say so — along with the response rate and methodology. If certain outcomes are difficult to attribute to your work alone, acknowledge that. Donors and funders respect intellectual honesty about measurement challenges; what erodes trust is false certainty or inflated claims.
What is the SORP 2026 requirement for impact reporting?
The Charities SORP 2026, effective from January 2026, requires all charities to report on what changed as a result of their activities, with requirements increasing by income tier. Tier 1 charities need a summary of main achievements; Tier 2 must explain impact on beneficiaries and society and describe measurement indicators; Tier 3 must additionally assess return on funds invested. See What Is Impact Reporting for more detail.
How can small charities produce good impact reports without large teams?
By building data collection into programme delivery rather than treating it as a separate reporting exercise. When beneficiary surveys, session records, and outcome tracking are part of standard practice throughout the year, the data for a compelling impact report accumulates automatically. Technology platforms that integrate data collection with reporting make this significantly more achievable for small teams.
Should charities report negative findings or failures?
Yes, selectively and professionally. Reporting on what did not work — and what the organisation learned from it — builds credibility with sophisticated donors and funders who understand that all programmes face challenges. The key is framing: not a catalogue of failures, but a brief, clear acknowledgement of a specific challenge, why it arose, and how the organisation responded. This is very different from reporting general organisational weakness.
Recommended Next Pages
- Turning Impact Data into Donor Revenue
- How to Write a Charity Impact Report
- Proving Your Charity's Impact to Funders
- How to Standardise Impact Reporting Across Programmes
- How Charities Struggle to Collect Impact Data
Last updated: February 2026