How to Align Grants with Organisational Strategy

A practical guide for funders on aligning grant programmes with strategic priorities, from setting outcomes and criteria to reviewing portfolio performance.

By Plinth Team

Most funders have a strategy document. Fewer have a grant-making process that consistently reflects it. The gap between what a foundation says it wants to achieve and where its money actually goes is one of the most common and least discussed problems in UK philanthropy. Applications arrive that seem promising but sit outside strategic priorities. Assessment panels disagree on what "good fit" means. End-of-year reporting reveals a scattered portfolio that is hard to explain to trustees or the public.

Aligning grants with strategy is not about imposing rigid rules that eliminate professional judgement. It is about creating a clear, shared framework that helps everyone involved in the funding process -- from programme officers to assessors to grant recipients -- understand what the foundation is trying to achieve and how each grant contributes to that goal.

The Association of Charitable Foundations (ACF) has made this a central theme of its Stronger Foundations initiative, which identifies strategy and governance as one of its pillars of excellent foundation practice. ACF's framework sets out seven characteristics of strong strategic practice, including being aware of the external context, publicly articulating vision and mission, and continually strengthening governance arrangements (ACF, Stronger Foundations).

The stakes are real. UK charitable foundations distributed a record £8.2 billion in grants in 2023-24, a 12% increase on the previous year (ACF, Foundations in Focus 2025). With application volumes surging by 50-60% at many foundations -- and some reporting increases of 100-400% -- the need for clear strategic alignment has never been greater.

Why does strategic alignment matter for funders?

Without deliberate alignment, grant portfolios drift. Programme officers respond to what comes through the door rather than what the strategy demands. Over time, the portfolio grows to reflect the interests and networks of individual staff rather than the mission of the organisation.

The Charity Commission requires charities with income above £500,000 to explain their strategy for meeting charitable purposes and to list significant activities undertaken as part of that strategy in their annual trustees' report (Charity Commission, CC15). For grant-making trusts specifically, this means explaining how their grant-making policy contributes to the achievement of their aims and objectives. Strategic alignment is therefore not just good practice -- it is a regulatory expectation.

Mission drift is a well-documented risk in the charity sector. Research published in Nonprofit Management and Leadership found that charities regularly adjust their practices and services to match funding opportunities rather than mission priorities, particularly when facing financial pressure (Bennett, 2011). For funders, the equivalent risk is awarding grants that are individually worthy but collectively incoherent. A scattered portfolio makes it harder to learn what works, harder to report to stakeholders, and harder to justify strategic choices to trustees.

How to translate strategy into grant criteria

The first step is converting your strategic plan into language that applicants and assessors can use. A strategy that says "we aim to reduce educational disadvantage in the West Midlands" needs to become specific enough that a programme officer can assess whether a given application fits.

This translation typically involves three layers:

Priority themes and populations. Define the 3-5 areas or groups your funding will focus on during this strategic period. Be specific: "improving literacy outcomes for children aged 5-11 in areas of high deprivation" is more useful than "supporting education."

Eligibility boundaries. State clearly who can and cannot apply. This includes geographic restrictions, organisational type (registered charities, CICs, unincorporated groups), income thresholds, and any excluded activities. Publishing these boundaries openly helps applicants self-select and reduces the volume of ineligible applications.

Assessment criteria linked to outcomes. Each question on your application form should connect to a strategic priority or a cross-cutting concern (such as governance or financial sustainability). When assessors score applications, they should be evaluating strategic fit alongside delivery capability.

The Esmee Fairbairn Foundation provides a useful model here, asking grantees to submit up to three key outcomes with one or two indicators for each -- a proportionate approach that keeps reporting manageable while maintaining strategic focus (Esmee Fairbairn Foundation).

For a detailed guide to writing effective criteria, see How to Write Clear Grant Criteria.

How many outcomes should a funder track?

Less is more. Foundations that try to track 15 or 20 outcomes across their portfolio end up with data that nobody uses. The practical ceiling for most funders is 3-5 strategic outcomes, each with a small number of measurable indicators.

This approach has several advantages. It forces genuine prioritisation at the strategy level -- you cannot claim everything matters equally if you are only tracking five things. It makes reporting simpler for grant recipients, who often submit reports to multiple funders simultaneously. And it makes portfolio analysis meaningful, because you have enough data points under each outcome to identify patterns.

A useful structure looks like this:

LevelWhat you defineExample
Strategic outcomeThe change you want to see in the worldYoung people in deprived areas achieve better educational outcomes
IndicatorHow you will know it is happening% of participants achieving expected reading level by age 11
Data sourceWhere the evidence comes fromGrantee monitoring reports, standardised assessments
Review frequencyWhen you look at portfolio-level dataAnnually, with mid-year check-ins for large grants
ThresholdWhat "good enough" looks like60%+ of participants showing measurable improvement

Outcomes should be owned collectively -- they belong to the portfolio, not to individual grants. Some grants will contribute directly to a strategic outcome; others will contribute indirectly or to enabling conditions. Both are valid, but the connection should be articulated at the point of award.

For more on building an outcomes framework, see What is a Theory of Change? and What is a Grant Evaluation Framework?.

What does strategic alignment look like in practice?

The gap between strategy and practice often lives in the details of day-to-day grant-making. Here is a comparison of what alignment looks like at each stage of the grant cycle:

Grant cycle stageWeak alignmentStrong alignment
Fund designFund objectives written in isolation from strategyFund objectives derived directly from strategic outcomes
Application formGeneric questions about "what you do"Questions mapped to specific strategic priorities
Eligibility screeningManual review of every applicationClear published criteria; automated screening where possible
AssessmentAssessors score on general qualityAssessors score on strategic fit, delivery capability, and evidence of need
Decision-makingPanel approves strongest individual applicationsPanel considers portfolio balance across themes and geography
MonitoringStandard progress report at 12 monthsReporting tied to agreed outcomes with proportionate data collection
Portfolio reviewAnnual report lists grants madeAnnual review analyses outcomes by theme, geography, and beneficiary group
Strategy refreshNew strategy written from scratchStrategy updated based on evidence from portfolio analysis

The shift from weak to strong alignment does not happen overnight. Most foundations find it takes 2-3 grant rounds to embed new criteria fully and for the quality of applications to reflect the updated priorities. Publishing your strategic outcomes alongside your funding guidance -- with examples of the kinds of projects that fit -- accelerates this process considerably.

How to build strategic alignment into your assessment process

Assessment is where alignment either holds or falls apart. If your scoring rubric does not reflect your strategic priorities, assessors will default to evaluating general quality -- which means well-written applications from well-resourced organisations tend to score highest, regardless of strategic fit.

A practical approach is to weight your scoring criteria so that strategic alignment accounts for a meaningful proportion of the total score -- typically 25-35%. This might include:

  • Fit with priority themes (does the application address one or more of our stated priorities?)
  • Evidence of need (is there a clear case that this work is needed in this place, for this population?)
  • Contribution to portfolio outcomes (how will this grant contribute to our ability to report on strategic outcomes?)
  • Learning potential (will this grant generate evidence or insight that strengthens the wider portfolio?)

Assessors need training on what strategic fit means in practice. Providing worked examples -- "here is an application that scored highly on strategic alignment, and here is one that scored well on quality but was not a good fit" -- is more effective than distributing a written policy.

IVAR's Open and Trusting Grant-making initiative, which now includes over 140 funders making grants worth over £1 billion annually, emphasises that alignment and trust are not in tension. Funders can have clear strategic direction while still giving grant recipients flexibility in how they deliver (IVAR, Open and Trusting Grant-making). The key is to be clear about the outcomes you are funding towards, while remaining open about the methods organisations use to get there.

How to use monitoring data to check alignment

Monitoring is not just about accountability -- it is the primary mechanism for understanding whether your portfolio is delivering against your strategy. If your monitoring forms are not structured around your strategic outcomes, you will collect data that is interesting but not useful for strategic review.

Proportionate monitoring means asking grant recipients to report on the outcomes that matter to the funder, using a small number of standard indicators supplemented by narrative. The approach should balance consistency (so you can aggregate data across the portfolio) with flexibility (so grantees can tell the story of their work in context).

Standard indicators might include quantitative measures like people reached, sessions delivered, or wellbeing scores. Narrative elements might include short case studies, descriptions of unexpected challenges, or reflections on what has changed for participants. Combining both gives you the numbers for portfolio dashboards and the stories for trustee reports and public communications.

The frequency of monitoring should match the size and duration of the grant. A 12-month grant of £10,000 might require a single end-of-grant report. A three-year grant of £150,000 might require annual reports with a lighter-touch mid-year update. NCVO's Road Ahead 2025 report highlighted the pressures facing small charities, noting the "big squeeze" of rising costs and increasing demands (NCVO, Road Ahead 2025), so calibrating the ask to the grant size matters both for relationships and for data quality.

Tools like Plinth's monitoring and reporting features allow funders to set up outcome-linked monitoring forms that grantees complete online, with responses automatically feeding into portfolio-level dashboards. This removes the manual step of extracting data from PDF reports and entering it into spreadsheets -- a process that, at many foundations, consumes days of staff time each quarter.

How to review and adapt your strategy based on evidence

Strategic alignment is not a one-off exercise. It requires a structured review cycle that uses portfolio evidence to test whether the strategy is working and whether it needs adjusting.

An annual strategic review should answer four questions:

  1. Are we funding what we said we would fund? Compare the portfolio against your stated priorities by theme, geography, beneficiary group, and grant size. Identify any significant gaps or areas of over-concentration.

  2. Are the grants delivering the outcomes we expected? Look at monitoring data in aggregate. Are the indicators moving in the right direction? Are there themes or geographies where outcomes are consistently stronger or weaker?

  3. What have we learned that should change our approach? This might include insights from grantee feedback, shifts in the external environment, new research, or changes in government policy. ACF's Stronger Foundations framework emphasises the importance of foundations being "aware of the external context and understanding their role within the wider ecosystem" (ACF, Stronger Foundations).

  4. What should we do differently in the next funding round? Translate the learning into concrete changes -- updated criteria, revised outcome indicators, new priority areas, or adjusted geographic focus.

The UKGrantmaking platform, a collaborative project bringing together data on UK grantmaking, enables funders to benchmark their portfolio against the wider landscape and understand where their funding sits relative to other foundations (UKGrantmaking). This kind of external comparison is valuable for identifying both gaps in provision and areas of duplication.

How technology supports strategic alignment

Manual grant management makes strategic alignment harder than it needs to be. When eligibility screening, application assessment, monitoring, and reporting all happen in separate systems -- or in spreadsheets and email -- the connections between strategy and delivery become invisible.

Modern grant management platforms address this by linking each stage of the grant cycle to the foundation's strategic framework. The benefits are practical:

  • Eligibility checks can be automated against published criteria, so programme officers spend less time on applications that do not meet basic requirements
  • Assessment scoring can be structured around strategic priorities, with built-in weighting
  • Monitoring forms can be pre-configured with outcome indicators that map directly to strategic goals
  • Portfolio dashboards can show real-time data on how funding is distributed across themes, geographies, and populations
  • Reporting to trustees and the public can draw on live data rather than requiring manual compilation

Plinth's grant management platform takes this further by connecting the full grant lifecycle -- from application intake and eligibility screening through assessment, award, monitoring, and impact reporting -- in a single system. Funders can define strategic themes and tag funds and grants against them, then generate portfolio-level views that show how their funding maps to priorities. The platform's AI capabilities help with tasks like screening applications against eligibility criteria, summarising monitoring submissions, and generating tailored funder reports from the underlying outcome data. Plinth offers a free tier, making this kind of integrated approach accessible to smaller foundations that may not have budgets for enterprise grant management systems.

Common mistakes when aligning grants with strategy

Even well-intentioned foundations make predictable errors when trying to align their grantmaking with strategy. Recognising these patterns can save time and prevent the frustration of a review cycle that reveals persistent misalignment.

Overly broad strategic priorities. If your strategy says you fund "health and wellbeing" with no further definition, almost any application can claim to fit. Narrow your priorities enough that programme officers and assessors can meaningfully distinguish between applications that align and those that do not.

Criteria that exist on paper but not in practice. Publishing strategic criteria on your website is necessary but not sufficient. If your assessment rubric does not weight strategic fit, and if your panel discussions do not reference it, the criteria will not influence decisions.

Ignoring portfolio balance. Approving grants one at a time, based on individual merit, can produce a portfolio that is heavily skewed towards one theme or geography. Building portfolio-level review into your decision-making process -- even if it is just a standing agenda item at panel meetings -- counteracts this tendency.

Collecting monitoring data you never analyse. Many foundations ask grantees to report on outcomes but never aggregate the data or use it to inform strategic decisions. If you are asking for data, have a plan for how you will use it. If you will not use it, do not ask for it.

Changing strategy too frequently. Foundations that revise their priorities every year create instability for applicants and make it impossible to track long-term outcomes. A 3-5 year strategic cycle, with annual review and minor adjustments, is typically more effective than wholesale change.

FAQs

How many strategic outcomes should a foundation track?

Three to five strategic outcomes is the practical range for most foundations. Tracking fewer than three risks oversimplification; tracking more than five typically results in data that is too thin to draw meaningful conclusions. Each outcome should have one or two measurable indicators.

Should we fund applications that fall outside our strategic priorities?

Occasionally, yes -- but with clear governance. Many foundations maintain a small discretionary budget (typically 5-10% of total grant-making) for opportunities that do not fit neatly within strategic priorities but are clearly aligned with the broader mission. Record the rationale for each exception and review the pattern annually.

How do we balance strategic alignment with trust-based grantmaking?

Strategic alignment and trust-based approaches are complementary, not contradictory. Be clear and directive about what outcomes you are funding towards, while giving grant recipients flexibility in how they deliver. IVAR's Open and Trusting Grant-making principles show that over 140 funders have adopted this approach successfully.

How often should we review our grantmaking strategy?

A full strategic review every 3-5 years is standard, with annual portfolio reviews that can trigger minor adjustments. Avoid changing strategic direction more frequently than this, as it creates instability for applicants and makes it difficult to assess long-term outcomes.

Should we publish our strategic outcomes for applicants to see?

Yes. Publishing your strategic outcomes, with examples of the kinds of projects that fit, improves application quality and helps organisations self-select before investing time in an application. Transparency about what you are looking for benefits both applicants and programme staff.

How do we prevent mission drift in our grant portfolio?

Build strategic fit into your assessment scoring (weighting it at 25-35% of the total score), review portfolio balance at every panel meeting, and conduct an annual strategic alignment review. The most common cause of drift is approving grants based on individual quality without considering how each award fits the overall portfolio.

What data should we collect from grantees to assess strategic alignment?

Collect a small number of standard quantitative indicators (linked to your strategic outcomes) plus brief narrative reports. The Esmee Fairbairn Foundation's model of 2-3 outcomes with 1-2 indicators each is widely regarded as proportionate. Avoid asking for data you will not use at portfolio level.

Can small foundations with limited staff still align grants with strategy?

Yes. Strategic alignment does not require complex systems or large teams. A one-page document setting out 3-5 priority outcomes, a simple scoring rubric that weights strategic fit, and an annual review of the portfolio against those priorities can be implemented by a single programme officer. Grant management platforms with free tiers, such as Plinth, can automate much of the tracking.

Recommended Next Pages


Last updated: February 2026