How we choose the charities we recommend
Our process
First, we identify Evidence Producers
We are currently using three:
- The Education Endowment Foundation, a charity which is the UK government’s ‘what works centre’ on education. EEF’s work underlies our charity recommendations in education. Read more about EEF’s research
- The What Works Centre on Children’s Social Care, and the Early Intervention Foundation. These have recently merged to create the What Works Centre for Children & Families (WWCF). They are also in the UK government’s network of What Works centres. Their research underlies our recommendations around children. Read more about WWCF’s research
- The Ministry of Justice Data Lab, which is an analytical unit inside the UK Ministry of Justice. The JDL’s work underlies our charity recommendations in crime reduction. Read more about JDL’s research
We call these ‘Evidence Producers’. We plan on adding more sectors over time by working with more Evidence Producers.
Second, we screen each Evidence Producer’s research for studies that are relevant to The Good Giving List
For example, the Ministry of Justice Data Lab produces evaluations of the effect of various programmes to reduce crime (specifically, to reduce the 12-month re-offending rate). It assesses programmes run by the public sector (e.g., prisons), charities, and the private sector, and has been doing so for some years. We screen JDL’s analyses to select only those which: are conclusive (many are inconclusive, mainly because the sample size is too small to be statistically significant); produce a positive effect (some programmes that it assesses appear to have no effect or to increase re-offending); and are run by charities (because donors cannot normally directly donate to a prison or the national prison service.) This is our Stage One analysis.
Third, we screen the charities based on desk research
This includes criteria such as:
- Does the charity still exist? Some have ceased operations since the evaluation was published
- Does the charity appear to still run the programme? Some charities have made public announcements that they have ceased to run them. An example is a charity which ran a programme that the JDL found to be effective in probation – but since then, the government has brought all probation services in-house so none is now run by charities
We also use desk research to explore other criteria about which we may ask the charity in subsequent stages, e.g.:
- Financial stability, level of reserves, whether expenditure consistently exceeds income
- Whether the charity regulator has raised any red flags about this charity
- Suitability of its governance set-up, including number and skill of trustees
- Size. We do not recommend any charity with revenue under about £20,000, simply because a single large donation might swamp it
Fourth, we talk with the charity
There are some criteria here, e.g., whether it wants to be included in The Good Giving List. We explore with them issues such as those above, and also:
- The charity’s commitment to the programme
- Its financial stability, plans for maintaining diversity of revenue streams (because that can affect ability to endure and persist)
- Whether the charity can expand the programme if it wants to / has additional funding. Sometimes they cannot, e.g., if they are only allowed to operate in particular prisons
- Changes in the context around this programme and the charity. For example, sometimes there are changes in the law or role of statutory bodies, such that the government decides to fund a programme nationally and charitable donations are no longer really required. (This is a feature of the long history of charities. Originally schools were all funded charitably – i.e., education outside of governesses in stately homes – and then the state took on responsibility to provide them universally, funded by taxation. Same for hospitals. It still happens.)
Our process is as consistent as possible
We try to put all our recommendations through the same selection criteria and process. But the types of research produced by the various Evidence Producers varies. To be as consistent as possible, we need to tweak our criteria to take account of that variation in the Evidence Producers’ outputs. For instance, the Education Endowment Foundation commissions experimental evaluations (randomised controlled trials, RCT) of various interventions. It normally does those in two stages: first, an ‘efficacy trial’ which might include relatively few schools, and, if that produces good results, then an ‘effectiveness trial’, which runs in more schools. By contrast, we don’t screen charities which come through the Justice Data Lab on having an effectiveness RCT because JDL doesn’t produce RCTs: rather, it does a statistical analysis called Propensity Score Matching. This is similar to an RCT and can be used when randomisation is not possible (which it often isn’t in prisons, for obvious reasons) and where detailed data about the relevant population are available.
We trust our Evidence Producers
So, for example, if they say that a Propensity Score Matched evaluation is rigorous (e.g. that the PSM is such that the sole material difference between the group which got the programme and those which didn’t, so the evaluation shows the programme’s effect), we believe them. We do not second-guess their work.
We use research from independent Evidence Producers, not from the charities themselves
This is to avoid bias in research which is commissioned or conducted and published by the entity being evaluated.
We recommend whole charities
Often charities run many programmes: there may be rigorous evaluations of one or some of its programmes but not of all of them. This leaves The Good Giving List with a choice: to recommend that you, as donors, fund only the evaluated programme; or to recommend the whole charity. We recommend whole charities, for two reasons.
- Dictating that a charity must use your donation as you specify (‘restricting it’ in the charity industry lingo) is a nonsense: it’s unenforceable: you can’t track what they do with your £10 because money is all the same colour
- Restricted funding is an administrative nightmare for charities – so expensive to manage that sometimes the administrative cost outweighs the benefit. So we don’t do that
Much more detail about our selection criteria and method are here.