What role has the Prevent programme played in funding local online radicalisation-prevention projects and how have those projects been evaluated?

Checked on February 6, 2026
Disclaimer: Factually can make mistakes. Please verify important information or breaking news. Learn more.

Executive summary

The Prevent programme has been a channel for funding local projects that aim to stop radicalisation—including online-focused prevention and counter‑narrative activity—while national and European bodies have also financed and promoted research, toolkits and networks for online P/CVE (preventing and countering violent extremism) [1] [2]. Evaluations of these interventions exist in the form of internal toolkits, training in evaluation, case studies and academic evidence reviews, but independent impact evidence is patchy and systematic gaps in rigorous evaluation have been repeatedly flagged [3] [4] [5] [6].

1. What Prevent funds locally: online content, skills and outreach

Prevent’s remit explicitly includes stopping people being drawn into terrorism and supporting early intervention across institutions and communities; online radicalisation is named as a threat the programme must address, and funded activity has included strategic communications, online counter-narratives and community outreach that have online components [1] [7]. At the EU level, funding streams and Commission grants have supported projects to explore recruitment mechanisms, develop good practices and promote alternative narratives online through civil society empowerment programmes and Horizon/ISF grants [2] [7] [6].

2. How funding flows and local delivery are organised

Delivery is devolved: local authorities oversee Prevent delivery locally and receive Home Office coordination and regional advisors where risk is elevated, with dedicated Prevent coordinators in higher-risk areas [1]. The EU complements national work through project-based grants, research funding and a Knowledge Hub that aggregates and promotes EU-funded materials for practitioners, including online training formats and events [2] [3] [8].

3. What projects look like—practical examples and scope

Local initiatives funded through Prevent grants range from leadership and school ambassador programmes to targeted communications work; the Future Leaders Programme in East London—initially funded by a Home Office Prevent grant—focused on young people, school outreach and counter-narrative messaging alongside offline workshops, showing how funded work blends online and offline channels [4]. EU-funded projects and networks similarly span prison rehabilitation, community resilience and digital strategic communications, often packaged as multi‑partner, cross‑border initiatives [7] [9].

4. Evaluation: toolkits, training and the uneven evidence base

The EU Knowledge Hub explicitly offers training on evaluation and promotes materials from EU-funded projects to build practitioner capacity to assess implementation [3]. UN and international programmes likewise produce internal evaluations and Monitoring, Evaluation and Learning toolkits for online youth strategic-communications projects [10]. Despite these resources, academic and policy reviews repeatedly highlight a shortage of robust impact evaluations: an evidence-and-gap map of criminal-justice interventions stresses the lack of high-quality impact studies, and policy analysis finds a concerning lack of strong empirical evidence for many P/CVE programmes [5] [6].

5. What evaluations typically measure — and miss

Evaluations produced by programmes and hubs often focus on process indicators—training delivered, materials produced, practitioner skills improved—and case study successes such as increased referrals or participant self-reports, as in the Future Leaders case where participant engagement and school referrals were cited as outcomes [4] [3]. Systematic outcome measurement of long‑term behaviour change, reductions in online recruitment risk or randomized impact assessments are rare in the published corpus, leaving questions about causal effects unanswered [5] [6].

6. Tensions, critiques and the road ahead

There is a clear, stated intent across Prevent, national Home Office materials and EU programmes to invest in online prevention and to professionalise evaluation, but independent observers warn that funding has outpaced the evidence: more money and more projects exist than rigorous impact studies to show what truly works at scale [2] [6] [5]. The EU and international actors are moving toward better monitoring and AI-era convenings to refine digital responses, signalling an appetite to close the evidence gap—but current reporting shows the field remains in a phase of experimentation rather than consolidated, evidence-backed best practice [11] [3].

Want to dive deeper?
What rigorous impact evaluations exist for online counter-narrative campaigns funded by Prevent or EU grants?
How do local Prevent-funded programmes measure long-term outcomes like de-radicalisation or reduced online recruitment?
What criticisms have civil-society groups made about Prevent funding and its effects on community trust?