5 Mar 2025 Location TBC
ISD and Coventry University: Workshop on managing bystander footage online after terrorist attacks

In March, ISD and Coventry University held a closed-door workshop on Bystander Content Online: Harms and Responses to explore an increasingly important, yet often overlooked, challenge: how to respond to bystander footage in the aftermath of terrorist attacks.
Co-organised by Anne Craanen, Senior Research and Policy Manager at ISD, and Dr Alastair Reed, Professor at Coventry University, the workshop brought together a range of stakeholders from government, regulation, tech companies and civil society. Participants included representatives from Europol, Ofcom, competent authorities from the Netherlands, Spain, Belgium and France, the European Commission (DG HOME and DG CONNECT), the Christchurch Call Foundation, the Oversight Board, and the Global Internet Forum to Counter Terrorism (GIFCT). Their contributions shaped a timely and necessary conversation on the complex role of bystander content in crisis communication and response.
In the wake of the 2019 Christchurch terrorist attacks, governments and tech companies developed a number of protocols to stem the rapid spread of perpetrator-produced content, including the Global Internet Forum to Counter Terrorism’s Crisis Incident Protocol, the EU’s Crisis Protocol, and the Christchurch Call Foundation Crisis Protocol. However, the question of how to address bystander content—footage not produced by attackers but captured by members of the public, CCTV systems, or emergency responders—has remained largely unaddressed.
This type of footage is increasingly visible in the aftermath of attacks. The rise of livestreaming and smartphone use means bystander content can reach wide audiences quickly, often ahead of official information. While often not intended to cause harm, it can still be exploited—by violent extremists seeking publicity, by conspiracy theorists, or by those spreading disinformation. At the same time, such content can also serve the public interest, contributing to transparency and accountability.
To explore this complexity in more detail, the workshop was structured into three core sessions, introduced by short presentations followed by in-depth roundtable discussions. Held under Chatham House Rule, the discussions aimed to inform a forthcoming report and policy guidance, while preserving an open and constructive environment for dialogue.
Workshop Agenda
Session 1 – The Challenge of Bystander Content
Chaired by Dr Alastair Reed, this session provided an overview of the kinds of content considered bystander footage—ranging from mobile phone recordings and CCTV to bodycam footage and incidental livestreams. Arthur Bradley (Independent Researcher) discussed case-studies of previous terrorist attacks which framed the conversation. Friederike Wegener (DG HOME, European Commission) outlined the evolving nature of this material and its increasing relevance to online harm. Discussion focused on how bystander content is currently encountered and understood across sectors, and whether it may already form part of an emerging modus operandi in some attack strategies.
Session 2 – What is the Harm?
In this session, participants considered the range of harms associated with bystander footage. Pooja Larvin (Meta Oversight Board) and Paul Schmite (Advisor to the Digital Ambassador of France) provided reflections on the tension between preventing harm and preserving public interest. Topics included whether bystander footage can be seen as glorifying or supporting terrorism, how it intersects with dignity of victims and graphic content policies, and the way such content is repurposed by a wide spectrum of malicious actors.
Session 3 – Responding to Bystander Content
The final session, chaired by Arthur Bradley, focused on content moderation and response. Dr Erin Saltman (GIFCT) discussed bystander content in existing crisis protocols and tech platform policies, while Anne Craanen (ISD) outlined alternative moderation tools—such as blurring, downranking or the use of warning screens—as potentially proportionate responses. The session emphasised the need for context-sensitive approaches that balance freedom of expression with safeguarding, particularly where material could serve as evidence or is already widely disseminated.
Main Takeaways and Future Work
One of the central takeaways from the workshop was the complexity of responding to bystander footage. The discussion underscored that this content sits at a crossroads of competing priorities: freedom of expression, journalistic protections, public interest, dignity of victims and security concerns. While no single approach can fully reconcile these tensions, the workshop succeeded in delineating the core challenges and identifying practical needs going forward.
Among the most concrete outcomes was a shared understanding that existing frameworks may already offer partial coverage. In particular, there was discussion of how bystander content might intersect with the Terrorist Content Online Regulation (TCO) in the EU—especially where footage is co-opted to glorify or promote terrorist violence. Participants also noted that many platforms already have relevant Terms of Service policies—such as those addressing incitement, graphic violence or human dignity—which could apply to bystander content but are not always consistently enforced.
There was consensus on the need for clearer guidance for both regulators and platforms. This includes clarifying when and how bystander content could fall under TCO or national legal frameworks; identifying how platform policies could be adapted or harmonised to support more transparent, consistent responses; and outlining a tiered framework for responding to different types of content based on risk, context and potential harm.
ISD and Coventry University will take this work forward in the coming months, producing policy guidance to support a more coherent and rights-based response to bystander content. This will feed into ongoing efforts and complement broader international work on crisis response and digital safety.
This initiative forms part of ISD’s wider portfolio on Digital Policy and Hate and Extremism, where we combine deep expertise in online harms with cutting-edge research methods and policy engagement. Our cross-harms approach allows us to examine the intersection of disinformation, extremism and platform governance, and provides practical recommendations to governments, civil society and the private sector. We are delighted to do so with Coventry University, and particularly Dr. Alastair Reed given his expertise on strategic communications in the aftermath of terrorist attacks.
We’re deeply grateful to all who contributed to the workshop for their thoughtful and constructive input. This is a complex and evolving area, but through continued collaboration and engagement, we look forward to helping shape a more effective and proportionate response to bystander content online.