I&J contributed

Published on
October 5, 2021

On October 13, 2021, the Secretariat of the Internet & Jurisdiction Policy Network presented the I&JPN Toolkits on Cross-border Access to Electronic Evidence and Content Moderation to participants at the African School on Internet Governance 2021.

The I&JPN Toolkits are the result of broad consultations with key policy actors since the 1st Global Conference of the Internet & Jurisdiction Policy Network in 2016, and the tireless work of governments, companies, international organizations, civil society, technical operators, and academics who have collaborated to develop interoperable solutions to cross-border access to electronic evidence and cross-border content moderation challenges.

AfriSIG’s goal is to develop a pipeline of leading Africans from diverse sectors, backgrounds and ages with the skills to participate in local and international internet governance structures and shape the future of the internet landscape for Africa's development.

The session “Working through the I&JPN Policy Network Toolkits on cross-border access to electronic evidence and content moderation”, provided an opportunity for participants to learn about the concrete policy challenges the Toolkits address as well as analyze the resources and their implementation.

About the I&JPN Toolkits

Toolkit: Cross-border Access to Electronic Evidence

The I&JPN Toolkit on Cross-border Access to Electronic Evidence outlines the ways in which data flows and privacy can be reconciled with lawful access requirements to address crime. The Toolkit intends to inform public, private and civil society actors in their own activities and interactions in developing and implementing alternate practices for cross-border access to electronic evidence.



Toolkit: Cross-Border Content Moderation

The I&JPN Toolkit on Cross-Border Content Moderation provides an overview of some of the key issues that arise when managing online content considering diverse local laws and norms. It intends to support Service Providers, in the design of their content moderation activities and Notifiers in the detection and reporting of problematic or abusive content as well as Legislators and Policy-Makers in determining procedures for dealing with different types of content and abusive behavior.