Today, we’re releasing the findings of three independent human rights impact assessments we commissioned in 2018 to evaluate the role of our services in Sri Lanka, Indonesia and Cambodia, along with details on how we’ve responded to the recommendations in each assessment. The assessments build on the work we’ve done over the last two years, beginning with creation of a human rights team to inform our policies, products, programs and partnerships around the world.
Since then, we’ve formalized an approach to determine which countries require more investment, including increased staffing, product changes and research. We have committed to expanding end-to-end encryption, a security function that is already core to WhatsApp, to all of our messaging products to protect people’s private messages, including journalists and human rights defenders. In October of last year, we updated the values that underpin our Community Standards to specifically reference human rights principles. And earlier this year, the Global Network Initiative (GNI), completed its biennial assessment of Facebook, which determined the company is making good-faith efforts to implement the GNI Principles with improvement over time. The assessor reported Facebook had “strengthened its systematic review of both privacy and freedom of expression.”
The assessments we’re releasing today underscore the role our services play in providing voice to people, promoting civic and political engagement, and shining a light on human rights issues and abuse, especially in places where activists, human rights defenders and vulnerable communities don’t otherwise have a platform.
They also highlight the threats to people’s rights and encourage us to respond more resolutely. We accept this responsibility, and acknowledge the human rights impacts outlined in these reports. We will continue to strive to keep people safe and their information secure.
Why and How We Do Human Rights Impact Assessments
We’re committed to understanding the role our platforms play offline and how Facebook’s products and policies can evolve to create better outcomes. Engaging independent experts and evaluating our work through the lens of global human rights principles is key to achieving this goal.
The three assessments we commissioned were conducted in accordance with the UN Guiding Principles on Business and Human Rights. The assessment of Facebook’s role in Cambodia was completed by BSR, and the assessments of Sri Lanka and Indonesia by Article One.
While the three assessments focus on Cambodia, Indonesia and Sri Lanka, the recommendations in each report have implications for other contexts, countries and regions in which Facebook is used. This reflects the universal nature of human rights, the global reach of our products, and the intersectionality of the impacts identified.
HRIA Recommendations and Our Progress
The reports each made similar recommendations to help us better protect human rights, including:
- Improving our corporate accountability around human rights
- Updating our Community Standards and improving enforcement
- Investing in changes to platform architecture to promote authoritative information and reduce the spread of abusive content
- Improving reporting mechanisms and response times
- Engaging more regularly and substantively with civil society organizations
- Increasing transparency so that people better understand our approach to content, misinformation and News Feed ranking
- Continuing human rights due diligence
We welcome the recommendations and have taken steps towards filling these gaps, as our responses demonstrate.
Over the last two years, we have formalized an approach for prioritizing countries at risk of conflict and tailoring policy and product solutions to account for the unique needs of each. In Sri Lanka, for example, we are reducing the distribution of frequently reshared messages, which are often associated with clickbait and misinformation. These demotions seek to respect the guidance on permissible limits to freedom of expression under Article 19 of the ICCPR.
We have also taken lessons from Cambodia, where government surveillance of internet and social media use is pervasive. We expanded the ways that users can keep their accounts secure and started encouraging people to use authenticator apps rather than SMS for more secure two-factor authentication.
We also made key updates to our Community Standards. For example, in 2018, we adopted a policy to remove verified misinformation that contributes to the risk of imminent physical harm, which we later expanded to apply to unverifiable rumors. The policy, which is global in scope, is especially applicable to conflict-affected areas, and has been used to remove content in Sri Lanka and Indonesia. We further updated our policies to protect vulnerable groups, including veiled women, LGBTQ+ individuals and human rights activists whose “outing” might increase risks of offline harm. We began using proactive detection technology to identify potentially violating hate speech, developing machine learning capabilities in Sinhala and Bahasa Indonesia. And we expanded our policies against voter interference, which proved critical ahead of elections in Sri Lanka and Indonesia in 2019, and will be equally important ahead of Cambodian elections in 2022 and 2023.
We increased staffing significantly, hiring policy leads and program managers in Sri Lanka, Indonesia and Cambodia, and expanding the number of content reviewers who speak Sinhala, Tamil, Bahasa Indonesia, Javanese and Khmer. With bigger and more specialized teams, we’ve formalized engagement with civil society organizations, many of whom serve as a regular and invaluable source of input as we make updates to our Community Standards to account for new types of abuse. Our investment in civil society has further extended to digital and media literacy programs and economic initiatives like #SheMeansBusiness. We also strengthened our local fact-checking partnerships in Sri Lanka and Indonesia, and we’re always looking to expand the program with organizations certified by the International Fact-Checking Network.
As we work to protect human rights and mitigate the adverse impacts of our platform, we have sought to communicate more transparently and build trust with rights holders. We also aim to use our presence in places like Sri Lanka, Indonesia and Cambodia to advance human rights, as outlined in the UN Guiding Principles on Business and Human Rights and in Article One and BSR’s assessments. In particular, we are deeply troubled by the arrests of people who have used Facebook to engage in peaceful political expression, and will continue to advocate for freedom of expression and stronger protections of user data.
The progress we’ve laid out here represents the beginning of our work in Sri Lanka, Indonesia and Cambodia, not the end. Facebook learns from every human rights impact assessment we undertake, and these reports are critical to changing how we operate to better support communities around the world. We have a long road ahead, but sharing some of the progress we’ve made is part of our commitment to demonstrating action and accountability.
Article One undertook its Human Rights Impact Assessments in Indonesia and Sri Lanka during 2018, using a methodology informed by guidance from the UN Guiding Principles on Business and Human Rights as well as Article One’s award-winning process for and experience in conducting human rights impact assessments around the world. The Indonesia HRIA involved interviews with 35 organizations, and the HRIA for Sri Lanka reflects interviews with 29 organizations, as well as focus groups of 150 participants. Both assessments encompassed interviews with relevant Facebook employees, and were funded by Facebook. Article One retained editorial control of its contents.
BSR undertook its Human Rights Impact Assessment (HRIA) of Cambodia in late 2018 and early 2019, using a methodology based on the UN Guiding Principles on Business and Human Rights (UNGPs). It involved interviews with 35 affected rights holders and stakeholders in Cambodia, as well as in-country research and interviews with relevant Facebook employees. The HRIA was funded by Facebook, though BSR retained editorial control of its contents.