Measuring and reporting out these numbers on a regular basis, with the same frequency as we report our earnings and business results, is important. It lets others see how we’re doing and hold us accountable – and it helps us measure progress and spot places where we’re not doing enough.
But transparency is only helpful if the information we share is useful and accurate. In the context of the Community Standards Enforcement Report, that means the metrics we report are based on sound methodology and accurately reflect what’s happening on our platform. To this end, over a year ago we worked with international experts in measurement, statistics, law, economics and governance to provide an independent, public assessment of whether the metrics we share in the enforcement report provide accurate and useful measures of Facebook’s content moderation challenges and our work to address them. They broadly agreed that we are looking at the right metrics and provided some recommendations for improvement including launching appeals reporting. You can read the full report here.
Now we are taking it one step further to validate that the metrics are measured and reported correctly. Over the past year, we’ve been working with auditors internally to assess how the metrics we report can be audited most effectively. This week, we are issuing a Request For Proposal (RFP) to external auditors to conduct an independent audit of these metrics. We hope to conduct this audit starting in 2021 and have the auditors publish their assessments once completed.
No company should grade its own homework and the credibility of our systems should be earned, not assumed. We believe independent audits and assessments are crucial to hold us accountable and help us do better.