Undoubtedly, connection, self-expression and the ability to build and be part of online communities remain the enduring virtues of social media. Platforms like Facebook have empowered many people who previously didn’t have the means to express themselves freely. This has played out in politics: political debate has erupted online. Facebook has become a stage for democratic debate – between citizens, between candidates and voters, and for campaigners and advocacy groups to make their arguments too. Much of that is positive – raucous, often intense, but undoubtedly now part of the fabric of an open democracy.
That means Facebook and other social media companies have a big responsibility. Facebook constantly works to improve the way its services foster democratic participation – for example, this month we launched a new Voter Information Center as part of the biggest ever voter information campaign in the US, with the goal of registering 4 million voters. But it doesn’t mean – given how young this technology is – that we have fully researched all its effects yet. The US elections in 2016 exposed the risk of social media being abused to interfere in elections, and misinformation and political polarization clearly play out on social media platforms too.
To continue to amplify all that is good for democracy on social media, and mitigate against that which is not, we need more objective, dispassionate, empirically grounded research. We need to better understand whether social media makes us more polarized as a society, or if it largely reflects the divisions that already exist; if it helps people to become better informed about politics, or less; or if it affects people’s attitudes towards government and democracy, including whether and how they vote.
That’s why today we are announcing a new research partnership to better understand the impact of Facebook and Instagram on key political attitudes and behaviors during the US 2020 elections, building on the initiative we launched in 2018. It will examine the impact of how people interact with our products, including content shared in News Feed and across Instagram, and the role of features like content ranking systems.
The effort is a partnership between Facebook researchers and independent external academics. Externally, the project is led by Professors Talia Stroud and Joshua A. Tucker, two independent academics who serve as chairs of Social Science One committees. Professors Tucker and Stroud selected 15 additional researchers to collaborate on this effort, based on their expertise.
Three principles guide our work, and will continue to do so as we move ahead: independence, transparency and consent.
Independence: The external researchers won’t be paid by Facebook and they won’t answer to Facebook either. Neither the questions they’ve asked nor the conclusions they draw will be restricted by Facebook. We’ve signed the same contracts with them that we do with other independent researchers who use our data (which is publicly posted on Social Science One’s web site).
Transparency: The researchers have committed to publish their findings in academic journals in open access format, which means they will be freely available to the public. Facebook and the researchers will also document study plans and hypotheses in advance through a pre-registration process and release those initial commitments upon publication of the studies. This means that people will be able to check that we did what we said we would – and didn’t hide any of the results. In addition, to run their own analyses and further check our homework, we plan to deliver de-identified data on the studies we run. We have also invited Michael Wagner, a professor at the University of Wisconsin, to document and publicly comment on our research process as an independent observer.
Consent: We are asking for the explicit, informed consent from those who opt to be part of research that analyzes individual level data. This means research participants will confirm both the use of their data and that they understand how and why their data will be used. Additionally, as part of our studies, we will also analyze aggregated user data on Facebook and Instagram to help us understand patterns. In addition to this, the studies – and our consent language – were reviewed and approved by an Institutional Review Board (IRB) to ensure they adhere to high ethical standards.
This research is part of Facebook’s wider effort to protect elections. As a company, we’ve looked hard at what went wrong with Russian interference in 2016 and made some big changes. There are now three times as many people working on safety and security issues, more than 35,000 in total, and we work closely with government and law enforcement. Facebook has helped fight interference in more than 200 elections since 2017 and reduced fake news on its platform by more than 50%, according to independent studies.
This research won’t settle every debate about social media and democracy, but we hope and expect the researchers will advance society’s understanding of the intersection of technology and democracy. The answers will help us all to shape the rules of the road for the internet – for the benefit of our democracy, and society as a whole.
For more information, please see below for an FAQ, or read this announcement from the lead researchers.
Q: Who are the independent researchers and how were they chosen?
A: Facebook is working with a group of 17 independent researchers who are experts in the fields of elections, democracy and social media. Social Science One facilitated the start of the project, and two of its committee chairs, Talia Stroud and Joshua A. Tucker, serve as co-chairs of this project. They selected researchers who represent a variety of institutions, disciplines, areas of expertise and methodological traditions. Facebook did not select the researchers and is taking measures to ensure that they operate independently.
- Hunt Allcott, New York University
- Deen Freelon, University of North Carolina at Chapel Hill
- Matthew Gentzkow, Stanford University
- Sandra Gonzalez-Bailon, University of Pennsylvania
- Andrew Guess, Princeton University
- Shanto Iyengar, Stanford University
- Young Mie Kim, University of Wisconsin-Madison
- David Lazer, Northeastern University
- Neil Malhotra, Stanford University
- Brendan Nyhan, Dartmouth College
- Jennifer Pan, Stanford University
- Jaime Settle, William & Mary
- Talia Stroud, The University of Texas at Austin
- Emily Thorson, Syracuse University
- Rebekah Tromble, The George Washington University
- Joshua A. Tucker, New York University
- Magdalena Wojcieszak, University of California, Davis; University of Amsterdam
Q: What measures will you take to ensure this research is independent, ethical and well done?
- No financial incentives: While Facebook will continue to pay its employees who work on this research, we will not be compensating our external research partners.
- Pre-registration: Facebook and the researchers will pre-register our research together. This means we will document and publish under embargo the hypotheses we plan to investigate prior to beginning our studies, using pre-analysis plans. These plans will become public upon publication of the results. As such, when the studies are published, everyone can be clear that we are reporting all of the results, not just a small selection.
- No pre-publication approval: Regardless of what is discovered, Facebook will not restrict the researchers from publishing their findings. As with other Social Science One research, Facebook is entitled to review (not approve or reject) research prior to publication, and remove any confidential or personally identifiable information. (Our operating contract is public, and can be viewed at Social Science One’s web site.)
- External Institutional Review Board (IRB) ethics process: The studies that Facebook and the independent researchers conduct were submitted to and approved by an IRB.
- Incorporating both consent and privacy in the research design: The research design is built with consent and privacy in mind. For example, our design includes obtaining explicit, informed consent from research participants for analyses of individual level data. Additionally, as part of our studies, we will analyze aggregated user data on Facebook and Instagram to help us understand patterns. We’ll also implement privacy and security practices in the storage and processing of the data such as data access controls and isolated, privacy-protective storage of the data.
- Enabling replication: Facebook plans on partnering with the external research team to deliver de-identified data on all the studies we run. This means that other researchers will be able to, in effect, check our homework by re-running analyses on data that cannot be reasonably linked to an individual.
- Independent Observer: We invited Michael Wagner, a professor at the University of Wisconsin, to document and publicly comment on our research process as an independent observer. We have asked him to publish, once the project is complete, on how the process of the research proceeded, in order to validate the above mechanisms, and thus the independence and credibility of the effort.
Q: What methodologies will the researchers use?
A: The independent academics are collaborating with Facebook researchers to design a diverse set of studies to analyze the role of Facebook and Instagram in the US 2020 election. To collect the information for the study, we are partnering with NORC at the University of Chicago, an objective, non-partisan research institution that has been studying public opinion since 1941. NORC possesses deep expertise in survey research, policy evaluation, data collection, advanced analytics and data science. The study was approved by NORC’s Institutional Review Board.
For people who have explicitly opted in to the study, we plan to combine multiple research methods, including surveys and behavioral data analysis, along with targeted changes to some participants’ experiences with Facebook and Instagram. For example, participants could see more or fewer ads in specific categories such as retail, entertainment or politics, or see more or fewer posts in News Feed related to specific topics. Other participants may be asked to stop using Facebook or Instagram for a period of time. A subset of participants may be asked to install an app on their devices – with their permission – that will log other digital media that they consume. This will allow researchers to understand more comprehensively the information environment that people experience.
Q: Will Facebook make changes to its products as a result of the study?
A: We are continually making changes and improvements to our products, and this research, along with other continuous input we receive from external stakeholders, will be considered in this process.
Q: Is it likely that this research will change the outcome of an election?
A: No. With billions of dollars spent on ads, direct mail, canvassing, organizing and get out the vote efforts, it is statistically implausible that one research initiative could impact the outcome of an election. The research has been carefully designed to not impact the outcome of the election or harm participants. The sample of participants represents approximately 0.1% of the entire US eligible voting population spread across the US. By better understanding how people use our platform during an election, we can continually enhance the integrity of the platform moving forward.
Q: How will you monitor the effects of the research?
A: Facebook and our research partners will be monitoring the research at every step. In the highly unlikely event they detect unanticipated effects, they will stop the research and take corrective action as needed.
Q: How are you ensuring this work will be done in a way that safeguards people’s privacy?
A: The research design is built with consent and privacy in mind. For example, our design includes obtaining explicit, informed consent from research participants. Additionally, as part of our studies and consistent with our Data Policy, we will also analyze aggregated data on US based Facebook and Instagram users to help us understand patterns. We’ll also implement privacy and security practices in the storage and processing of the data such as data access controls.
Q: When will the study start and end?
A: The study will start soon and end in December. However, it will take the research teams many months to properly analyze all the data, and as such we do not expect to publish any findings until mid-2021 at the earliest.
Q: How do people opt into the study?
A: Representative, scientific samples of people in the US will be selected and invited to participate in the study. Some potential participants will see a notice in Facebook or Instagram inviting them to take part in the study. Study samples will be designed to ensure that participants mirror the diversity of the US adult population, as well as users of Facebook and Instagram.
Q: How many people will participate in the study?
A: We expect between 200,000 and 400,000 US adults may choose to participate in the study, which could include things like taking part in surveys or agreeing to see a different product experience. We will also study trends across Facebook and Instagram – but only in aggregate.
Q: How are you ensuring that this research is ethically sound?
A: As part of the research design, studies that Facebook and the independent researchers conduct underwent IRB review. The research team also received ethical guidance from the independent firm Ethical Resolve to inform study designs.
Q: Will research participants’ data be used to target ads?
A: None of the survey data collected for this research effort from consented research participants will be used for ads targeting.
Q: Is Facebook paying the researchers? How much are you investing in this project?
A: Facebook is investing significant resources in this effort. However, as part of our efforts to ensure the independence of the external research team, we will not pay them.
Q: When will the results be published?
A: We expect initial papers to be available in mid-2021 at the earliest, and hope that many of them will be published shortly thereafter.
Q: Does Facebook have veto-power over publishing the results?