A Partnership for More Responsible Tech: Assessing Facebook’s Role in Elections

Last week, Omidyar Network’s Tech and Society Solutions Lab announced a joint partnership with several leading foundations to fund independent academic research into the role that Facebook plays in elections. We are particularly pleased to join our sister organization, Democracy Fund, in this effort, which demonstrates Pierre Omidyar’s deep commitment to ensuring that his philanthropic efforts are collectively addressing the unintended consequences of technology and its impacts on our democracy.

Over the past week, as Facebook has faced scrutiny in Washington and in the press, we have been heartened to hear support across our network for our efforts to move beyond the problems to focus on finding solutions. With that support has come questions about the partnership, our role, and the motivations that brought us to the table; all questions we endeavor to answer here to better illustrate the potential of this partnership in shaping the future of the tech industry.

Why are you funding this effort?

We firmly believe that technology can be a massive force for good, and often invest against that thesis, but we’ve also become increasingly concerned with the unintended consequences that have emerged. Our newly-founded Tech and Society Solutions Lab was created with the remit to ensure that technologists take broader responsibility for the implications of their products on society. As we have begun to dig in on this work, we have found a remarkable lack of existing solutions that are based on sound analysis – it simply has not been possible to pinpoint either the problems or to develop the solutions because of a near-complete lack of transparent access to data. The fact that Facebook has agreed to safely and responsibly share its data with an independent group of academics is a critical first step in clearly defining the problems and identifying the robust solutions to address them.

The fact that both the academic selection committee and researchers will be independent is also of critical importance. The research will be peer-reviewed and, significantly, the results will not be subject to Facebook’s review or approval. To date,

the majority of analytical efforts about the impacts of technology have come from within big tech companies. In this case, an external, impartial, non-partisan, and diverse body of researchers will have the ability to independently assess the problem and offer concrete solutions – which we hope will not only have significant implications for Facebook in the midterm elections and beyond, but will encourage other tech companies to follow suit.

Finally, we’re excited about the considerable possibilities inherent in the research being done, not just for Facebook, but for the broader social media industry. The potential topics for analysis – misinformation; polarizing content; promoting freedom of expression and association; protecting domestic elections from foreign interference; and civic engagement – have huge implications not only for the tech industry, but for our society at large. While we can neither suggest what the results might be nor guarantee that they will be acted upon, the potential for an improved, more responsible tech industry is so significant that we are willing to invest in making it possible. We hope that this will set a model – leveraging independent research based on actual data to improve products – that other social media platforms and tech companies can follow.

Omidyar Network, through its Governance & Citizen Engagement initiative (GCE), has been supporting transparency and open government for many years. The Omidyar Group (TOG) also recently authored an in-depth paper asking, “Is Social Media a Threat to Democracy?” which identified six ways in which digital platforms may pose direct challenges to our democratic ideals. This important work by the Tech and Society Solutions Lab represents one aspect of our activities to enable a responsible tech industry. The Governance & Citizen Engagement initiative is taking a different and complementary approach, focusing on funding independent media, policy, and advocacy.

Why now?

A brief glance at the daily headlines reveals how important this issue has become. And as we saw when the markets responded to the numerous challenges facing Facebook, there is a real implication for tech companies’ bottom lines. We are at the beginning of an industry shift. Instead of simply assuming that our technologies are doing good in the world, technologists need to learn how to better anticipate, prevent, and correct vulnerabilities in their technologies – in order to build stronger products that keep both their company’s and humanity’s best interests in mind. Firms like Facebook have recently recognized this, and pledged to responsibly and safely open their data to start to address these concerns, so we felt this to be an important moment to set up a process. We are hopeful that other tech companies will soon follow this example, too.

How will the effort be operationalized to ensure transparency and accountability?

As funders, we will be supporting an independent process that leads to independent research. We will not have access to the data nor have input into the results. None of the funders will. There will be an independent committee of scholars affiliated with

the Social Science Research Council (SSRC) who will create the research agenda, decision-making criteria, and select the researchers to perform the work. Regular public updates will be provided, and SSRC will actively engage feedback from ethicists, advocates, and civil society. We believe that with these safeguards, the results that will emerge will indeed be transparent and independently created.

Why this coalition of funders?

The William and Flora Hewlett Foundation has assembled a coalition of funders – in addition to Democracy Fund and ourselves, we are joined by the John S. and James L. Knight Foundation, Charles Koch Foundation, Laura and John Arnold Foundation, and the Alfred P. Sloan Foundation – that represent numerous perspectives. We often join peer funders to advance leading organizations, as is common best practice in philanthropy. It doesn’t mean that we agree with our peer funders on every approach, but in this case, it does mean we believe in the potential and importance of this particular research effort into Facebook’s role in society.

What happens once the research is finalized?

Put simply, we don’t yet know – but it is our hope that the research will yield important insights that are fully embraced by Facebook and lead to changes in the platform that ensure future elections are not compromised by malicious actors and empower citizens to make the best possible choices.

We are realistic about the range of motivations of the players at the table, but also know that systemic change does not come without a few uncertain bets. We believe this is a critical step forward at a critical juncture, and will remain vigilant for the rest of the journey to ensure that the outcome moves us closer to a tech industry that embraces its fundamental responsibilities to society.

In that vein, we will endeavor to continue to share information about this project as it progresses – key milestones, significant decisions, even lessons learned. We hope this will be a productive and active dialogue that leads to real and meaningful change that serves as a model encouraging other companies to act.