Collective victories toward a safer digital future and charting the path ahead

Illustration of two smartphones showing WhatsApp. Blue background

By Wafa Ben-Hassine, Principal, Responsible Technology

We spend a great deal of time thinking about how technology affects society. Yet, despite our reflections, a clear disconnect has emerged. In recent decades, we have allowed digital technology to become unmoored from any societal vision for it. Take what’s happening with real-time messaging apps like WhatsApp, Facebook Messenger, and Telegram, which are used daily by over 4 billion people worldwide. These apps are built for convenience, speed, point-to-point communications, and the ability to connect large groups of people.

With this convenience also comes an ease with which users can forward messages without verifying their accuracy. This means that misinformation, disinformation, hate speech, scams and other harmful content are spreadquickly, secretly, and at significant scale.

Following the recent Global Encryption Day, we must continue to consider the deluge of mis- and disinformation on messaging platforms and beyond and discuss how we, as a society, can effectively move towards a safer digital future. One of the most effective ways to do so is by distributing the power that is held today by a handful of technology companies. This way, the engagement of a wider group of stakeholders — including nonprofits, researchers, regulators, and investors — enables a diversity of perspectives to scrutinize challenges and bring about more viable, favorable, solutions.

At Omidyar Network, we take a holistic approach to trust and safety. It is our vision to work together with advocates, designers, and engineers to create best practices that ensure platforms are more trustworthy, safe, and secure.

We know for certain that many private messaging platforms collect troves of information containing tremendous insights to feed into both how teams design and trial new product features and enticing investment from advertisers.

But as of today, much of that information is kept in the exclusive hands of tech companies. The aggregated, anonymized data the platforms collect can, without compromising encryption and user privacy, be used by platforms and researchers alike to shed light on important patterns. Sharing such aggregated metadata could lead to game-changing trust and safety improvements through better features and design choices.

Celebrating victories in the evolving landscape

The landscape is continuing to expand quickly, spanning everything from multiverses to video-gaming, to social media and consumer AI applications.

In doing our part, our Digital Trust and Safety team has a track record of supporting organizations that work to persuade corporate leaders as well as regulators and policymakers to transcend the binary of privacy — and encryption as a means of achieving it — on the one hand and safety and security on the other.

We have been proud to support progress in the space that protects and upholds individual privacy and security. For instance, when WhatsApp’s privacy policy update asked users to accept invasive changes from business accounts or lose platform functionality, our team quickly supported Public Citizen to launch a rapid-response pressure campaign. The campaign mobilized three dozen social justice, labor, and digital rights organizations from around the world to call on Meta (known as Facebook at the time) to reverse course on the WhatsApp plan to limit platform functionality, creating a letter that was signed by 28 groups denouncing the plan.

Omidyar Network also elevated the salience of harms on private messaging platforms in the mainstream media and in the U.S. Congress. Our grantee, Dr. Samuel Woolley’s Propaganda Research Lab at the University of Texas Austin consulted with the production team of John Oliver’s “Last Week Tonight” show, sharing research and insights on misinformation among U.S. diaspora communities on private messaging apps. Their work led to the production of a full-length episode on the topic, viewed by over 8 million people across the globe. Dr. Woolley also testified before Congress on disinformation targeted at communities of color in the U.S.

Our team’s support of the Institute for Strategic Dialogue (ISD) bolstered the work of key leaders like Chloe Colliver, an extremism and disinformation expert who now supports Ofcom, the UK’s communications regulator. ISD research on disinformation and extremism on private messaging platforms globally was featured in The New York Times, The Guardian, The Atlantic, and other far-reaching media outlets.

The work ahead

Going forward, a global infrastructure is imperative to sustain this work beyond any one funder or initiative. Privacy is a fundamental, critical right that users are entitled to everywhere on the Internet, and not just on messaging platforms like WhatsApp and Signal.

The evergreen question we all face is: how do we maintain user privacy while also working together to prevent societal harms? The risks and harms are becoming more pernicious and more varied, with generative artificial intelligence taking the world by storm and causing concerns related to privacy, security threats, and bias, among others. As a philanthropic actor with years of experience in the space, we are committed to finding sustainable solutions to this pressing technology issue.

Back to the question of redistribution of power. A robust technological ecosystem demands a deliberate connection between digital innovators and a diverse set of stakeholders, where digital tech makers and owners purposefully engage and are responsive to a broad range of stakeholders. To do so, technologists need more than the hard skills of coding, engineering, and data science; they need supportive work cultures and professional associations, ethics training, and soft skills that will enable them to openly collaborate, work with integrity, and put purpose above profits. Consumers and communities also have important roles to play in this relationship, sharing their lived experiences, advocating for and demanding better design, and working to embed their needs in the design, deployment, and improvement of tech.

We all have the ability — and the obligation — to steer, shape, and govern digital technology in service of a more inclusive and equitable society. All of us — philanthropists, technologists, entrepreneurs, policymakers, academics, advocates, movement leaders, students, consumers, and investors — have different, yet complementary strengths to contribute. It will require us to learn from each other, collaborate, and connect our individual work to the larger system. It will require a fundamentally different, more systematic approach than we have tried before. And it will ultimately determine if we force tech to improve society or allow it to continue to further destabilize our societies and diminish trust in one another.