Skip to Content
Point of View

Our Vision for a Responsible Tech Future

Technology, like art, is a soaring exercise of the human imagination.

Daniel Bell, expert sociologist and writer on the information society
Introduction

Woven into almost every aspect of modern life, digital technology is one of the greatest collective endeavors in history and is radically transforming society. Digital technology helps us respond to global challenges so we can better diagnose and treat diseases and extend life expectancy; make wondrous new things possible, like providing distance and virtual learning environments for students who would otherwise be left behind; accelerate progress to make public services more accessible and accountable; put a powerful computer in the hands of anyone who owns a cell phone; entertain and connect people in powerful new ways; and so much more.

“It’s not that we use technology, we live technology.”

Godfrey Reggio, pioneer director of an experimental documentary genre

For too long, we have allowed digital technology to march forward unmoored from any societal vision, waiting until the damage is done to ask how society can remedy its harms. As this happens, trust in the potential of technology shifts to distrust. This must change. While digital technology and society are increasingly and inextricably linked, we believe digital technology must develop in service of society. We must start by defining the kind of society we want, then ask how we can use technology’s potential to achieve this goal. We have failed to take this approach to date, in part due to outdated mindsets about economic norms, technology, and its inevitability. But also due to our tendency to think of technology as simply a “sector” rather than something bigger. As serial entrepreneur Anil Dash writes, “Technology isn’t an industry, it’s a method of transforming the culture of existing systems and institutions.” Against this backdrop, we must consider much more intentional governance of technology in its broadest terms, including culture, norms, mindsets, institutions, ethics, participation, and more.

Omidyar Network's Journey

We have seen and enthusiastically supported many of the possibilities unlocked by digital technology. Since Omidyar Network’s founding in 2004, we have invested more than $750 million in technology start-ups aimed at improving people’s lives. Our Silicon Valley origins drive our belief in the positive potential of digital technology as well as the importance of start-ups, competition, innovation, and resulting economic dynamism.

We were early investors in Change.org, MeetUp.com, and Indie.VC, as well as digital currency (pre-crypto) and early blockchain innovations. We were the first investor in MOSIP, a modular and open-source identity platform that has worked with seven countries (and counting) to implement a digital, foundational ID in a cost-effective way, while embracing the best practices of scalability, security, and privacy to harness the power of open source. And we have invested in FinTech, EdTech, Geospatial Tech, Civic Tech, Privacy Tech, and more.

During the early days of Omidyar Network, we learned a lot about how companies succeed, and fail, and the “kill zones” around big platforms. We learned that disruptive innovation and blitzscaling can either reinforce or reform the undesirable aspects of systems we are working to change. Ultimately, we also learned that entrepreneurial investments and innovations at scale alone are not sufficient to change systems.

And now, we strongly believe that society must choose to govern the broader digital technology system with true intentionality to advance our society for good.

Guiding Principles

1. Digital technology should be guided by a democratic vision of a good society.

For Omidyar Network, the ideal society is one that recognizes the importance of individual human flourishing, freedom, and capabilities balanced by the role of the common good, community, and collective institutions. Our goal is “sustainable wellbeing with dignity and fairness for humans and the rest of nature.” To achieve this, society must:

  • Ensure individual dignity, including meeting all fundamental human needs such as food, shelter, respect, health, education, privacy, security, voice, and purpose. Everyone should feel like they belong.
  • Enable individuals to have both the freedoms and the capabilities to participate to their fullest.
  • Build connection and recognize interdependence within our communities and within and across generations (including future generations), to allow for meaningful relationships and contributions to community, enabling what Margaret Levi refers to as “an expanded community of fate.
  • Create and maintain the conditions for a fair distribution of resources, income, and wealth.
  • Develop governance systems that are inclusive, responsive, just, and accountable; instill a shared sense of trust and trustworthiness; and, as Danielle Allen writes, balance individual freedom with non-domination.
  • Steward our planet’s greatest resources and mitigate the impacts of climate change so we are more resilient.

As Esther Dyson cautioned, “Don’t leave hold of your common sense. Think about what you’re doing and how the technology can enhance it. Don’t think about technology first.” We concur. It is past time for society to assert its guidance over digital technology.

2. Digital tech is different. And yet, in many ways the same.

Most major tech revolutions have broad similarities, but the pervasiveness and unique characteristics of digital technology create distinct opportunities and challenges. We need to understand its singular attributes to better anticipate and plan for unintended consequences and new scenarios.

Certain elements make digital technology more challenging for society to manage than past technology revolutions:

  • Digital technology’s speed and virality, lack of friction, scale, pervasiveness, and rapid evolutionary nature;
  • AI’s distinct self-learning capabilities and its ability to replace thinking work and impersonate humans (e.g., bots, natural language processing, deepfakes);
  • Web3 claims and aspirations from some quarters to operate outside the realm of government control;
  • Digital distribution channels and networks that operate with near zero marginal costs, capitalizing on a massive installed, connected base that is increasingly controlled and centralized by few gatekeepers (increasingly in crypto/Web3);
  • More than most other technologies, digital apps can be (and at times are) designed to hijack or manipulate our emotions and psychology; and
  • Widespread lack of understanding about how complex technology works and what’s inside “the black box.”

These unique attributes may lull us into a deterministic sense that digital technology is exceptional and there is nothing we can do to influence its path. To achieve our vision for society, we must move beyond this narrative.

Every other technology in modern history (e.g., electricity, telephony, cars, tv, radio, biomedicine) has come with idealistic visions. But some were overtaken by bad actors or had unintended consequences. As a society, we have learned—however imperfectly—that to reap the benefits of technology at scale, and manage its downsides, we must be more intentional about how we shape it. We will never foresee every problem, but we can use lessons from the past to better anticipate what is coming. If we had stopped to think about the impact of the mass production of automobiles on the environment 75 years ago, we might have avoided the current climate crisis by investing in different fuel sources and establishing regulations that are proving much harder to enact in today’s entrenched political economy.

We can glean lessons from the societal frameworks established over the 20th century for other powerful technologies—broadcast, free speech, network-effect businesses, and utilities—or dangerous, complex, dual-use technologies like nuclear power. But the novelty of new digital technologies will also require new frameworks.

We must remember that digital technology doesn’t just happen. Both deliberate decision and passive indecision have led us to where we are now. While we are facing the impact (or lack) of these decisions now, what happens or doesn’t happen next is up to us and the choices we make, not a law of physics.

3. We need to build out the system.

By elevating the moral imagination of what’s possible, we can build a system that creates and encourages technology to serve the society we want to live in.

Society has a role in stewarding other technologies, setting ethical codes, bounding uses, and ensuring pro-social outcomes. It is time to reassert this role. As explained by Carlota Perezafter every technological revolution, society has had to work and adjust to build out new social structures, education regimes, and norms to support, accommodate, and channel the new realities wrought by technological disruption. This time should be no different. And, if done right, as Dr. Perez says, a “new golden age of technology” should be possible.

When people hear the term “system,” they tend to think of tech companies, start-ups, government, and perhaps academia. We define it more broadly, to include both the “hardware” and “software.” The hardware includes the companies, institutions, investors, laws, incentives, and policies that shape and drive the system. Shifts in our system’s software—mindsets, ethics and transparency norms, power relationships, and, ultimately, who participates— are just as critical, perhaps even more so.

Six Core Elements to a Healthy Digital Tech System

Establishing a healthy digital technology system requires a holistic approach that considers six critical and deeply inter-related elements and pays attention to the entire system, rather than fragmented pieces of it:

1 Inclusive Participation
2 Stronger Ethics, Greater Transparency
3 New Paradigms
4 Meaningful Oversight
5 Expansive Innovation
6 Empowered Consumers, Responsive Makers

Our primary focus is the United States, but with a global outlook. Silicon Valley continues to have a dominant influence over the digital technology world, and what happens in the US reverberates throughout the world. We say this without hubris, but rather a clear-eyed realization that to steer digital tech toward the common good, much needs to be done in our own backyard. Action must happen in the US (and Europe, and in many other countries), and the ideas will come from around the world. Technology is so ubiquitous that everyone has a stake.

We believe that successfully creating such a system will advance our society for good.

1. Inclusive Participation

The majority of those participating in building and working on tech come to it sincerely, wanting to have a positive impact and do good. We must build on this impulse. But these intentions can sometimes be subverted elsewhere in the system, and tech still employs only a narrow subset of the broader community whose lives it so deeply shapes.

Representation matters profoundly, and it must permeate the five other elements of the system. Technology can instantly be more responsive and responsible by vastly expanding who finances, creates, governs, and delivers it.

But today, in the US, we have too much concentration of voice:

80% of technology executives are men; 20% are women. U.S. Equal Employment Opportunity Commission
82% are white, while only 3% are Hispanic and 2% are Black. U.S. Equal Employment Opportunity Commission
92.6% of software developers identify as heterosexual, and only 0.9% identify as transgender. Dice

The venture community also lacks diversity:

58% of venture capitalists are white men and 11% are white women. Forbes
2% are Black men and 1% are Black women. Forbes
1% are Latino and less than 1% are Latina. Forbes

And VCs typically fund people in their networks and who look like them:

1.2% of VC dollars invested in Black startup entrepreneurs Crunchbase News

Like many of the systems in the US today, as digital technologies began to take hold, they did so only accounting for a narrow set of voices, primarily straight white men. Women, people of color, LGBTQIA+, and people with disabilities and special needs have consistently been under-represented, both as builders and as users. These tech workers and investors are the people who regularly make quiet tradeoffs between product features, usability, design, privacy, security, and beyond, that often place the interests of the product provider over the impact on end users. This fundamental flaw breeds distrust and inequities and must be addressed.

As a McKinsey study notes, diverse teams, including those with greater gender diversity, are on average more creative, innovative, and, ultimately, associated with greater profitability. The correlation between higher levels of employee diversity and stronger financial performance has been demonstrated consistently across sectors and geographies, and tech is no different. The current lack of diversity in the digital technology space is leaving many potential innovations, use cases, and markets on the table.

Diversity also matters in the boardroom, as board members often have significant influence over corporate behavior. Yet, according to the Spencer Stuart Board Index, although 172 new directors were added to the top 200 US tech company boards in 2020 for a total of 1,805 current board seats, Black professionals hold just 55 seats (3% of the total seats available). This is another area that requires action, and not just among the technology companies themselves. A broader, more diverse range of participants from around the world and at all levels of the system—e.g., standards bodies, regulators, policymakers, international organizations—will ensure decisions made about the future of technology reflect the interests, needs, and input of all stakeholders.

This will require:

  • Listening and deeply engaging with the communities that are most often left out of conversations about tech—and are the ones most frequently harmed.
  • Building new and diverse talent pipelines into all levels and aspects of digital technology: computer science and coding, business, policy and regulations, entrepreneurs, civil society, movement leaders, executives and board members, and beyond.
  • Assigning more attention to tech capabilities, judgment, and ethics in our education system to prepare a digitally literate generation. Continuing IT education often focuses more on technical skills than social ones.
  • Creating networks of non-traditional founders and incumbent technology company workers and supporting them with the tools and community they need to succeed.
  • Seeking out and funding start-ups outside the usual locales, communities, and education programs.
  • Taking the same actions for tech workers and governors working outside the technology field (e.g., banking, health care, government).

Additionally, diverse and inclusive participation requires access. And access is still uneven, not surprisingly, along raceclass, and geographic lines. According to Pew, at least 18 million Americans nationwide—and perhaps more than 42 million—lack access to high-speed internet service. And millions more cannot afford a broadband connection even if one is available. State and federal governments need to focus on bridging this divide by focusing on availability, affordability, and adoption of both access to high-speed internet service and the suite of devices and applications that allow for beneficial use, including digital public infrastructure that allows even the least connected citizens access to the modern digital economy.

2. Stronger Ethics, Greater Transparency

A critical way to ensure future digital technologies serve society is to establish clear ethical codes and norms that are grounded in shared values. As Gene Kimmelman, former senior adviser to the Department of Justice and former president of Public Knowledge, said, “We are constantly trying to adapt market practices and regulations to fit the new technology into old norms and rules (e.g., crypto, fintech) instead of addressing whether the new technology has such profound ethical implications that we must first address whether such technology should be used at all. We simply have no ‘nuclear freeze’ or circuit breaker available to turn this process around.”

Society has built ethical frameworks guiding most other novel technologies in the 19th and 20th centuries: nuclear energy; biomedicine, genetics, and healthcare; and agriculture and genetically modified foods to name a few. Digital technology (across all applications, not just AI) should be no different.

Starting with AI, we must address algorithmic biases in all forms (pre-existing, technical, and emergent) given their tremendous economic and social impact. From hiring to lending, from criminal justice to housing, and from health care to dissemination of news and information, “artificial algorithms are increasingly being deployed to inform, endorse, and govern various aspects of today’s society.” Beyond biases, other various aspects of AI also require hard questions and new ethical norms about usage, conditions, data aggregation, and governance.

We also need ethical frameworks to contend with novel challenges from open source to cyber to decentralized and automated DAO decision-making, as well as platforms like voice and biometrics, encryption, cryptocurrency (including its impact on the environment), blockchain, and the metaverse. Understanding the “why” behind these new technologies will help us to better guide them. Such ethical frameworks should help to reckon with potential harms during the development phase, rather than ex post, and prevent many from coming to fruition, by asking “Should I build this?” Or, at minimum, “How can I build this better?”; questions we’ve been asking as they pertain to encrypted messaging platforms.

These ethical frameworks should account for the indirect impact digital technologies have on individuals and communities (e.g., automation and AI replacing workers; data centers and crypto adversely impacting the environment; sharing and selling personal data eroding privacy and trust).

Tech companies must make greater investments in internal teams with meaningful authority to reckon with harms and shore up the safety and integrity of their products and platforms. And they must build new incentives into the system to avoid rewarding bad behavior (perhaps a digital technology equivalent of a Vickrey auction that rewards trustworthiness).

Again, we cannot overlook the role of government. Government procurement is a massive tool that can be used to encourage trustworthy and ethical norms and behavior that will lead to better outcomes.

To ensure that normative and ethical standards are developed and upheld, civil society organizations such as the Trust & Safety Professional AssociationIntegrity InstituteWhistleblower AidCoworker.org, and Algorithmic Justice League, professional bodies like IEEE, and many others have a critical responsibility.

People outside the tech sector who rely on technology to go about their daily lives also have a role to play. Improved digital literacy is going to be essential so that users can be better consumers of tech and have true digital agency. To do so, they need to better understand how their own needs are not being well-served by the current system and demand a dramatic shift in digital technology governance. Kimmelman suggests we need a broad educational program—beyond technologists and policymakers—that brings ethical considerations and normative choices into classrooms.

For education and new ethical codes to succeed, they must be accompanied by requirements for and better practices of transparency, which is critical for accountability and oversight—especially as digital technology (AI specifically) gets smarter on its own.

To be clear, we advocate for a broad definition of transparency for digital technology, including technical issues such as algorithms, data, APIs, and privacy, as well as corporate and labor practices like human rights, manufacturing, procurement, hiring and DEI, and harms and violations.

Improving transparency requires increased use of open-source code, greater interoperability, and new protocols that will inherently drive the sharing of knowledge across actors (including potentially creating systems that will enable people to see where their data is being sold or shared) and improve understanding of how key “choke point” decisions are made. Tolga Kurtoglu, former CEO of PARC, has argued that AI needs to develop the equivalent of aviation flight box recorders to make it more explainable.

As noted in The Open Road report by Demos, “More openness means more innovation. More transparency means more scrutiny, which means fewer overlooked security vulnerabilities. Openness favors the development of ‘good technology,’ which embeds privacy, security, and other protections in its design.” Openness also illuminates shortcomings in code and design, leading to more robust applications and solutions.

Transparency should be the default for digital technologies, and anonymized data should be expected by law (with clear use/purpose limitations) to be shared as part of the social license to operate. We believe more anonymized data should be made available to qualified researchers across academia, the media, civil society, and certain government agencies. This will enable various actors in our society to understand trends, impacts, benefits, and harms that can inform future action, protect the public interest, and hold responsible parties accountable.

Concentration of Knowledge

As digital technology gets more complex, there will be fewer people who truly understand the algorithms, codes, and rules that drive its workings and outputs. This is technology’s “black box.” This concentration of knowledge could have dangerous consequences. We must build new mechanisms through which society (and regulators) can inspect issues of critical or systemic importance and alter them as needed (with appropriate privacy and anonymization protections built in from the start).

Dispersing this concentration of knowledge requires a multi-pronged, multi-disciplinary approach, from both the public and private sectors. This includes expanding opportunities for STEM education; increasing funding for basic research; providing greater transparency, open source, and interoperability; mandating publication requirements; creating literate and competent regulatory bodies; and, as Mariana Mazzucato suggests, rethinking the capacities and role of government to focus on the most wicked challenges of our time.

Technology is a useful servant but a dangerous master.

Christian Lous Lange, historian, teacher, political scientist, and Nobel Peace Prize winner
3. New Paradigms

Digital technology does not exist in a vacuum. It is subject to the same mindsets and beliefs that govern private markets, our culture, and even our government.

Given that digital technology came of age amidst peak neoliberal assumptions, the prevailing economic paradigm we’ve been living under in the US since the early 1980s is one of the deep root causes of technology’s challenges.

As a result, we still govern (or choose not to govern) our digital technology in the US with a deeply laissez-faire framework built on a naive Magna Carta for the Information Age that incorrectly said, “The coming of the Third Wave turns that equation inside-out. The complexity of Third Wave society is too great for any centrally planned bureaucracy to manage. Demassification, customization, individuality, freedom—these are the keys to success for Third Wave civilization.” Policies established decades ago, driven by misunderstanding and ideology, have harmed consumers, communities, and society.

Shareholder primacy is also a foundational problem, with tech sector profits often coming at the expense of workers, the public square and democracy, our social fabric, and the environment. Additionally, the current economic paradigm incentivizes privatizing the gains and socializing the harms, while avoiding any meaningful accountability. This all contributes to a general fatalism mindset about tech, its inevitability, and pessimism that it is too late to act.

As Omidyar Network articulated in “Our Call to Reimagine Capitalism,we believe a new economic paradigm—one that is inclusive of the digital technology sector—must place individual, community, and societal well-being at the center, enabling everyone to meaningfully participate in our economy, democracy, and society. The digital technology system we aim to build must also focus on solutions that support a more equitable, inclusive, and resilient society, and take a multi-stakeholder view of tech companies’ obligations to do more than earn and maximize profits.

And like the economic paradigm, we see similar challenges with the prevailing data paradigm. Most current business models treat data as property that can be traded away with a simple initial click. If data is property, others (and typically not the data subject) can profit greatly from the recombination of that data and the insights it generates. It is a lopsided value proposition that underscores the concentration of power that corporations and governments hold over our data. Consent, cookies, and privacy policies do not solve this challenge. Anyone who opts out is excluded from participating in the modern world, making the system deceptive, coercive, and extractive.

We need a new mindset—and accompanying new business models that are not extractive—to recharacterize data and guide how its economic value is derived and shared. Foundationally, this new mindset should be grounded in fairness and contribution. Instead of individual (or corporate) property, data should be used in the public interest and have a greater benefit for society. This intersects with the imperative for transparency noted above. Currently, what happens to one’s data is a mystery.

Consumers have no means to see or understand where their data are being sold or shared into secondary, tertiary, and other uses. However, establishing this transparency is certainly feasible at a technical level, and solutions to break data monopolies already exist (e.g., more interoperability).

Arguing that data might be better thought of as the new Oxycontin, Tim O’Reilly notes, “Like an opioid, data is highly addictive and dangerous when overprescribed, but extremely useful when prescribed correctly. It is harmful when companies turn it against their users to enhance their profits or competitive position, but beneficial when it is used on behalf of the people from whom it is collected. This metaphor…leads us to ask what benefits come from having so much data, what harms it creates when it is misused, and how to limit those harms.”

A complete system will not only challenge the current set of outdated and corrosive mindsets, assumptions, and paradigms, but also develop new paradigms that will result in a more prosocial, responsible digital tech ecosystem.

Mindsets Matter

Mindsets reflect our values and guide how we build our society, shaping everything from incentives to regulation and from who participates to who benefits. Yet what we believe is becoming more important than what we know. Changing how people view digital technology requires shifting the hidden assumptions and underlying (often inexplicit) mindsets about its role in society.

Currently, several dominant mindsets are undermining our belief that technology should serve society, and not the converse:

  • Digital technology shapes and determines who we are and how we live, and there is inexorable progress towards the “singularity” beyond human control.
  • Innovation is always good and regulation stifles innovation, because, in part, technologists know more than regulators.
  • Technology’s benefits are greater than concerns about privacy, monopoly power, mis- and disinformation, exclusion, or its effect on jobs.
  • The Internet (or even digital technology) should be exclusively run by the private sector.
  • US regulation will allow China to control the Internet and outcompete the US.
  • Data is individual property, given up in exchange for services to companies who monetize it.
  • Digital tech’s speed and scale alone determine its success.

These mindsets are a disservice to the society we want to create. To maximize the potential of technology, and minimize its harms, we must sprint toward a new global mindset that reflects an understanding that we have the ability—and the obligation—to steer, shape, and govern digital technology in service of society. The new mindsets we want to instill include:

  • It is not too late to act. We can still be masters of our own destiny in the digital world. The choice is ours.
  • Digital technology should develop in service of a shared vision of society, and not vice versa.
  • All stakeholders should be included in, and benefit from, the digital technology value chain.
  • We can have both innovation and regulation.
  • We can shape and direct the future of digital technology.
  • To ensure digital technology promotes societal good, we can—and must— successfully compete with all other countries.

 

4. Meaningful Oversight

Ensuring that digital tech lives up to its promise and holding it to account requires a strengthened set of competencies, institutions, standards, and statutory authorities.

Across numerous sectors, including government, civil society, media, the academy, and within tech itself, we need stronger and better institutional capabilities to monitor, manage, and implement new and updated frameworks that will address technology’s effects on:

Today in the US, we have inadequate oversight and weak regulation. As noted above, this is partly because digital technologies came of age in an era of laissez-faire neoliberal deregulation. It can also be credited to too many government policymakers who choose to not properly understand technology and its advances, and/or are financially beholden to or overly influenced by tech lobbying efforts. Adding to the challenge, dominant companies intentionally misdirect policymakers, and our governing institutions lack the capacity to conduct robust oversight. This must change. A lack of meaningful competition policy has resulted in a world where five big tech firms had an August 2022 market cap of almost $8.5tr, larger than the sovereign economies of Germany or Japan.

Competition policy is important, but requires other values—such as redundancy, resiliency, localism, and diversity—to play an equal role to counteract a tendency toward monopoly or oligopoly.

Countering another narrative, we believe it is a false dichotomy to pit innovation against regulation and technologists’ smarts against government incomprehension. We can and must have both. As Shannon Ellis, an expert in data science and teaching professor at the University of California, San Diego said, “Note that the internet can remain free while data, information, and systems can be regulated. There is space for both.” To wit: for years, FinTech has been among the fastest growing and largest categories of VC investment, despite the banking sector being among the most heavily regulated. Biomedicine is another heavily regulated sector, yet it took less than nine months to develop and roll out an entirely new class of life-saving mRNA Covid vaccines. Indeed, with better incentives and regulation, we can unleash innovation in business models, products, and competitive features.

After decades of paralysis, both the federal and state governments must urgently catch up. The EU has admirably shown leadership here, and it’s quite helpful to show what’s possible in ways that don’t impede innovation. We need to train and expand the capacity of US regulators and policymakers, so they are well-versed in the nuances of digital technology and not beholden to the interpretations of tech executives and their lobbyists. (The five biggest tech firms alone spend roughly $60m a year on lobbying in the US.)

While this point of view takes a US-centric lens, we recognize that digital technology does not adhere to borders, nor does the capital that often funds it. Oversight and regulation must have a global perspective on issues from data flows to digital trade and from IP to cyber to privacy protections—and create a floor that incentivizes our common beliefs and values.

We are pleased to see some initial progress on competition, anti-monopoly, and privacy policies, but there remains much work to be done to account for the impact of digital technology on speech, trade, banking and currency, licensing, health, and even taxation. Advising and educating policymakers on these broad and significant issues may require new independent, expert commissions, and perhaps even new or revamped institutions with new mandates and capabilities.

Regulations should also consider interventions based on systemic importance, scale and maturity, and real-world harms. At times, this will require adapting or revisiting prior regulatory frameworks, while other circumstances will prompt new theories and frameworks to account, for instance, for business models that have no explicit consumer pricing.

Beyond regulation, there are many other tools that work as both carrot and stick, including:

  • Litigation,
  • Public options (e.g., ride share apps, social media),
  • Protocol and interoperability requirements,
  • Standard setting and government procurements that require such standards,
  • Regulatory sandboxes,
  • Taxes to channel and incentivize behavior and/or address inevitable harms that tech providers may not be able to prevent but are obligated to address or mitigate, and
  • Multistakeholder/multinational institutions to address inherently cross-border and cross-sector tech issues.

We recognize that it is one thing to develop regulations and a regulatory framework, and completely another to do so well. We have certainly seen government regulation implemented badly, creating unintended consequences that become very difficult to unwind (e.g., regulations intended to protect the environment used instead as a smokescreen to thwart housing development).

We cannot simply dust off without modernizing prior frameworks from the steel and railroad age. And regulation without political reform often suffers from severe regulatory capture by those it is meant to regulate.

However, tech in major segments is already being regulated, just not in the public interest. It is being done invisibly by large, dominant private actors, whether Amazon or Microsoft in cloud services, Apple or Alphabet in app stores, Uber and Lyft on what drivers can be paid, or Facebook and Google in what news appears in one’s feed or search.

5. Expansive Innovation

A healthy digital technology system requires experimentation, variety, participation, volume, and financing. After all, a primary way technology improves is by someone inventing a superior new technology and business model.

Major technological revolutions usually come with their own accompanying financial revolutions, and the US now has a deep and well-established set of investors, systems, and incentives to develop and advance digital technology. Financing, by far the most established and dominant part of our tech system, has fostered a culture and engine of innovation and investment that frequently anchors and drives the digital technology system. There is much to be proud of.

Nonetheless, our current dominant new tech financing model and culture creates a premium for exponential growth at all costs, irrespective of societal consequences, which VCs have no incentive to consider. Moreover, the prevailing VC model puts a premium on acquiring users. It then subsidizes any losses, undermining competing companies that finance their growth capital from operating revenues and profits. This model makes VC-backed companies more accountable to their investors than they are to users, communities, workers and contractors, or markets.

New private financing models with longer horizons are urgently needed, and some VCs are already moving away from 2-and-20 structures. Limited Partners, as they invest more in VCs, can use their significant leverage to encourage VCs to take more responsible approaches. Many already represent broader public interests as worker pension funds, university endowments, and sovereign wealth funds.

More patient funding models will support technologists who embody different values, such as safeguarding rights, promoting justice, and building tech for social good. Financing is starting to see new innovations in revenue models, ownership structures, and the allocation of returns and dividends. These are steps in the right direction, but we would like to see even bigger leaps, as these remain notable exceptions.

Corporate R&D also has an important role to play, but the model is trending toward the pharma model of innovation by acquisition, rather than organic innovation. Tech companies themselves spend a tremendous amount on R&D. In 2020, for example, Amazon spent nearly $43 billion and was granted 2,244 patents.

We also cannot overlook the substantial role the government plays in spurring and subtly directing innovation in digital technology, via its deep investments in R&D that annually spend billions on developing new tech.

An influx of government funding of tech innovation, especially during the earliest stage, will undoubtedly lead to civilian (by)products as it has in the past (e.g., facial recognition, microchips, touch screens, GPS). As such, government must not only serve as a regulator and enforcer, but also set design standards and priorities for the kind of technology we want to create and support in service of the public good. Additionally, we need to reconsider how to manage the returns from government’s significant early-stage investments in tech—where the losses are socialized and the gains accrue solely to the individual founders and their investors—or how government procurement can be a lever for better tech.

We are pleased to see that cybersecurity technology is receiving significant attention and funding, as it is often ranked as a major threat to national security. But the US government should expend the same level of effort in R&D as it does for national security to make other non-economic technologies feasible (as we know, markets will take care of technologies that are profitable to develop). With broader participation in the system, different incentives, and real competition, we expect to see more innovation.

How should we think about Web3?

Web1 Open source, decentralized, protocol-based, static
Web2 Collaborative, user-generated content, dynamic…but highly centralized
Web3 ???

Those questions need to turn into answers, and quickly. We are still in the early days and have an opportunity to create the next version of the internet in a way that works for society.

On first inspection, we are inclined to be skeptical of the “fadware” that characterizes most Web3 discussion. It is easy to see it as a jumble of unchecked financial speculation, artificial asset creation, technologies existing because they can and not because they need to, massive energy consumption, and dystopian libertarian dreams. And for all its promises of re-decentralization of the internet, it is stunningly concentrated even now, either via VC investment, consolidation of crypto and NFT holdings, or Chinese dominance of token mining. The crypto universe is even less diverse than the broader internet.

However, underneath all of this are two important impulses. The first is a dangerous one that too few have thought through. Web3 and crypto began as a response to the 2008 financial crisis and a deepening distrust of institutions both in banking and government, and increasingly against centralized tech giants, a sentiment we share. That distrust is well-earned, and we have opined elsewhere on the antecedents of that crisis (spoiler: neoliberalism and regulatory capture). However, taken to its extreme, the impulse under many crypto-currencies and decentralized finance, DAOs, and other related efforts is a deliberate bypass of existing institutions. This is dangerous and unwarranted, both in principle and in practice. In principle, there need to be checks anywhere concentrated power exists: in government, the private economy, or elsewhere, including Web3. How can there be checks on power when there are, by design, no accountable institutions? And in practice, when someone irrevocably writes a false, slanderous, or erroneous piece of data to the immutable blockchain, what recourse does the victim have? Where is the accountability? We would do better to fix the institutions in question and hold them to account.

The second is a more promising impulse, and one which—if we build the rest of the tech ecosystem well—could be the basis for a new settlement moving off our hyper-concentrated current Web 2.0 configuration. It is still very early in Web3 time, creating an opportunity to help it evolve and move in the right direction. But that work must capitalize on communitarian impulses rather than libertarian ones. We are starting to see progress on key issues like energy consumption. Our peers are funding bold experiments in more open-source, non-monetized social graphs built on Web3 technology. But on the current trajectory, if we do not adopt the broader elements of the framework in this document, then Web3 is likely to evolve into an even more extreme version of the outcomes we have seen from Web 2.0. However, if we get it right, we can ensure that the best of it works so that it truly serves all of society.

6. Empowered Consumers, Responsive Makers

A focal point of the system must be an expanded role and understanding of the full range of consumers and communities in the design, deployment, and improvement of digital technology.

New legislation and regulations must center consumers and communities. The new data paradigm must not only consider what consumers have to offer to providers, but also what providers and technologies can offer to consumers.

To ensure new digital technology works best for consumers, and for society as a whole, makers and developers should purposefully engage a broad range of consumers and communities—in the US and across the globe—to understand their real needs and lived experiences. Serial tech entrepreneur Anil Dash writes, “Often times, tech creators have enough money funding them that they don’t even notice the negative effects of the flaws in their designs, especially if they’re isolated from the people affected by those flaws.”

Our efforts to promote Good ID and public digital infrastructure taught us a great deal about what individuals need to engage effectively, including:

  • Genuine individual agency and meaningful user choice,
  • Adequate safeguards,
  • Real consent, opt-outs, and recourse – and clarity on who to hold to account,
  • Privacy and anonymized code, and
  • Transparency.

To improve how they do their work, technologists need more than the hard skills of coding, engineering, and data science. As noted in our discussion of ethics, schools and academia must teach the soft skills that will help young technologists learn to work with integrity, focus on consumers and communities, and create for social good, rather than maintain a narrow focus on products and profits.

To move in this direction, consumers and communities have a role to play in advocating for and demanding better design. We are seeing a groundswell from digital natives and their parents who fought and won better protections on the social internet with efforts like the Age-Appropriate Design Bill in California. The Code requires companies to ask themselves, “What would you do differently if you knew your end user was a child?” With more than 600 million children online, tech makers must work harder to design their products with the safety and privacy of children in mind.

We typically think of tech companies as the primary makers and providers, but, as noted above, governments at all levels also play this role and require the same checks and accountability.

We applaud tech companies that have in-house ethicists and human-centered designers. Yet, most providers still do this through a narrow product lens rather than a broader frame about a given technology’s real-world effects. As we have learned from experiments like Ethical Explorer, broadening this lens is imperative for responsible technology to gain traction, but it is a long road.

Philanthropy's Role

A Greek term meaning “love of mankind,” philanthropy is an idea, event, or action done to better humanity. And because, as we note, digital technology and society are intertwined, all philanthropy should consider how best to support more responsible technology at the system level to help tilt the playing field toward pro-social outcomes. Every foundation that works to advance racial justice, equitable opportunity, jobs, and a healthy democracy has a stake in this.

Philanthropy is well positioned to do this because we can focus on systems-level change, take the long view, experiment, take risks, and bring people together across sectors and disciplines. Philanthropy can test new incentives and business models, engage policymakers and technologists, and provide a public interest counterweight to the many private incentives already in the system.

Additionally, philanthropy can support organizations that future-proof technology and ask the hard questions. It can expose the harms and hold tech accountable. It can help build and finance new kinds of responsible technology, and new financing models. It can support inclusive and diverse participation. And it can inform lawmakers about what is needed to ensure tech upholds our values.

Beyond Philanthropy: Our Call to Action

Technology is shaping society at an unprecedented level and pace. Whether it improves society or destabilizes it is up to us. Our call to action is simple: Join us. Together, we can channel digital tech to meet its promise. It will require a fundamentally different, more systematic approach than we have tried before. It will require leaving behind outdated mindsets. And it will require action both now and in the long term. Join the many partners who are already doing the hard work to chart the roadmap for a better technology system.

Building a holistic, integrated digital technology system requires everyone’s participation. Omidyar Network calls on philanthropists, technologists, entrepreneurs, policymakers, academics, advocates, movement leaders, students, consumers, investors, and everyone who has a stake in the future to be part of this effort.

The Web as I envisaged it, we have not seen it yet. The future is still so much bigger than the past.

Sir Tim Berners-Lee, computer scientist best known as the inventor of the World Wide Web

We know it will not be quick nor easy. Some of us are focused on the immediate challenges and opportunities, such as addressing harms we are experiencing in real time, mis- and disinformation, bias, the future of AI, and Diversity, Equity, and Inclusion in tech. Others are looking further into the future to explore the infinite opportunities surrounding what is possible. Both are needed, and both should be happening simultaneously. That said, for those working on the nearer-term and specific issues, we encourage you to examine how your efforts level up to the larger system we are ultimately trying to build. Consider how you can connect on a broader scale and with organizations and communities of practice that are working on other parts of the system so that we can learn from each other, build momentum, and create a digital technology system that benefits individuals, communities, and society.

Acknowledgements

Established by philanthropists Pam and Pierre Omidyar, Omidyar Network is a social change venture that has committed more than $1 billion to innovative for-profit companies and nonprofit organizations since 2004. Omidyar Network works to reimagine critical systems and the ideas that govern them, and to build more inclusive and equitable societies in which individuals have the social, economic, and democratic power to thrive.

Our Vision for a Responsible Tech Future” was completed in September 2022. Contributors include Omidyar Network staff members Anamitra Deb, Michele Lawrence Jawando, Beth Kanter, Mike Kubzansky, and Abiah Weaver.

We also benefited from the expertise of the following individuals:

  • Eric Braverman, CEO, Schmidt Futures
  • Ann Mei Chang, CEO, Candid
  • Katherine Fulton, Strategic Advisor
  • Don Gips, CEO, Skoll Foundation
  • Gene Kimmelman, Former Deputy Associate Attorney General and Former President Public Knowledge
  • Larry Kramer, President, Hewlett Foundation
  • Tolga Kurtoglu, Former CEO of PARC
  • Jamie Merisotis, President and CEO, Lumina Foundation
  • Ali Noorani, Program Director, US Democracy, Hewlett Foundation
  • Tim O’Reilly, CEO O’Reilly Media
  • Carlota Perez, Honorary Professor at Institute for Innovation and Public Purpose (IIPP), UCL and at SPRU, University of Sussex, UK; Professor of Technology and Development, Nurkse Institute, Taltech, Estonia; Academic in Residence, Anthemis UK; International consultant and lecturer; Author of “Technological Revolutions and Financial Capital: the Dynamics of Bubbles and Golden Ages”
  • Vivian Schiller, The Aspen Institute
  • Eli Sugarman, Fellow, Schmidt Futures
  • Tomicah Tillemann, Chief Policy Officer, Haun Ventures
  • Nicole Taylor, President and CEO of Silicon Valley Community Foundation
  • Nicole Tisdale, Former White House National Security Council; Advocacy Blueprints, LLC

We also wish to thank 16 other readers from philanthropy, the tech sector, nonprofit advocacy groups, and economics who read and provided essential comments to early drafts.

Our Vision for a Responsible Tech Future” was designed and produced with support from Riveter Communications, Jaime Vazquez, Abiah Weaver, and numerous artists represented by Canva and iStock.