Better Processes Lead to Better Outcomes: Corporate Governance as a Tool to Address Misinformation

By Nathalie Maréchal, Elizabeth Renieris, and Jan Rydzak, Ranking Digital Rights

This is the second installment of a new white paper series on alternative regulatory responses to misinformation. The articles, which were produced with the editing and support of the WIII Initiative, will be presented at a series of events on March 8, 10, and 12th. You can find more information, and registration details for our launch events, here.

     I.         Introduction

The popularization of the internet has been a boon for journalists, activists, and individuals whose ability to access and impart information had been stymied by traditional gatekeepers in the public and private sectors. The emergence of social media further democratized access to online expression and facilitated the growth of decentralized networks, including a range of technologically-mediated social movements.[1] As naive as the previous decade’s enthusiasm for these “liberation technologies” may seem in hindsight, their benefits to human rights and the democratic project should not be discounted.[2] Nonetheless, the same ability to circumvent traditional gatekeepers, including editors and fact-checkers, has also enabled the spread of misinformation and disinformation, with grave consequences for democracy, human rights, and public health.[3]

Misinformation is only one manifestation of structural flaws in a complex socio-technical system for data collection and information dissemination, governed by the market logic that Shoshana Zuboff terms “surveillance capitalism.”[4] To date, many proposed and implemented interventions target symptoms that manifest themselves at the content layer of the internet rather than root causes concentrated at the corporate level.[5] Symptomatic efforts targeting misinformation and other harmful or dangerous content are necessary but not sufficient.

Addressing misinformation alongside other harms (notably surveillance and discrimination), while respecting international human rights (particularly freedom of expression and information), requires a regulatory toolkit that includes antitrust and competition law, data protection, privacy, and consumer protection law. It also requires a careful balancing of intermediary liability with intermediary immunity.[6] This paper examines good corporate governance as a prerequisite for effective content moderation, dealing with the root causes of misinformation in a manner which avoids threatening freedom of expression.

First, this paper will introduce the relationship between corporate governance and misinformation. The paper will then describe two categories of corporate governance reform that could help platforms improve content moderation while enhancing respect for human rights, namely structural reforms of company leadership, and enhanced investor oversight. For each category, the paper outlines the industry status quo, including company behavior demonstrating the need for the reforms in question, before recommending best practices for companies and considering the legal and regulatory levers that could incentivize or even compel companies to alter their practices. The paper draws on prior coverage and analysis of several companies to illustrate our argument, though for consistency it uses Facebook as the primary example.

   II.         The Relationship Between Corporate Governance and Misinformation

“Corporate governance” refers to the set of policies, processes, and procedures through which the entities and individuals comprising a corporation make decisions impacting the company and hold one another accountable for following through on them.[7] In the case of public U.S. companies listed on a major stock exchange, including large social media and technology companies, these entities and individuals may include the Board of Directors and its Chair; the Chief Executive Officer and other high-ranking executives; managers and employees throughout the corporate hierarchy; and individual and institutional shareholders. Non-company actors may also influence corporate governance; the most notable among them are governments and regulatory bodies, the media, and civil society organizations, including consumer protection and human rights groups.

The Board of Directors determines the company’s priorities with an eye to both long-term profitability and environmental, social, and governance (ESG) concerns.[8] It appoints the Chief Executive Officer and holds that individual accountable for carrying out those priorities. The CEO, executives, senior management, and other employees execute the Board’s vision and produce regular financial reports and other documentation for the Board and company shareholders to demonstrate progress toward pre-established goals such as revenue targets, environmental impact, control over its supply chain, and other functions of the company.

Public companies that host large volumes of user-generated content (UGC) and paid advertising, including Alphabet (Google’s parent company), Microsoft, Facebook, and Twitter, long treated content moderation as a relatively low priority, particularly in terms of consistently enforcing rules and effectively responding to contextual nuances.[9] Despite the dire consequences of rampant mis- and disinformation on their platforms in other parts of the world,[10] it is only since the well-documented foreign influence operations into the U.K.’s Brexit referendum and the 2016 U.S. general election that the major social media companies have begun to devote significant resources into grappling with these challenges.[11] Steeped in a maximalist interpretation of the First Amendment and cyberlibertarianism, technology companies have traditionally been resistant to calls to take a stronger hand towards combating misinformation.[12] Even now, a strong discursive current in Silicon Valley and Washington, D.C. insists that free speech is paramount to all other values in the digital sphere.[13]

But the internet is not a parallel dimension and is very much part of the “real world.”[14] Governments have long sought to pressure social media platforms into doing more to combat child sexual abuse material (CSAM), copyrighted materials, and other problematic content. Simultaneously, civil society has been fighting for due process and transparency around content removals, both in response to government demands and in cases where content violates company rules.[15] Forced to confront the ugly realities of “lawful but awful” expression, companies are gradually and somewhat haphazardly developing increasingly sophisticated content-related rules and enforcement processes.[16]

Given the multiplicity of companies, governments, languages, and sociocultural contexts involved, what has emerged is a hodgepodge of international norms, national laws, co- and self-regulation, and seemingly capricious decisions by company leadership. Despite repeated assurances to the contrary, companies seem unable to exercise meaningful control over the (mis)information that circulates on their platforms. In this context, it is little surprise that even democratic governments that are ostensibly committed to free expression are tempted to turn to intermediary liability and restrictions on online speech to hold technology companies accountable. However, for reasons that are detailed in a recent report from Ranking Digital Rights, such measures raise serious freedom of expression concerns and, at least in the United States, would be unlikely to comply with the First Amendment.[17]

Governments should not dictate the processes for, or outcomes of, content governance decisions. Nevertheless, there is a role for governments to encourage better corporate governance, which in turn will translate into better processes and outcomes for content moderation. Many of the social media platforms where the dangers of misinformation are most severe (due in part to their large user bases) are publicly traded on the New York Stock Exchange and NASDAQ and therefore subject to regulation by the Securities and Exchange Commission (SEC) and other federal agencies. Requiring these companies to follow recognized best practices, rules, and regulations for corporate governance will improve internal decision-making and oversight mechanisms and generate better outcomes, while avoiding government intervention into content moderation, which poses serious challenges under both the First Amendment of the Constitution and Article 19 of the Universal Declaration of Human Rights.

Two specific corporate governance reforms that would be particularly impactful are structural reforms of company leadership and mechanisms for enhanced investor oversight. Ideally, these measures would be accompanied by other changes such as mandating company-wide human rights due diligence. To combat misinformation, it is critical to normalize strong oversight while holding companies accountable for their due diligence based on a common set of standards, parts of which already exist in the UN Guiding Principles on Business and Human Rights and guidance published by organizations such as the Danish Institute for Human Rights.[18] The interventions we explore below are only two of the top-level changes needed to stem misinformation, but they would unlock important changes further downstream.

 III.         Reforming Company Leadership Structures

The case for reform

In August 2020, the Wall Street Journal reported that Ankhi Das, Facebook’s top public policy executive for India as well as South and Central Asia, had overruled her staff to prevent the removal of hate speech and incitement to violence posted on Facebook by politicians belonging to the ruling Hindu nationalist Bharatiya Janata Party (BJP).[19] Das, who herself had previously posted anti-Muslim hate speech on Facebook, reportedly justified her decision by saying that “punishing violations by politicians from Mr. Modi’s party would damage the company’s business prospects in the country.” This was not the first time that Das had interfered in content moderation decisions in the BJP’s favor.

According to the company’s former Chief Information Security Officer, “a core problem at Facebook is that one policy org is responsible for both the rules of the platform and keeping governments happy.”[20] Indeed, its content policy and government relations functions are merged from the C-suite down. As of late 2020, VP of Global Public Policy Joel Kaplan, hired to strengthen the company’s ties to the GOP, is in charge of lobbying the U.S. government and overseeing the team in charge of content rules.[21] There is evidence of Kaplan torpedoing efforts to increase civility on the platform, fact-check political ads, and generally enforce rules on hate speech, incitement to violence, and voter suppression against Donald Trump and other Republican politicians.[22] In many cases, political considerations and a perceived need to kowtow to those in power influenced Facebook’s content rules and enforcement decisions.[23]

Weak top-level human rights oversight undermines platforms’ ability to halt the global spread of false information. In particular, Facebook is the world’s dominant platform for organized social media manipulation, in part due to its near-ubiquitous reach and widespread use as a news source.[24] Its problems are exacerbated by a proclivity to underreport political manipulation efforts in the “Global South,” which whistleblowers have partially attributed to a lack of interest in “non-priority” markets among executives.[25] Meanwhile, domestically, Mark Zuckerberg has personally intervened to bend Facebook’s own Community Standards to retain posts by Donald Trump that glorified violence, triggering strong internal and external criticism.[26]

The frequency and scale of such back-room content decisions at the executive level is unknown. Researchers, journalists, and the public all rely on scattered, leaked reports that likely capture a fraction of such actions. Additionally, Facebook does not disclose any data on “reporter appeals”—content that is left up after being reported, which shrouds its decision-making in further obscurity.[27] Facebook’s opaque organizational chart and corporate governance processes impede transparency on how individual decisions are made, including on how to respond to misinformation that originates from high-profile individuals.

A company’s top executives are responsible for managing conflicts of interest between the government relations and content policy functions, identifying the human rights risks involved, and mitigating them. If they fail to do so, the Board of Directors should exercise oversight and compel reform. While Facebook claims it oversees human rights at the Board level, it has fiercely resisted efforts to incorporate relevant expertise into its highest echelons of leadership and persuaded its shareholders to reject a 2020 proposal calling for the appointment of a recognized human and civil rights expert to its Board.[28]

Securing an institutional commitment from top leadership to fundamental human rights and oversight requires years of sustained pressure supported by research that exposes crucial shortcomings. While some technology companies have recently disclosed evidence of oversight over human rights risks at the Board and executive levels, this remains the exception.[29] For instance, since its reorganization under Alphabet in 2015, Google’s Board of Directors has not exercised formal, systematic oversight over freedom of expression issues, and only recently brought privacy under the purview of the Board’s Audit Committee.[30] Similarly, Twitter has not made any public statements about Board-level oversight of free expression and only recently announced that its Data Protection Officer provides regular updates to the Board as part of Twitter’s push toward greater transparency on privacy.[31]

Recommended best practices

Companies that are serious about combating misinformation should give themselves the tools to do so, starting at the top of the hierarchy by ensuring that the Board of Directors has the relevant expertise and independence. According to the Business Roundtable, the composition of a Board should reflect a diversity of thought, backgrounds, skills, experiences, and expertise, and a range of tenures that are appropriate given the company’s current and anticipated circumstances.[32] Together, these features should enable the Board to perform its oversight function effectively.

For large social media companies whose platforms are rife with misinformation, the Board should include at least one independent expert on civil and human rights, notably freedom of expression, privacy, freedom from discrimination, and the relationship between these values. The Board should also designate and appoint a separate committee responsible for overseeing the company’s human rights policies and practices and ensure that similar committees and teams are embedded throughout the company.

In addition to expertise, it is a well-established principle that an independent chairperson of the Board is an essential element of good corporate governance, necessary to ensure proper checks and balances within the company. For example, the Council of Institutional Investors, an authoritative voice on corporate governance, supports the separation of CEO and chairperson roles, cautioning that a “CEO who also serves as a chair can exert excessive influence on the board and its agenda.”[33] This is precisely the case with Facebook, whose CEO’s dual role as Chairman of the Board effectively renders him accountable to himself. If companies are serious about Board independence and oversight, they must avoid this kind of CEO duality and move away from having company executives and insiders serve on the Board altogether.

Companies that concentrate power in one person are also likely to focus on a single set of interests, rather than carrying out a careful balancing of the competing values at play where human rights questions are engaged. Managers, in this context, may prioritize the company’s immediate short-term interests over conflicting ones, even to the detriment of shareholders’ own financial interests.[34] Where one individual wields excessive influence over a Board lacking human rights expertise, incentives for human rights-based decision-making can be further diminished, particularly where they are framed as considerations which do not impact a company’s financial health. All of these factors undermine companies’ ability to combat systemic problems like misinformation, particularly since this challenge manifests differently across different local contexts. The disconnect between First Amendment values and the companies’ global user base further exacerbates the problem.

Social media companies with an independent Board, including Board members with relevant human rights expertise, would be much better equipped to ensure that internal policies and procedures are applied consistently, as opposed to implementing rules in a way that is reactive to political influence and pressure. This includes separating the company’s government relations and content governance functions. A strong “human rights by design” approach would bolster good practices from the top, with transparency into the circumstances in which senior executives may get involved in content decisions, the actual cases in which they did, and the chain of command enabling such decisions to be made.

Legal and regulatory levers for reform

Going beyond best practices, Board expertise, independence, and oversight should also be supported by legal and regulatory measures relating to corporate governance.  

There are a number of sources for corporate governance requirements, including laws of the state of incorporation; federal securities laws, notably the Securities Act of 1933 (the “1933 Act”) and the Securities Exchange Act of 1934 (the “1934 Act”); other federal laws, including the Sarbanes-Oxley Act of 2002 (Sarbanes-Oxley Act) and the Dodd-Frank Wall Street Reform and Consumer Protection Act (Dodd-Frank Act); SEC regulations, rules, and guidance; and common law rules. In addition, listing standards published by registered stock exchanges such as the New York Stock Exchange Listed Company Manual (NYSE Listing Manual) and the National Association of Securities Dealers Automatic Quotation System Marketplace Rules (Nasdaq Marketplace Rules) require listed companies to maintain specified corporate governance practices.[35] Corporations are also subject to their own bylaws.

In the wake of financial crises, governments often enact more stringent laws and regulations related to corporate governance. For example, the Sarbanes-Oxley Act, which emerged in the wake of numerous large corporate fraud scandals such as Enron, WorldCom, and Tyco, was designed to implement sweeping reforms to corporate disclosures, financial reporting, and other corporate governance practices to restore investor confidence in the U.S. financial markets. Similarly, Dodd-Frank, enacted in response to the collapse of Lehman Brothers and the sub-prime mortgage crisis of 2008, was designed to restore public confidence by restructuring the financial regulatory system and improving corporate governance, including by strengthening shareholder power. 

Due to the nature of these crises, the emphasis of their resulting reform efforts was on strengthening corporations’ expertise, oversight, disclosures, and reporting on financial matters. For example, SEC rules promulgated per Section 407 of Sarbanes-Oxley require a public company to disclose in its annual report whether or not its Board of Directors has at least one “financial expert” on its audit committee to properly understand and assess the company’s financial risks, and, if not, why not.

In the wake of numerous civil and human rights related harms and abuses flowing from the policies adopted by large social media companies,[36] the SEC should adopt similar rules regarding civil and human rights expertise. Specifically, the rules should require companies that pose substantial civil and human rights-related risks to disclose whether they have at least one qualified civil and human rights expert on their Board of Directors and relevant oversight committees—and, if not, why not. This would help shareholders better assess the sufficiency of a company’s oversight over human rights-related risks. 

Expertise alone is insufficient without sufficient independence to apply it. While the separation of the roles of Board chairperson and CEO is mandated by regulation in some countries,[37] U.S. federal and state laws typically do not dictate or restrict board composition. However, both the NYSE Listing Manual and the Nasdaq Marketplace Rules do require a majority of Board members to be independent. Per the NYSE Listing Manual, anyone who owns company shares does not qualify as independent,[38] while ownership of company stock is a factor but not dispositive under the Nasdaq Marketplace Rules[39] (note that Facebook chose Nasdaq over the NYSE for its IPO).[40]

Where companies nevertheless insist on CEO duality, they are required to disclose their rationale for doing so. Per 2009 amendments to Regulation S-K of the 1933 Act, which lays out the reporting requirements for the SEC filings of public companies, companies must disclose their reasons for combining or separating the roles of CEO and chairperson of the board in their proxy statement.[41] The amendments were designed to provide investors with more transparency over corporate governance matters. As the form and nature of these disclosures is not standardized, firms use widely divergent wording to describe their rationale, making it difficult to study the true impact of such disclosures on corporate governance and the market.[42] According to at least one analysis, the two most commonly cited reasons for separating the CEO and Chairperson roles are the differences between the roles (and its associated tasks) as well as enhanced oversight and monitoring of management.[43] While there is evidence that CEO duality has implications for shareholder value,[44] more guidance or standardization about the nature of these disclosures would advance research and analysis in the ongoing debate around CEO duality.   

Even with these reforms, transparency alone on the part of company leadership is not enough to ensure good corporate governance. Armed with this improved transparency, shareholders must be empowered to exercise meaningful oversight, which is the subject of the next section.    

 IV.         Empowering Investor Oversight

The case for reform

Independent analyses suggest a growing rift between executives and investors with regard to the time and attention that Boards should place on human rights and Environmental, Social, and Governance (ESG) concerns. In PwC’s 2019 Annual Corporate Directors Survey, only 29% of Directors felt their Board needed more reporting on ESG measures, and a similar percentage felt that public ESG disclosure should be a priority.[45] Directors were much more inclined to declare the Board already had a strong understanding of ESG issues. Additionally, nearly half of those surveyed felt that institutional investors dedicate too much attention to corporate social responsibility (CSR) issues (up from 29% the prior year) and only one in five viewed human rights as a topic that could credibly impact company strategy. In contrast, a majority of institutional investors favors the use of ESG frameworks to assess non-financial disclosures and a rapidly growing number believe companies do not adequately disclose ESG risks.[46]

These results suggest that boardrooms and investors have increasingly disparate visions of companies’ responsibility toward the people who use their services and their impact on civil and human rights. The rift is compounded by the common view among executives that it is difficult to express dissenting perspectives in the boardroom—a trend that appears to be markedly stronger in companies that merge the roles of CEO and Board chair, demonstrating another downside of CEO duality in large technology companies.[47] These trends also illustrate why reforms at the Board level alone are insufficient to improve corporate governance in ways that will meaningfully mitigate ESG risks, and specifically human rights-related risks that manifest themselves through widespread mis- and disinformation and hate speech.

In addition to the Board of Directors, shareholders—individuals and institutions that own stock in a company—also play an oversight role. One vehicle for that oversight is the shareholder resolution, a formal proposal subject to a vote at the company’s annual meeting. Most proposals are non-binding and are typically opposed by management: if management agreed, there would be no need for a resolution. Resolutions typically focus on corporate governance, and increasingly involve human rights as well as environmental and social responsibility issues.[48]

Shareholder activism is a growing influence vector for activists seeking to change corporate behavior. However, this activism is often hampered by structural limitations such as dual-class or multi-class voting structures that dilute the influence of public investors relative to company insiders. Approximately 90% of all public companies in the U.S. have a ‘one share one vote’ structure, and IPOs in recent years have largely mirrored the broader market in this regard.[49] But large technology companies subvert this trend: in the first half of 2019, seven out of the ten largest tech IPOs (including Lyft, Pinterest, and Zoom) adopted a dual-class or multi-class structure.[50] Dual-class shares grant differential voting rights to a favored class of founders and insiders while retaining standard voting power (or none at all) for an inferior class of public investors.

Typically, the disparity between the voting power of the two classes is ten to one, as in the case of Facebook and Zoom. However, the superior class in companies such as Pinterest holds twenty times the voting power that ordinary shareholders do, while Snap and Alphabet both maintain a triple-class structure in which the lowest tier is given no right to vote whatsoever. The superclass skews the playing field further in its favor when unequal voting rights are supplemented with an imbalance between the company’s ownership and its voting hierarchy. Alphabet, for instance, has a voting superclass that controls more than 60% of the vote but retains less than 7% of the company’s total outstanding equity; Snap’s superclass controls nearly the entirety of the vote with an equity stake of less than 17%.[51] None of these companies have adopted sunset provisions for their dual-class structures, thus reducing the locus of decision-making power to a narrow group of insiders in perpetuity.

Superclass voting power

Standard class voting power

Superclass vote control

Superclass equity stake

Wedge (vote control - equity stake)

Sunset clause?

Alphabet *










































Snap *














**Alphabet and Snap both have a third tier of shareholders with no voting rights.

Table 1. Voting rights and ownership figures for eight prominent U.S. technology companies. Source: adapted from Council of Institutional Investors, Dual-Class Companies List: June 2020, Council of Institutional Investors (June 2020),

A structural manifestation of the disproportionate power reserved for top executives and founders, dual-class structures both fossilize existing policy approaches and virtually shut down the prospect of anchoring them in international human rights standards. This further erodes companies’ ability to manage complex and geographically nuanced problems like misinformation.

Recommended best practices

Retaining power through lopsided voting configurations is a choice, not an inevitability. Apple, Amazon, Microsoft, and Twitter all operate without a dual-class share structure.[52] While this approach has not immunized them from human rights shortcomings, it nonetheless demonstrates that balancing the pursuit of profit with accountability to shareholders is possible in the sector without skewing the playing field. While tech giants that are entrenched in the dual-class model view it as a way to insulate themselves against activist investors and initially adopted it to maintain founders’ control,[53] board members of companies across various industries are recognizing the reputational harm of this structure. According to industry surveys, nearly two thirds of corporate directors believe dual-class share structures reflect negatively or very negatively on the Board.[54] Continued pressure from civil society and responsible investors is likely to further this tendency and feed into critical public sentiment about companies operated as fiefdoms.

Early-stage companies should either consider avoiding dual-class structures altogether, or at least consider how to phase them out as the company grows and evolves, particularly to avoid negative public sentiment and reputational harm. Later-stage companies that already have dual-class or multi-class structures should reassess the potential reputational harm and increased scrutiny they create and consider restructuring ownership or else implementing countermeasures to address the risks. For example, where a company cannot easily restructure voting rights, they may consider whether other positive corporate governance measures, such as the separation of the CEO and Chairperson roles, more openness and receptivity towards shareholder resolutions, or voluntary enhanced corporate disclosures about the rationale for introducing or retaining differential voting structures, may help to improve investor or market confidence and trust in company leadership. While founders are unlikely to voluntarily relinquish control, companies should recognize that government intervention becomes more likely as public pressure mounts and that voluntary measures are in their own self-interest.

In addition, companies should disclose information about the “wedge” between ownership and control. At present, investors cannot easily quantify the difference between insider control of the vote and their equity stake—or the ways in which that difference can change. Companies have a range of options to disclose ownership information and are not required to disclose how the power differential between the superclass and the other class or classes can change in their IPO registration statement. As the SEC’s Investor Advisory Committee remarks, this means that the co-founder of a company like Snap can reduce their stake to below 1% without relinquishing total voting control, and no mechanism compels them to notify investors of the resulting risk.[55] Without a suite of disclosures on these topics, any ill-conceived approaches to issues like platform content governance will be harder to reverse by investor activism. 

Legal and regulatory levers for reform

As with reforming Board composition and expertise, empowering shareholders to exercise adequate oversight and hold company leadership accountable for their social impact, including with respect to the human rights of their users and other impacted stakeholders, may require going beyond corporate best practices to include certain legal and regulatory interventions.

One option may be to limit the ability of publicly traded companies to create and maintain dual-class or multi-class share structures. Typically designed to allow the founders of emerging companies to experiment and innovate as they grow, differential voting schemes effectively decouple voting rights from one’s economic interest in a company’s future success. In fact, research suggests that companies with dual-class share structures are more likely to exhibit problematic corporate governance practices and, in turn, face more corporate governance challenges than companies without such structures in place.[56] As a result, many countries, including Australia, Germany, Italy, Spain, and the United Kingdom, have regulations that prohibit or restrict dual-class shares. For example, the London Stock Exchange’s listing regime prohibits dual-class or differential voting structures within premium-listed companies, a category that spans many of the U.K.’s largest and most influential companies.[57] As with mandatory disclosures about the rationale for CEO duality or non-duality per amendments to Regulation S-K, the SEC may also consider promulgating rules that require companies with dual- or multi-class share structures to explain their rationale for doing so.   

Another option would be for the SEC to promulgate rules that enable shareholder activism by way of shareholder resolutions and other means, while rolling back rules that put up hurdles to this kind of activity. Unfortunately, in recent years, the SEC has chipped away at some of the core tools of activist investors concerned with ESG matters. For example, in September, the SEC enacted a rule that required shareholders to hold at least $25,000 of stock for a year (up from $2,000) before filing a shareholder resolution, increased the re-submission thresholds of votes required to get proposals included in the company’s annual proxy materials, and included a variety of amendments to existing rules making it harder to vote by proxy.[58] The SEC should reverse these rule changes, which make it more difficult for shareholders to file proposals and to get them on proxy ballots.

Finally, the SEC should promulgate rules requiring publicly traded companies to disclose what percentage of their revenue comes specifically from targeted advertising, as opposed to advertising more generally, as targeted advertising creates perverse incentives that feed the proliferation of mis- and disinformation, polarization, and hate speech on large social media platforms. The SEC should also require companies to disclose non-financial information about their ESG impact, including information about the social impact of targeted advertising and algorithmic systems. Such enhanced disclosures help to empower investors by providing them with the information they need to make informed voting and investment decisions on areas of risk, including the kind of rampant misinformation that undermines and threatens the human rights of the company’s user base.  

   V.         Conclusion

This paper argues that the spread of misinformation is a downstream manifestation of upstream corporate governance and management failures. It is settled doctrine that a Board of Directors’ fiduciary duties include establishing that management has an effective corporate compliance program in place, exercising oversight of that program, and taking regular steps to stay informed of the program’s content and operation.[59] The solutions we propose—reforming corporate leadership structures and enabling investors to exercise greater oversight—both address some of the root causes of misinformation and other societal ills that platforms propel. Neither requires an intervention into the actual content that proliferates on platforms, but rather into the systems that enable this proliferation.

At the same time, both measures are internationally scalable. Phasing out dual-class share structures would render Boards less prone to cherry-picking their priority markets and relying on scandal-driven due diligence for markets they relegate to a lower tier of importance.[60] Implementing clear rules to prevent the strong-arming of content decisions by executives and upper management could be possible in every country of operation if executives at the highest level demonstrate the willingness to change the status quo. Until top-level incentives are reformed through measures such as these, platforms will continue to address individual conflagrations reactively while ignoring the forest fires they help kindle.

Nathalie Maréchal is Senior Policy Analyst at Ranking Digital Rights, Elizabeth Renieris is the Founding Director of the Notre Dame-IBM Technology Ethics Lab, and Jan Rydzak is Company Engagement Lead at Ranking Digital Rights.

[1] See Zeynep Tufekci, Twitter and Tear Gas: The Power and Fragility of Networked Protest (2018).

[2] See Larry Diamond, Liberation Technology. 3 J. of Democracy 69 (2010).

[3] In this paper, we use Caroline Jack’s definitions: misinformation is “information whose inaccuracy is unintentional, while disinformation is “information that is deliberately false or misleading.” As Jack notes, “whether a given story or piece of content is labeled as misinformation or disinformation can depend as much on a speaker’s intent as on the professional standards of who is evaluating.” This distinction can be exceedingly challenging to parse, and while germane to many approaches to combating misinformation, is not particularly relevant to the interventions discussed in this paper. Therefore, for the purpose of this paper, we will refer to misinformation as including disinformation. See Caroline Jack. Lexicon of Lies: Terms for Problematic Information (Data & Society report, 2017),

[4] See Shoshana Zuboff, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (2019). See also Nathalie Maréchal & Ellery Roberts Biddle, It’s Not Just the Content, It’s the Business Model: Democracy’s Online Speech Challenge (Ranking Digital Rights report, 2020), See also Nathalie Maréchal et al., Getting to the Source of Infodemics: It’s the Business Model (Ranking Digital Rights report, 2020),

[5] See Laura DeNardis. The Global War for Internet Governance (2015).

[6] See Jack M. Balkin, How to Regulate (and Not Regulate) Social Media, Knight First Amendment Institute at Columbia University (Mar. 25, 2020),

[7] See California Public Employees’ Retirement System, Global Principles of Accountable Corporate Governance, California Public Employee’s Retirement System (Nov. 14, 2011), Numerous definitions of corporate governance exist, reflecting varying stakeholder priorities. Taking the simplest approach, corporate governance can be defined as “the system of rules, practices, and processes by which a firm is directed and controlled.” James Chen & Margaret James, Corporate Governance, Investopedia (Apr. 12, 2020),

[8] See Olivier Jan, The Board and ESG, Harvard Law School Forum on Corporate Governance (Feb. 25, 2019),

[9] See Kate Klonick, The New Governors: The People, Rules and Processes Governing Online Speech, 131 Harv. L. Rev. 1598 (2018). See also Sarah T. Roberts. Behind the Screen: Content Moderation in the Shadows of Social Media (2019).

[10] See Amanda Taub & Max Fisher, Where Countries Are Tinderboxes and Facebook Is a Match, The New York Times (Apr. 21, 2018),

[11] See Yochai Benkler et al. Network Propaganda: Manipulation, Disinformation, and Radicalization in American Politics (2018).

[12] See David Golumbia, Cyberlibertarianism: The Extremist Foundations of ‘Digital Freedom.’ Presented at Clemson University, Clemson, SC (Sept. 5, 2013),

[13] See, for example, Mark Zuckerberg’s 2019 speech at Georgetown University, and various proposals from the Trump Administration and congressional Republicans to strip platforms of Section 230 when they moderate or fact-check misinformation and other kinds of “lawful-but-awful” content. See Mark Zuckerberg, Standing for Voice and Free Expression, Presented at the Speech at Georgetown University, Washington D.C. (Oct. 17, 2019). See also Daphne Keller, Trump-Backed Rules for Online Speech: No to Porn, Yes to Election Disinformation and Hate Speech, Slate Magazine (Oct. 1, 2020),

[14] John P. Barlow, A Declaration of the Independence of Cyberspace, Electronic Frontier Foundation (Feb. 8, 1996),

[15] Kate Klonick, supra note 8.

[16] Daphne Keller, If Lawmakers Don't Like Platforms' Speech Rules, Here's What They Can Do About It. Spoiler: The Options Aren't Great, TechDirt (Sept. 9, 2020, 12:00 PM),

[17] Nathalie Maréchal & Ellery Roberts Biddle, supra note 4.

[18] Danish Institute for Human Rights, Human Rights Impact Assessment Guidance and Toolbox (2020).

[19] Newley Purnell & Jeff Horwitz, Facebook’s Hate-Speech Rules Collide with Indian Politics, The Wall Street Journal (Aug. 14, 2020),

[20] Id.

[21] Hayley Tsukayama, Facebook Taps D.C. Office Head to Manage Global Policy, The Washington Post (Oct. 6, 2014),

[22] Jeff Hortwitz & Deepa Seetharaman, Facebook Executives Shut Down Efforts to Make the Site Less Divisive, The Wall Street Journal (May 25, 2020),

[23] See Jack M. Balkin, supra note 6.

[24] See Samantha Bradshaw & Philip N. Howard, The Global Disinformation Order: 2019 Global Inventory of Organised Social Media Manipulation. University of Oxford Project on Computational Propaganda (Sept. 26, 2019),

[25] Craig Silverman et al. “I Have Blood on My Hands”: A Whistleblower Says Facebook Ignored Global Political Manipulation. BuzzFeed News (Sept. 14, 2020),

[26] Mike Isaac et al., Zuckerberg Defends Hands-Off Approach to Trump’s Posts, The New York Times (June 2, 2020),

[27] Facebook, Understanding the Community Standards Enforcement Report, Facebook, (last visited Sept. 24, 2020).

[28] The proposal garnered 3.71% of the shareholder vote and was not successful. See Securities and Exchange Commission, Schedule 14A Information: Proxy Statement Pursuant to Section 14(a) of the Securities Exchange Act of 1934, Securities and Exchange Commission (Apr. 10, 2020),

[29] See Ranking Digital Rights, 2019 RDR Corporate Accountability Index: Governance at 3.2, Ranking Digital Rights (May 15, 2019),

[30] Ranking Digital Rights, 2019 RDR Corporate Accountability Index at 26-27, Ranking Digital Rights (May 15, 2019),

[31] Damien Kieran & Kayvon Beykpour, What We Have Been Doing to Protect Your Privacy and Data, Twitter Blog (Dec. 2, 2019),

[32] See Business Roundtable, Principles of Corporate Governance, Harvard Law School Forum on Corporate Governance (Sept. 8, 2016),

[33] See Council of Institutional Investors, Independent Board Leadership, Council of Institutional Investors, (last visited Oct. 1, 2020). See Council of Institutional Investors, Policies on Corporate Governance at 2.4, Council of Institutional Investors (Sept. 22, 2020), ("The board should be chaired by an independent director. The CEO and chair roles should only be combined in very limited circumstances; in these situations, the board should provide a written statement in the proxy materials discussing why the combined role is in the best interests of shareowners, and it should name a lead independent director who should have approval over information flow to the board, meeting agendas and meeting schedules to ensure a structure that provides an appropriate balance between the powers of the CEO and those of the independent directors.")

[34] Jena Martin, Business and Human Rights: What's the Board Got to Do with It, 3 U. Ill. L. Rev. 959, 970 (2013).

[35] See Stephen Giove & Robert Treuhold, Corporate Governance and Directors’ Duties in the United States: Overview, Thomson Reuters Westlaw (Feb. 1, 2013),

[36] For a U.S. example, see Russell Brandom, Facebook Chose Not to Act on Militia Complaints Before Kenosha Shooting, The Verge (Aug. 26, 2020), For an international example, see Jennifer Whitten-Woodring et al., Poison If You Don’t Know How to Use It: Facebook, Democracy, and Human Rights in Myanmar, 3 Int’l J. of Press/Politics.

[37] See, e.g., India: KPMG, SEBI Defers the Timeline for Separation of the Roles of Non-Executive Chairperson and MD/CEO by Two Years, KPMG (Jan. 17, 2020), See also Organisation for Economic Co-operation and Development, OECD Corporate Governance Factbook 2019 at 119, Organisation for Economic Co-operation and Development (June 8, 2019),

[38] See New York Stock Exchange, NYSE Rule Guide at 303A.02 (Independent Tests) (a)(i), New York Stock Exchange (Nov. 25, 2009),!WKUS-TAL-DOCS-PHC-%7B0588BF4A-D3B5-4B91-94EA-BE9F17057DF0%7D--WKUS_TAL_5667%23teid-69. (“No director qualifies as "independent" unless the board of directors affirmatively determines that the director has no material relationship with the listed company (either directly or as a partner, shareholder or officer of an organization that has a relationship with the company).”)

[39] See Nasdaq, Nasdaq Rulebook at IM-5605. Definition of Independence — Rule 5605(a)(2), Nasdaq Listing Center, (It is important for investors to have confidence that individuals serving as Independent Directors do not have a relationship with the listed Company that would impair their independence. The board has a responsibility to make an affirmative determination that no such relationships exist through the application of Rule 5605(a)(2). Rule 5605(a)(2) also provides a list of certain relationships that preclude a board finding of independence. These objective measures provide transparency to investors and Companies, facilitate uniform application of the rules, and ease administration. Because Nasdaq does not believe that ownership of Company stock by itself would preclude a board finding of independence, it is not included in the aforementioned objective factors. It should be noted that there are additional, more stringent requirements that apply to directors serving on audit committees, as specified in Rule 5605(c).”)

[40] See Jessica Guynn, Facebook Selects Nasdaq over NYSE for IPO, The Los Angeles Times (Apr. 5, 2012),

[41] The SEC amended Item 407 of Regulation S-K and Schedule 14A to require disclosure of whether and why a company has chosen to combine or separate the principal executive officer and Board chairperson positions, and the reasons why it believes that this Board leadership structure is the most appropriate structure for the company at the time of filing. If these positions are combined, and a lead independent director is designated to chair meetings of the independent directors, the amendments require disclosure of why this leadership structure was chosen, as well as the specific role the lead independent director plays in the company’s leadership structure. See 17 CFR § 229.407 (h). (“Board leadership structure and role in risk oversight. Briefly describe the leadership structure of the registrant's board, such as whether the same person serves as both principal executive officer and chairman of the board, or whether two individuals serve in those positions, and, in the case of a registrant that is an investment company, whether the chairman of the board is an “interested person” of the registrant as defined in section 2(a)(19) of the Investment Company Act (15 U.S.C. 80a-2(a)(19)). If one person serves as both principal executive officer and chairman of the board, or if the chairman of the board of a registrant that is an investment company is an “interested person” of the registrant, disclose whether the registrant has a lead independent director and what specific role the lead independent director plays in the leadership of the board. This disclosure should indicate why the registrant has determined that its leadership structure is appropriate given the specific characteristics or circumstances of the registrant. In addition, disclose the extent of the board's role in the risk oversight of the registrant, such as how the board administers its oversight function, and the effect that this has on the board's leadership structure.”).

[42] See Marc Goergen et al., On the Choice of CEO Duality: Evidence from a Mandatory Disclosure Rule (CFR Working Paper No. 18-06, Centre for Financial Research (CFR), University of Cologne, 2018),

[43] Id. at 11.

[44] See id. at 23.

[45] PwC, The Collegiality Conundrum: Finding Balance in the Boardroom – PwC’s 2019 Annual Corporate Directors Survey, PwC, at 31 (October 2019),

[46] EY, How Will ESG Performance Shape Your Future? Why Investors Are Making ESG an Imperative for COVID-19 and Beyond – Climate Change and Sustainability Services (Ccass) Fifth Global Institutional Investor Survey, EY (July 22, 2020),

[47] PwC, supra note 45, at 29.

[48] See Diana Kearney & Sharmeen Contractor, Investors Embrace Human Rights in the Age of Corona, Responsible Investor (Sept. 4, 2020),

[49] Council of Institutional Investors, Dual-Class IPO Snapshot: 2017—2019 Statistics, Council of Institutional Investors, (last visited Oct. 2, 2020).

[50] See Kosmas Papadopoulos, Dual-Class Shares: Governance Risks and Company Performance, Harvard Law School Forum on Corporate Governance (June 28, 2019),

[51]  Council of Institutional Investors, Dual-Class Companies List: June 2020, Council of Institutional Investors (June 2020),

[52] Rani Molla, More Tech Companies Are Selling Stock That Keeps Their Founders in Power, Vox (Apr. 11, 2019),

[53] Vijay Govindarajan & Anupt Srivastava, Reexamining Dual-Class Stock, 3 Business Horizons 461 (2018).

[54] PwC, Turning Crisis into Opportunity: PwC’s 2020 Annual Corporate Directors Survey, PwC, at 18 (Sept. 2020),

[55] Securities and Exchange Commission Investor Advisory Committee, Recommendation of the Investor Advisory Committee Dual Class and Other Entrenching Governance Structures in Public Companies, Security and Exchange Commission Investor Advisory Committee (Mar. 2018),

[56] See Kosmas Papadopoulos, supra note 48.

[57] See Marc T. Moore, Designing Dual Class Sunsets: The Case for a Transfer-Centered Approach, University of Oxford Business Law Blog (Jan. 16, 2020), (“The general tolerance shown towards DCS by US regulators puts the United States in stark contrast to numerous other jurisdictions including Australia, Belgium, Brazil, Germany, Italy, Spain and the United Kingdom.”)

[58] See Security and Exchange Commission, SEC Adopts Amendments to Modernize Shareholder Proposal Rule, Security and Exchange Commission (Sept. 23, 2020), Under the new rules, shareholders must own $25,000 of the company’s securities for at least one year, $15,000 for two years, or $2,000 for three years before they can submit a proposal.

[59] See Robert Biskup et al., Board Oversight of Corporate Compliance: Is It Time for a Refresh? Deloitte LLP (Oct. 15, 2019),

[60] See Kendyl Salcito, Company-Commissioned HRIA: Concepts, Practice, Limitations and Opportunities, in Handbook on Human Rights Impact Assessment 32 (2019).