To Build a Better Online Community, Initiative Looks Beyond Moderation

A world map with cities of connected with lines

Social media platforms hire thousands — even tens of thousands — of contractors to review endless streams of text, photos, and videos. These screeners look for content that breaks the platforms’ rules, and then the platforms remove the offending posts. For now, that’s the industry standard for dealing with harmful content online. But recently, a research center at Yale Law School sought other approaches. 

These ways to create a healthier online community are detailed in “Beyond Moderation: Emerging Research in Online Governance,” a new collection of essays from the Justice Collaboratory’s Social Media Governance Initiative (SMGI)

“Rather than focus exclusively on removing harmful content after it is posted, we think there are ways to intervene further upstream where online behavior and user interaction start to form, with the goal of promoting online community vitality,” said Matt Katsaros, the initiative’s director.

The Justice Collaboratory works towards a theory-driven, evidence-based justice system in which people and communities feel and experience equity, respect, and trust. The initiative is applying that same standard to the emerging area of online governance, roughly defined as the rules, policies, standards and practices that coordinate and shape the online world. The initiative’s aim: to create an online environment that is good for society.

READ THE ESSAYS: “Beyond Moderation: Emerging Research in Online Governance”

To spark discussion and collaboration about how an “upstream” approach to online governance would work, the Justice Collaboratory brought together 80 people for two days of presentations and discussions this spring. The event’s theme, “Beyond Moderation,” acknowledges the limitations of the current approach and recognizes the possibility of alternatives, according to the organizers. The collection of essays grew out of the event.

The event’s 80 participants were roughly split between research scholars and industry practitioners. The scholars spanned disciplines, including legal, HCI (human computer interaction), sociology, and psychology. Industry professionals included data scientists, product managers, policy managers, and UX researchers. They came from organizations including Jigsaw, Meta, Niantic Labs, Reddit, Roblox, Snap, Spectrum Labs, Spotify, TikTok, Tinder, Twitch, and Wikimedia.

Noting the importance of bringing fresh voices to the conversation, the event’s organizers invited three emerging scholars to share  a written presentation of their research on online governance. The compilation of their work is now online with an introduction by Katsaros and Justice Collaboratory Executive Committee member Sudhir Venkatesh. The coauthors reflect on both the convening and the current state of online trust and safety.

In the first essay, “The Teenaged Adults in the Room: Understanding And Supporting Young Online Community Moderators,” Jina Yoon from University of Washington examines how teenagers are participating as volunteer moderators online. She explores through qualitative interviews how these teens use this experience to learn and grow. In doing so, Yoon reframes the dominant narrative that is limited to protecting teens from harm online. She instead tells a story that seeks to recognize and enhance their agency.

The next essay, “Learning to Resist Misinformation: A Field Experiment” is by Monika Yadav from Columbia University. In a study, Yadav and her colleague conduct a rigorous, longitudinal field experiment in India providing participants weekly digests of factual information related to viral misinformation spread across the country over the past week.

Finally, Adina Gitomer from Northeastern University explores online activism online in her essay “Stop Scrolling!: Youth Activism and Political Remix on TikTok.” Gitomer conducts timely research focusing on the specific design and structure of TikTok that is leveraged for political activism on the platform. Through a mixed methods approach, Gitomer and her colleagues look for the ways that younger and older users of TikTok are using the platform to create and spread political messages.

The Justice Collaboratory’s SMGI will continue to convene researchers and practitioners to share knowledge and ideas that can help ensure healthy and safe online communities.