In the Press
Wednesday, May 31, 2023“Words and Policies: ‘De-Risking’ and China Policy — A Commentary by Paul Gewirtz Brookings
Wednesday, May 31, 2023It’s Time to Fix Congress’s Classification Infrastructure — A Commentary by Oona Hathaway ’97, Michael Sullivan ’24, and Aaron Sobel ’23 Just Security
Wednesday, May 31, 2023In ‘Fancy Bear Goes Phishing,’ Tales of Harmful Hacks The New York Times
Tuesday, May 30, 2023America Needs More Housing, But Not More Public Housing The Washington Post
Wednesday, March 29, 2023
Report: Online Platforms Can Stem Problem Content With a Principle from Criminal Justice
Applying a theory from criminal justice about why people obey the law, researchers propose that online platforms can help stem the tide of problem content by being transparent with users about how they enforce rules of online behavior.
The Justice Collaboratory has released these findings of its Social Media Governance Initiative in a new publication: The Procedural Justice Framework for Tech Professionals, a Practical Guide for Building and Maintaining Healthy Online Communities.
The concept central to the guide is procedural justice, which emphasizes the role of the process in interactions with a rule-making or rule-implementing authority. Traditionally, the theory has been used in the criminal-legal setting, such as policing. However, new research demonstrates that procedural justice can also be effective in online environments as part of strategies for content moderation, the process of screening online platforms for inappropriate or harmful content.
“A content moderation approach that is procedurally just helps build healthier relationships between the platform and users, improve users’ understanding of the rules, and reduce violations in the future,” said Vivian Zhao, co-author of the report and Research Assistant at Yale Law School.
Procedural justice consists of four main components:
- Treating the individual with dignity and respect
- Giving the individual a voice
- Maintaining neutrality and transparency
- Acting with trustworthy motives
Currently, the guide explains, online platforms primarily rely on a deterrence approach: using punishment to discourage unwanted behavior. Platforms take down content that violates the rules, may sanction users with punishments ranging from suspension to a permanent ban. The underlying logic is that users follow rules to avoid punishment — much like in the criminal legal systems. The guide proposes an alternative.
Studies indicate that platforms can build trust and improve rule adherence by adopting a procedural justice approach to how they design and enforce policies, according to the guide. For example, providing explanations for post removals can help users understand the platform’s decision-making process, making them more likely to respect the outcome even if they initially disagreed with it. When users who violate a platform’s rules feel that they have been treated fairly by the platform, the researchers write, they become less likely to violate these rules. As a result, online platforms help build trust with users while reducing problem content.
The Justice Collaboratory is a membership-based social science research center at Yale Law School that brings together an interdisciplinary group of scholars and researchers at Yale and beyond to cooperatively work toward a theory-driven, evidence-informed justice system.