- Saturday, April 2, 2016 at 8:30AM - 8:30AM
- Room 127
- Open To The YLS Community Only
- Event URL
- Add to Calendar:
The increasing power of big data and algorithmic decision-making—in commercial, government, and even non-profit contexts—has raised concerns among academics, activists, journalists and legal experts. Three characteristics of algorithmic ordering have made the problem particularly difficult to address: the data used may be inaccurate or inappropriate, algorithmic modeling may be biased or limited, and the uses of algorithms are still opaque in many critical sectors.
No single academic field can address all the new problems created by algorithmic decision-making. Collaboration among experts in different fields is starting to yield important responses. Researchers are going beyond the analysis of extant data, and joining coalitions of watchdogs, archivists, open data activists, and public interest attorneys, to assure a more balanced set of “raw materials” for analysis, synthesis, and critique. As an ongoing, intergenerational project, social science must commit to assuring the representativeness and relevance of what is documented—lest the most powerful “pull the strings” in comfortable obscurity, while scholars’ agendas are dictated by the information that, by happenstance or design, is readily available. What would similar directions for legal scholars and journalists look like? This conference will aim to answer that question, setting forth algorithmic accountability as a paradigm of what Kenneth Gergen has called “future-forming” research.
Algorithmic accountability calls for the development of a legal-academic community, developed inter-disciplinarily among theorists and empiricists, practitioners and scholars, journalists and activists. This conference will explore early achievements among those working for algorithmic accountability, and will help chart the future development of an academic community devoted to accountability as a principle of research, investigation, and action.
Co-sponsored by the Oscar M. Ruebhausen (OMR) Fund and YJOLT.