In the Press
Tuesday, July 5, 2022A Growing Movement Against Illegal War The Washington Post
Tuesday, July 5, 2022How Can States Limit Guns? By Protecting The Right to Peaceably Assemble — A Commentary by Ian Ayres ’86 Los Angeles Times
Tuesday, July 5, 2022Infertility Patients and Doctors Fear Abortion Bans Could Restrict I.V.F. The New York Times
Tuesday, July 5, 2022After Roe, Are Republicans Willing to Expand the Social Safety Net? The Guardian
Wednesday, January 19, 2022
Report: Computers Make Critical Decisions for CT Agencies; How That Works is Kept Secret
Three major Connecticut state agencies have used automated software programs called algorithms to make policy decisions affecting school funding, removing children from families, and hiring state workers, but those agencies are unable or unwilling to fully disclose how these programs make their decisions, a report has found.
Significant gaps in public oversight of behind-the-scenes decision-making by computers were revealed in the comprehensive research report released by the Media Freedom and Information Access Clinic (MFIA) at Yale Law School, in collaboration with The Connecticut Foundation for Open Government (CFOG) and the Connecticut Council on Freedom of Information (CCFOI).
The report, “Algorithmic Accountability: The Need for a New Approach to Transparency and Accountability When Government Functions Are Performed by Algorithms,” is based on Freedom of Information Act (FOIA) requests made by the MFIA Clinic. The three state agencies — whose responses were generally deficient and untimely — are the departments of Children and Families (DCF), Education (DOE) and Administrative Services (DAS).
The report documents examples outside Connecticut where algorithms produced faulty background checks, made incorrect facial recognition matches, denied benefits erroneously, and wrongfully terminated parental rights. It sought to determine the extent to which the public can know that algorithms used by Connecticut agencies do not suffer from similar flaws.
“The potential for real harm makes transparency surrounding the government’s use of algorithms critical,” said MFIA Fellow Stephen Stich ’17, a co-author of the report. “This transparency does not exist in Connecticut today.”
Speaking to the agencies’ noncompliance with FOIA, Mitchell W. Pearlman, former Executive Director of CT Freedom of Information Commission and CFOG officer, added, “Unfortunately, government agencies have always used these same techniques of delay, denial and obfuscation to hinder access to public records that would hold those agencies accountable and shed some needed sunlight on their activities.”
Algorithms are sets of computer instructions government agencies and others increasingly use to solve problems or accomplish tasks formerly done by humans. Policymakers use algorithms to conduct government business that has life-changing effects. They are used widely for everything from setting bail to allocating police resources and distributing social welfare benefits.
Although algorithms can improve effectiveness and efficiency, their use has problems, the report explains. Algorithms can make mistakes and worsen preexisting biases. And when government algorithms make mistakes, they can upend lives.
The agencies’ delayed and sparse responses to inquiries led the Yale clinic to conclude that “existing disclosure requirements are insufficient to allow meaningful public oversight of the use of algorithms, and that agencies do not adequately assess [algorithms’] effectiveness and reliability.” Moreover, agency personnel appear unconcerned about the lack of transparency, the report states.
Using the FOIA, the Yale students sought information about what certain algorithms used by the three agencies do, how they were obtained, whether they were evaluated, and the extent to which answers to those questions can be accessed under existing open-government laws.
Among the three state agencies, the most incomplete response came from DAS, according to the report. The agency is arguably one of Connecticut’s most powerful state government departments, having broad authority over services that cover employment practices, procurement, facilities management, and technology.
FOIA requires state agencies to respond to inquiries within four business days. But the report shows that, for several months, DAS ignored the request for information about a new algorithm used in hiring state employees and contractors. Many attempts to set up meetings went unheeded, the report recounts. Ultimately, DAS provided no documentation after invoking FOIA exemptions the report’s authors found dubious and irrelevant.
The report shows that DOE responded, in part, to an inquiry involving an algorithm used to assign students to public schools, an issue hugely important given the court cases and settlements designed to end racial segregation in Connecticut schools. However, the agency failed to disclose how its school-assignment algorithm worked. There appeared no mechanism to allow parents to challenge its decisions, according to the report.
Citing a trade-secret exemption, DOE refused to produce key data and failed to produce most records regarding its acquisition of the algorithm. The department just produced the procurement announcement and the contract, which totaled $650,000. Thus, it was impossible to evaluate the algorithm’s efficacy or bias, the report concluded.
DCF officials provided the only complete response to the Yale clinic’s request, producing documents on its algorithm used to reduce the number of children experiencing life-threatening episodes. Illinois abandoned that same algorithm after it determined that it was ineffective. The report documented that DCF had not performed a “robust evaluation of the algorithm’s efficacy or bias” before it was implemented or in the three years it was used.
The authors say legislative remedies are needed, like those introduced or considered by governmental entities elsewhere. The report discusses options that include forcing state agencies to assess publicly the effectiveness and bias of any algorithm used, mandating the waiver of trade secret protections in certain situations, and requiring disclosures to individuals who are subject to algorithmic decisions.
“While the details of these approaches require study, it is imperative that steps are undertaken now to identify an effective response to the current lack of algorithmic accountability. The potential for serious harm to be inflicted by malfunctioning or biased algorithms are too serious to ignore,” said William J. Fish Jr., president of CT Foundation for Open Government.
The Media Freedom and Information Access Clinic is dedicated to increasing government transparency, defending the essential work of news gatherers, and protecting freedom of expression through impact litigation, direct legal services, and policy work.