The Problem With Tomorrow
A woman in Arkansas with severe cerebral palsy sees her Medicaid-provided caretaker hours reduced from 56 to 32 per week. Some public school teachers are fired while others receive paycheck bonuses in Houston. A teenager gets a harsh sentence in juvenile criminal court in D.C. Police officers in New Orleans patrol the streets with a list of “high-risk” individuals to check in on periodically. Black families in Allegheny County, Pennsylvania, receive disproportionate attention from the county’s Department of Human Services for child abuse-related issues. Twenty thousand individuals were told they had won the Green Card lottery in 2011—only to learn that there was a mistake and that the lottery would be run again, potentially costing them their futures in the United States.
One common strand ties these disparate threads together: none of these governmental decisions were made by a human being.
Instead, they were outsourced to an algorithm in one form or another. Our government is leaning on algorithms more and more to analyze, assess, and decide the fates of its citizens. From access to benefits to the allocation of public resources, sentencing in the criminal justice system, immigration, and more, algorithms increasingly determine how the government affects our lives and livelihoods. And what’s more, these algorithms are rarely developed by the government itself—rather, they are outsourced in the truest sense of the word, purchased from private corporations in return for lucrative contracts, reams of data, and the protections offered by trade secret law.
Companies sell their software products. The government sidesteps the public accountability offered by transparency—and intelligibility—in pursuit of a world of technologically-driven efficiency and accuracy. We are left to grapple with decisions we struggle to contest, often because they refuse explanation by definition.
This is the story behind the examples this post began with, and many more. In some cases, attorneys manage to convince the court that the government cannot continue to rely solely on a number-crunching algorithm to decide which teachers keep their jobs or on a complicated decision tree to calculate disability benefits. But in others, companies claim that trade secrets law protects their code from dissemination or even examination—including by judges—even when someone’s freedom is on the line.
How is it that the government can prioritize a business’s investment in the secrecy of its product over, for example, one’s constitutional right to confront their accuser or enjoy due process of law in a criminal proceeding? In other words, how is it that government accountability bows to corporate ambition when algorithms are involved?
Trade secret law gives us insight into one aspect of this problem. It’s what enables businesses to contract with the government, gaining access to enormous amounts of data while safely avoiding public oversight of their work by keeping it under lock and key.
Algorithms & Trade Secrets
A trade secret is information of some sort that (1) provides an economic benefit to its possessor by virtue of its secrecy, (2) and which is the subject of reasonable and successful efforts to maintain that secrecy. If you think that definition sounds vague, you would be correct. It is an expansive protection that covers a great deal of much intellectual property in the business world, from Coca-Cola’s soda recipe (allegedly kept in a custom vault in Atlanta that can be opened only by two senior executives, whose names are unknown to the public and who cannot fly on a plane together) to Google’s search algorithm (protected by the less literal vault of nondisclosure agreements, employment contracts, and in-house counselors with trade secret misappropriation suits ready to fire at a moment’s notice). While most states have adopted their own form of the Uniform Trade Secrets Act to further define what a trade secret is, they generally allow for information that could qualify as a “formula, pattern, compilation, program, device, method, technique, or process” to be protected from misappropriation. Courts also often refer to § 757, comment b, of the Restatement of Torts (1939) for its finer six-factor test to comb through the claims at issue in a case where the status of something as a trade secret is in question.
Algorithms have roundly been accepted as trade secrets by courts across the country. In the trade secrets context, an algorithm has been defined as some variation on “a series of commands designed to accomplish a specific task” whose “architecture is a series of functional blocks, each of which makes an independent inquiry and represents an individual step of the algorithm.”1 More simply, they are “the steps taken by a computer to solve a particular problem.”2 This amorphous definition has especially helped corporations claim trade secrets by focusing vaguely on what their algorithms do as opposed to what they are.3
When determining whether something is a trade secret, some courts focus on the economic benefit of keeping algorithms secret over whether the algorithms, in and of themselves, are discrete protectable types of information.4 Others focus on the level of secrecy at which the disputed algorithms are kept.5 These rationales extend to forms of information we might think of as comprising, resulting from, or adjacent to an algorithm, such as source code.6 On the whole, there seems to be little investigation into what exactly an algorithm is composed of, aside from occasional and superficial walkthroughs of terms like source code, programming language, and so on.7
In sum, courts have found it fairly uncontroversial that algorithms can be trade secrets. This is not a problem in and of itself. As described earlier, however, issues arise when the government uses algorithms to make the kinds of decisions that we hope are carefully reasoned, intelligible, and balanced by empathy. In these situations, the economic rationale for trade secrets protections falls away in the face of our values for government fairness, transparency, and accountability.
The Freedom of Information Act includes an exemption for trade secrets. Algorithm proprietors argue against subpoenas for their source code in criminal cases.8 Judges are under pressure to conduct in camera reviews of alleged trade secrets and release redacted documents to opposing parties conducting discovery. These are just some of the ways that the balance of priorities has shifted from protecting the public to protecting private interests when a suit turns on a secret with economic value. As the government continues to outsource its decision-making to proprietary algorithms, this imbalance will chip more of our safety mechanisms for accountability away.
A Few Potential Solutions
This problem of diminishing government transparency falls right into the MFIA clinic’s wheelhouse. So what can we do to solve it?
There are at least three forms of the government-algorithm relationship where we might explore inserting some accountability measures to correct the public/private balance of power: procurement, public requests, and litigation.
Procurement: When an arm of the government—state, local, or federal—decides to automate some analysis or decision-making process typically carried out by humans, it could be held to a higher standard of transparency. There is not much in the way of regulation here today: indeed, most politicians in New Orleans were not aware that the city had contracted with Palantir, a Big Data software company, to develop a predictive policing algorithm. Imagine instead if the government had to lay out the terms of the contract; share the goals, training manuals, and datasets of its algorithms; offer a notes and comments period to gather public feedback; hire unaffiliated software engineers and academics to examine the source code; and commit to limiting each algorithm’s use to an exploratory test run before signing the full contract. These and other policies could ensure that the public was aware and in control of government automation—and would therefore have the knowledge to decide whether increased efficiency was worth the cost, or whether the algorithm accomplished what it claimed.
Public requests: The Freedom of Information Act’s Exemption 4 protects against the disclosure of trade secrets. Many states have their own versions of FOIA with similar exemptions. While this is an important safeguard to make sure the government can find willing contractors—some companies might be deterred from public works if that meant sacrificing their intellectual property—there ought to be a middle ground between fully disclosing a trade secret and redacting it entirely. If a proprietary algorithm is requested via FOIA or a similar law, the government could respond with something similar to the Vaughn index. Instead of giving away a company’s source code entirely, the government could instead release a report that details the types of variables used in an algorithm, its general controlling logical principles, the ways in which biases get investigated, snippets of its code taken from open source hubs, training documentation for government employees, and so on. In addition, we might also require that for each algorithm the government uses, it offers black box testing opportunities upon request, and similar chances for experts in the public to probe the validity of algorithms without disrupting their status as trade secrets.
Litigation: Finally, courts might consider delving deeper into which parts of an algorithm are protectable by trade secrets law and which ought to be, if not publicly accessible, then at least discoverable during litigation. Let’s return to Palantir’s predictive policing algorithm that New Orleans had used for an example. The economic value of this algorithm is its “accuracy”: how well it anticipates the risk levels of certain individuals or neighborhoods, and thus how successfully it helps the police reduce crime. The algorithm’s accuracy results from the mechanisms by which it takes a massive amount of data, draws connections between this piece and that, and “design[s] targeted interventions to protect the city’s most vulnerable populations.” Those mechanisms might be fairly categorized as trade secrets, since they fuel the algorithm’s economic value. However, other aspects of the algorithm—like the data it gets fed from criminal records and social media histories, the high-level logic behind its code, the publicly available open source code snippets it uses, the particular insights it provides to the police department—ought to be discoverable, since revealing them would not jeopardize the algorithm’s market position. Greater technical literacy among the judiciary would go a long way: trade secrets protection wielded as a scalpel, rather than a club, could help protect us against the tyranny of unaccountable governmental automation.
 ClearOne Commc'ns, Inc. v. Chiang, 608 F. Supp. 2d 1270, 1273 (D. Utah 2009).
 Morley v. Square, Inc., No. 4:10CV2243 SNLJ, 2016 WL 1615676, at *2 (E.D. Mo. Apr. 22, 2016).
 See, e.g., Whetstone Holdings, LLC v. Thorell, No. 13-CV-24138-UU, 2014 WL 11906593, at *4 (S.D. Fla. Feb. 5, 2014) (“Plaintiff describes its trade secret as an algorithm that, based on certain information, more accurately predicts which potential customers own automobiles. Plaintiff need not describe the algorithm in greater detail, and, as such, sufficiently alleges the existence of a trade secret”); DVD Copy Control Assn., Inc. v. Bunner, 75 P.3d 1, 6 (Cal. 2003) (“The algorithm [in this case] is a type of mathematical formula for transforming the contents of the movie file into gibberish”).
 See, e.g., Vermont Microsystems, Inc. v. Autodesk, Inc., 88 F.3d 142, 149 (2d Cir. 1996) (“Autodesk takes issue with the district court's implicit finding that the triangle shading algorithm qualifies as a trade secret under California law ... This argument is unavailing here. The economic value of VMI's triangle shading algorithm is readily apparent”); Workgroup Tech. Partners, Inc. v. Anthem, Inc., No. 2:15-CV-00002-JAW, 2016 WL 424960, at *22 (D. Me. 2016) (“[T]he inner workings of the Workgroup software program were the essence of the thing of value that Workgroup licensed to Anthem”).
 See, e.g., Altavion, Inc. v. Konica Minolta Sys. Lab., Inc., 171 Cal.Rptr.3d 714, 738 (Ct. App. 2014) (“[Algorithms and source code] information is unquestionably protectable by trade secret law”).
 United States v. Aleynikov, 737 F. Supp. 2d 173, 187 (S.D.N.Y. 2010) (“The source code … contains highly confidential trade secrets related to the Trading System. A market for such valuable trade secrets could readily be proven at trial”).
 See, e.g., Universal City Studios, Inc. v. Reimerdes, 111 F. Supp. 2d 294, 304 (S.D.N.Y. 2000).
 See United States v. Ocasio, No. EP-11-cr-02728-KC, 2013 WL 2458617, at *4 (W.D. Tex. June 6, 2013); United States v. Ocasio, No. 3:11-cr-02728-KC, slip op. at 2 (W.D. Tex. May 28, 2013). See generally Rebecca Wexler, Life, Liberty, And Trade Secrets: Intellectual Property In The Criminal Justice System, 70 Stan. L. Rev. 1343 (2018) (arguing that increasing automation in the criminal justice system should not continue to enjoy the protections of trade secrets law).