Media Freedom and Information Access Clinic Talks to OpenAI’s David Robinson ’12

David Robinson ’12

David Robinson ’12 is the Head of Policy Planning at OpenAI, the artificial intelligence laboratory best known for developing ChatGPT. A dozen years ago, he was one of the early students to work with Yale Law School’s Media Freedom and Information Access Clinic (MFIA). He recently caught up with the clinic for a conversation about his time in law school and his work since. 


Born with a mild case of cerebral palsy that affected his fine motor skills, David Robinson wasn’t very good at writing as a child. It was only when he was given access to a word processor in school that he discovered his love of writing. The policies that brought computers to classrooms had a liberating effect for the young Robinson.

“What I take from that is that the impact and benefits that technology have for people is about getting the rules right,” Robinson said.

After majoring in philosophy at Princeton University and getting master’s degree at Oxford as a Rhodes scholar, in 2007 Robinson returned to Princeton to start the Center for Information Technology Policy, a joint venture between the Computer Science department and the School of Public Policy. The experience made Robinson recognize the importance of the law in shaping the future of technology, leading him to Yale Law School. 

In 2011, he was part of the second cohort of the Media Freedom and Information Access Clinic, which is dedicated to increasing government transparency, defending the essential work of news gatherers, and protecting freedom of expression. The clinic is an initiative of the Information Society Project (ISP), a Yale Law School center that supports a community of interdisciplinary scholars who explore issues at the intersection of law, technology, and society.

“Law school gives you a small window into complex domains where people go very deep. Clinics in particular give you a flavor of various disciplines and they are such an important component of a legal education.”
— David Robinson ’12

“Law school gives you a small window into complex domains where people go very deep,” Robinson said. “Clinics in particular give you a flavor of various disciplines and they are such an important component of a legal education.”

Always interested in bridging law and computer science, Robinson co-founded a project during his third year of law school to bring technology expertise to policy makers. The project eventually became Upturn. As a Washington, D.C.-based nonprofit, Upturn partnered with leading civil rights and social justice organizations to drive design changes at major online platforms like Facebook and Google.

During this period, Robinson worked on what was going wrong with various high-stakes technologies, addressing issues including pretrial risk assessment, benefits eligibility, and employment screening systems. He was part of the group that started the Association for Computing Machinery’s Fairness Accountability and Transparency conference, which has become the leading interdisciplinary venue for computer science and social research related to problems of justice and fairness in computing.

But Robinson, who characterizes himself as “dispositionally interested in approaching issues from various perspectives,” also felt a need to move beyond critique and explore what better models for governing technology might look like. Students in a seminar he taught at Georgetown Law, Governing Automated Decisions, introduced him to a technology that seemed to be doing a lot of things right. That technology was the Kidney Allocation System, an organ transplant matching algorithm built by a diverse group of patients, surgeons, clinicians, data scientists, public officials and advocates. Its creators had made efforts at participatory and transparent design, done impact assessments and invited audits from a third party auditor. The story of the algorithm was the basis for his 2022 book, Voices in the Code

Robinson said his goal with the book was “to tell a story that would have room for the moral problems that haven’t been solved; the valiant efforts that have been made, with mixed results, to try and govern things in a wise manner; the people who are disappointed; the people who are pleased; and, ultimately, what the world can learn about how to govern software, if not perfectly, incrementally more wisely than we’ve been doing lately.”

In 2018, Robinson left Upturn to join the Information Science department at Cornell University as a visiting scientist. After three years at Cornell, he shifted to Apple University, the internal learning and development group at Apple. There, he led seminars for executives on values questions connected to technology. 

In May 2023, Robinson joined OpenAI as Head of Policy Planning. In this capacity, Robinson will help shape OpenAI’s engagement in the AI policy debate at a critical moment. The U.S. and the E.U. are each weighing regulations for ChatGPT and other AI tools, while many companies have implemented or are considering policies.

“I’m looking forward to bringing all these experiences to bear in the next chapter,” Robinson said of his new role.

Looking back, Robinson sees his time at Yale Law School as a crucial period to develop the ideas, approaches and community that he draws on today. 

“The most important impact on my life from being a student at Yale is the relationships with other students that I formed while I was there, including with people that are connected to me through MFIA and the ISP,” Robinson said. “Today, those people are all over this field in various professional roles. I’ve really forged a connection with them that’s endured.”