Q&A: LEAP Student Grant Winner Diego Ellis Soto on Turning Animal Data into Music

150605-building-006.jpg

Diego Ellis Soto (Ph.D. ’24) is a Law, Ethics & Animals Program (LEAP) student fellow who researches ways to use emerging technology and satellite imagery to study ecology and conservation biology. He is a fourth-year Ph.D. candidate in Ecology and Evolutionary Biology and affiliated with the Center for Biodiversity and Global Change at Yale.
He is also a former DJ who has been using a LEAP student grant to create music by transforming animal motion, behavior, and sounds into music. The process uses the harmonies in schools of fish or flocks of birds to create compelling musical experiences that bring awareness to the beauty in animal behavior. 

LEAP Program Fellow Noah Macey spoke with Ellis Soto about his LEAP student grant research, which he also discussed in a talk for the Franke Program in Science and the Humanities. The conversation has been edited for clarity. 

Headshot of Diego Ellis Soto
Diego Ellis Soto

Could you give an overview of your project? How are you transforming animal movement into music?

The project aims to uncover hidden dimensions of nature by using advances in technology and artificial intelligence. Humans have always been interested in how nature operates — in philosophy and science — and particularly how many animals manage to move in unison, like a flock of birds or a school of fish. Until about a hundred years ago, people used to think those animals were using telepathy to coordinate. Now, with better technology, we can uncover those complexities of nature. There’s actually a lot of harmony in the ways some animals move. Nature’s fairly musical. The rules that some of these flocks use, the biological rules, are almost the same as the rules of musical harmony. Take a flock of pigeons, for example. Pigeons don’t want to be too far apart from each other, they don’t want to be too close together, and they want to move at the same speed. Those are all principles of musical harmony. 

During the onset of the pandemic, I met with Professor Matthew Sutter at the Yale School of Drama, and we immediately clicked because Matthew had been exploring the collective behavior of animals, and I’d been interested in the intersection of scientific communication, collective behavior, ecology, and technology. We’ve put together a really interdisciplinary collaboration. It’s a big experiment in bridging music theory and biological theory to bring the hidden lives of animals to the public, both musically and artistically.

With the advent of artificial intelligence and better tracking technology, we can now track hundreds of actors in an ecosystem — anything from mosquitos to fish to birds — at the same time. This was very hard before the new tech. But now we can turn their movements into this vast amount of data, then use biological and musical theory to understand the data and transform it into music or artistic visualizations. The ultimate goal is to bring people closer to nature in the midst of the climate crisis and ongoing mass extinction. It’s a way to communicate with nature and visualize it in new ways. 

What kinds of animals have you worked with to make the musical pieces? 

We’ve been all over the place—we’ve used homing pigeons, stickleback fish, golden shiner fish. We’ve even used eight termites on a petri dish, and I think they’ve actually been our most successful musicians. We recently did two art exhibits to showcase the interdisciplinary nature of this project, and we had an artist use the data from a single termite to do a drawn piece, which is something I would have never dreamed of. Because of the collaborative nature of this project, new dimensions are added all the time. 

In the termite example, we had the termites play the guitar, keyboard, and drums. The challenge is transforming the data about their movement into music in a way that sounds good while preserving the data. We could massage it so much that you don’t get a sense that the termites are driving the art. But that’s not the goal. The goal is to bring people closer to nature using recent advances in technology. 

The project is growing really fast, partly thanks to LEAP, and I’m kind of overwhelmed by the interest people are showing. I’m getting emails about it every week from people all over the world who want to work with us. The next step involves forming a global movement of people who want to make music from natural patterns. I’m planning a few lectures next year both at Yale and outside New Haven. There seems to be a lot of momentum in society to bring people and nature closer together, especially by closing the gaps between STEM and the arts, and this project is well situated for that. 

Could you describe exactly how you’d go from having eight termites on a petri dish to music? 

Yes, let’s do a concrete example of how artificial intelligence can transform biology into music. You start with a video of the termites walking around the petri dish. First, we need to transform that video into clean data, and for that we use something called computer vision. It’s a form of deep learning — basically,  an algorithm that tracks each animal separately and gives their coordinates for the entire duration of the video. The AI algorithm gives us the position of each termite in space. With that information, we can determine how fast the termites are moving, how far away they are from each other, and whether their motion is aligned or misaligned. We can transform all that data into musical space using music theory, mapping from the patterns in their movements to the rules of chord progression. So we’re able to turn termite positions into different notes on a keyboard, and the distance between the notes is based on the distance between the termites. We’ve kind of turned the petri dish into a circular keyboard — that’s a bit of a simplification, but it gets the spirit of it — and choices we make about how to map the data onto music lets us put together an opera of termites. 

The interesting bit is that at the end of the day, we can do this for any species:  the code is agnostic. I’m really interested to hear the differences between flying, swimming, and walking animals. I have no idea how different they’ll sound, but I’m excited to find out. 

That’s how you’d make music using computer vision. Another way to do it is to use microphones in the wild to record things like birdsong. That also uses AI, but it’s a sound recognition algorithm — it’s kind of the same thing that happens when someone says, “hey Siri,” and their iPhone mic turns on. We use an algorithm that does the same thing when birds sing. So we dropped $150 worth of microphones all over the Yale Farm and were able to record and identify dozens of different species. This would be unthinkable 15  years ago — it would have been an entire PhD project. And now we’ve done it for under $200. 

So basically, we make animal music in two ways: we let animals make music with their movements, and we use sounds from nature. The cool thing about the microphone method is that it’s place-based. So if you were to teach children to do this, they can have an enhanced experience listening to their own soundscapes and environment. You can also make music for your own farm or backyard and create a sense of space and community. 

How have you brought in collaborators on the project?

We just did two exhibitions. One was at Yale’s Tsai CITY, where we created some visual media and videos from animal data along with music. We were trying to bring people closer to nature through their visual and auditory senses, and that involved real animals generating art. There have been a number of other events, too. We’ve worked with local youth from New Haven schools on art for a contest, which was really exciting. There was a ton of participation from the youth, and we were able to involve a hospital and have hospital patients and essential healthcare workers create art for the project. Beyond the artistic and science communication value of this project, there’s a ton of educational value. It allows us not just to understand nature in a better way, but also to bring the incredible sounds of nature into human hands to create more art. Anyone can do this. 

We’ve had a really diverse array of collaborations from this project already, kind of all over the place in terms of media — which reflects my general mindset. We’re excited to be part of a collective pulse of collaborators making this sort of art. This is probably the most interdisciplinary project I’ve had the luck of working on. We’ve had data scientists, musicians, electrical engineers — shoutout to Ph.D. Candidate Jonathan Koss, who is our artificial intelligence and engineering expert — web developers, artists, sculptors, people from local nonprofits, from social justice groups, from local art spaces, biologists from across the world, computer scientists, cancer patients, and composers. It’s just an incredibly interdisciplinary team. If anyone reading this is interested in hearing more or joining, they should totally reach out to me at diego.ellissoto@yale.edu. The overall principle is to have an interdisciplinary combination of people from the humanities, STEM, and the arts. 

The variety of people involved has taught us really concrete lessons about interdisciplinary collaboration. We’ve all become much better communicators. For instance, we’re better at avoiding jargon whenever we talk, because all jargon sounds similar, but the terms will mean different things in different disciplines. If we were speaking our silo’s language, we’d never understand each other. 

Another fun part of working with interdisciplinary teams is how subjective interpretations can be, not just of art and music but also of data. Coming from a very STEM background, I’ve appreciated how open-ended and open-minded artists and humanities people are. The project is less hypothesis driven, less concretely goal oriented. To make art, you often have to be exploratory. Which is amazing, but it may be frowned upon in basic STEM research, where you often need to know your hypothesis before acting. That’s just not the case in art, and I think it’s ultimately making me a better scientist to work with this interdisciplinary team. 

What’s on the horizon for this project? 

We’re trying to make our methods publicly available for anyone, so that anyone can recreate this sort of project. Maybe a handbook that a music school, interdisciplinary art department, college, or after school program could use. We might play at a few music festivals soon, and we’re branching out of the Yale community to engage with New Haven more broadly. We’re also trying to get more kids involved because they’re the most curious scientists, and they always ask the hardest questions. It would be incredible to develop the capacity to work with them. 

This work matters. Society is becoming increasingly interested in species loss and climate change. To me there’s no greater sin than losing a species without knowing that we lost it, and the same can be said of soundscapes. Nature has been a source of infinite inspiration for every human culture, art, and science, and documenting the rapid change of soundscapes and of species is imperative given the current biodiversity crisis. Presenting people with a bird’s-eye view of the fragility and beauty of nature will motivate more of them to be better stewards of the environment. Not everyone needs to be a biologist. I just want them to be passionate about the neighborhoods and environments they live in. So if you’re not a biologist, you can totally still be involved in this project. We need people who are curious about everything.