The past few years have seen an unprecedented surge in both the availability of bioacoustic data and the sophistication of AI/machine learning models. This convergence presents a unique window of opportunity to revolutionize our understanding of animal communication and biodiversity. However, achieving this requires a conscious effort to integrate the disciplines of AI/Machine Learning and Ethology.
This workshop will explore the intersection of artificial intelligence (AI) and bioacoustics, aiming to address challenges in processing complex bioacoustic data and interpreting animal signals in order to advance our understanding of non-human animal communication. Join us for a poster session, keynote talks and a panel discussion as we explore key opportunities to use AI to decipher animal languages and thus deepen our understanding of the natural world.
| Time | Event |
|---|---|
| 09:00 - 09:45 | Coffee Provided |
| 09:00 - 09:15 | Opening Remarks |
| 09:15 - 10:15 | Plenary Talk by Laela Sayigh |
| 10:15 - 11:15 | Lightning Talks: Part 1 |
| 11:15 - 11:30 | Break |
| 11:30 - 12:30 | Plenary Talk by Oisin Mac Aodha |
| 12:30 - 14:00 | Poster Session & Lunch |
| 14:00 - 15:00 | Lightning Talks: Part 2 |
| 15:00 - 15:15 | Break |
| 15:15 - 16:15 | Final Plenary Talk by Julie Elie |
| 16:15 - 17:00 | Panel |
| 17:00 | Conclude |
Dr. Sayigh is a leading expert in cetacean behavior and communication, with over 5,000 citations of her work. She holds a Ph.D. from the MIT/WHOI Joint Program and has dedicated her career to understanding the social behavior and acoustic communication of whales and dolphins. She has been involved with a long-term study of bottlenose dolphins in waters near Sarasota, Florida, for many years, where her work has focused on individually distinctive signature whistles and other aspects of dolphin communication.
Recently, this research has involved playback experiments to free-swimming dolphins, filmed with drones. Given the challenges of studying species that spend most of their lives underwater, she is involved in research that utilizes new technologies, such as non-invasive tags, to study cetacean communication systems.
Oisin Mac Aodha is a Reader (aka Associate Professor) in Machine Learning in the School of Informatics at the University of Edinburgh (UoE). He was a Turing Fellow from 2021 to 2025, currently is an ELLIS Scholar, and a founder of the Turing interest group on biodiversity monitoring and forecasting. His current research interests are in the areas of computer vision and machine learning, with a specific emphasis on 3D understanding, human-in-the-loop methods, and AI for conservation and biodiversity monitoring.
From 2016-2019 he was a postdoc in Prof. Pietro Perona's Computational Vision Lab at Caltech working with the Visipedia team. Previous to Caltech, he spent three years (2013-2016) as a postdoc in the Department of Computer Science at University College London (UCL) with Prof. Gabriel Brostow and Prof. Kate Jones. There he worked on interactive machine learning, where the goal was to design algorithms to enable non-programming scientists to semi-automatically explore events of interest in vast quantities of audio and visual data.
He did both his MSc (with Dr. Simon Prince) and PhD (with Prof. Gabriel Brostow) at UCL and has an undergraduate degree in electronic engineering from the University of Galway in Ireland. Before his PhD, he was a research assistant for one year in Prof. Marc Pollefeys' group at ETH Zurich.
Dr. Elie is a computational neuroethologist specializing in the neural mechanisms underlying vocal communication in animals. With over 1,400 citations, her work bridges ethology, bioacoustics, neurobiology, and computational neuroscience. She has made fundamental contributions to understanding how animals produce and perceive entire vocal repertoires, particularly in zebra finches and Egyptian fruit bats – both vocal learning species.
Her research combines rigorous behavioral experiments, neural recordings (including wireless recordings from freely behaving animals), and sophisticated computational analyses to decode how the brain represents and processes communication signals. Her work addresses core questions about individual recognition, the relationship between acoustic structure and meaning, and how neural circuits enable flexible vocal production and categorical perception across diverse call types.
We invite short papers and proposals papers related to using AI for non-human animal communication. The following are some relevant areas for consideration, but proposals are not limited to these topics.
The submission deadline was September 5, 2025. Submissions are no longer being accepted.
We also welcome applications to review, please express your interest to aiforanimalcomms@earthspecies.org.
We have two tracks for submissions of manuscripts: short papers and proposals. Submissions to this workshop are considered non-archival and thus may be submitted elsewhere in the future, but should represent novel work not previously published.
We welcome submissions of brief research papers that summarize the context, methods, and results of a study or set of experiments. Please limit your submission to 4 pages excluding references and supplementary materials. Reviewers will not be required to read or review the supplementary content and will evaluate the papers based on the main content. Papers should have the following components:
The proposals track is intended for articulating new ideas for research or forward-looking advancements that do not have finalized results, or for proposing how concepts from interdisciplinary or domain research would apply in AI development or applications in the domain of animal communication.
Submissions should be novel work not previously published within machine learning conferences - however, new findings that build on prior work are welcome. We may consider new presentations of work previously published in applied scientific journals on a case-by-case basis as long as the findings were not previously published.
We will be hosting a poster session and potentially a short series of talks on successful papers (the schedule for the day is still TBC). Please note that talks will take place in person; we are not able to support virtual presentations.
Please reach out to aiforanimalcomms@earthspecies.org, cc ellen@earthspecies.org and we will get back to you.







For any questions or further information regarding the workshop, please contact Ellen Gilsenan-McMahon.
Email: ellen@earthspecies.org