Handbook of AI for Music: Chapter 9: On Making Music from Heartbeats

After two years’ gestation, the Handbook of Artificial Intelligence for Music is now born. Credit goes to the initiative, vision, and gumption of the book’s illustrious editor Eduardo Miranda, composer and Professor in Computer Music at the University of Plymouth. Hailed as the definitive work on AI and music computing, the 34 chapters of the >1000-page book covers current trends, and new and emerging topics in AI, including biocomputing and quantum computing. Chapter 9 describes Elaine Chew’s making of arrhythmia music.

© 2021
Handbook of Artificial Intelligence for Music
Foundations, Advanced Approaches, and Developments for Creativity
Editor: Miranda, Eduardo Reck

Book Synopsis: This book presents comprehensive coverage of the latest advances in research into enabling machines to listen to and compose new music. It includes chapters introducing what we know about human musical intelligence and on how this knowledge can be simulated with AI. The development of interactive musical robots and emerging new approaches to AI-based musical creativity are also introduced, including brain-computer music interfaces, bio-processors and quantum computing.
Artificial Intelligence (AI) technology permeates the music industry, from management systems for recording studios to recommendation systems for online commercialization of music through the Internet. Yet whereas AI for online music distribution is well advanced, this book focuses on a largely unexplored application: AI for creating the actual musical content.

Chapter 9: On Making Music with Heartbeats
by Elaine Chew

Abstract: Representation and analysis of musical structures in heart signals can benefit understanding of cardiac electrophysiology aberrations such as arrhythmias, which can in turn aid in the diagnosis and treatment of cardiac arrhythmias.
The typical time-frequency analysis of electrocardiographic recordings of cardiac arrhythmias yield descriptive statistics that provide useful features for classification, but fail to capture the actual rhythms of the physiological phenomena. Here, I propose to use music notation to represent beat-to-beat and morphological feature-to-feature durations of abnormal cardiac rhythms, using articulation markings when emphasis is warranted. The rhythms and articulations captured in these representations may provide cues to differentiate between individual experiences of cardiac arrhythmia, with potential impact on personalising diagnostics and treatment decisions.
Music generation is presented as an application of these rhythm transcriptions. The physiological origins ensure that the music based on heart rhythms, even abnormal ones, sound natural. The two-part music creation process draws inspiration from music collage practices, and comprises of a retrieval component followed by transformation processes, which can be applied at the melody or block levels, and complex combinations thereof.
The music thus created can not only be used to identify distinct heart signatures and what they mean for different cardiac conditions, they can also provide a visceral record of the experience of an arrhythmic episode. The pounding and fluttering of arrhythmia can often be physically uncomfortable. The music created from arrhythmic traces is not easy listening; it is often provocative, but potentially instructive.

Archived on HAL:

Chew, E (2021). On Making Music with Heartbeats. In ER Miranda (ed.): Handbook of Artificial Intelligence for Music – Foundations, Advanced Approaches, and Developments for Creativity. Springer Nature Switzerland AG. doi: 10.1007/978-3-030-72116-9_9.

Additional supporting material: