Date of Submission

Fall 2023

Academic Program

Computer Science

Project Advisor 1

Sven Anderson

Abstract/Artist's Statement

With the widespread proliferation of AI technology, deep architectures — many of which are based on neural networks — have been incredibly successful in a variety of different research areas and applications. Within the relatively new domain of Music Information Retrieval (MIR), deep neural networks have also been successful for a variety of tasks, including tempo estimation, beat detection, genre classification, and more. Drawing inspiration from projects like George E. Lewis's Voyager and Al Biles's GenJam, two pioneering endeavors in human-computer interaction, this project attempts to tackle the problem of expressive music generation and seeks to create a Symbolic Music Transformer as a real-time musical improvisation companion, exploring the potential of AI to enhance the human experience of music. We successfully manage to implement the first iteration of a Transformer that can generate musical output. While the model struggles to generalize to a variety of inputs — likely due to limited training resources and data used while training — it can learn the structure of encoded midi-sequences and can generate expressive MIDI performances. We also present a working prototype of a performance environment built with Max/MSP which can parse auditory information in real-time and serve as the interface between the model and the musician.

Senior Project submitted to The Division of Science, Mathematics and Computing of Bard College.

Open Access Agreement

Open Access

Creative Commons License

Creative Commons License
This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 4.0 License.

This work is protected by a Creative Commons license. Any use not permitted under that license is prohibited.

Share

COinS