
Work Highlights
Face- & Eye-Controlled Interactive AI Music System for Analog Synths
This video is an excerpt from a lecture-recital I gave at Western Carolina University. In it, I detail an interactive music system that leverages AI and XR technology to enable hands-free homophonic music performance on analog synthesizers. The system uses Python, ChatGPT 3.0 (text-davinci-003), MIDI, OSC, Max for Live, EyeHarp, ZigSimPro, and two Moog synthesizers.
This project endeavored to facilitate a collaborative human-machine interaction where no composer's work was used a learning model. Instead, the prompt asked the AI for "three harmonious frequencies". The prompt was continuously retriggered by the user, creating a chord progression.
In Harmony: An Interactive Media System
In Harmony was an interactive media installation housed inside a geodesic dome. Sitting in the middle of the dome was a table with a clear surface and a webcam at its base. Museum visitors moved "game pieces" across the table to activate sound for a 2nd-order ambisonic sound system (9 speakers) and change the lighting behavior of hundreds of LEDs. They could control the sound's spatialization, filters, and faders as well as trigger hidden sounds.
For this installation, I was project lead, programmer, installation co-designer, technical writer, composer, and sound designer. Being project lead meant I was responsible for making sure the project was on-time and within budget constraints. Additionally, I coordinated the programming, creative, and fabrication teams.
Together with my colleagues Gino Ceresia and Paul Diaz, I coded the installation's stand-alone A/V software application. The app is owned by Baltu Technologies, with whom I was employed at the time. That deliverable was licensed to the i.d.e.a. Museum of Mesa, who commissioned Baltu to create the exhibit.
In Harmony was open to the public from June 2018 through November 2019. Though we had a debugging and repair clause in our contract, and though our team would check-in on the installation via site visits, none of it was necessary as there was not one technical problem during its entire lifespan.
Virtual Reality, 360 Videos, & Spatial Audio


I have worked on multiple virtual reality software applications made with Unity gaming engine. I wrote code in C#, did audio post-production, storyboarded game designs, did 3D spatial sound design, and composed music. Above are two clips of such endeavors.
The first video is a trailer for a VR game made for the Phoenix Natural Discovery Center. For that game, I did sound design, composed music, and wrote C# code. I also storyboarded and directed the trailer.
The second clip is a 360 video of an in-studio string quartet performance, for which I did audio post-production. It uses 1st-order ambisonic format, so users with VR headsets may look around the environment and hear the spatialization change as they do. Should you be viewing on a computer, you may click the navigational compass in the top-left to "look around" and hear the changes in sound.
Accessible Music Technology
I conduct research in the field of accessible music technology. In 2022, I was Artist in Residence with the Phonos Foundation in Barcelona. I worked in collaboration with the EyeHarp Association and performer Joel Bueno to compose my new piece Circles & Circuits. Alongside Gil Dori, who is EyeHarp’s Creative Director, I programmed and patched a system enabling Joel to control RSF Kobol Expander I & II and Moog Werkstatt-01 analog synthesizers through the use of EyeHarp's gaze-based interface.
Currently, I am developing accessible music software using CMake, C, C++, and JUCE. This app is designed for the Windows operating system. Its software development plan outlines stretch goals that include porting the app to MacOS and making it available as a DAW plugin.
Biography
Justin Leo Kennedy is a software developer, A/V system designer, and researcher. Recently, he worked as an Assistant Professor in Western Carolina University's Commercial Music & Audio Production program. He served as director of that program from 2021-2023.
Justin has designed several multimodal A/V systems including a face- and eye-controlled AI music system and an interactive media exhibit for the IDEA Museum of Mesa, Arizona. During Justin's doctoral studies, he co-designed a portable A/V system capable of projecting live digital illustration while sending live audio through a PA. This 15 X 11.25 screen and its accompanying PA was made compact enough to fit into a 1991 Toyota Camry.

While working as a professor, Dr. Kennedy taught audio production. In service of university needs, he also expanded the capabilities of school's Extron classroom technology during the pandemic. After that, he implemented the School of Music's livestreaming system.
Justin's research and accompanying musical compositions have been presented on multiple nationally and internationally peer-reviewed platforms, such as the GameSoundCon (Los Angeles), MUSLAB (Mexico City), BBC Radio 3 (UK), and Fox 10 News (Phoenix). His work has earned him a selection as an Alternate for the Fulbright Fellowship, a Title VIII Fellowship, and Ventura's Endowment for the Arts. He holds BM, MM, and DMA degrees in music composition with a heavy focus on computer music and live sound reinforcement.
Aside from academic employment, Justin also worked full-time in the commercial market as Creative Director at Baltu Technologies. In addition to carrying out directorship duties, he co-developed software, authored technical documents, composed music, and did sound design for installations and VR experiences.
Believing technology should be for everyone, Justin helps maintain a list of affordable software for producers. His current software development focuses on leveraging XR tools to enhance accessible music technology.