Team Music in Motion Gets Moving

Our team was still arranging our new project room for the semester when our first visitor dropped by. We were lucky enough to receive an introduction to the Theremin from virtuoso, Pamelia Kurstin. She made playing the electronic instrument look natural as she demonstrated her control of pitch with one hand and volume with the other. We were surprised by the Theremin’s versatility. So often it is associated with a “sci-fi” sound, however, Pam drew out its musicality with graceful and measured hand-movements. She allowed us each to make an attempt at playing the instrument and to explore its responsiveness.

This experience quickly became the inspiration for our first gold spike and prototype. We wanted to mimic Pam’s controlled and movement-oriented performance by creating a Theremin ourselves in VR. In the demo, the Vive controllers replaced the hands of the musician, while their distance from virtual antennas determined volume and pitch. However, this experiment surprisingly felt unsuccessful because the concept did not translate into VR as well and the magic of the instrument fell flat. Still, it became clear that what we loved about the Theremin was the nuanced hand gestures and intimate movements that shaped the music.

This prompted us to reconsider our assumption that our project platform would be solely for Vive. We agreed that it was essential to encourage our guests to use more delicate and refined gestures. The piece “Very Nervous System” (1982-1991), in which performers manipulated musical playback with controlled, dance-like movement, inspired us to focus on more intimate interactions. The natural next step was to consider Leap Motion as an extension to the Vive.

While we had heard some negative things about Leap Motion, we enjoyed exploring the possibilities it might offer. Although there were some evident issues such as finger occlusion, the tracking felt magical. The most notable problem we encountered however, was that when the guest’s hands fell out of view of the Leap Motion sensor, they stopped existing temporarily. This complicates any interactions that take place out of the guests’ “view.”  Our team has already begun looking for solutions to this, including using Vive trackers to interpolate hand position.

In our second week, we had our next project visit from Master’s Student Anna Henson. She is currently in the Computational Design program at CMU, and has many years of experience producing installation art pieces. She was able to recommend numerous reference works, from dance artists and action painters, to creative developers in the machine learning space.  Further, she encouraged us to get away from our computers. Because our project focuses on designing experiences that look tactile, Anna helpfully reminded us to play with textures in a hands-on context. We hope to plan some group activities that allow us to brainstorm outside of our project room.

In terms of project planning, our team nailed down our core hours, roles, and problem statement during our first week. The second week we spent mapping out our semester, including the project phases, goals, and acceptance criteria. We also took time to make sure we had our technology in order.  As mentioned, we added a Leap Motion alongside our Vive and trackers. We also procured a  4.1 surround sound system, so that we can begin bringing our VR positional sound to life.

Our final group activity before the end of the week was to brainstorm ideas for our first prototype and sprint. It was not an easy task to agree on a set of interactions and a collective image of our experience. Ultimately we determined that we would explore the concept of creation and the act of bringing harmony to life from a chaos of white noise. From this brainstorm we selected one key interaction to kick-off our sprint and the work for week three.