Drawn and Quartered

A lot of energy went into preparing a demo for Quarters. The sound programmers were hard at work Monday trying to make sure that SuperCollider could hook into Unity. Tuesday turned into a technical trial as it became clear that we still could not play more than two sound sources without encountering some major distortion. This meant that what we were able to present to faculty was visually appealing, but, the interactions felt empty and dull without our audio.

The feedback we received from the faculty reflected this. While many people were interested in the visual environment we had created and the concept that we pitched, the overwhelming response was that our world felt untested and flat. So far, we had created a “musical toy that wasn’t much fun”. We began to recognize that we had become too attached to the world, assets and even the technologies we had started to work with over the first few weeks.

Ricardo Washington testing out our experience. Pictures courtesy of the ETC Facebook page.

Some of the hardest advice we received was to cut loose the Leap Motion. We were intrigued by the way that finger movement played into experiences like “Very Nervous System”, and  imagined that it might go a long way in helping guests feel like they had nuanced control over music creation. It had performed well through our various tests leading up to Quarters, but on the day of, it was almost unusable at times. We were encouraged to get rid of the platform and move back to Vive trackers and controllers. Jesse Schell helpfully offered to let us try out a pair of gloves he had crafted which use the Vive trackers to input finger to finger touches. This would allow us to at least use pinching or finger tapping interactions. We are hoping to to test these gloves in the next week, but in the meantime we are back to using trackers only.

Caleb and Jae trying to catch all the feedback during Quarters.

Another critique we received was that we were not “owning the prototyping process”. We had seen our project layout itself as a method of prototyping. Build three worlds and iterate on each. But we were not seeking out the interactions that might inspire movement or best link to concepts of sound creation. We are now pivoting to focus on much more directed prototyping for our next two sprints. We want to explore interactions that encourage people to be playful and performative in VR.

The final hurdle we encountered was the realization that the SuperCollider audio engine might actually be the wrong fit for our project. After four weeks of getting to know the tool, we still could not create even two audio sources reliably in Unity. Post Quarters we decided to spend a week testing out other audio engine options and verifying that we have the right solution. We began by looking into Chuck, Pure Data and Max/MSP. Half of our team has begun investigating these options while the other is more freely exploring interaction design.  

One of the better aspects of our week was our Skype call with Chris Carlson. Chris works with interactive music and interface design. As a pitch team we had seen a performance of one of his projects back in the fall. We connected with him after the show and he offered to discuss our project. Chris encouraged us to use simple interfaces that everyone understands. Two of his most recognized projects use ‘sliders’ as the primary interface. But, he also felt that for us it was likely that the best musical interface would involve full body movement. His final piece of advice was on how to communicate musical ideas to non-musicians. Because many of us are not acquainted with the terminology it is helpful to be descriptive and use words like “bright” or “crunchy” to explain the characteristics of a sound.

This week, especially with our feedback from Quarters, we  had a rude, but necessary awakening. We have been laboring over worlds and technology that we will likely need to turn loose. We are now re-focusing our energy back towards the exploratory nature of our project. The hope is to return to our underlying goal: create an experience in which guests can create a musical soundscapes through body motion.