Super-Hack-Lider

This week we made some exciting strides forward. First, we got into the swing of our prototyping schedule. On Monday we were able to test four separate interactions that were each mapped to the same shared synth. We found that our testers preferred the most responsive and movement based interaction that we created. A guest could play sound by simply waving the Vive controllers- the notes played would be randomly generated within the confines of a scale. Small orbs trailed the controllers providing some subtle feedback. Another interaction that people seemed to like, at least on a visual level, was the use of “flocking” shapes that feel like birds or fish following your arm movements. These tests prompted us to ask a few new questions about our project.  How much do we want to balance the use of inputs like free form body movement with object based inputs like the noise balls from our very first demo?

Tuesday the sound programming team cracked the problem they had been having with SuperCollider. After 5 days of vetting other sound engines, it started to become clear that they were not going to offer the range or richness of tone that Supercollider could. The team solved some of the core issues, but determined our biggest problem had been with the sound card on our VR computer, the computer that our ambisonic speaker system was plugged in to. We feel lucky to have our favorite sound engine back on board. Yay SuperCollider.

Wednesday we were lucky enough to be invited to visit Music Everywhere, a past ETC project.  They went on to become a startup and just won the Hololens challenge. We visited them at their AlphaLabs office in Pittsburgh. They talked about their process and what their project had looked like at quarters and halves.  They made recommendations about helpful faculty to connect with outside of the ETC. We traded information about audio engines and got to see what the project looks like currently. It was helpful to receive advice from a group that had just been through the same set of challenges and we hope to have them review and critique out work as the semester progresses.

As the week wrapped up we fit in on more playtest that focused on platforms. Jesse Schell had lent us his handcrafted Vive Tracker gloves that uses conductive tap on the fingers and thumb of the gloves as inputs- basically buttons. We played with these as an option, and demoed them next to Leap Motion, Vive trackers, and Vive controllers. We wanted to see if we could isolate what felt important to have in an experience, something as simple as note creation. People of course loved the Leap Motion and the Vive Controllers seemed a close second because of their familiarity. We realized the pinching motions made possible by the Vive gloves were a little abstract, you would need instructions before you would learn the interactions. These discovers prompted us to settle on the Vive Controllers for the time being. A familiar, mostly friendly platform that will allow us to focus on developing out interactions.

Looking forward, we are interested in exploring rhythm in more depth. The team wants to see if they can track movement patterns and isolate motions that are repeated. Repeated motions could be translated into the tempo of the song, or they could lay down a beat. This is a idea that we hope to incorporate into our next playtest and so it became homework for the weekend. Concepts that deal with Rhythm will likely be the target playtests for week 7.