This week began with an exploration into rhythm. We had hopes of using guests’ movement to set a tempo for the world’s music. Unfortunately most of our concepts did not work out well. Our initial idea was to use the Fast Fourier Transform algorithm to analyze peaks in guest’s body movement and convert that into rhythm. What we discovered was that these worked well at recognizing long, sustained patterns of motion. Swirling your hand in a figure-eight would be the perfect match for FFT. Quick, sharp percussive movements that we think of as analogous with rhythm were not such as good fit. We are still interested in using Fourier Transforms to supplement future interactions but it seems to be a poor match for rhythm.
Another route we examined was mapping simple striking motions to percussive sounds. We tried using the idea of “hits” on both the hand-held controllers as well as trackers strapped to the guests’ feet. We found that the striking motions were rarely satisfying. Without a surface (or drum) to strike, the responding beats felt like they were being played at almost arbitrary times. Further, the beat detection on the feet, and sometimes the hands, had trouble differentiating the reverb of a hit and an actual hit. While there are some solutions to these problems, they were not ready for our scheduled playtest.
Wednesday came, we had a first-rate playtesting opportunity available to us. Shirley Saldamarco was bringing in a group of students from main campus for one of her classes, a great opportunity to get fresh input. But, we didn’t have much exciting new material to playtest. After considering the primary questions that we were still struggling with, we were interested in understanding the repetition of motion and freedom of movement in VR space. So, we sneakily crafted a playtest in which we placed guests in VR with controllers and some pretty hand-trail effects. Then, our sound designers would pick up a midi controller and begin twisted knobs, feeling out the performance of the playtester with their “resonance filter” synths.
We were able to collect both numerical and observational data that gave us some pretty helpful insights. We took video recordings of each session, and we discovered, that people were ok with the abstractness of this playtest. They never noticed the man (men) behind the curtain. They never questioned whether or not they were crafting the music themselves. In fact, they felt this world was more responsive than the other, older, worlds we showed them that did give them musical control.
Another insight was how freely people moved, gesturally, in the space. The hand-trails alone seem to encourage people to be graceful. The data behind these movements was tracked and collected. Figo was interested in reviewing them as they related to his Fourier Transform project. He noticed that people did frequently try making the patterned movements that we would need in order to use FFT as an input. He was able to get some specific information on the kind of thresholds we might want to set, or when feedback would optimally need to occur.
After the playtest we fleshed out a final round of interactions to prototype before settling down to create our final halves demo. This time, each member of our group picked an exercise they were interested in exploring. It was a programming heavy weekend, as every member of the team, artists included, worked to bring our ideas to life in Unity.