UX Exploration – Animated Hand Revisited

UX Exploration – Animated Hand Revisited

ttt

During playtesting, we attempted to identify the types of keyboard overlay indicators that most accurately led students to play the correct notes. More on that later…

Playing back the videos from the playtests, we additionally noticed students often would play through the demo using only one or two fingers. Highlighting notes might show which notes to play, but how to play them was not being taught. [This is something we’d anticipated, but the playtests highlighted the shortcoming.]

Playtesting fingers

Student playtesting

The team brainstormed possible solutions and what was the optimal way to use AR to instruct proper finger numbers and hand technique.

A first approach, which will likely come back in future overlay displays, was to include finger numbers in the overlay indicators.  The initial stage was to design the visual overlay and write an interface for the lesson builder.  Then in the notation software, Sibelius, write finger numbers into a test score.  An investigation of the resulting XML export showed the finger numbers within the “words” tags.  This was then factored into the Score Reader’s parsing logic.

Finger numbers

Finger numbers in overlay, score and XML

Our next brainstorming session sought to leverage earlier experiments we’d done using an animated hand to demonstrate proper hand technique.  Attempts to capture hand data were thus far unfruitful.  Using a Leap Motion hand controller at the piano keyboard, we tried recording a single hand playing.  Unfortunately, the Leap Motion would lose track of the hand when a finger would come into contact with a solid surface, such as the keys on the piano.

More discussions led to new attempts of simulated playing above the Leap Motion without touching the keys.  This proved too inaccurate as the spacing of the keys was difficult to approximate without a visual reference.

What if we could see a virtual piano on a screen while “playing” above the Leap Motion?

This was a good step forward, but still produced highly inaccurate hand recordings.

A quick experiment provided a significant level of improvement, and yet brought about another problem to consider.  Instead of using a virtual piano keyboard on the screen, we created a Virtual Reality environment using an Oculus Rift in combination with the Leap Motion setup.  The Leap Motion was now connected to the front of the Oculus, hand orientation flipped, and the virtual piano keyboard was placed within the VR scene.

Far more accurate hand videos were recorded.  We were surprised it had made such a difference.

The new problem was being in the VR environment caused serious motion sickness, limiting recording times to only a few minutes.  This was typically followed by two or three hours of discomfort, often fully resolved only by sleep.

Hand capture frame

Frame placed above Leap Motion device

But somehow within the VR space, we were able to get closer to specific keys, albeit continuing to hover over the Leap Motion (instead of playing a on real surface) would require precise performances.

Our programmers continued to experiment and came up with a physical interface to address the hovering problem.  By connecting a piece of clear plastic to a wooden frame, we could have the hand recordings be tapped out on the plastic.

We will continue to iterate (especially to get beyond the motion sickness issue), but we’ve established a proof of concept.

To add this content to our lesson builder, a new node was developed to handle the Leap Motion data.

Lesson builder hand node

Comments are closed.