UX Exploration – Animated Hand
An area our team would like to explore is the use of 3D-modeled hands to demonstrate piano concepts for the guest via AR that could potentially include aspects of proper hand positioning, posture, correct note selection, fingering, intervals, articulation, etc.
The technical aspect of this involves first how to accurately generate the hand for optimal learning potential.
We began by experimenting with hand motion capture using an Oculus Rift with an attached Leap Motion device. The challenge with our approach was twofold. First, the leap motion had difficulty identifying the finger positions when the fingers would interact with a surface. This interrupted data when individual fingers would strike the desk or simulated piano keys. And then, a combination of low frame rate due to limited capacity of our development workstations and attempting to play piano fingerings without a surface, caused some serious disorientation and motion-sickness.
Our next approach was to try a Microsoft Kinect, another commercially-available motion tracking device. However, where Kinect works well in the X and Y planes, the Z-axis is not necessarily its strong suit.
We reached out to a group at CMU that has some more advanced motion capture equipment, but specialized hand capture was not part of their equipment set.
Some additional internal brainstorming sessions led us to a new concept which we will be exploring in the week(s) ahead. It returns to the Oculus / Leap Motion setup, this time employing a simulation of physical keys, but with a far smaller footprint as to minimize the loss of data from the Leap Motion. Our plan is to use long construction nails (or similar) affixed to a solid surface, spaced appropriately to match the spacing of piano keys. Then gently and slowly “play” a sequence of these smaller surfaces to capture and test.