Prototype 1a – 2D Geometry Demo


To create an interactive model of AR marker vertices that demonstrate properties of 2D shapes.


Originally Microsoft HoloLens, shifted to Oculus Rift + Zed mini + Leap

Early Design

Development Process

I. HoloLens

We started building this demo with Microsoft HoloLens. HoloLens provides three types of interaction: Tap, Bloom and Gaze. Bloom is reserved for system use, however, and therefore we tried to build interactions using Tap and Gaze.

The first prototype features a triangle with three movable vertices. Moving vertices requires the guest to first tap on it to enable movement, use gazing to drag the vertex and finally tap on the re-positioned vertex to switch it back to locked-in position. We wanted to see if it might feel fluid in action, but due to gesture recognition issues, the interactions were not smooth and felt cumbersome. This eventually led to a platform shift as discussed in the Introduction.

When we tried to instantiate polygons with more than three sides, we found that spawning vertices based on individual markers resulted in vertices not sharing the same plane. To get around this, each vertex spawned after the third one was anchored to the plane determined by the first three vertices.

II. Pass-through VR

Using the Leap Motion for interactions meant that the vertices could be dragged and dropped anywhere in space using natural hand movements.

At this point, we investigated the issue of image tracking with this new platform. Directing efforts to the image tracking R&D reduced the scope of this prototype to just implementing a few basic features, independent of the interactive marker feature.

A primary need is to display angle measurements as the user moves the vertices. The Leap Motion SDK has a responsive flip panel that can appear when the user flips their palm. We used that panel to display the internal angle measures and their constant sum.


The palm flip interaction itself is engaging, but in its current state, it requires that the users hold their hand up for an extended period of time. Also, users are forced to interact one-handed for the duration that they want the panel displayed.

A panel like this may instead be best suited for

  1. A quick check on data, akin to checking a wristwatch for time
  2. An inventory, from where you can pull out objects to use

It would also be preferable to have the angle measures overlaid directly on the polygon for a more direct correlation.

One positive implication of this demo is the role of embodied cognition in math education. Previous research has implied that physical embodiment of mathematical concepts could be a powerful tool for developing mental models. Experiences in this vein might indicate how AR can further that research.