Given that our project is developing with Unity, the API of Tap can only work with Android and IOS system. Most Android and IOS devices are mobile devices which have touch screen as the strong input interface.

Touch screen is one of the biggest competitors of TAP because it is so convenient on a mobile device that everyone uses today. It definitely has the strength that TAP cannot compete with, like the position accuracy, swipe and hold function, and many others.

Tap is still in the developing process, which has many limitations in real use case. Those limitations make it really hard to view Tap as a substitution of the original touch screen interface, so a better way to use it could be combination of Tap and touch screen to reach a higher level use scenario.

Prototype Purpose

This prototype focused on using Tap as a complimentary device adding to the original touch screen interface. The ultimate goal is to reduce the limitation of Tap and touch screen with this combination.

Limitation of Tap (compared with Touch Screen):

  1. TAP is not always accurate.
  2. TAP has no position information.
  3. TAP can only detect sudden touch down event (no stationery or moved state).
  4. TAP has noticeable latency problem (180ms).

Limitation of Touch Screen (compared with Tap):

Touch Screen doesn’t distinguish fingers (no finger distinction).

From the limitation, it can be seen that most of them can be easily solved from the other interface, touch screen can provide position information on screen, and it can also detect which state the finger is on the screen. Correspondingly, Tap has the strength that it can distinguish the usage of finger, which could provide a possible solution for effective use case. It’s like adding a new layer for the traditional touch screen, which multiplies the possibility of using it.

Design Details

The design of this prototype can be divide to 2 phases.

Phase 1: Distinctive Position

The first phase is to combine the Tap and touch screen to provide the distinctive position. In another word, it is to assign each touch on screen a finger index attribute, and in the meantime assign a position attribute to the Tap. In the illustration design, each touch on the screen will generate a colored circle on the screen, the color will correspond to certain index of the finger on the screen. Same color will show up when same finger touch on the screen.

Phase 2: Distinctive Touch State

The second phase is to extend the Tap use case with touch screen information of touch state, which could provide swipe and hold function to Tap. In the illustration design, the demo is a simple dye puzzle that player need to use different finger on the screen to fulfill the puzzle.

The two phases demonstrated the possibility for the purpose.

Implementation Details

Since Tap provides the information of which finger is down, adding it with the robust touch function of a touch screen shouldn’t be difficult. However, given that Tap has 180ms latency problem, the challenge for combining them together is to synchronize the Tap input and the touch input.

To achieve this, we organized a stack to keep the record. As expected, the touch input will be detected first, with the tag of the touch’s state of “Began”, “Moved”, “Stationery”, “Ended”, we keep all the touch input recorded. When the Tap input is received, we can map it to the touch input by go through the latest touch input with possible time (Tap is not accurate so the touch input’s count may not equal to Tap input’s count). In this way, the two input event get combined together.

We packed it up with Tap plugin together, the link for download the unitypackage is here. The usage for the package can be view from the example in the package. You just need to override the 4 functions in “TouchTapTest.cs” to achieve the function you want.

  • OnTouchTapBegan(TouchTap touchtap)
  • OnTouchTapStay(TouchTap touchtap)
  • OnTouchTapMove(TouchTap touchtap)
  • OnTouchTapEnd(TouchTap touchtap)

The TouchTap is a class that encapsulates the Touch & Tap input.

Lessons Learned


  1. The biggest problem now is the TAP can only know the combination and if tapped with more than one finger at a time, it will not know the order of fingers. Now we only compared the x-axis value assuming that fingers’ positions are in x-axis order.
  2. Also sometimes the TAP is not accurate which means the mapping could be asymmetric that is not an N-N mapping.

Except the problems, Touch + Tap works well as expected. It does provide another layer for the original touch screen input interface and in this way provide more possibilities of the usage. However, with the problems now, the accuracy and latency still influenced a lot if to use this in a efficiency use case.

Future Possibilities

Piano Tutorial or other fingering use cases

Painting Tool or other Effective use cases