The halves are done.
For the past week, not too much happened. We spend most of the time preparing the presentation. In the presentation, we first introduced the device and its limitations to the faculty, then we introduced all the prototypes have made of which the number is 11. After that, we also justified our playtesting results and our choice for the second half of the semester. We will go for an audio-based game for the second half of the semester. And I will explain that a little bit.
This picture probably is not the best group portrait ever, but it is the only picture of our team we can find.
To begin with, I would like to share some playtesting results which are pretty much interesting. For different prototypes, we designed the corresponding questionnaires. Given that our mid-term research is aimed at exploring the breadth of Tap’s performance in different application scenarios, our questionnaires also focused on different areas. It is difficult to report all playtest feedback in a short period of time, but still, from these feedbacks and our observations of playtesters, we summarized some of the things that make sense for driving our project forward. From our observation, there is some interesting discovery. The index finger is the most used finger as we expected which means when we design the interactions we should weigh the use of it most. Multiplayer exertion game happened to make playtesters get most immersed with, which makes it very promising to put Tap as a device for recreational party games. Music application is what we think of first, and playtesters also found it promising and enjoyable.
Besides, we also concluded some important elements from the feedback. Intuitiveness was a feature that almost all playtesters would value, which required our team to be more careful with the interaction choice. And most of our prototypes failed to provide clear accessible feedback after a tap which became a huge problem since Tap has a severe latency problem. And many playtesters recommended us to implement more natural interactions and finger combinations which we all think as a good idea. The playtest findings presented us the requirement from a user, with some behavior data we need, which will guide our following roadway.
Based on all of our demos, we find some of them are especially promising which includes collaborative education, music or performance tutorials, multiplayer party games, and fully audio-based games.
For collaborative education, we think maybe we can use Tap with other big touch screens like Prometheans for teaching within primary schools or kindergartens so that the improvement for that is the teacher can now know who is touching on the board. One problem is that the cost of Tap is still too high for this purpose only. It is unlikely that a school will but Tap for only this purpose. And for party games, problems are the same. It is unlikely that all the people all have the Tap devices and it is also unlikely that a person could bug several different Taps. And for the music or performance tutorials and audio-based game, for now, we decide to go for the direction of the audio-based game. But of course, we are still open for input and we will probably in the future two weeks.
The next week is CMU’s spring break(the blog will not be updated for this week) when SXSW will be happening and the week after will be the GDC week. Since most of the team members will go to GDC or SXSW, we will not have core hours for the next two weeks. But we will all work remotely if necessary. Besides, we will all do some research about the audio-based games and we will sync up after the break.
The tap R&D team has also released the API for window platform, although we may not develop a Windows version of the game. And we did a little bit improvement for the Tap API that aligns better with the Unity coding standard, now users can use TapInput.GetTap(Finger.xxx) to get tap input.