Week 6: Getting Things Together

Week 6: Getting Things Together

The next few weeks for our team is going to involve merging all our current work. Over the past weeks, we had finalized on our tech and art designs. Everyone in our team had been working on their bit separately till now.

Making Ako interesting

We had decided that Ako would be having a personality that makes him stand apart from a regular automated robotic assistants. This meant that he should be able to do things that would excite the players and would also have characteristics that makes him approachable and easy to interact with. To be more precise, to add something that would make players want to interact with him.

Therefore, our artist came up with a couple of props and costumes that we can use.

Ako who loves burgers

Ako who is a Magician
A basic Ako model with different kinds of clothing

This gave us a lot of options to choose from and think about what kind of fun things Ako can do if he takes up these roles.

We considered these designs and ended up choosing the Ako who loves burgers. Ako in our world would be a regular teenager who loves to eat but is also a super Set game fanatic and knows it inside out. He is also learning to perform tricks that he likes to show now and then.

Finally, we decided that we would combine the Hippo and his clothing into one model as it proved to be more convenient and predictable to rig.

Testing the features of DialogFlow

If you remember from last week’s project architecture there was a decision tree component which drives the state machine of the game. This decision tree was the integral merging component that would bring together the game logic, AR, speech input and outputs.

Therefore, we spent some time to make careful decisions on how this would be implemented. This was important because this component routes the flow of the game and should be robust and dynamic. It should also be easy to modify and update as and when more elements are being added to the project.

Basically, the role of this component can be summarized in the following diagram:

After some research, we concluded that this can be managed by the DialogFlow service.

DialogFlow is driven by something called Intents. Intents are inputs to the DialogFlow which are mapped to the questions that could be asked by the player.

DialogFlow also maintains states through Contexts. Contexts are attached to Intents and is executed when an Intent is active. These Contexts can also be passed as an input to another Context if it contains information or data that is required by another state.
Each context also gives out a response which is passed on to the Unity game engine.

For example,
INTENT:
Input Question => “Is this a Set?” “Can you check if this is a Set?” “Is this correct?”
Identify intent from this question => Intent 1 or Intent 2?

CONTEXT:
Choose the context from the Intent selected => State 1 or State 2?

RESPONSE:
Output the corresponding response.

State Machine Logic

It is necessary to have a state machine logic in our experience. The game states are driven by the responses from the output from the dialog flow. Therefore, it made sense to use the dialogflow service itself as a state machine.

Brainstorming Visual User Experience

From our Quarters Feedback we wanted to add more visuals to the experience. We also were figuring out to finish the final phases of our tech integration. The team wanted to have a clear view of what the user experience would be like to design the interface.

Therefore, prototyped some UI elements that we could use in our project. Here is an example of Ako teaching to identify Set:

Ako showing features
Rule explanation feature

Next week, we plan to test Dialogflow to use as our state machine to combine the AR game states and Ako’s speech response.