Dev Blog: Week 2

WEEK 2:

PRODUCTION:

We dedicated this week to further conceptualizing and researching our final deliverable. As a team, we brainstormed and listed different gameplay types, mechanics, and ideas to incorporate into our product, with a good portion related specifically to AR and learning.

I familiarized the team with methods of agile development: the basic structure, format, and purpose of SCRUM and its components before running them through hypothetical case examples of user stories and the process from backlog to burndown. We constructed a physical backlog and SCRUM board to be used in the coming weeks.

The team put together a concept board breaking down the project’s basic description, inspirations, experiences, needs and goals. Trisha, our lead designer, ran the team through Sabrina Culba’s transformational framework and conducted a collaborative team exercise that broke down each step in relation to our project needs.

We spent the rest of the week preparing for and end-week client meeting and industry expert meeting. Our client meeting was geared towards better understanding the specifics of our deliverable (i.e. target demographic, platform, client needs) and helped us further develop the project’s constraints. We had the incredible fortune of having two regional BotsIQ participants/multi-year champions  come in and discuss their experiences with the program, the bot building process, bot driving, and the basic components of a combat bot.

Next week we will dedicate time towards further research, with the ultimate goal of being in a position to start building and entering an early pre-production phase by the end of the week. We will also be meeting with three local teachers and educators that currently participate in the program to gain insight into the educators perspective of the program.

 

DESIGN:

This week, we decided to do a team exercise using Sabrina Culyba’s transformational framework. Each of us answered 8 questions individually and met the very next day to have a discussion.
This exercise was to find out what each of us thought about 8 aspects of the framework and make sure we were all on the same page.

This is the first iteration of our document: here

This exercise helped us define our goals, player transformations expected by our client and measures of success, better. Our next step would be to operationalise some of the abstract terms like “fun” and “engaging” to have a consistent way of defining these terms and measuring them.

[INSERT HYPERLINK TO DOCUMENT]

 

UI/UX:

In this week, we spend some time to organize all the information from the 1st client meeting to shape the outline of our project. We brainstormed together and generated all kinds of ideas based on platforms, gameplay, final deliverable and etc. The whole team reached a common sense that AR is a great approach to reach client’s goal.

We generated several directions the project probably would go, at the same time, we also want to build a scope for our project so we had our second meeting on Friday. During the meeting, we asked our questions which was generated in this whole week to explore more about what is the real achievement our client want to have. After the meeting, we found that client gave us a clearer goal for this project is attracting people to join this program. In this way, our previous ideas about replayable and real physical simulation are not in our consideration now.

Student is the most important stakeholder in this project. We also met with experts in this field Joe and Bill with their winner bots on Friday. Talking with the champion of NRL competition is really helpful for us to get a better sense about bot building process.

 

ART:

In this week, we keep on figuring out our final goals and requirements in our project with our client and instructors. After several meetings, we decided to focus on creating a fun and engaging experience to attract middles school students to join the NRL program and introduce it to teachers and people in manufacturing. But we still needed to define what was a really fun game. The client want our project can show their NRL program well. In order to match our client’s minimum requirement, we should at least make a bots prototyping process with one test in our game. In addition, we met two experts in NRL competitions called Joe and Don. We got lots of knowledge about building bots from them and saw the real bots moving which was really helpful.

We have not started to build art assets this week. But we set our game’s art style in Lowpoly.   

 

PROGRAMMING:

In this week we are trying to find a way to combine AR with the project. We realized that it’s important for us to find several features that only can be done with AR instead of any other technologies.

  1. AR overlay the virtual onto the real world. We can use AR to connect virtual and reality. Usually manufacturing or building requires user have a physical interaction. With AR, we can make the experience much more similar to the real one than making it in a 3D app.
  2. Sometimes, it’s good for kids to have an intuitive way to sense the scale of the robots. AR provide them the chance to measure things in a real way
  3. With the latest AR kit, we have some attracting features like object scanning. One of application we think of is to let kids do some papare work in the real and then use AR to scan into the game for later testing. It simplifies the process of building but still let kids to some sort of practising.

BotLab: Week 1 Development bLog

Hello and welcome to BotLab’s Dev. Blog!

The following is meant to serve as an online reference of our development process and is broken down into team member’s individual roles.

Production

I met the team on Monday, and after setting up the room, we discussed preferred roles and what we each wanted to get out of the project. Fortunately, we have a well balanced team with no overlap in intended roles. The team breakdown is as follows:

Producer: Jehan Sandhu (me!)

Programmer: Guanghao Yang

3D Artist: Kangyan Li

UI/UX + 2D Art: Meng (Nicole) Wan

Designer + Programmer: Trisha Surve

We casually met our faculty instructors, Mike C. and John D, and with their input, soon began blocking out a rough schedule for our core hours. We also began to prepare for our first face-to-face client meeting on Friday by researching organizations and experiences that dealt with bot building, systems learning, and noteworthy games and mechanics.

Following the advice of Mike, we met John Balash earlier on in the week to learn more about our client (they have a professional rapport), how to best prepare for our Friday meeting, and the types of questions to ask of the client. Additionally John will be assisting us with getting Middle School children to playtest with.

We requested equipment and platforms midway through the week, four ipads and four android tablets to use ARKit or ARCore accordingly, and decided on the team name of BotLab. I also reached out to the client via e-mail to set the basic agenda for Friday’s meeting. The latter of the week was spent preparing a short presentation and materials for our client meeting.

We met our client, Bill, on Friday (check out the pictures below!). After a brief period of team introductions, we began open discussion about the project. Our client was very warm and open minded, explaining what the organization wanted while also noting he wanted us to leverage our individual skills towards the team. Additionally, we were able to lock down some constraints: what we make has to be portable and easy to demo on accessible hardware.

Our first client meeting!

Next week we will meet Tuesday to fill out a RACI chart and we will begin daily SCRUM meetings. We’re all very excited to be working with the National Robotics League.

Programming

We did a research on what kind of technology we might use. For now, we think AR could be a good fit to our project.

The most popular AR platforms are ARkit and ARCore.

For ARKit, it’s the technology introduced during WWDC 17. its key highlights include

   1.Tracking: Ability to accurately track the device position in the real-world. Using the Visual Inertial Odometer (VIO), the ARKit combines with camera tracking and motion sensor data, with which a real-time position of the device is recorded. Additionally, no tracking cards are required

   2.Landscape Understanding and Lightning Perception: iPhones and iPads would be especially aware of the surroundings using the ARKit. It comes with the ability to identify surfaces in the real-world environment, like floor, tables, walls, ceiling, etc. Also referred to as ‘Plane Detection’. Remember observing the vapour and shadow effect in the demo?

   3.Rendering: It provides an easy integration with SpriteKit, SceneKit as Metal, with an added support for in Unity and Unreal Engine

It’s available on iOS 11 only

Another choice is ARCore, it’s Google’s answer to Apple’s ARKit was ARCore – the AR framework for Android devices. Compare to ARKit, it emphasis on following:

   1.Motion Tracking: ARCore observes IMU sensor data and feature points of the surrounding space to determine both the position and orientation of the device as per its movement

   2.Environmental Understanding : ARCore detects horizontal surfaces using features similar to motion tracking

   Light Estimation: ARCore detects the lighting ambience of the device, thereby enhancing the appearance and making the visual accurate in real-time

   3.User Interaction: With the ‘hit-testing’ feature, ARCore detects intersection of light rays in the device’s camera view

   4.Anchoring Objects: To accurately place a virtual object, ARCore defines an anchor that ensures its ability to track the object’s displacement over a period of time. ARCore efficiently improves its understanding of the position and environment.

The SDK preview qualifiers are – Android 7.0 Nougat devices : Google Pixel, Pixel XL and Samsung Galaxy S8

We don’t think it’s good for us to have it on head-mounted virtual retinal display like Magic Leap and Microsoft Hololens as it requires extra equipments and it’s not convenient to bring it to everywhere.

For engine middleware, we would like to pick Unity. Both because we used Unity a lot and Unity provides a good support for almost every AR technology to easily integrated.

Design

Role:

I will be working as a Game Designer on this project and would be leveraging my technical skills to help with Gameplay programming.

Responsibility for client meeting:

For our first client meeting, I was responsible for researching games that incorporate “Stealth Learning” for kids. We had to make sure that we were not only looking at past robotics games that already exist, but also past ETC projects that have cleverly demonstrated this type of learning. I ended up analysing Water Bears and BattleBots: Beyond the BattleBox. It was particularly interesting to understand their design perspectives and principles upon which the games were built. As a team, we had to get as familiar as we could with designing and developing a game for this space. We also had a meeting with John Balash who gave us really good advice on preparing for our meeting with the client as he knew the client before hand. Based on John’s advice, we made a list of things that the two games did right and another list to identify their shortcomings to have a discussion around this topic in case they were brought up during the meeting.

Important questions:

Mike Christel suggested that we go through Sabrina Culyba’s Transformational Framework draft to identify the set of questions we needed to ask our client for us to proceed.

These were some of the questions we focussed on:

  • What is the high-level purpose of the game? What is the goal of this experience?

The high-level purpose of this game is to get students interested in the process of building a bot, prototyping, learning from mistakes and follow continuous iteration.

  • What are the existing barriers that stand in the way of achieving that goal?

The materials used are expensive and do not allow much of trial and error when building.

  • What do you want the players to know/learn after playing the game?

The players should get familiar with the pipeline of building and testing a robot.

  • Resources that we need to consider for insight and feedback in our domain.

Link to the website: www.gonrl.com. Our client will also be connecting us with people from the industry and students already a part of this process

  • Key concepts: What is the critical content that the game should embody?

A good feedback loop to help students identify their mistakes, find solutions and learn from them.

  • Assessment plan: How do you intend to measure success or asses that the game is effective?

Still needs to be discussed in detail.

Target Audience:

We were able to identify our Target Audience, children from  5th-7th grade along with the stakeholders, teachers andmanufacturers.

Target Impact
Students (5th – 7th Graders) Interest, Engagement, fun, excitement, enthusiasm
Teachers Demystify the learning tool and make them less intimidated by it
Manufacturing Companies Make them understand the importance of this program

 

Deliverable:

A polished product by the end of the semester that can be shown in conferences and to stakeholders.

Art

We did the researches about several robotic projects before meeting  our client. One of the robotic projects we researched is called Mindstorm produced by LEGO. This product provided a simple pipeline of building robot for kids. Kids can actually build a robot in every shapes with its sensors and motors. In addition, kids can also use its programming app made by LEGO to program the behavior of the robot they built.

For the client meeting, the client shared a lot of informations about what NRL do and what they want to achieve.The goal of the project we are going to make is attracting 10~14 years old kids to play with our app(games) to learn basic manufacturing skills and  let them be interested in joining the robot competition held by NRL. Moreover, we should also interests the teachers and the people in manufacturing industry to use this games to teach or make connection with kids who love doing manufacturing. The experience we are going to create should contain building and testing the virtual robot. The kids can learn that different materials will suit for different situations and how to choose materials to make their robot better in limited budget. At last, kids will get feedback of their virtual robot about how things are working and how things are not working.

In the effect and art side, the behaviors and animations of the virtual robot should be based on reality which means that it will not be able to fly or shooting bullets.    

UI/UX Design


It was the first week of this semester. Our team members formally meet with each other and team advisors. I will take care of UX design in our project.

As a preparation for the first meeting with the client, Our team did some research about our client National Robotics League with related resources and present to the client with a slide.

We acquired lots of touch points during the meeting. Our client has a really clear goal that he wants to use our project to attract more companies to join them as partner and more students who are interested in manufacture. In this way, we are clear for whis our target audience. In their vision, our project could make kids feel excited and teachers on STEM program would get a high level meaning about our game. It should exit as expansion of instructors class discussion, which could make kids gain skills about manufacture. It’s a really great start of our project.

For the presentation, we looked forward to framing a problem of our project with them in our first client meeting on Friday.  As a UI/UX designer in our team, I designed a slide theme based on the color of the client logo, which could let them feel a sense of customizing. And we will keep using this theme in the future presentation to keep the consistency of our project.