The Endless Sprint: Weeks 13, 14, & 15

Jehan – Production:

This post covers development for weeks 13, 14, and 15. 
The past few weeks have flown by… time for a quick recap!

You wouldn’t believe how far along our experience has come over the past couple weeks. We’ve refined the player experience to be more engaging and intuitive (dare I say “fun”?). We’ve been working on our visual polish, the interface, combat balancing, unified soundscapes– even new weapon mechanics, shaders, and….. particle FX!

We have also been in close communication with our client, the National Robotics League, preparing everything for the end of semester handoff. Their parent organization, the National Tooling and Machining Association, recently expanded their media development team and have received certification as App Developers on the App Store. We have been in correspondence with this team for the past few weeks, as they will be receiving our deliverable to run further QA/Optimization on it before taking our beta-build to TestFlight, then eventually the App Store.

Professors get a hands-on chance to demo our experience and provide their feedback during Softs.

Softs, also known as Soft-Opening, is a day set aside for ETC faculty to demo the various  projects and provide “last minute” feedback to the teams. Faculty tried our experience in groups of 2-3 over the course of the day, and shared their constructive critiques as well as their thoughts on what we should fix in our precious time left.

From Softs, we prepared for the 2018 ETC Fall Festival, an evening devoted to showcasing of ETC Project works and select experiences made by first year students and attended by some 400 people: friends, family, and visitors from the industry. After feeling like we had been in nonstop crunch mode the last few weeks, the evening was especially nice– seeing guests smile in enjoyment while they played, people lining up to have a go, there were a few younger kids who didn’t want to put it down!

Next Week:

Regardless, there’s a ton more work to do! From here, we need to prepare for our final presentation and walkthroughs by faculty, and most importantly, preparing our final handoff for out client!T

Nicole – UI & UX:

In these two weeks, we made our best to merge all separate parts together and fine tuning all the details from beginning to the end. Before Halves, for the team, showcase and battle are the two significant parts of our experience. People work so hard on testing and implementing this two. On the other hand, before softs, we need to consider other parts are not that challenged but still important for us and client like title screen, NRL branding integration and pause menu during the battle, which could make the whole experience more smooth.

Specific to my part, even though all the work of the interface looks like tiny things but matters a lot. I spent a lot of time with Programmers and they helped me to change all placeholders image into the high-fidelity one. After receiving the feedback from soft, the biggest problem for our experience is the ambiguous state. We added a state bar to identify different states on the left side of the screen. Menu inside of the showcase phase was a big problem at that time. Programmers use a carousel plugin to implement this feature but it looks horrible on the screen.

We also spent a lot of time to find a way which could be implemented but also make sense on the screen. And we have our final iteration of all components description UI with different value of 4 features from design. Besides of all the things mentioned above, I changed the rest of UI elements to keep the style consistency as much as I could.

Kang – Art:

In this week, in order to make the battle clear and juicy, we iterated and redesigned several times. Firstly, in the visual aspect, we added a blue ring under the player’s bot which can easily show that which bot are controlled by the player. Based on many videos of real NRL bots battle, we found out that the sparks and the damages of bots cased by metal impact are always make people excited. So we implemented those elements into our experience. When two bots crashed into each other the spark effect will come out and the different components will also drop off when it got enough damage. How easy it will break depends on its material.

In addition, to make the battle more diverse and the different weapons more distinguish. We set up two different controlling ways for the players. One is the Beater Bar and Spinner that the players only need to hold the attack button all the time to case damage. Another one is the Rammer that the players need to hold the button to charge and then release the button to make dash and cause damage.     

Guanghao – Programming:

We now start iterate on the battle part. We first came up several physics solutions to deal with component based bot. The difficulitis laid on how to make them behaves like an real physical world while still have those dramatical smash moment kept in our game. We modified the existing system in the Unity and finally got the value we want.

After physics done we finally be able to tweak our opponent AI. To make the system to be stable and avoid unexpected error behavior opponent, we trying to make the jump between different states simple enough. We add some randomness to the FSM so that the enemy could look smart but still competatable.

Development: Weeks 10, 11, & 12

Production:

This post covers development for weeks 10 to 12.

Oh man— everybody is feeling the heat!

As the semester rapidly approaches its end, the team has been working feverishly to juice up what we’ve built. We have put great effort into refining the experience, spending time fine tuning the interface, sounds, physics, visuals, and the stability of our product. Unfortunately (and speaking strictly as a producer), we have Thanksgiving break, which effectively turns a week of valuable development time into more like… 2 and a half days?  

That being said– Guanghao and Trisha selflessly spent the break working on integration, implementation, and bug-fixing (thanks you two!).

Our next major milestone is Softs, by which we will have stitched together all the different stages of our experience into one end to end experience. In preparation for this milestone, we came up with a Softs-specific priority list based on the input of our faculty instructors:

Neither our client, the National Robotics League, or their parent organization, the National Tooling and Machining Association, had much of a digital presence beyond their respective websites, to date.

Part of why they were interested in our work is because they want to diversify their methods of outreach and engagement with new audiences, in our case with an App– and this means obtaining Apple Developer certifications. Trisha, our Designer/Programmer, put together a docket outlining this process for our client, which I passed along at the beginning of Week 11, and we have since been in communication with their web development team.

Outside of production, I have iterated on and come up with the final version of the onboarding script, component & material descriptions, and other in-game copy, including win/loss text and boss descriptions. You can check out some of my work, attached as a PDF:

Additionally, I corresponded with our external SFX/music talent to grab one last batch of sounds to integrate.

Design:
Most of the time was spent on transferring the prototype work into final content. There was an overlap between design and programming this week.

We also spent time on creating a Technical documentation for building an app for iOS. This is one of our deliverables to our client.


UI & UX:

In the 12 week, we spent most of our time implementing all the prototype we had before playtest day. After receiving all valuable feedback from our demographics, we made lots of design decision during the showcase and battle.

Before the break, I work with programmers closely to implement all the UI I had. I mainly focus on the user interface of showcase this week. I have the 2nd iteration of UI design of showcase and export all the elements for programmers to implement.

Art:

Before Thanksgiving break, we almost finished all the 3D models in our final build as well as the textures of every components we made. We have one body frame, two types of armor, three types of weapons and two kinds of wheels. And all of them have different textures of different materials. After that, we started to refined our textures to make them more realistic.   

Programming:

We spent a lot of time on Showcase integration. We support features to switch and replace component. The whole showcase animation is now script controlled, which means we can play animation for any component either existed or in the future. We implemented a carousel UI to allow player choose the component they like in an intuitive way.

Week 9

WEEK 9

Jehan – Production:

We’re officially done with Halves! Our presentation went off without a hitch and we received some great feedback from the faculty and student body of the ETC. Based on this feedback, we’ve realized we could be clearer about a couple things in particular: why augmented reality is the ideal platform for our product, and our plan for delivery. I had a chance to discuss both of these points in our meeting with Bill after our presentation, as we have worked out the details and plan moving forward. I’ll break down some of the reasoning why:

Mobile AR is the ideal platform because..

 

  • Cost/Safety Solution to Showcase the NRL:

  • Real-life bots and all of the equipment, arena, and safety precautions needed to accommodate them is INCREDIBLY expensive.
  • Our experience enables the NRL to showcase their brand in a safe, cost effective, and scalable manner.

 

      • All of the fun and excitement of the NRL without any of the cost and danger.
    • Promotional Tool:

      • Our project enables the NRL to promote themselves through a new interactive medium that previously had not existed.
      • Bill is especially excited to be able to demo a battle to kids and manufacturers at youth/business conferences.

 

  • Unique Demo Experience:

 

    • There are few alternative interactive ways to demo the NRL at conferences and across the nation.

Our plan for distribution is…

Our finished deliverable will be released to TestFlight, after which Bill will hire a third party to push it out on the app store. The NRL doesn’t have an online presence in the App Store, so part of this will involve the NRL being registered as a developer with said hired help. We will have to coordinate with a Bill to refer him to a viable and cost effective Third party, which I am sure that we can find within the ETC.

 

Trisha – Prototyping and Design:

 

Collaboration with the UI Designer:

This week’s prototype and testing was based on the showcase part of our experience.

 

[VIDEO]

 

As you can see from the video, it is a 3 step process:

  1. Display names of individual components.
  2. Open up the bot to see the individual components and insides of the bot.
  3. Descriptions for these individual components.

 

This was playtested on Sunday (10/29/2018)

Playtester Name: Harrison

Gender: Male

Age: 11 years old

Comments:

  • Thought it was really cool to see the bot laid out in front of him
  • Wanted to be able to modify the bot he was battling with in the showcase mode (“I wish I could add components to my bot”)
  • Really wanted to choose the colors of the bot and name it.
  • When asked, said Height was comfortable – not an issue
  • Said there was too much text on the screen

 

Based on the playtest and the design team’s observations, we are going to make a different version of this prototype with the following changes:

  1. Description information will only be displayed when you’re closer to the component.
  2. Height of the bot needs to be adjustable.

 

Collaboration with the Programmer:

Games like Robot Arena have been one of the games that we studied during our research phase. One thing that we thought we could improve from the battle part of that game was the feedback system.

 

[GIF]

 

As you can see from the above gif, the current feedback system that they have includes an explosion and then the component vanishes.

We wish to incorporate what actually happens in an arena. The component falls off and now becomes a hindrance as it simply lies on the arena floor.

 

Inorder to have this in the game, the following system was created:

 

Parameter values for Bot 1

 

Every component has a strong and a weak point.

 

For example, let’s take a look at the shield.

 

Strong:

  • The shield is what protects the wheels. Each shield has a health bar depending on what component it’s made out of.
  • Once the shield falls off, the wheels are exposed.
  • If wheels (they have their own health bar) get damaged, your bot is immobilized and you lose the competition.

 

Weak:

  • Depending on the material, the bot’s weight will vary.
  • A heavier bot will move slower and a lighter bot will move faster.

 

Once this system has been coded, we can start testing out the values and start tweaking them to create a balanced battle.

Week 9 Example Text.


Nicole – UI & UX:

Week 9.

 

Kang – Art:

Week 9.

 

Guanghao – Programming:

This week we have nailed down the basic code framework. We are going to use component based system to construct our code. We take this for 2 reasons: 1. We can now have more combination from different components instead of being limited by the whole bots. This allows children to sort of creating their own bot. 2.we can reuse those code for different purpose, we can apply the code for both “tutorial battle” and “free battle” without changing to much. That also ensure we have a robust code base and easier for us to debug.

Week 8

WEEK 8

Jehan – Producer

This week was spent refining our design and playtesting. The team has made great progress in our builds and prototypes– we are getting faster and more efficient with our prototypes. To date, our experience can be broken down into two major portions: an exciting battling portion of our game where the user can control their own bot’s movement and weapon, then attack a basic Combat robot AI… and a “Showcase” portion, in which the user can get a better sense of the inner-working of their combat robot. We had the chance to playtest an early version of the showcase for the first time with our target demographic last weekend, and it was received well.

Much of the later portion of the week was spent preparing for halves, a mid-semester review of our work and findings geared towards ETC faculty. We’ve put together a presentation outlining our overall game structure, iterative process, past and future milestones, and our plan for our delivery. Our client, Bill, will be attending our presentation on Monday of next week, and will give us his feedback on our progress.

I can’t believe we’re at the end of week 8, time is flying! We’re all looking forward to getting done with Halves… especially since we’ve worked through the entire weekend.

 

Trisha – Designer:

From the last week’s paper playtest for the resource management system, we had the following important findings:

  • Self-discovery: Kids enjoyed figuring out a solution on their and were excited when they discovered why the solution worked.
  • Considering the weight limitation and solving a problem got frustrating for the kids.
  • When solving the problem together, we found lesser participation from the shy/quiet kid and the most loud one would be dominant.

 

We have decided to split the problem-solving and weight limitations into two different phases.

Phase 1: Find which bot would not get flipped and learn why

Phase 2: Even though your solution is correct, flipping too many times causes damage to internal components. Upgrade to a stronger armor making sure that the bot does not weigh more than 15 lbs.

 

This week, we performed rapid iterations on our prototype and integrated sound for our playtest sessions.

 

Playtest sessions: 14th, 17th and 19th of October

Playtesters Gender: 6 Males + 6 Females

Playtester Age: 10-16 years

Iterations:

 

  • Virtual on Physical

 

Problem:

Players did not realise that the virtual components were rendered onto the physical world.

Changes:

We made the arena transparent.

Results:

Players started pointing that after a while, it felt like the bot existed in the same space as them.

 

  • Arena too Big

 

Problem:

Originally, it felt like the arena was much above the ground. Players would stand in the middle and rotate in the same spot to find their bot.

Changes:

Lowering the arena further was not an option as the ARKit wouldn’t allow going below 0 along the y-axis. So we changed the height of the arena. The arena size (1,1,1) was reduced to half (0.5,0.3,0.5) and the height was further reduced to 0.3.

Result:

Because of the change in length and breadth of the arena, players were not standing inside of it anymore and stopped rotating around in space struggling to find their bot.

A much larger change in height made it feel like the arena was a lot lower than before.

 

  • To get used to the control system before the battle, players were asked to drive the bot around in an empty arena.

 

Problem:

Players were bored and weren’t developing any skills due to mindless driving.

Changes:

Added cubes to the arena that changed color.

Result:

Players started driving much more intentionally (turn and proceed towards a cube to turn it green)

Next week, we are planning to focus and test the UI for the showcase part of our game.

Kang – Art:

Week 8.

Guanghao – Programming:

We are working hard to make the prototype which includes the whole game flow. We first anchor the world in AR and select the bot we want. We then enter the battle with the one we chose. For the first battle, there will be a movement test which you need trigger all sandbags before you encounter with a boss enemy.

Nicole – UIUX:

Week 8.

Week 7

WEEK 7

 

Production:

This week, we coordinated with our out-of-house sound designer (a colleague at the ETC), and catalogued the portions of our experience that require specific SFX. We don’t have a sounds designer on the team, so I will be responsible for coordinating with him I outlined the table below as a reference for him:

 

Design:

This week, we tested out the resource management system for the redesign phase of our game.

Our redesign phase will accommodate the scenario in which the player’s bot is flipped and the player has to find ways to get around it.

What a redesign session would look like in the game:

  1. Players will enter a battle
  2. They will be playing against an AI bot (flipper)
  3. After a good few seconds (20 – 30 seconds depending on what players prefer) into the battle, the AI bot will flip the player which will lead to the player’s bot being immobilized (the player’s bot will land on its back and cannot move anymore similar to scenario 3 gif)

  1. The player will lose this battle as their bot cannot move anymore
  2. Players will get the opportunity to go back in time and make changes to their bot to avoid being flipped on their back that lead to immobilisation
  3. Players will be provided a range of options (Self-righting mechanism, heavier chassis etc.)
  4. These options will have at least 2 acceptable solutions and the rest will be right answers but not acceptable and also wrong answers (Example: Players will suggest increasing the weight of the bot, which is currently 14 lbs,  to avoid getting flipped. Unfortunately, this answer despite having a possibility of being correct goes against the weight restrictions (15 lbs))
  5. Players will choose their option, they will then be shown the outcome of their choice in the form of an animation or by battling with the AI bot again.
  6. To avoid monotonicity, some options like increase weight, will not allow the players to move to the outcome phase (the battle or the animation part) and players will be told the outcome in the form of visual/textual feedback immediately after choosing that option.
  7. If the outcome is such that their bot doesn’t get flipped, they have successfully redesigned their bot.

 

Note: The redesign phase is not about fixing/repairing your bot before the next battle. It’s about rethinking the design to solve a particular problem. Think of it as, “If I could go back in time, what would I do differently to avoid this from happening”. It’s catered towards encouraging the players to solve challenges from past battles that the bot must have faced. We want to players to get into a problem-solving mindset. We want the players to understand that NRL focuses on learning from a particular bot’s past shortcomings and “iterating” on its existing design.

 

We are currently making a list of right and wrong options that will be provided to the player at step 6. Based on Table 1, we made a list of options for when a bot is flipped and immobilized. Each of those options have problems listed next to them that can be presented to the players when they chose them to let them know why that was a wrong choice. We would like to know if we are on the right track in terms of categorising these problems as right/wrong.

 

So the problem statement is:

Your bot gets flipped and lands on it’s back that leads to it getting immobilized and you lose the match. How will you redesign it such that you can either recover from being flipped or avoid getting flipped?

For now, the right solutions include:

  1. Self-righting mechanism
  2. Invertible bot
  3. Weapons that also self-right

The most obvious wrong solution is: increasing the weight (goes beyond competition weight limit)

This is what the distribution for our redesign phase would look like:

Based on some of our playtest feedback, it went through several changes in terms of balance and the way the information was presented to the players.

The final iteration looked like this:

This iteration will be tested with school kids (10-14 years of age) to test the following things:

  1. What is their thought process when solving this problem?
  2. Do they solve this any differently when paired with a buddy?
  3. Which weapon did they want to equip their bot with and why?
  4. Their reason for the solution that they provide.
  5. Is the moment of self-discovery fun?

 

We will also playtest the current bot movement system in AR with the kids this weekend. Based on my past experience, I added a developer mode to our build to tweak all the juice parameters (force, friction, rotation). This will help us  lock down on what parameter values feel “fun” to the kids.

 

The design team and the programmer sat down together to create a better work ethic that involves much more transparency from both disciplines.

This meeting helped us better understand each other and develop a terminology that both disciplines will be able to use in the future. All ideas will now follow this format and the programmer will create a framework that will accommodate these ideas.

 

Programming:

We made a new version of control in which the movement of bot is based on physics. Which helps us gain a much smoother rotation. Though it might not that easy to control, it also gives kid a chance to chase their bot.
We also add a dummy AI which could only randomly move to a target point. It gives us a sense of how we can interact with enemy.

Week 6

 

Production:

The team made great progress with mapping out and narrowing down the specifics of scripted case scenarios in the battle portion of the game. Programming included real time drop shadows and basic animations for the player’s bot, which went a long way towards visually grounding the bot to the floor through the eyes of the user, and was well received.

As we add new features and refine/balance the movement, control, and physics systems, playtesters spend more and more time with the iPad before they decide they’re done with it. More than ever, I am seeing the real potential for us to make something great.

The most limiting constraint that we face is time… 14 total weeks of development, six of which have past already. Moving forward and for next week, we will be challenged by scoping back our idea in a manner that retains its core identity and purpose.

 

Programming:

This week we are still working on prototyping. For Wednesday, we made out a demo that uses the new version of control. We reduced the controller from 2 to 1 which allows us to put enough space to place attack UI. In the new control, we move and rotate bot in a more intuitive way. We let the bot to always move in the direction that player is facing. The reason why we made this is we found when player use the previous control they usually get lost if the camera’s facing direction is changed. This time we change the moving coordinate system from global to local so that your bot is always move in front of you. We let the EA visitors to try the two version and everyone who tried it out prefer the later one which means we are in the right track.
On Friday, we have another updated version which includes the wheel animation and realtime shadow. The wheel animation is based on the real physics, if you turn right, the right wheel will rotate backward while the left one rotate forward. And the shadow enhance the feeling that the bot is in the real world.

 

Design:

As a team, we decided to focus on the “Re-Design” (Phase 2) part of our game to be playtested in week 6.

 

The goal of this playtest was to figure out:

  • how players react to problems presented to them
  • how creative their solutions can get
  • their understanding of a bot’s weaknesses and strengths
  • which bot would they like to play as

 

As a part of our playtest, we presented players with 3 scenarios:

  1. How would you redesign the robot with the hammer to make it more stable?

2) How would you redesign the silver and red robot to deal damage more effectively?

3) How would you change the two-wheeled robot so that you could recover after being flipped?

For each of these scenarios, we further asked them the following questions:

  • For every problem stated, we asked them if it was something worth fixing?
  • What are each bot’s weaknesses according to you?  Is this something you want to make an effort fixing?
  • Which one do you prefer, offence or defence?
  • Which bot would you like to play as?

 

This playtest was conducted in two separate ways,

  1. We had half of our playtesters fill this survey at home to understand how much of the problem the playtesters at home were able to comprehend.
  2. We had the rest of them engage in an open discussion with us to encourage them to talk about feedback that we were not necessarily looking for/didn’t foresee.

 

Our findings:

  1. We were able to narrow down some of the most common but incorrect solutions for the first two scenarios.
  2. We found out that players were thinking more when they reached the third scenario.
    1. This could be because of their practice with the first two scenarios
    2. Their ability to better understand the battle after going through two scenarios
    3. The problem being interesting and easy enough to encourage players to find creative solutions
    4. A much easier understanding of the problem without having to explain what was clearly wrong with the bot that was immobilized
  3. Most of our players preferred to pursue an offensive approach when it came to choosing the type of bot they would like to play as. This mostly was based on the liking of the weapon the bot had. For example, players did not realise how powerful a drum bot would be because it’s impact is not visually clear.
    1. We need to have  feedback system that highlights the impact and has more clarity in terms of how effective or damaging the weapon is.
    2. A passive weapon can be a bad decision as players were constantly preferring weapons that they could trigger/activate a weapon at their own will.
  4. We need to choose a scenario that is easy for a player too grasp in terms of what the current problem is. For example, players did not know what was wrong with the hammer bot unless it was explicitly mentioned to them. Whereas, a scenario like getting immobilized conveys the problem without further explanation.
  5. Players loved the horizontal spinner and this is a weapon that will be introduced during customization after phase 2.

 

Our next playtest will be based on scenario 3 (being immobilized). We will be presenting players with options that they can choose from to fix this problem. Each option will have a weight and a cost associated with it.

This playtest will help us in the following way:

  • further tune/balance weight vs cost
  • it will give us an idea about how hard or easy is it for the players to figure out the right solution to the problem
  • how many right vs wrong solutions should exists
  • What are some of the aspects the player liked/hated when doing this exercise
  • how to further make this scenario interesting

 

Art:

In this week, we finished the 3D models concept for our final project. We made two battlebots based on the shapes of two real famous battlebots in the show.

After the Art and Design meeting, we figured out our basic restraint relationship among different weapons. For the art, we will start to build weapon components for the game such as hammer and flipper. So in the next week, we will start to make the models of several weapons. In addition, we will begin texturing some models by Substance.

 

UI/UX Design:

In this week, we had our first formal playtest on ‘Redesign’ part which could help designers in the team to ideate more options of redesigning bots and also playtest the prototypes we made in last few weeks. We got lots of valuable input from people inside and outside ETC building. I collected the feedback of prototype playtest and delivered them to our programmer. For now, we were trying to solve the problems of control system.

 

As a part a design work, we try to review the game design we had from last week and modify a little bit to fit with our time and resource constraints and client feedback. We also generate a idea about bot gallery at the end of the whole experience to satisfy our client requirements from last client meeting that we probably need to make manufacture get more involved into the whole process.

 

I and Trisha has a clearer work division about design after faculty meeting in the middle. After the team deciding our basic flow, we will take responsibility for different parts work. I will take care of the design to user end and Trisha will take care of game level design.

 

To help the team have a better understanding of our game flow, I made the wireframes of battle phase. I believe visual elements could help our team on the same page and also help to move forward. I’m still in the process of experiments and comparison different layouts of battle phase with people’s critique around me.

 

Week 5

Production: 

We spent much of this week preparing for quarters walkarounds from the ETC faculty, integrating feedback, and iterating on our design. For walkarounds, each two/three faculty session was only 15 minutes, but we were able to introduce our topic, project goals, and get some valuable feedback on our AR prototype and the second major iteration of our design. We met with our client later on in the week to get his input on the new structure, which he was receptive of.

For the past two weeks, the team has shifted away from the physical SCRUM board to Trello to manage work and tasks accompanied by brief dailies and role specific meetings. I plan on adapting our backlog to our new design over the next week, and rescheduling the start of our first sprint for next Wednesday. My intention is that is that by maintaining the physical scrum board and simple burndown chart, the team will be able to easily visualize and interpret trends so that we can adjust our workflow for the next sprint cycle.

Design:

This week, we got a prototype out for testing our movement phase. The demo includes putting wheels on your bot and then placing it in physical space to drive it around. This demo was a sample of what our first idea would feel like in terms of switching between non-AR and AR part of our game. After trying it out with faculty during quarters, we got some really good feedback on our prototype. We were a little skeptical about driving a bot in AR because of the limitations of the IPad screen size but we found out that it didn’t feel all that uneasy and was actually pretty smooth to drive.

This week, we also started expanding on the battle part of our game to test our 2nd idea which has much more controlled gameplay. In Order to better scope, we have modified our idea in a way that we don’t have to actually teach kids how to build a bot from scratch. Based on the feedback we received during quarters, we have decided to divide the process into three parts with each part highlighting an aspect of bot building. The three aspects include Repair, Redesign and Strategy involved in bot building and battling.

A flowchart for the idea is as follows:

Based on our flowchart, we are going to expand each aspect and we are going to make the player experience a battle that will resemble a past championship battle.

Repair: Teaches the player about material replacement and durability.

Redesign: Includes presenting the player with a problem in the arena and providing a set of options amongst which the player will choose which one to pursue.

Strategy: Includes the strategy involved during an actual battle where none of the bots get defeated. Each bot will get points based on where they attack their opponent and a tally will decide who wins.

Programming:

This week we are making another prototype that combines two phases of our game: building and battling. For building part, we make tools in the non-ar environment. we move two wheels around and place them at the right place of both side. Once it’s done, we are able to drop the bots we were just created to the scene. Although it’s still a simple and rough version prototype but this at least give us the sense of the whole game flow.

One of the difficulties we encountered is the object that persist between two scenes. And since the object we created has rigid body, the unity physical system seems not stable at some points. We later fix the problem by adding constraint on two axis rotations. Though it’s not the perfect solution, we still have something that testable at last.

 

Art:

In this week, we changed our game design into a new one which will be focus more on Ar controlling and battle. In the new game flows, the players will choose a bot to battle and try to fix the broken parts of this bot during the battle. The players can learn how to place different components of a bot.

For the prototyping, we finished a rough whole bot with wheels and weapon for testing. In the prototype, we test controlling the bot in AR by using only one button to control the turning and moving forward. Next week, if the prototype works well and we will move forward for more 3D modeling as well as animations and texture.

 

Week 4

WEEK 4:

Production:

The past week was spent researching our target demographic, meeting with industry experts (educators and teachers from high school participating in the NRL program), prototyping, finalizing branding, getting faculty input, locking design, and planning for each sprint through the remainder of the semester. Additionally, I spent some time populating the project backlog and ranking priorities based on the design we have been developing.

Programming:

In this week, we are making prototype based on ARkit2. At first, we have a lot of problems about setting up build environment. Because ARKit2 was just released on June and even the Unity official website doesn’t have a final plugin or package, we suffered a lot from its frequent update. However, the good news is when we finally worked it out, we will satisfied with the stability of ARKit2 and it seems we can do more on it.

In this prototype, we focus on making our “bot” move around in the real world. We use two analog sticks to control our bots. The left one is to control weather move forward or backward. THe right one is to turn left/right. We made a arena in our project room using some black tapes. We ask others try to move the bots following tracks and almost everyone can do that.

Design:

For week 4, it was necessary to get ready for our meeting with the school kids from the Cornell School District. As a preparation for that, we created a set of questions to ask based on a video we showed them about Bot building. The package created for focus testing was as follows:

6th and 7th Graders Package

 

  • What do you and your friends like to do in your free time?

 

  • Do you play games?
  • If yes, when do u play games?
  • If yes, what do you generally play on? PC? Ipad? IPhone?
  • If not, what do you generally like doing after getting home? Are you new to games?

 

  • What kind of games do you (and your friends) like to play?
  • What classes do they find fun at school?

 

Video

Show them Combat video.

Show them Building video.

 

  • Raise your hand when you see something you like.
  • What parts of the video did you find exciting?

 

Post Video

 

  • What parts of building the robot looked exciting?

 

BASIC LAYOUT:

  1. Open with the battle bots video.
  2. Separate into two groups
  3. Ask questions: What parts of the video did you find exciting?
  4. Raise your hand when you see something you like. Show them the second video.
  5. Ask questions: What parts of the video did you find exciting?
  6. Ask general questions.

We got pretty good information for the things we were specifically looking for and also ended having information we weren’t quite expecting.

One of the most important things that we learned from the visit was that students were very much excited about the building process and not just the battling part of the competition. We realised that students liked building a bot that solves a specific problem and it we understood how important problem solving would be as a part of our game.

We also had industry experts visit us during the same week. After showing them what AR could do, they seemed really enthusiastic about the technology we were working with and were giving us tons of ideas on what part of the Bot building process would be enhanced by AR.

 

Art:

In week 3 and week 4, we focus on settling down our rough finial design. We set our game flow into  building and testing in different levels. We will let our players known how to combine several components in the right place in a bot and test the bot.

In addition, in the art side, we made some 3D models of the components of the battlebots for future development like the wheels, motors and batteries. For the next step, we will put these assets into our prototype for testing. And we will make a entire bot in 3D model for driving prototype.

 

UX/UI Design:

This week is really intense. During the last faculty meeting, Mike and John expressed their concern for our quarter. We did spend lots of time trying to nail down our design and did not have too much prototype to show at that moment. So our major work in this week is trying to do some hands-on work, After last faculty meeting, we finally nail down our design solution. To meet with two main stakeholders’ need, we divided the whole experience into two parts, including building and testing parts.

On Thursday, we went to Cornell middle school to have our target audience research. We had two sessions with both 7th and 8th grade. The kids had lots of valuable input based on our project.

On Friday, we were glad to have our experts from the program. The experts offered lots of important insights for us. For example, the question which faculty always asked us, ‘Why AR?’

The design plan on blueprint looks thin and abstract. If we could create the 3D components in AR world which could compared with realty stuff, it would be really helpful for kids have a better feeling of the specific components they have designed in the design process.

Week 3

Production:

The week was spent on further research, fleshing out our design, and agile SCRUM, burndown charts, Trello, and the project backlog. We conducted research on previous existing games in the area, put together a list of questions for our survey with our target demographic (next week).  Additionally, the design team has settled on a basic structure for the structure of our experience, consisting of periods of building and designing followed by battle sequences in AR. Next week we will be meeting our target demographic (middle schoolers) at a local southwestern PA school in order to learn more about them and their interests.

Programming:

In this week, we did researches on what we can do with the AR and found some tech demo provided by Unity. We are setting up the environment for AR development. Including the package for Unity, the mac build environment. We have some problems at first when we tried to build some examples on iPad and later we realized that only the new ipad supports AR, those old ones are not capable to run AR apps. Thus we found out all the devices that support ARKit2 and made a request for them.

We have some discussion on how to use those AR technique to enrich our gameplay or make it more fun. One of the ideas I came up with was using the scanning feature in the new ARKit 2 to scan the shape drawn by kids and let them extrude and adjust the scale. It’s more like a simplified metaphor of the real building process. But it also have other problems like the shape should be simple enough which means usually may meaningless to the final bots. At last, we think let the AR scan a image and pop a preset component would be a better choice for us.

Design:

For week three, we started focusing more on different kind of applications that exist in AR and what mechanics can we use as an inspiration for our game. We found couple of good games on the app store and started playing around with them. We made a list of some of the good decisions that they had made and some not so good decisions that we should avoid. By the end of the week, we were able to finalise an idea for the game that was based on an iterative development learning approach.

UI/UX Design:

In this week, I focus on research on existing AR work in App Store. I try to get some inspiration from them about the interaction in AR world. The designers in team brainstormed out with several ideas and try to nail down a concrete idea through pulling back and forth.

For branding, I finish our logo and poster. After getting critique from John and Ricardo, I have my second iteration.

This is the first iteration of our idea. We have divided the game into two phases, building and testing. While the team is confident about having the testing part implemented in AR, we are still trying to get to make a decision about whether the building part of our game would be in AR or on the IPad.

Dev Blog: Week 2

WEEK 2:

PRODUCTION:

We dedicated this week to further conceptualizing and researching our final deliverable. As a team, we brainstormed and listed different gameplay types, mechanics, and ideas to incorporate into our product, with a good portion related specifically to AR and learning.

I familiarized the team with methods of agile development: the basic structure, format, and purpose of SCRUM and its components before running them through hypothetical case examples of user stories and the process from backlog to burndown. We constructed a physical backlog and SCRUM board to be used in the coming weeks.

The team put together a concept board breaking down the project’s basic description, inspirations, experiences, needs and goals. Trisha, our lead designer, ran the team through Sabrina Culba’s transformational framework and conducted a collaborative team exercise that broke down each step in relation to our project needs.

We spent the rest of the week preparing for and end-week client meeting and industry expert meeting. Our client meeting was geared towards better understanding the specifics of our deliverable (i.e. target demographic, platform, client needs) and helped us further develop the project’s constraints. We had the incredible fortune of having two regional BotsIQ participants/multi-year champions  come in and discuss their experiences with the program, the bot building process, bot driving, and the basic components of a combat bot.

Next week we will dedicate time towards further research, with the ultimate goal of being in a position to start building and entering an early pre-production phase by the end of the week. We will also be meeting with three local teachers and educators that currently participate in the program to gain insight into the educators perspective of the program.

 

DESIGN:

This week, we decided to do a team exercise using Sabrina Culyba’s transformational framework. Each of us answered 8 questions individually and met the very next day to have a discussion.
This exercise was to find out what each of us thought about 8 aspects of the framework and make sure we were all on the same page.

This is the first iteration of our document: here

This exercise helped us define our goals, player transformations expected by our client and measures of success, better. Our next step would be to operationalise some of the abstract terms like “fun” and “engaging” to have a consistent way of defining these terms and measuring them.

[INSERT HYPERLINK TO DOCUMENT]

 

UI/UX:

In this week, we spend some time to organize all the information from the 1st client meeting to shape the outline of our project. We brainstormed together and generated all kinds of ideas based on platforms, gameplay, final deliverable and etc. The whole team reached a common sense that AR is a great approach to reach client’s goal.

We generated several directions the project probably would go, at the same time, we also want to build a scope for our project so we had our second meeting on Friday. During the meeting, we asked our questions which was generated in this whole week to explore more about what is the real achievement our client want to have. After the meeting, we found that client gave us a clearer goal for this project is attracting people to join this program. In this way, our previous ideas about replayable and real physical simulation are not in our consideration now.

Student is the most important stakeholder in this project. We also met with experts in this field Joe and Bill with their winner bots on Friday. Talking with the champion of NRL competition is really helpful for us to get a better sense about bot building process.

 

ART:

In this week, we keep on figuring out our final goals and requirements in our project with our client and instructors. After several meetings, we decided to focus on creating a fun and engaging experience to attract middles school students to join the NRL program and introduce it to teachers and people in manufacturing. But we still needed to define what was a really fun game. The client want our project can show their NRL program well. In order to match our client’s minimum requirement, we should at least make a bots prototyping process with one test in our game. In addition, we met two experts in NRL competitions called Joe and Don. We got lots of knowledge about building bots from them and saw the real bots moving which was really helpful.

We have not started to build art assets this week. But we set our game’s art style in Lowpoly.   

 

PROGRAMMING:

In this week we are trying to find a way to combine AR with the project. We realized that it’s important for us to find several features that only can be done with AR instead of any other technologies.

  1. AR overlay the virtual onto the real world. We can use AR to connect virtual and reality. Usually manufacturing or building requires user have a physical interaction. With AR, we can make the experience much more similar to the real one than making it in a 3D app.
  2. Sometimes, it’s good for kids to have an intuitive way to sense the scale of the robots. AR provide them the chance to measure things in a real way
  3. With the latest AR kit, we have some attracting features like object scanning. One of application we think of is to let kids do some papare work in the real and then use AR to scan into the game for later testing. It simplifies the process of building but still let kids to some sort of practising.