Week 6 Dev Blog

On Week 6, we continuously work on the linear VR experience. Here are some screenshots for our latest progress:

Step Controller for level management

The above picture shows the steep controller we used for level management. We use a stepManager system as the bottom structure to manage the linear level system. We gave every step several parameters, and every step has events to be triggered when the step starts and ends. When a step is created, we also gave them step events for ending and start, and also store each step and their following steps in two dictionaries. Then we leave the AddStep Interface for .json file to store and add levels. For every step, we need a structured method calling the step ID of every single step. Then, we search in .json file for glowing objects and outline objects and use specific functions to provide visual feedback. In the .json file, we store objects with names and split with the comma. Therefore, In Set step function we formalized strings and find objects in the scene through their name and call relative methods. Then we leave interface for both moving to the next step and also calling specific step. In most occasions, when the system receives the event that the guest has finished the current step, we call the function of moving to the next step. In some other occasions, we need the player to recall back to specific steps and looping in several steps until they finish the last requirement. In the loop, we need to call a specific step instead of moving to the next step. When calling the function, we pass the step index as the parameter.
We use a stepManager system as the bottom structure to manage the linear level system. We gave every step several parameters, and every step has events to be triggered when the step starts and ends. When a step is created, we also gave them step events for ending and start, and also store each step and their following steps in two dictionaries. Then we leave the AddStep Interface for .json file to store and add levels.

Interface of Setting the current step

For every step, we need a structured method calling the step ID of every single step. Then, we search in .json file for glowing objects and outline objects and use specific functions to provide visual feedback. In the .json file, we store objects with names and split with the comma. Therefore, In Set step function we formalized strings and find objects in the scene through their name and call relative methods.

Set step function implementation

Then we leave interface for both moving to the next step and also calling specific step. In most occasions, when the system receives the event that the guest has finished the current step, we call the function of moving to the next step. In some other occasions, we need the player to recall back to specific steps and looping in several steps until they finish the last requirement. In the loop, we need to call a specific step instead of moving to the next step. When calling the function, we pass the step index as the parameter.

Above is our latest machine model. Before the half, we need to add more details on the outer door and build the right half part which contains the monitor of the machine and the safety knots.

We had a draft UI for the tabletl that the student will hold in their left hand during the VR experience and let’s take a peek:

Draft UI for Tablet

For week 7, the last week before the half presentation. We have made a priority list of what we are going to do on our prototype:

a.     The whole interaction experiences including art assets we have. We will use placeholders for the rest.

b.     Instructions implemented in the tablet.

c.     The tag on each object when the controller hovers over it.

d.     Prioritize visual feedback.

e.     Sound effect.

Our whole team are excited about what we are going to present in the half.

Week 5 Dev Blog

On Monday, we discussed the feedback we got from Quarter Sit-down last Friday. We looked up similar past ETC projects and related companies online together, including Labster, M-lab, and MediSIM. We discussed what we could learn from them and what we should try to avoid in our design. We separated the tutorial content into three layers: the procedure that students need to do; the procedure that lab technician needs to do; and any other information. We prioritized the contents and tasks according to the project’s needs:

  1. Things students must touch in VR
  2. The less important but cool contents and implement into VR
  3. Overall system working in PC

We agree that we need to keep in mind what is the most important part and the hardest step for the students to understand. These parts are where illustration and animation could really help.

After prioritizing our tasks, we are much clearer about what we need to do this week: programmers would finish the most important interactions from step 5-8 in VR; designers would research more on the VR instruction display; the artist would continue working on the model of the chamber.

We worked hard for the demo on Friday and here are some highlights of our progress:

On Friday, our clients, Sandra, Nick and Todd came and we presented our progress and demo to them. It was the first time that we showed our VR demo. Clients were surprised about how efficient and effective the training in VR could be and was very excited to see our progress in the future. Here are some photos of their visit:

With courage and support, Hot Metal team works hard and is looking forward to our next demo!

Week 4 Dev Blog

On Monday, Todd, the lab technician, and Nick, the Ph.D. student at NextManufacturing Institution, gave us a tour of the lab. We got to know the procedure much better by actually seeing the manipulation in person. We also took a lot of photos for future reference. Here are some highlights of the tour:

AM machine
Lab Tour

After the tour, we rearranged the content we had and made sure everyone understood each part well. With the pictures we had, our artist modeled the chamber in more details.

Chamber Model

For the art part, we also have purchased some art assets including screws, lab environment, and work coat packages.

For the programming part, our programmers were continuously working on the interactions in Virtual Reality. Here is a screenshot of the door open function of the machine:

Door Open Function

On Wednesday, we presented our semester-long plan to the faculty at 1/4 walk-around and got a lot of feedback from them. We modified our plan to be VR-focused. We planned to finish the first VR prototype before the half and will design our PC part after the playtesting the VR prototype. We are excited to see the progress we are going to make in the next 3 weeks before the half presentation.

Quarter Presentation

Week 3 Dev Blog

In order to prepare for the quarter walk-around next week, we decorated our room with the patent posters ordered online. We started to use the scrum board to keep everyone clear about the progress everyone has. We also designed the poster, half-sheet and project website.

Project Poster

We also met with Jesse Schell and received a lot of suggestions from him about how to attract students to pay attention to the PC training. Jesse also gave us some advice about how to display texts in Virtual Reality in a good way. We definitely would consider them into our design.

Our programmers continuously developed communication between CTAT and Unity. They also built Level Management System by using Level Manager-Level-Goal structure and interaction system based on grabbing and raycasting.

VR Simulation

Our artist began to model the chamber of the machine. Here is the first peak:

AM machine chamber

On Friday, we had our regular Faculty meeting with Sandra, Bruce, Lu, and Nick. We were excited that we would have another lab tour next Monday with Nick and the lab technician, Todd. We would have the same training as the current students have for AM machine. We clarified a few more details about the operation and tried to keep our vision and expectation the same as our client.

Lastly, here is our team photo:

Team Photo

Week 2 Dev Blog

During our 2nd week, programmers have figured out the integration between Unity and CTAT and started prototyping the basic interactions of the operation in AM machine in Virtual Reality . Our artist designed our logo shown below.

Project Logo

We also set our project description :
“Project Hot Metal is an educational project aiming to teach engineering students how to operate a metal Additive Manufacturing (AM) machine. With cross-platform of PC and VR, the goal of the project is to simulate the operation of metal AM machine and provide instructional support for engineering students.  Students will not only have detailed instructions and assessments on the PC platform but also have a practical experience operating the machine in the virtual world. Project Hot Metal will also integrate the prototype with a machine-based tutor created with CTAT (the Cognitive Tutor Authoring Tools), developed by the Human-Computer Interaction Institute (HCII) at Carnegie Mellon University. “

We talked about our semester-long plan. We were planning to deliver our first VR prototype before the half presentation and working on the PC and VR platform simultaneously.

Here is the detailed design plan:

Current Design Plan
Composition Box

Week 1 Dev Blog

As the new semester started, our team could not wait to meet up with each other.

On the first day of school, we went out and had a team dinner. We introduced to each other and had a great time there.

Team Dinner – Jan 14th, 2019

Before our client meeting on Friday, we sat together and did some research about our client and their product. The NextManufacturing Center is a CMU research center, for additive manufacturing (AM), commonly known as 3-D printing. Our Client, Sandra DeVincent Wolf, the Executive Director of NextManufacturing, is looking for a prototype that could provide instructional support for AM machine training in Virtual Reality. We looked up the client’s website and talked about our expertise and preferred positions in our team. We also listed some potential questions that we could ask the client during our first client meeting.

On Friday, Jan 18th, we met with our clients on campus. Sandra gave us a brief introduction about the lab, the fundamental knowledge of 3D metal printing and her expectation about the project. Bruce McLaren, the associate research professor at Human-Computer Interaction Institute, and his student, Lu Sun, also gave another introduction about CTAT (the Cognitive Tutor Authoring Tools) and how they wanted our prototype to integrate with this tutoring software. They wanted to extend our training software in the future to collect and analyze student’s data for research purpose. After the presentation, Sandra, along with her student, Nick, and lab technician, Todd, gave us a lab tour and explained the machine on-site. We had a really great time during our first meeting and we felt very impressive about how high-tech the machine is and how much space it took to store all the metal powder.

First Client Meeting – Jan 18th, 2019
Additive Manufacturing (AM) machine

After our first client meeting, we set our role in this project and designed the responsibility assignment matrix together.

RACI Chart