20 Feb

Dev Blog – Week 6

This week, the team made a demo video for the client to give him an idea of the team has done after the client left.

Programmers modified the current marker recognition system such that the device is able to determine a position based on multiple markers more precisely.

Artists started to make a new “storyboard” to clarify the interaction of the player and the corresponding implementations for programmers to evaluate the difficulty.

On Friday, the team was to communicate with the client to ask for input about the importance of the  interaction of the player to make a priority for all the steps and think about drawbacks.

Also the team is featured on Hot Tech: Devices Revolutionizing The Electronics World

Hot Tech: Devices Revolutionizing The Electronics World

On Friday, the team attended Playtest to Refine Workshop to answer the following questions:

  1. AFFECT. How does it make you feel? How do you feel at the beginning of the experience, the middle, the end?

  2. THEME. What’s the big idea? (i.e. “forbidden love,” “transformation,” “reduce, reuse, recycle”) List all of the themes of the experience.

  3. ENVIRONMENT. Describe the physical world of the experience. This might include sound and touch.

  4. CHARACTER. Describe the characters, avatars, or game objects (or interface elements)

  5. MECHANIC. List the game mechanics that the experience uses. What actions can the guest perform?

  6. RELATIONSHIP. Describe the relationship the experience supports between characters and/or players.

  7. EVENTS. What are the major events of the experience? How are they sequenced and paced? What triggers the event? What does the event trigger?

13 Feb

Dev Blog – Week 5

This week is time for 1/4 walk-around and 1/4 sit-down.

On Tuesday, a reporter came to the team, tried the current demo and interviewed the instructor. They then had a discussion about the project in general which leads to a deeper discussion about the unique value brought to the project by Augmented Reality.

Abdominal examination is not limited to exam the abdomen. It also consists the examination of the eyes, tongues, skins, arms, nails and many other parts of the patient. And it is very hard and expensive to physically make different “modules” for all the symptoms. But with Augmented Reality, it is much easier to simulate all the symptoms digitally and it will take less time to make new scenarios and new “modules”.

The teacher will also be able to explain the performance of a student to other students as they also have the view of the student who is doing the examination.

On Wednesday, all the faculties came to the project room and the team presented the current progress of the project as well as the biggest challenges and risks. Most of the faculty understand the idea behind Augmented Reality and Medical Simulation and some of them have several concerns.

  1. The team said they’ve built abstraction layers when make function calls so that it can be easily transferred to HoloLens. But some faculties still concerns that the team won’t have enough time to do the migration and the optimization for rendering, like texture size, mesh polygon count and etc.
  2. The team already made internal organ models based online reference but the client want them to be more realistic which is very hard to model. Some faculties gave advice on using Physically Based Rendering and have a more detailed discussion on Friday.
  3. The team mentioned it would be hard to conduct playtest as the demographics are medical students and teachers. Some of the faculties said they have some potential resource the team could utilize.

Overall, 1/4 walk-around and 1/4 sit-downs work pretty smooth and well.

05 Feb

Dev Blog – Week 4

This week, Erle Lim, the client, came to Pittsburgh from Singapore to discuss the fine details about the project with team.

They came up with a “storyboard” of the examination sequence.

They also made several typical scenarios for abdominal examination and summarized in a document for future use.

The team decided they will make a graphical interface for the teachers to enable add new/modify old scenarios to make the application easily expandable.

Discussion with client

The client also suggested to enable the student manipulate MRI images when he/she is doing the examination of Abe, to give them a direct, better understanding of the attitude of human abdomen.

They thought it would be also very useful to generate a report explaining the performance of the student as the student can have a direct feeling about how well he followed the steps and the teacher will be able to have databases for students.

To customize the superimposed images to make Abe a better representative of the human race – ie instead of being Chinese – the team decided to make him Caucasian, negroid, latino etc.

29 Jan

Dev Blog – Week 3

This week, programmers experimented more on meta and we found out several important things.
1. Meta is just duplicating what is showing on the desktop monitor so extra efforts are needed in order to make it possible for instructors to make comments and talk to other students
2. We need to move the camera instead of the virtual object to “simulate” of the movement of the headset/player. It has recently been a hot topic at Meta Community also as a known difficulty.

We also discussed and designed the 1st draft of interaction map.

Artists kept refining the branding work and worked with programmers to create several UI experiments.

On weekends, several members of the team participated Global Game Jam.

22 Jan

Dev Blog – Week 2

This week, programmers experimented more on meta and found out that meta has a built-in pattern recognition module we can utilize to indicate where the digital object should be placed in 3d space. But the module is limited in several ways:

1. there are only 12 pre-defined markers and it is not documented how to make new ones
2. when the markers are far from the device (which result in the marker becomes smaller in the camera), the system is not very stable
3. the latency between the device and the computer is very noticeable

At the meantime, artists created several variation of branding art, including posters, logos, 1/2 sheet and theming of the project room.

The team also discussed about the project.

Background: (Why we need Abe and this project)
Medical simulation is having a transformative impact on health care, especially experiential learning. Manikins to teach the examinations of the heart and lungs have been used for decades, but there is no manikin for teaching abdominal examination until Abe came out.

Problem: (What is the new problem)
Teaching and learning palpation is a real pedagogical challenge as it requires the right combination of knowledge, skills, and attitude.
1. mental part: medical students need to be aware that patients may not be able/willing to precisely describe their feelings, may not feeling very comfortable being touched and etc.
2. physical part: the arrangement of the parts of a body or figure /posture/

Solution: (How the team will solve the problem)
This project provides a 3d visually simulated interactive manikin with hologram superimposed on the Abe as well as the corresponding text, video and details of the viscera.

Next week the team will meet with the client and discuss all the new questions after the first meeting with the client as well as try to solve the technical problems.

 

15 Jan

Dev Blog – Week 1

This week, the team read through and discussed the material sent from client and the instructors. They team also had a meeting with the client on Friday morning in order to have a better understanding of the expectation and requirement of client.

Also programmers of the team experimented the new AR device, meta, and would be able to set up a demo project in Unity. Artists discussed themeing and branding art.

On Friday afternoon, the team participated the playtest workshop and had more detailed thoughts about the project.

Next week, the team plan to make new prototypes to test the advantage and limitation of the device and the first draft of the branding art.