Categories
Blogs

Week 5: Everybody Was Kung Fu Fighting

The challenge we decided to tackle for this week was to create a multiplayer game that required movement around the display.

We decided to make a multiplayer fighting game, like Mortal Kombat or Super Smash Bros. However, we would be fighting cute, cuddly bears.

We decided on our input being an android app, as we could build our own controllers, use the different sensors already on the device, and be able to determine rotation using the accelerometer and magnetic field sensors.

We added the movements seen in the UI above, as well as a Block action (triggered by moving the phone quickly upwards) and a Special Attack, which you can use when the progress bar at the bottom is completely full (which you fill by yelling into the phone’s mic).

Connecting the android device to the game was surprisingly easy, and we will probably use this interaction again if another prototype calls for it.

The game ended up being pretty fun, even just using the emulator on our desktop computer. We can’t wait to see if it would be just as fun on the Voxon device!

Learnings:

  • Use the VOLUME FRAME LIMIT as a game mechanic
    • Players can jump out of frame but can imply position
    • No mesh for floor of the battle arena
  • Need to see on device to see if players actually move around the device as expected.
  • Using dynamic camera movements within the display as the players move around the arena allows for:
    • more dynamic gameplay
    • allows the area of gameplay, the ring or arena, feel bigger
    • allows zoom in on characters and animations to see details that would be impossible to see if it was static

Categories
Blogs

Week 4: Honey I Shrunk Myself!!

This week was all about how to get the person in VR in our asymmetrical co-op from last week to see the people around the display. The previous prototype was more about getting the two groups of players talking, but only one could see the other. What if they could BOTH see each other?

The previous week we had an idea about putting a 360 camera on top of the display, running that feed into the VR skybox, and BAM! Better than FaceTime right? So we decided to try it.

Image result for honey i shrunk the kids
since there is the size discrepancy, it would feel very much like the movie

Implementing the 360 camera technology ended up being pretty straightforward. We were able to have the person in VR see everyone surrounding the camera (including themselves if they were in view of the camera – it was very trippy).

The next step was to implement it with our previous Mission Control prototype, to add that extra communication element to help the puzzle go faster. The ‘Mission Control’ side (the Voxon display) would be able to physically point at different objects or paths that they wanted the VR player to take, rather than worry about giving vague directions.

view from the VR perspective

This worked out really well, and was a great morale boost for our team, but there are a few questions we still have for when we actually get the machine, as well as our learnings.

Questions:

  • What is the best way to represent the person who is in VR, in the Voxon display?
  • Is having the full 360 camera view the ideal view, or should parts of it be obscured by say walls with windows to see through?
  • How important is the small lag of the camera updating the skybox when people are moving around the display?
  • Would post processing of 360 stream for distorted or filter effects add to the experience?

Learnings:

  • Can use physical props to obscure the bottom part otherwise the virtual assets won’t mesh with the room environment
    • Or can theme it narratively such that the VR tiny person is surrounded by fruit flies or they’re climbing a small plant, placing the 360 camera inside a real clay planter
  • Windows instead of lights for cylinder to obscure PARTS of the 360 footage but only 
  • Body language for 360 is super important for communication between the player in an asymmetric format
    • Can we add a kinect to map their movements to animation on the display?

Categories
Blogs

Week 3: Spaceship Mission

One of the key strengths of the Voxon display is the ability for multiple people to stand around it and see the same thing from different angles. It also inherently looks futuristic so our minds immediately went to Spaceship!

Image result for star wars mission control
we wanted to create a sort of mission control experience

We have also been interested in trying an asymmetrical co-op with a VR headset and forcing communication between the two as both technologies provide different perspectives.

How to merge these two ideas? The Voxon display would be like Mission Control, they would see the overall map, see the bigger picture. The person wearing the VR headset would be the person actually on the mission. They could see the details of the space, but would need help from Mission Control about the bigger picture.

We designed the below map puzzle for the two experiences to work together; needing to go to different portals to go to different floors to grab the key and get to the end.

The VR experience was designed to provide a sense of being on a space ship, with dim lighting so the player would need the help of those using the Voxon display.

Right: Voxon Emulator, Left: VR
Left: VR Right: Voxon Emulator
Left: VR, Right: Voxon Emulator

Our learnings from this prototype:

Using Asymmetrical Co-op:

  • A cool and novel experience, but isn’t very re-playable since once the puzzle is solved there isn’t much else to do
    • Could there be a more dynamic way for them to interact that could be played more than once and still be interesting?
  • It is a cool concept to be able to see someone inside the display, but what if the person inside could see the people outside the display?
    • Use 360 camera to capture players around voxon and paste into skybox of VR
    • Allows VR user to interact with Voxon user, and forces Voxon user to move around the VX1 display in order to point to where things are

VR Prototype:

  • We need a light source from helmet or VR headset to give limited view but dim the surrounding ambient light
  • This gives more immediate direction to VR player but still renders pathways as “concrete” it connects better to Voxon display

Important Input Limitation:

  • Switching between colors and angles in Emulator requires having the emulator screen in the foreground and therefore input will ONLY go either to the Emulator display or the VR environment
    • Therefore, the Voxon button would not be able to be used as an input into the VR world via the button
    • This we need to test with the actual Voxon display
  • Think about how input mechanics of Voxon display interface with input mechanics of digital game/experience
  • Inputs can only be for one or the other, but can change the visual information usable for asymmetry or coop
    • I.e. playing switch on a TV, and using the remote to turn up the volume
  • Best to switch the textures inside the Unity file to pair with the Voxon display limitations

5.1 Sound Learnings

  • Ambient soundscapes from within the digital experience add a LOT to the immersion of the experience
  • If we use surround sound and have those tethered to the player or the environment displayed such that they react dynamically and can provide audio cues on either what to do or where to go, then immersion will be crazy!
Categories
Blogs

Week 2: The Boat

The second week of classes was a time for us to explore different kind of processes that we could use for the rest of the semester. We had a few hiccups; the machine hadn’t arrived yet, we had to figure out how to use the Voxon Emulator (seen in video) to recreate a similar type of experience we would see on the machine.

We had two goals for this first demo, our goldspike. 1) Use an interesting interaction with the emulator and 2) learn how our art assets would work in this type of display. Without the machine we can’t say for sure how everything will work, but we can use this demo to make some educated guesses.

For this first goal we decided on finding a good interaction with the LeapMotion. This smaller screen that people will stand around made us think that interactions that feel ‘god-like’ will keep the users engaged. We created a simple hand-wave motion detector that would move the boat as if you were controlling the actual wind or water. We also tested tracking your hand in water to change the experience to feel more fluid.

testing how the feel of water changes the experience

The second goal provided us a lot of design insights on what could work well and what we think wouldn’t.

  • We found that black is the best way to use occlusion, while other colors remain mostly transparent.
  • There isn’t a lightbox in the display, so all shadows needed to be baked beforehand.
  • Cartoonish characters and objects seem to work better, as they are more recognizable when smaller than a very detailed small object. A character would have to have large body movements.
  • We started with the boat bigger in the beginning so the user can clearly see it’s a boat, then zoom out so that image is still in the users mind.
  • Increasing ambient sound volume helps a lot with expanding the space and immersion as it situates you within the environment of the display

We ran into some design challenges that we still need to find workarounds for in the coming weeks:

  • GPU intensive for visuals so doing operations like 2D video or changing textures more frequently than usual is harder, same with changing the models and stuff
  • Frame rate will drop and will lag therefore will need to find a balance between which rate we want and balance consumption and processing with art and story and design
  • Display is very small, how can we make the immersion feel larger than life?

Categories
Blogs

Week 1: Voltech

Voltech is a student pitch project team at Carnegie Mellon’s Entertainment Technology Center (ETC).

We are researching the design strengths and weaknesses of a 3-D volumetric display as a platform for entertainment technology, primarily games and interactive storytelling.

We are looking into not only what can be done well on this type of display, and what can only be done using this type of display.

These blog posts is where we will talk about what kind of prototypes and things we do each week, as well as learnings we found along the way.

Our first week has been relatively slow, as we are all getting back from break and gearing up for the semester. We have downloaded the Voxon emulator on our computers and have gotten basic demos going.