Categories
Blogs

Week 7: Spring Break

Time for a relaxing break!

Just kidding!! Spring Break isn’t for another two weeks! But we did have a spring break… inside the machine. A special lasercut spring made by Voxon for the machine broke and we don’t have an immediate replacement.

Thankfully, Voxon is very nice about the whole ordeal and offers to expedite ship us a new one (with a few extra just in case) ASAP.

Unfortunately, it is still all the way in Australia so it will still take a few days to get here.

Luckily we had 5 weeks of practicing creating content for the machine without actually having it, but this time, we have seen it in person and know a little better what works and what doesn’t.

With Halves next week, we take all the learnings we have had so far to keep polishing our demos so they’re ready to present during our presentation!

Categories
Blogs

Week 6: It Arrives!

Feb 18 2019. The day the Voxon finally arrives. We are so excited!

What better way to test the machine than to fire it up? The Voxon comes preloaded with demos that we spent hours playing with.

The machine has some specific lighting needs, as you can easily see the reflection of our christmas lights on the dome.
Finished off a very exciting week with putting our very own logo in the Voxon!

This first week we focused on learning as much about the machine as we could, mostly using the pre-loaded demos. Here are some of our learnings from this first week:

  • Cross sections are interesting; could make entire experience around zooming in and moving around the image itself (like a 3D where’s waldo)
  • Flicker is a definite issue:
    • Brighter makes it more noticeable
    • Red/Blue darker colors are less noticeable
  • Space Mouse is incredibly unintuitive
  • Lag higher with higher density
  • UI sucks
  • Hypnotic colorful psychedelic demos are the coolest
    • Particle effects and patterns
  • Stacking, jumping, and platforms are a cool mechanic given the height
  • Full bright multicolor is much better than monochrome for visual pop and identification
  • Kind of limiting… good to know constraints of platform when considering it as potential BVW or visual story platform
Categories
Blogs

Week 5: Everybody Was Kung Fu Fighting

The challenge we decided to tackle for this week was to create a multiplayer game that required movement around the display.

We decided to make a multiplayer fighting game, like Mortal Kombat or Super Smash Bros. However, we would be fighting cute, cuddly bears.

We decided on our input being an android app, as we could build our own controllers, use the different sensors already on the device, and be able to determine rotation using the accelerometer and magnetic field sensors.

We added the movements seen in the UI above, as well as a Block action (triggered by moving the phone quickly upwards) and a Special Attack, which you can use when the progress bar at the bottom is completely full (which you fill by yelling into the phone’s mic).

Connecting the android device to the game was surprisingly easy, and we will probably use this interaction again if another prototype calls for it.

The game ended up being pretty fun, even just using the emulator on our desktop computer. We can’t wait to see if it would be just as fun on the Voxon device!

Learnings:

  • Use the VOLUME FRAME LIMIT as a game mechanic
    • Players can jump out of frame but can imply position
    • No mesh for floor of the battle arena
  • Need to see on device to see if players actually move around the device as expected.
  • Using dynamic camera movements within the display as the players move around the arena allows for:
    • more dynamic gameplay
    • allows the area of gameplay, the ring or arena, feel bigger
    • allows zoom in on characters and animations to see details that would be impossible to see if it was static

Categories
Blogs

Week 4: Honey I Shrunk Myself!!

This week was all about how to get the person in VR in our asymmetrical co-op from last week to see the people around the display. The previous prototype was more about getting the two groups of players talking, but only one could see the other. What if they could BOTH see each other?

The previous week we had an idea about putting a 360 camera on top of the display, running that feed into the VR skybox, and BAM! Better than FaceTime right? So we decided to try it.

Image result for honey i shrunk the kids
since there is the size discrepancy, it would feel very much like the movie

Implementing the 360 camera technology ended up being pretty straightforward. We were able to have the person in VR see everyone surrounding the camera (including themselves if they were in view of the camera – it was very trippy).

The next step was to implement it with our previous Mission Control prototype, to add that extra communication element to help the puzzle go faster. The ‘Mission Control’ side (the Voxon display) would be able to physically point at different objects or paths that they wanted the VR player to take, rather than worry about giving vague directions.

view from the VR perspective

This worked out really well, and was a great morale boost for our team, but there are a few questions we still have for when we actually get the machine, as well as our learnings.

Questions:

  • What is the best way to represent the person who is in VR, in the Voxon display?
  • Is having the full 360 camera view the ideal view, or should parts of it be obscured by say walls with windows to see through?
  • How important is the small lag of the camera updating the skybox when people are moving around the display?
  • Would post processing of 360 stream for distorted or filter effects add to the experience?

Learnings:

  • Can use physical props to obscure the bottom part otherwise the virtual assets won’t mesh with the room environment
    • Or can theme it narratively such that the VR tiny person is surrounded by fruit flies or they’re climbing a small plant, placing the 360 camera inside a real clay planter
  • Windows instead of lights for cylinder to obscure PARTS of the 360 footage but only 
  • Body language for 360 is super important for communication between the player in an asymmetric format
    • Can we add a kinect to map their movements to animation on the display?

Categories
Blogs

Week 3: Spaceship Mission

One of the key strengths of the Voxon display is the ability for multiple people to stand around it and see the same thing from different angles. It also inherently looks futuristic so our minds immediately went to Spaceship!

Image result for star wars mission control
we wanted to create a sort of mission control experience

We have also been interested in trying an asymmetrical co-op with a VR headset and forcing communication between the two as both technologies provide different perspectives.

How to merge these two ideas? The Voxon display would be like Mission Control, they would see the overall map, see the bigger picture. The person wearing the VR headset would be the person actually on the mission. They could see the details of the space, but would need help from Mission Control about the bigger picture.

We designed the below map puzzle for the two experiences to work together; needing to go to different portals to go to different floors to grab the key and get to the end.

The VR experience was designed to provide a sense of being on a space ship, with dim lighting so the player would need the help of those using the Voxon display.

Right: Voxon Emulator, Left: VR
Left: VR Right: Voxon Emulator
Left: VR, Right: Voxon Emulator

Our learnings from this prototype:

Using Asymmetrical Co-op:

  • A cool and novel experience, but isn’t very re-playable since once the puzzle is solved there isn’t much else to do
    • Could there be a more dynamic way for them to interact that could be played more than once and still be interesting?
  • It is a cool concept to be able to see someone inside the display, but what if the person inside could see the people outside the display?
    • Use 360 camera to capture players around voxon and paste into skybox of VR
    • Allows VR user to interact with Voxon user, and forces Voxon user to move around the VX1 display in order to point to where things are

VR Prototype:

  • We need a light source from helmet or VR headset to give limited view but dim the surrounding ambient light
  • This gives more immediate direction to VR player but still renders pathways as “concrete” it connects better to Voxon display

Important Input Limitation:

  • Switching between colors and angles in Emulator requires having the emulator screen in the foreground and therefore input will ONLY go either to the Emulator display or the VR environment
    • Therefore, the Voxon button would not be able to be used as an input into the VR world via the button
    • This we need to test with the actual Voxon display
  • Think about how input mechanics of Voxon display interface with input mechanics of digital game/experience
  • Inputs can only be for one or the other, but can change the visual information usable for asymmetry or coop
    • I.e. playing switch on a TV, and using the remote to turn up the volume
  • Best to switch the textures inside the Unity file to pair with the Voxon display limitations

5.1 Sound Learnings

  • Ambient soundscapes from within the digital experience add a LOT to the immersion of the experience
  • If we use surround sound and have those tethered to the player or the environment displayed such that they react dynamically and can provide audio cues on either what to do or where to go, then immersion will be crazy!