#6: Brainstorming Character Controls

  • Ideation:
    • Brain Board:
      • How to build this off of initial character controller setup mapped to traditional mario-style game pad (and track switch button)
        • Also how to fit this mechanic narratively into story
      • Playtest with single gear ‘axon’ and single conductive LEGO ‘axon’
        • Buy and test conductive paint for wear-and-tear
        • See what is more fun, or perhaps we can combine the two concepts together into fully mechanic piece
      • What types of behavior can we affect?
        • 5 senses and movement
        • Just movement
        • Just senses (visual, auditory, haptic, olfactory)
        • Jumping (height, double jumps, etc.)
        • Different power-ups (like mushrooms/stars – size, speed, shape, form, color, power, jump height, etc.)
      • Brain board should enable live LEGO interactive play
      • Is there still a role for a person to control the dog?
        • Just simulation or direct control?
      • Playtest to see if game structure is too overbearing on LEGO making or just right
    • Visual effects for projection only (not gameplay affecting) and perhaps for something like block shape hints and stud light show

We talked to the client further about the design developments and we found that they really liked the idea of a leaderboard or online saving. Something to consider was since this installation would be at the library for an extended period of time, how we can use the history and length of the gameplay to memorialize community building. They also reminded us about the physical cart design and that it should be inviting, and implicitly provide permission to play.

Art Progress

Our artists have created models of Scotty and students for our game. We realized that the black version of Scotty would not display well when being projected, so we created a white version of Scotty as well as gave the students a white hoodie. Here you can also see a LEGO style student and a non-LEGO styled student.

Tech Progress

Our programmers have succeeded in depth correction and voxelizing the depth data! It’s running extremely slowly, but here you see the different steps in the detection system – first from the Kinect depth image, then obtaining the perspective-corrected LEGO baseboard. Then we correct the depth (you can see the bottom left corner is no longer considered “nearer” by the lightness of the color between the second and fourth image) and downsample the image with a nearest point filter to obtain the pixelized height map for our voxel generation.

Previous post #5: Rapid Prototyping Sprint
Next post #7: Paper Prototyping and Playtesting