A large amount of software runs collaboratively in realtime to make our robots "tick". Due to the networked nature of IPC (discussed below), we may easily offload processing onto multiple computers to harness greater computing power. Quasi uses the combined force of three 3ghz Pentium-4 based PC's to put on the show.

Inter-process Communication
Developed at Carnegie Mellon's Robotics Institute, IPC provides a subscription model for applications to pass messages to each other via a central server process. As long as applications can see the central server via a TCP/IP network, they can communicate. Each application registers with the central server, and specifies what types of messages it publishes and what types it listens for. Any type of message that is passed to the central server is immediately copied to all other processes. This model allows for easy expansion of other applications - they can be distributed among many physical computers or multiple copies can be run for multiple outputs.

Character State Control System
The backend decision mechanism for a character running on the Interbots Platform depends on a set of system inputs and a network of related states and transition rules. Inputs are split into internal "factors" and external "facts." On the backend, this behavioral model is stored in a database format. This behavioral database is accessed in two very different ways by two separate applications, the Behavior Authoring Tool (a content creation tool) and the Character State Control System. The Behavior Authoring Tool (BAT) is the interface through which a content creator develops personalities and behaviors for a character. The Character State Control System (CSCS) is the decision making software that controls the character when it is performing. Effectively, BAT is an offline pre-programming tool while CSCS is an online runtime execution tool.

In order to fully understand BAT, CSCS, and the general theory behind the autonomous behavior available with the Interbots Platform, the terms "fact" and "factor" should be further understood. Facts represent knowledge acquired from the character's external environment. This is typically an input such as sound, video, button presses, IR readings, etc. Facts can be "direct" (e.g. a distance reading from an IR sensor), or they can be indirectly interpreted from an input. An example of an indirect fact might be "Someone's standing in front of me," determined from the direct IR sensor input. Factors represent the character's internal environment. This includes a character's emotional state. For example, the character Quasi has five emotions or motivations (happiness, aggression, loneliness, cleanliness, and attention), each of which is represented on a numerical scale. Factors are the only elements of persistent state that are stored in the behavioral database.

CSCS handles the changing of character state at run time based on a character's behavioral database. An Interbots Platform character's behavioral database is, at the high level, a set of what is called "superstates". These can be thought of as categories of behaviors. The original Quasi personality included Idle, Sleep, Greeting, Farewell, and GiveCandy superstates. An Interbots character will always have a default superstate which is considered the "Idle" state. Each superstate (except Idle) has a set of "entry requirements" or fact and factor requirements that must be true for the character to enter that superstate. If the character's state does not match any superstate entry requirements, CSCS will default to the Idle state.

Superstates also have "stay requirements," or requirements that must be true for a character to keep progressing through that superstate. If at any point the character's state fails to meet the stay requirements (or if the character completes a superstate), CSCS will try to find the next appropriate superstate. Superstates are not connected to each other in any explicit way. Whenever CSCS is changing superstates, it considers all possible superstates, finds those whose entry requirements match the character's current state, and then randomly picks one out of those with the highest priority setting.

Within each superstate is a network of substates. This network can be thought of as a traditional finite state machine. Substates are explicitly connected via "transitions," and the starting substate of each superstate is marked as the "entry substate." At each substate a couple of important things happen. First, adjustments are made to the character's internal factors. Second, "actions" take place. Actions are sequences of output that the character executes. This can include events such as character animations, lighting changes, sounds, candy dispensing, displaying something on a video screen, or even explicitly jumping to a different superstate. Each substate contains a list of these actions, each of which contains a timeline of events. One action is selected from this list and CSCS sends the matching event messages to the output applications through IPC. After a substate is completed, CSCS considers all transitions leading from that substate. If there are no transitions, CSCS exits the current superstate. Transitions can also be weighted to control their probability of being executed and also have transition requirements much like superstates have entry requirements.

Real-time Show Control System
The Realtime Show Control System talks to all output hardware, mainly the robot control system, lighting, and candy dispenser. The control chain starts with a Color Kinetics SmartJack connected via USB, which then creates a DMX signal that all the hardware listens to. DMX is a popular protocol similar to MIDI that is used in clubs, theaters, and theme parks. Originally intended to control lighting, it has since ballooned in use to control motors, special effects, and even fireworks. A single DMX line supports up to 512 channels of device control - we use 72 for Quasi and his additional hardware. RSCS reads the animation files exported from Maya, and plays them back, blending them in realtime.

An aside to RSCS is the Virtual Show Control System (VSCS), a recent addition which provides a virtual means of previewing the interactive experience in its current state. VSCS provides the same functionality as RSCS, only the end recipient is not a physical character, but rather a virtual 3D animated character running in the open source Panda 3D simulation engine. This provides developers with a simple way to test interactions quickly without the need for a physical character. Content creators can devise interactions even while the physical figure is being constructed or is out of commission.

The Interbots Platform includes a tool for face tracking called "cVision" which is an implementation of Intel's OpenCV face tracking library. We believe the ability to detect faces to be a crucial form of input for an interactive character designed to interface with humans. In the Interbots Platform, cVision's face information is factor of highest importance in determining the "point of highest interest" for the character. This point then influences where the character "looks," i.e., its gaze target. For Quasi, this gaze target is allowed to override control of his neck, allowing him to appear to be looking directly at the face of a guest. The variation in movement induced by this method also lends an element of realism and believability to the character's movement.

iButton Reader
The iButton Reader application interfaces with a low-cost iButton reader module over RS-232. Residents in the building carry iButton tags on their key chains, which when pressed to a reader on Quasi's kiosk allow him to identify specific people and greet them accordingly. iButtons provide a unique identification number that can be correlated to a user database. Their greatest benefit is the low-cost and ruggedness of the tag and reader, critical when people carry one with them at all times. We are also considering the use of fingerprint readers and Active RFID systems to identify people in a more intuitive fashion.

In order for Quasi to perceive his environment, he must be able to read values from his sensors. The MIDI Reader application listens to values streaming in from a custom circuit board that translates analog voltages to MIDI. It then translates the MIDI signals to IPC messages and passes them to CSCS. While the custom board is the only device hooked to the control chain in the current Quasi implementation, MIDI is used quite often in the fields of art and music, and being able to read these values means that the system could potentially allow Quasi to jam along with a human playing a musical instrument, or even synthesize his own music on a device using the MIDI standard.

The Babble application is a simple implementation of the open-source Sphinx voice recognition package developed at Carnegie Mellon University, enhanced with the capability to pass messages over IPC. Babble listens for and captures chunks of sound that are louder than base ambient noise, then runs an analysis on the captured waveforms to identify phonemes. The identified phonemes are then matched against a database of predefined words, and any matches are passed to CSCS over IPC for implementation. This system works fairly well for simple voice responses to Quasi's questions.

ipcXtra for Director
The ipcXtra is a plugin for Macromedia director that allows it to communicate with the IPC message passing system. As a result, we may leverage the power of Director to author rich interactive media content that has the ability to communicate two-way with CSCS.