UI Research at the Carnegie Science Center

Over this semester we have been doing a good amount of research on concepts and user interfaces that can helps us to create a better final product. Our idea is to create a flexible, dynamic and intuitive interface that exports animations to HERB. The interface we are trying to develop is something that gives the the operator the possibility of composing before the play as well as tweaking those actions in real time.
There are several projects with similar concepts in terms of robot manipulation and one of those projects is located at the Carnegie Science Center, in Downtown Pittsburgh.

A couple of weeks ago we had the opportunity to visit this installation and got some valuable insights about the interface and  the setup of this interesting robot. Below there are some pictures we took in our visit.

 

IMG_0838
The robot is a human size puppet with fully expressive face/head and arms/hands.

 

Photo Jan 27, 2 38 07 PM
There is a special live interface that let you manipulate the robot movements in real time. One of the main features of this interface is the ability to puppeteer the head while seeing a live video streamed from the head itself.

 

Photo Jan 27, 2 38 15 PM
There is a library of pre-composed performances. It seems that originally there was a feature that would let people save compositions on the library slots. This type of feature is something that we are implementing in our own interface.

 

Photo Jan 27, 2 38 21 PM
The main composing tools is based on a timeline functionality similar to the ones you can find in After Effects and Premiere.

 

Photo Jan 27, 2 41 12 PM
There is a subset of animations and particular actions that can be drag and dropped into the timeline.