Animation System Design

The straightforward way to design an animation system for HERB is to use a fixed pose-to-pose design as illustrated by the following image.

pose-trasition

 

The disadvantages of this approach are:

a. It’s a fixed graph so in order to replay a sequence, you have to go back through a set of unwanted poses. Additionally, the graph would have to be huge to support every line of the play because everything is hand animated, this is confusing for the operator as it is not easy to visually parse.

b. If the director wants to see a different version of an animation/pose for example something faster or slower, this doesn’t provide that functionality

c. It forces the director to adapt to the tech rather than a more user-centric approach

We tried to look for another solution that can solve at least some of these problems. First we focus on flexibility; based on research we did (specifically the perception research I posted about before and further, The Effect of Posture and Dynamics on the Perception of Emotion) we came up with the idea of parameterized animations. The design of the workflow is to hand animate a gesture/action and then use a Blender script I have written to generate parameterized versions of that main action.

The two parameters we came up with based on the research paper are ‘openness’ and ‘intensity.’ Intensity is basically the sped of the animation and this is based on the paper’s findings that altering the speed alters the intensity of perceived emotion. So for example

 

intense1More Intense

 

intense2Less Intense

We define openness as the amplitude of the joint angles or the distance of the limb from the body. This is also based on the findings of the paper that indicate that blending a motion with a more neutral motion and thus altering the pose changes the perceived emotion. Here is an animated example of this concept as well.

 

Less Open

Less Open

More Open

More Open

Now we define an animation set as consisting of the main original gesture plus say n1 less open versions and n2 slower versions of these (n1 +1) animations so a total of n2(n1+1). We hand animate the most extreme, the upper limit of the physical capabilities of the robot i.e. the fastest and most open joint angles so that we only have to test that version for collisions and velocity limits.

Ideally we would want any animation to be able to lead into any other animation to provide maximum flexibility and stay away from the closed graph problems but this is not really technically feasible because

a. It would require us to programmatically generate every transition between n animation sets which is an n² problem

b.There’s no guarantee these transitions would not cause velocity faults or collisions so those would have to be hand checked

c.A large database of animations would have to be stored, all of which may not even be useful as we may not even transition between certain animations ever

So we came up with a middle ground solution, which is not perfect, but a step toward making something that eliminates at some of the problems. We design a pose-to-pose graph but we also have parameters for each pose which we generate as explained. To solve the problem of exploding the graph with too many transitions, we want to have hidden transitions from the parameterized version of the pose back to the original. This way we have some flexibility but are also trying to work within the technical limitations.

Proposed: Pose to pose with hidden transitionsProposed: Pose to pose with hidden transitions

 

By Rachina Ahuja (Animator)

Playtest: People guess HERB’s emotions.

Part of our research and development process is to get data for the potential audience and users to iterate our designs. Of course there are several ways to achieve this but Playtesting or User Testing is one of the most common ones. For our project we did a playtest in which we collected data from a diverse pool of participants who where trying to guess what emotion was HERB portraying based on a set of simulator animations.

For this playtest we created a questionnarie using the Robert Plutchik Basic Emotions Theory that describes 8 basic human emotions that scale to other human subemotions. Those emotions are:

•Fear
•Anger
•Sadness
•Disgust
•Joy
•Trust
•Anticipation
•Surprise

We asked the participants to watch an animation and based on their guesses pick one of those eight basic emotions.
We then asked them to evaluate in a scale from 1 to 5 with how much intensity they were perceiving that emotion.

6

 

After playstesting with a total of 53 participants, we got a lot of useful data and insights.

Particularly we got some specific animations that perform better that the others. This kind of data is very useful because we can conclude that certain movements and trajectories are absolutely representing a single type of emotion. In other words, when most of the participants agreed that an animation was conveying certain emotion, for instance joy, we were able to conclude that in fact that animation was representing that emotion and therefore we are able to use that animation as a very high fidelity reference for the design of other subemotions.

 

Joy
Joy

 

Playtest and User Test sessions are an essential part of a design process, especially when that design is intended for certain audience. In Bowtie we are using playtesting to iterate the design and to get confirmation that we are getting the desired results. We highly suggest playtesting and user testing for any design process.

Rehearsals 2

We keep advancing with our development proccess and one of the most important parts of this project are rehearsals. A couple of weeks ago we had our second series of rehearsals. This time we were able to test the play with HERB itself. After this rehearsal we got a huge amount of insights and feedback that we are using for iterate several elements regarding the UI design and the Animation Pipeline.

Here is the video of the rehearsal in which we went over the “Sure Thing” script with the female actor, HERB and the Bowtie team.

Enjoy the video!

 

Openness and Intensity: A concept for the development of robot animations.

The animation pipeline is one of the most crucial elements of our development process. The animations constitue the main input for the robot to generate corporal movements with emotional intention. On the other hand, those animations must be dynamically integrated on the interface we are building. To achieve high polished but flexible animations we had to come up with a concept that fit this constraints. After several discussions and research we decided that all our animations would be rule by two main attributes. Openness and Intensity.

Openness

The concept of openness refers to the diferent ranges of trajectories within an specific joint. For instance, imagine a scale from 1-10 fixed to the openness of your own hand. 1 would be a value in which you hand is completely closed (fist position)  and 10 would be a value in which your hand would be completel opened (open hand palm).

We can see the concept illustrated in the following image.

4

 

Intensity

The concept of intensity refers to the velocity of the trajectories within a line of joints. For instance, imagine the same scale we used in the openness example, but now the values are not going modify the position but rathe the speed of certain joints movements. In this case a 1 would be a very slow and mild movement and 10 would be a very fast and aggressive movement.

We can see the concept illustrated in the following image in which we can see a representation of the different curves / keyframe values.

5

 

In conclusion, when working in projects that required animations to be dynamically pushed into physical animated objects, is very important to create high end concepts that can be scaled down and applied to each specific animation.

Disrupting a playwright using a robot as an actor.

One of the goals of Bowtie beyond transforming HERB itself into an actor is to completely transform the acting stage and find new angles from this kind of performances. When something like a robot takes the role of the main actor, several things start changing. One of the most relevant changes is the way other actor interact with this new playful object. Also the way the director guides the performance change completely. He or she must address different problems on how the attention is being allocated and how the flow of the performance is going. Here is a diagram showing the normal process with human actors:

1

As we can see. The process seems very natural. There is direct feebdack and communication between the actors and the director. Of course a bad actor or a bad director can be detrimental for the correct flow of the performance. But even is the most extreme cases the verbal communication and verbal cues generate a good framework for the development of the performance. In contrast here is the digram showing how the process work with a robot and the setup we are trying to build:

 

3

 

Adding something as complex as a robot to an acting performance would be always a big challenge to solve. However, Bowtie is desgining a flexible interface as wells as several dynamic animations that can make the communication process between the actors, the director and the robot operator, much better. The ultimate goal of Bowtie is to contribute material that can be used in future research process to develop better human-robot interactions.

Architecture

A big part of the Bowtie development process relies on technology. As you may know, our main goal with this project is to create a flexible and robust interface that would let an operator push certain pre-composed animations to Herb, in order to make him perform the role of an actor.
In the process of creating this interface we have to touch and intersect several technologies.
Some of those technologies are the Robot Operating System (Herb’s Core OS), HerbPY (Herb’s Python API), Motion Rave (Motion Planning), Blender (Animation Software) and some others. Bowtie is building the interface using several features and intersctions of those technologies besides the hardware that includes the robot itself.

Below you can find a graphic of how those technologies are layered for the development of our interface.

As always, we’re open for feedback and comments.

Architecture

Observation of Human Expression: Emotional Reference.

Part of our research in Bowtie is trying to understand and interpret the best possible emotions for a robot in order to create the most optimal human robot interaction. As part of this process we executed an Observation of Human Expression that we could use as emotional references to design our animations. We also wanted to see what’s the level of understanding of a general audience in respect to anything that has to do with emotions and human expressiveness. We put together a video with the footage of this observation.

As always we’re open for your comments and feedback.

 

UI Research at the Carnegie Science Center

Over this semester we have been doing a good amount of research on concepts and user interfaces that can helps us to create a better final product. Our idea is to create a flexible, dynamic and intuitive interface that exports animations to HERB. The interface we are trying to develop is something that gives the the operator the possibility of composing before the play as well as tweaking those actions in real time.
There are several projects with similar concepts in terms of robot manipulation and one of those projects is located at the Carnegie Science Center, in Downtown Pittsburgh.

A couple of weeks ago we had the opportunity to visit this installation and got some valuable insights about the interface and  the setup of this interesting robot. Below there are some pictures we took in our visit.

 

IMG_0838
The robot is a human size puppet with fully expressive face/head and arms/hands.

 

Photo Jan 27, 2 38 07 PM
There is a special live interface that let you manipulate the robot movements in real time. One of the main features of this interface is the ability to puppeteer the head while seeing a live video streamed from the head itself.

 

Photo Jan 27, 2 38 15 PM
There is a library of pre-composed performances. It seems that originally there was a feature that would let people save compositions on the library slots. This type of feature is something that we are implementing in our own interface.

 

Photo Jan 27, 2 38 21 PM
The main composing tools is based on a timeline functionality similar to the ones you can find in After Effects and Premiere.

 

Photo Jan 27, 2 41 12 PM
There is a subset of animations and particular actions that can be drag and dropped into the timeline.

Rehearsals!

The last week we had a Rehearsal session of the play that HERB is going to interpret in April 30th. In this meeting we also had the pleasure of talking with the Director, Sam French, and show him our progress in the UI Mockups that are a pretty close image of what we may have at the end od the development. We receive his directional insights and we are going to implement some of his ideas in the UI development ahead of us.

If you want to check the rehearsal please go to the videos below.

Best,

Team Bowtie,

 

 

 

Hello world! Hello Bowtie!

This is the first entry of our blog! We are very happy to release our website/blog and show the world the progress of our project. If you’re interested in the project please don’t hesitate to contact us.

Meanwhile here is a video featuring our beloved aspiring actor: HERB

We hope you enjoy it.

 

Best,

Team Bowtie