Ripieno

Animation Studio

Quarters is a unique opportunity that projects at the ETC have to demonstrate progress and intention, and for teams to get feedback from multidisciplinary faculty. We were able to present and walk through our story using a timed animatic, and provide supporting concept art to give a fully picture of our goals. In turn, we received strong advice on how to address challenges in expressive story, render time, and effective playtesting, which in turn informed our plans going into this week.

Character Modeling & Rigging

Our character modeler and rigger have been working together this week to get our India doll “dancing” and our Manny model rigged for hand-manipulated animation (rather than motion capture). This is all in anticipation of the next few weeks, where we will be putting our dancers into the motion capture studio.

Manny is now full rigged and ready to hit the dance floor.

One thing that was brought to our attention during critique was the fact that, at the distance the dolls currently sit from the viewer, it is impossible to see the essential details we want them to see. It is important to us that the material from which the dolls are made be visible enough that it informs the inner understandings audiences will have of them. In this instance, India is going to be made of paper maché, and while her skin currently has stippling, it is not visible from a distance. While this seemed like a simple fix – just increase the visibility – we ran into some technological issues…

Well…. at least you can see the paper mache now….

Environment Layout

Now that our environmental concepts are done, faculty asked us to think about scale and distance. Our instructors suggested that we determine Manny’s appropriate size in relation to the viewer, and then measure all objects and distance in “Mannys.”

With that mind, our concept artist  created a scale and placement map for our environment modeler. All objects (including the camera height) are created in comparison to Manny, and represented as unique basic place-holder shapes.

Manny is like the “smoot” of our generation

This design was then mimicked in our environment model. From there it was a case of iterate-iterate-iterate. We would render an image, put our concept artist in the headset, and ask if it looked how she imagined it. Then, once it met her specifications, we ran it by the rest of the team for levels of comfort and visibility.

Check out that precise placement!

Doing this level of iterative testing showed us the importance of the distance of the wall vs. the distance to the edge of the table. It also demonstrated the point of proximity at which people become extremely uncomfortable with Manny and the cultural dolls.

Getting Sound Involved

How many software applications does it take to screw in a lightbulb? Probably fewer than it took to integrate our sound.

Once all the objects were appropriately placed and confirmed as comfortable by outside playtesters, we were able to confirm the actual physical distance of each object – something our sound designer really needed. You see, turns out spatialized sound requires understanding the literal physical distance of the objects from the viewer. Now that he has that, our sound designer could begin test audio integration into our environment.

This, it turns out, is not as easy as dragging audio into premiere. He had to create an entirely new and complex pipeline for getting everything to fit:

We are using Facebook 360 to control spatialized sound positioning, but the video player associated with it can only handle up to 2 megabites of film, rendering it unusable for our purposes. Go Pro VR, however, was perfect for our video needs, but required a patch that was not designed for Facebook 360. Therefore, our sound designer had to re-map it to work with Facebook 360 audio.

And that’s just the least of the technical struggles he faced. But on the bright side, it’s working now!

Going Forward

After a number of meetings this week, our story is locked in and our environment is awaiting approval. Over the weekend we will get our first scratch track for choreography (for India’s dance, to be specific), and will set deadlines for the completion of the rest of the music and choreography by Monday.

Our film is thirsty for playtesting, and come Tuesday will aim to create tests for viewer attention in our environment, and emotion comprehension of face-less mo-capped animation.

Basically, it’s the wheels are finally starting to roll on this train!

See you next week!

The Ripieno Team Flamingo, Concerto

 

mschoell | mschoell@andrew.cmu.edu

Related Posts

Hello everyone, As we wrap up, we thought we would share some of the lessons we learned. Dedication: Our team was incredibly dedicated to the project. Animation is difficult and inevitably there will be crunch time. What got us through was the fact that everyone is incredibly passionate about making this project the best it could […]

Well, it is that time. We’ve wrapping up our finals presentation, Melody of Life is through the rendering process, and Agloe is fully boarded and conceptualized. Agloe Locked Our Agloe documentation is actually quite impressive, if we may say so. Over the past week, we finished modeling our models for Hans and Cassie, drew beat […]

This week was our soft opening. Last Monday, the ETC opened its project doors to visitors and faculty to review our final products and provide ways we can refine between now and our final presentation. For the purposes of this event, we created a rough render of Melody of Life and storyboards of Agloe. Seeing […]