Week 5

TheatAR Team Dinner at Stagioni

This week the TheatAR team presented its progress on Project Neverland to the ETC faculty. As part of this “quarters” presentation, the faculty got a broad understanding of what our project is trying to achieve and what our major challenges are. Additionally, it was a chance for us to reflect on the progress we’ve made on the project and to make sure we’re on target to complete everything over the next 11 weeks or so.

The faculty wants to make sure that we’re thinking about all the possible obstacles we might encounter in a project as complex and with as many moving pieces as ours. And it seems that while we don’t have answers to all of those questions, we have generally been thinking about the right things.

We’ve decided to move forward with the Micosoft HoloLens as our hardware platform. Despite its clear limitations (namely an unfortunately small field of view), and a pervasive lust for new technology (looking at you, Magic Leap One), the HoloLens as a more “mature” platform proved to be the most reliable tech to ensure that we can actually complete this project—at least for now. There is always the chance that we could switch to one of the alternatives in a few weeks, but we’ve settled on the HoloLens for myriad reasons, not limited to:

  1. As a platform that’s been around for a couple of years now, it has a treasure trove of documentation so that when we run into trouble, solutions are relatively easy to figure out.
  2. Unlike the competition, it supports networking without having to resort to zany, messy, and hacky solutions. Since we intend for as many people as possible to sit in the audience and wear a headset watching the same show, this is essential.
  3. It appears to have the best tracking. This was one of the main reasons we had to reject mobile AR, like Apple’s ARKit (although we also strongly prefer the experience of wearing a headset and watching a live performance as opposed to looking through a phone screen).
  4. It’s stable and reliable (The Magic Leap One has been running into some issues here).

    With our hardware locked, we can move forward and begin to combine the programming with early animation tests to see what Tinker Bell looks like zipping around the stage and starting to interact with actors. We’re in the process of casting our human actors as well to play Peter and Wendy, and talking about our set design for the nursery scene that we’ll be staging.

    One of the major (and exciting) challenges we’ll be running into next is dealing with how to cue the actors as to the location of a character that will be invisible to them but can be seen by the audience. This is something much more easily accomplished in film where a director can have as many takes as s/he desires, but for a live performance the stakes are much higher.

 

 

Week 4

This week was the culmination of a lot of research and extensive prototyping, the results of which has allowed us to definitively move forward on a story/script. Team TheatAR is therefore proud to announce that we will be staging a scene from J.M. Barrie’s classic Peter Pan, featuring an AR-animated Tinker Bell! The team is referring to the entire process of successfully staging this scene with its myriad technical challenges as Project Neverland, and here is our glorious promotional artwork*:

*artwork not final

So why Tinker Bell? After spending time testing augmented objects interacting with a physical space, the team has observed that a flying character would be well-suited for this kind of project because a character that mainly floats and hovers is significantly more believable than a character that predominantly walks on the ground. Additionally, Tinker Bell is relatively small in size and can fit comfortably even in the most narrow field of view (a limitation most of the available hardware is plagued with). With those parameters in mind, Tinker Bell emerged as a leading candidate and staging a scene from Peter Pan allows us to show the magic of an animated character having real presence on the stage, interacting with the physical set as well as the human actors that appear in the scene.

In pretty much all previous productions of Peter Pan, the character of Tinker Bell is performed by a little darting light effect that moves around the stage. While this was serviceable in 1904 when Barrie’s play first premiered, our approach aims to allow this character to generate real emotion, as we’ll be able to see her facial expressions and body language, much like one would see with an animated character in film. This will allow this character to in fact become an actual, meaningful character as opposed to an amorphous lighting trick.

We’ll be presenting our findings and progress thus far to the ETC faculty next Wednesday, but the results of our hardware tests point in favor of using one of the headsets as opposed to mobile AR for a couple of key reasons:

1) While mobile AR is capable of producing high-fidelity images on screen, the AR objects/characters cannot reliably be counted on to remain exactly where they’re placed; an untenable drift occurs when the device is moved around. Because we require our character to move to precise locations on our stage, we can only tolerate a very minimal amount of positional drifting.

2) Holding up a phone or a tablet to watch a live theater experience is unsatisfying as an audience member. Mobile AR certainly has its applications in live performance (perhaps ideally when an audience member can move around a space and use the device as a “magic eye” to see things that can’t be seen by the naked eye), but for a seated performance where the action occurs at a fixed distance from the audience, holding a device in one’s hands creates an aesthetic distance between the audience and performers that feels akin to watching a movie, even though it’s technically live.

The existing AR headsets have their drawbacks (the limited field-of-view is certainly the most damning), but for our particular needs and as of this writing, they appear to be the superior option.

Week 3

Testing things out!

The approach to creating a project like this has proven to be atypical to the way a work of theatre would normally be created. Generally, a theatre production is dictated by the script, and design/technical elements are then determined from there.

But we’re exploring uncharted territory, with a technical focus, and thus technology is driving our decision-making process. What is the available hardware capable of? What types of effects can they produce successfully? We’ve been spending this week putting the AR hardware through its paces. The results of these tests will help us make a decision as to which story/characters we can most believably bring to life on stage with AR.

The team put together a list of all conceivable interactions that an animated character could have with both a human actor and a physical set. Our programmers are then taking a handful of these interactions and creating simple prototypes to determine their viability.

By next week, at the 1/4-way-through mark of this project, we should have more definitive answers about which platforms can accomplish which interactions/effects, and be able to make an informed decision about how to move forward with story and casting.

Week 2

The TheatAR team.

As we’ve met with our project advisers, other members of the ETC faculty, and talked with industry experts and fellow students, there has been a single recurring response to our project and ambitions: “Wow, that sounds amazing, but really hard! Good luck!”

And it’s true! And as we’ve been learning from our experimentation with some of the major available augmented reality hardware (Microsoft HoloLens, Magic Leap, Meta 2, and Apple’s ARKit), none of these devices was really designed with what we are trying to achieve in mind. They all do some pretty amazing things, but the pioneers of AR clearly imagined interacting with a world that was no more than a couple of feet in front of your face. As we are trying to create an experience that allows and audience to sit in a theater and view a performance on stage from a moderate distance, the limitations of what the current generation of the tech can achieve are quickly coming into focus (unlike some of the images inside the headsets we are testing—ZING!).

Still, it’s been exciting to play around with the latest and greatest hardware. Hot off the factory production line is the Magic Leap One: Creator Edition, the source of endless speculation and nay-saying among developers in the AR space.

It’s Alive! Well…not really, since it’s an inorganic piece of tech equipment. But you know what I was going for.

We’ve only had the chance to play with this thing for a few minutes so far as of this writing, but our early impressions are positive. We’ll be putting it through its paces over the next few days to determine if it has value to our strange, particular needs. In the meantime, it’s been fun to watch a T-Rex fight with some medieval knights on top of a table.

We’ll see if she’s still smiling in a few days.

While the tech team has been furiously prototyping with all of the hardware listed above, the others have been working in parallel to identify the story we want to adapt to the stage for this format. There have been a lot of fun ideas tossed around, and the team is beginning to rally around one we want to start working on. There will be more on that front soon, but I’ll leave you this week with some photos of our theater space that we’ll be using for this project. For now, let’s just say that our show selection will be taking this unusual space into account. See you next week!