Post-Mortem

Introduction

Augur was a client project for Lockheed Martin Corporation, advised by Jesse Schell.  The objective of the project was to explore ways to improve the design of predictive artificial intelligence.  The team was free to approach this objective in whatever manner best suited its interests and abilities.

After some consideration, the team decided to forgo the programming-heavy machine learning approach favored in current AI research, choosing instead to focus on crowdsourcing for data using rapid prototyping.  This data allowed Augur to build AI that could predict player activity within these prototypes based on a variety of factors.

A key part of this approach was Amazon’s Mechanical Turk service, which allows users to post jobs for other users to complete.  Augur used Mechanical Turk to collect thousands of data sets for the prototypes for a relatively low cost.

The primary goal for the project was to design simple predictive AI algorithms that could demonstrate a better-than-random success rate, with the deliverables consisting primarily of documentation regarding the team’s methods.  The AI themselves were designed toward contained situations that do not necessarily correlate to situations important to Lockheed Martin, but can serve as proof of concept for the methods used to produce them.

The Augur team consisted of Brandon Perdue, producer; Russell Mester, artist; Chenyang Xia, programmer; Nora Bastida, programmer; Jitesh Mulchandani, programmer; and Evan Brown, hired artist.

What Went Well

The team hit the ground running early in the semester with preliminary ideas, which helped to avoid the potential indecision that plagues such open-ended projects sometimes.  This initial development phase revolved around developing the maze prototype and experimenting with Mechanical Turk itself.

Mechanical Turk, in particular, proved to be a significant part of the project because of its novelty and potential usefulness both to Lockheed Martin and the ETC.  The team did not expect this outcome, but with the useful data collected via Mechanical Turk demonstrates, the service has significant applications for the kind of projects that both parties sometimes do.

Furthermore, upon analysis, data from all prototypes yielded results that were not always intuitive to a problem.  For instance, based on the maze prototype data, left-handed people are more likely to turn at an intersection than are right-handed people, assuming a maze-like situation in which they have only a limited view.  (There was not a significant correlation to which direction players would turn.)  Though these revelations are largely trivial, they do demonstrate the usefulness of data sets collected via Mechanical Turk or other crowdsourcing resources.

The team also found ways to present data graphically and explain the significance of their findings to those outside the project.  After some initial difficulties, the team found their rhythm for presenting the technical details of the project in easily explicable terms.

What Went Less Well

Especially in the project’s earlier stages, the vagueness of the project goals left the team with a lack of focus.  Although the team managed to decide upon directions without wasting significant time in the process, there were parts of the semester when clearer goals would have been highly beneficial to the process.

Perhaps because of this, the team’s early presentations were weak.  Those who did not know much about the project came away with little more information at best, and confusion at worst.  During the first half of the semester it was clear that outsiders often were not really sure what was going on.

A few efforts to acquire existing data were also under way as part of the project, both from EA and Schell Games, but both fell through (at least within the time that would be useful to Augur): the EA data got locked in contract negotiations between EA and CMU legal, while the SG team was (understandably) consumed by other commitments.  Fortunately, Augur had other directions to pursue so that these setbacks, while disappointing, did not prove too problematic.

Lessons Learned

Perhaps the most pragmatic lesson of Augur, at least for the team, was to set a clear goal, even if (and perhaps especially when) the client does not have one in mind.  This was always a risk from the beginning, and though the team dealt with it reasonably there was still less structure, at times, then might have been helpful.  If semester projects were indefinitely long this may not be as necessary, but given the strict schedule and deadlines, plain milestones are crucial.

Secondly, a little data visualization goes a long way.  Not only can the right graph or custom 3D visualization make data easier to crunch for researchers, but it can make it clearer to those outside the project, from visitors to stakeholders.  For better or worse, crafting data into impressive graphics makes what they reflect seem more important and the project itself more tenable.  (This is not entirely surprising; in many ways, it is a generalized reflection of the “CSI effect”.)

Conclusion

Most of all, Augur discovered just how easily people can reach a large number of users via the Internet for the purpose of work and research.  The usefulness of crowdsourcing as a resource can hardly be overstated, and tools for accessing this resource exist (and are always evolving).  This potential will likely only increase in the coming years.