A lot has been accomplished this week. Front end and back end are building two sides of the same bridge, and while they haven’t exactly met in the middle yet, they are a stone’s throw away from each other.
Elodie and Glen have the server set up and are working on the database. Elodie focused on the image server; a big feature of our final product is the ability to take pictures of objects and add them into the database, as well as recalling and displaying those images. Glen has built the rest of the database and they are both now creating the protocols for front end query and back end answering.
Xin has completed a working prototype of the app. It showcases the galleries, the work space, and the ability to ask questions about single items and to display connections associated with those items. Jeremy continues working with NGUI, a powerful Unity UI plugin. In a way, our two front end programmers are also building a bridge; Xin is creating the guts of the app from the ground up, and Jeremy is learning the tools we will use to make the final version clear and beautiful.
Allison continues with concept work for the final look of the app; she will be the keystone between Xin’s prototype and Jeremy’s final UI interface. She also conducted some photo tests at Makeshop. Using an iPad camera and natural lighting, she took pictures of artifacts in situ to make sure user-generated photos could be usable. The results were outstanding; the best photos seemed to be the ones taken in Makeshop’s natural light using their wood-grained tables as a background. The aesthetically pleasing photos have the added benefits of being clear to younger viewers and easy to take with little setup.
Chris has been working this week on the “2 item” connection interaction. In the app, if a user asks for connections to a single item, they are shown other materials, tools, and processes that are directly related to that item. However, the crew at Makeshop is also interested in showing a bridge of connections between two seemingly unassociated items. For instance, if a child asked how a saw and cloth might be connected, the app would show that saw connects to wood, which connects to a chair made by sawing wood, which connects to the cloth used to upholster the chair. This interesting interaction would seemingly drive the user toward thinking about the non-obvious connections between items. However, in playtesting, we have not seen any children grasp this interaction or see any utility behind it. Chris’ feeling is that they get lost in the convoluted steps between items, and has been working all week on a way to make this interaction more clear and useful to children, but has so far remained stumped.
Next week, we’ll drop in the keystones that connect our different bridges and put a real working version of our app into children’s hands.