It’s been a couple weeks since the WebWise conference in Baltimore, but the information hasn’t gone stale yet. It’s still cutting edge and thought-provoking because most of the projects discussed are still underdevelopment and it’s unclear if they’ll succeed or not. However, they do provide a glimpse of current and potentially future practices in the use of technology for historic sites. The webcast videos are now available except for LeVar Burton’s presentation, which is waiting for broadcast permission. Heritage Preservation will be using portions of the webcasts to create webinars that will be available later this summer including, “Crowdsourcing in Public History” and “Oral History in the Digital Age.”
The first panel session, History Places and Spaces: Learning and Participation on the Move, was moderated by Nancy Proctor and looked at the ways that mobile technologies (that is, technologies that are aware of your location) can provide new and better experiences for visitors.
Rob Stein at the Indianapolis Museum of Art opened the session by stating that the field already recognizes that location should be a required element in basic content but the challenge is the proliferation of software and hardware platforms. Every digital device (website, cellphone, tablet) seems to be in a unique format and requires special treatment. To overcome this, Stein suggested we rethink how we manage content to avoid instability and chaos. As a case study, he outlined TAP, a set of tools that will allow you to create and deliver mobile tour applications in a museum or site setting. The Indianapolis Museum of Art is leading the effort in collaboration with museums, vendors, and content experts across the US, such as the Minnesota Historical Society, Guide By Cell, National Air and Space Museum, and AdLib Systems. The goal is to create a standardized database with information, images, audio, and location in a consistent format that can be easily shared or presented on a variety of third-party platforms. TAP is still under development but an example of a project based on it is the Balboa Park Walking Tour for an iPhone. Also be aware that Stein is really developing an easier, not easy, set of tools–most of us will find this daunting. Nevertheless, his presentation pointed out several issues we need to all be aware of as we consider online projects:
- Have a standardized database for all of your assets (information, images, audio, and location) in a consistent format that can be easily shared or presented on a variety of platforms. [I’ve noticed a tendency to create separate databases for different collections, such as photographs, objects, archives, archaeological artifacts, and oral histories, so try to maintain a single database if possible.]
- Managing content is crucial, but so is storytelling. Moving from a linear format to a branching one can become very confusing quickly and narrative arch is lost. Consider how interpretation should be organized to allow both free-choice learning within a meaningful structure. For example, to interpret a large art collection online, do start with the earliest, go left to right, or arrange by geography or theme?
- Consider the possibility of “generative storytelling,” that is, the computer develops custom interpretation based upon what we’ve seen, knew about us, or where we’ve been. If this intrigues you, check out Marc Riedl at Georgia Institute of Technology on intelligent narrative computing and his Fabulist computer architecture for generating stories automatically.
Jason Casden of the North Carolina State University Libraries discussed their desire to do more with their unique historical materials than just digitizing them, so they created WolfWalk, an interactive web-based application that allows users to learn about the history of the campus while walking around using their iPhone or iPad. By tracking the location of the user, the tour automatically changes its content to show you the appropriate images and text for the location. They’ve created both a web-based and native application for the 2010 launch, and the results have been counterintuitive:
- Both the native and web-based apps have the same features, but the web app was easier to develop and create.
- The native application is more popular (5500 downloads, 500 image requests daily) versus web-based version (426 from iOs), but usage seems to be declining. Perhaps this is because the native app is much easier to find than the web version. MobileHistorical.org by Cleveland State University may be a possible solution.
- The map is the least popular way to access information. Most frequently used are places, decades, and themes.
- User demographics are unknown, however, the top image requests are athletics, 1970s, 1980s, 2000s, student life, 1960s, people, long gone, 1950s, 1940s. Combined with peak use on the weekends, it suggests that primary users are visiting alums not current or prospective students.
Among the major challenges for this campus walking tour are that it requires a robust wireless connection (program can get hungup as it’s handed-off between routers) and delivering high quality images can compromise the user experience (memory limitations, bandwidth limitations, need for an image server that can recognize the capacity of an iPad vs an iPhone).
The session closed with Halsey Burgund, a sound artist and musician who describes his efforts as “creating musical scores from participants’ spoken words that continuously evolve in real-time.” Okay, that’s really vague and unclear, and it really doesn’t make any sense until you watch the video demonstration, such as the project he completed for the deCordova Sculpture Park near Boston. Imagine walking around a large public garden while listening to your iPhone on your earphones. The iPhone plays music continuously like a soundtrack but changes depending on where you are. So, if you’re near water, it might be wind chimes. If you walk towards the woods, the tinkling chimes may fade and you hear the rising hum of a cello. Along with these musical instruments, you hear the voices of previous visitors, short snippets describing what they see or how they feel. You, too, can add your experience by recording a message, which will be immediately heard by others who stand in your spot. Halsey calls this project Scapes, a “location-sensitive audio-composition” that requires visitors to move around to hear the presentation and encourage exploration. He didn’t want to create an app that blinds users to the place (turn on and just listen to headphones), he wants a continuous impressive experience (role of music) with occasional surprises (voices) that augments the places. Visitor participation is a requirement and it’s prompted by broad questions and the contributions of earlier visitors. Although this can be nerve-wracking (what if someone says something obscene?), he hasn’t had to edit even one contribution of the nearly 900 recordings collected over six months. He’s currently working on a project for the Smithsonian that will incorporate the National Mall and building Roundware.org, an open source platform to collect, store, organize, tag, and present audio content. Indeed, the Smithsonian has already used Roundware to create “Stories from Main Street,” an iPhone app that collects and plays stories from America’s small towns and rural communities.