Last week, we ran an experimental interactive treasure hunt as part of the 20th Internet Librarians International conference in London.
We also gave a talk about Conversations in the library, using technology to create new ways to communicate and collaborate with users, which you can read here.
The treasure hunt idea started as an extension of our #ILI2017 extra workshop - where we talked about and demonstrated a range of interactive technologies, particularly proximity technologies. This time, we wanted to create something that could involve more participants and help spark conversation around library spaces and the impact of digital services and emerging technologies.
After some conversations with previous workshop participants and ILI organisers, we submitted an idea to run a treasure hunt (or possibly a scavenger hunt back then) for the 20th anniversary event but the idea evolved quite a lot during development.
The story for the treasure hunt also evolved during the development. This was initially storyboarded with Twine but then became a chatbot in order to enable realtime interactions with participants.
As it was the 20th Anniversary of Internet Librarian International (the library innovation conference) and also inspired by publications such as Broad Band by Claire L. Evans and projects like Neocities, we wanted to tell a story about the continuing evolution of the web. But at some point it made sense to intertwine the story of Through the Looking-Glass, and What Alice Found There by Lewis Carroll - there are a lot of parallels and remixing a public domain title was also part of the appeal.
So, from this crazy mix of ideas and influences emerged the prototype of an experimental treasure hunt experience.
The treasure hunt story clues were revealed by Alice the Chatbot, programmed to answer simple questions and to offer extra hints when needed. Participants could join in by texting hi to the treasure hunt mobile number or saying hi in the browser-based chat window. Alice was a character bot that people could interact with in real time to hear more of the story and the clues for the treasure hunt.
Using a chatbot written in RiveScript meant we could provide multiple alternative answers and prompts and create a unique interaction for each participant. But it also meant supporting different formatting across a number of versions.
There were some initial edits needed with the SMS version of the chatbot but most people accessed it via the browser version. The SMS chatbot delivered messages using the Twilio API and there are additional complexities around message delivery timings and other formatting tweaks to improve the experience.
We’re excited about the potential of chatbots to provide experiences and support services in libraries, something that we will be writing about in more detail soon.
The installations were intentionally kept quite simple for practical reasons (we didn’t have access to the space, they had to be transportable, many were running on batteries) as well as conceptual ones. The aim was to demonstrate new approaches to common space and inspire conversation and ideas about how else these technologies could be used in libraries, not to show off our amazing technical prowess (though we're often happy to do that too).
And it was some of the smaller, less visible interactions, like Kiosk mode apps and conductive surfaces that may appeal to many libraries, while internet of things devices and visualisations may appeal to others. But the main thing was to think about how we could use interactivity to engage users in new ways.
The first installation was a visual novel, Alice's Diary.
Interactive fiction and visual novels tools are often used in games development but there also demonstrate using decision-tree style structures in visual displays in the library such as digital signage and orientation. After prototyping with Twine and AXMA StoryMaker, the final version was developed in Montogari as we wanted a framework that supported multimedia and responsive design. The text was a combination of quotes from Through the Looking Glass and an original story about the early days of the open web. The graphics were mostly remixed public domain images and animated gifs. Some of these were created using Image Remixer.
The background visualisations were created using Particle.js, which is a lot of fun for screen displays.
The second installation was the Talking Flowers, created with help from amazing artist, Renata Fernandez. This painting included conductive ink, a Bare Conductive Touchboard and Circuit Playground Express (CPX) boards.
Touching the conductive parts of the painting played audio recordings inspired by the Talking Flowers in Through the Looking Glass.
The main idea of this installation was to show how all surfaces could become interactive and engaging in the library space, we're not just limited to screens.
The third installation in the treasure hunt featured marker-based Augmented Reality content that could be accessed in the browser. This used AR.js and Aframe-ar to show a simple 3D animation without users needing to download an app or use anything additional beyond the cameras on their smartphone. This required WebRTC support so there were some access issues on some mobile browsers.
Mixed reality is a fast-developing area and with so many different devices and closed environments, we’re excited by the potential of browser-based technologies that are available without specialist equipment.
We also used a AR-Code, a QR code to help link users to the AR content delivered hosted on Glitch.
The next installation was a big internet button, designed to demonstrate ways that the physical space of the library can interact with online services. In this case, it was created using a Particle Photon button. Pressing the button automatically tweeted the next part of the story, stored in a Google Spreadsheet. This TweetBot was created in Node.js and Express and hosted on Glitch. The Twitter display was run in Kiosk mode using Fully Kiosk Browser.
We’re really interested in how the Internet of Things can be used in non-consumerist ways and influenced by participatory Smart Cities development. Also, how social networks like Twitter can be used to disseminate and remix collections which we talked about at the conference and we’ll also likely be talking about more in a future post
The final installation was a relatively simple video streaming installation viewable through two-way mirror. Because cats run the internet (or they used to before Facebook).
The video loop was created using the Fully Video Kiosk app to run the tablet in kiosk mode. We’re always testing Kiosk apps as it’s one of those things that’s so useful but many of the available options...aren't great.
After answering the final clue, participants were directed back to Treasure Hunt HQ. They could also re-do the treasure hunt, as the content is a bit different each time you play.
This was very much a first iteration - though we've run live installations at workshops and other events before, this is the first time we've used a real-time story to navigate through a series of clues like this.
The story was kept short to enable people to participate while not taking away from networking and other participation in the conference. Progress could be saved over the 2 days of the conference, though most people competed it in a single go over a coffee or lunch break. In the next iteration, we'll include more details about location, as the installations were spread across the venue which made it harder to navigate in short bursts of participation.
We've kept every comment and bit of feedback we received for future versions.
We’re excited about the potential of live conference experiences as well as other use cases for chatbots in libraries (and we’re already developing further prototypes in this area) .
Huge thanks to Internet Librarian International team for supporting our crazy adventure and to all the participants who gave something new a chance.
And there are more photos of the treasure hunt on our Instagram, if you want to check that out.