Live Web Midterm // Interactive Storytelling

For my Live Web midterm, I used the Google maps API, socket.io, nodejs, headtrackrjs, and peerjs to make a new tool for people to share stories and experiences with each other.  Right now, two users log on – one to the “sender” page, and one to the “receiver”.  The sender will input a location in the text box and submit it.  On submittal, the location is geocoded by Google, and the JSON object returned is parsed to get the lat/lon.  That is used to set the streetview panorama.  There is also a headtracker on the sender page, which…tracks the senders head.  Values from the head tracker are mapped to the heading and pitch of streetview, thus creating an effect of looking around within streetview.  There is also live audio streaming between the browser windows.

 

I see this as a tool to enhance peoples storytelling experience across long distances – instead of just talking about a place, or looking at a picture, they can log on and really look around the location they are talking about while talking about it.   In the future, I think annotations (either temporary drawings, or long-term markers left by people) could add another meaningful layer of interactivity.  There is a short video below, with a code snippet below.  The full code is on github.

midtermDoc from John Farrell on Vimeo.

Leave a Reply

Your email address will not be published. Required fields are marked *