Instead of creating content that gets pushed from one screen to another, brands are creating storylines that simultaneously play out across multiple devices. The strategic opportunity lies in creating content that leverages features native to each device and explores the interaction between multiple devices.
In today’s fragmented media landscape, a multidevice strategy can create additive experiences for consumers. Second-screen experiences gave us our first glimpse of content creation for multiple devices, outsourcing additional bits of content to the smaller screens sitting in our laps. These second screens have, for the most part, been additive, simply providing additional information or input mechanisms for viewer feedback. American Idol streamlined this process by partnering with Google, integrating the show’s voting feature directly into the search giant’s nearly ubiquitous feed.
Moving forward, “second screens” will likely shift in their priority, taking some emphasis away from the big screen and redirecting it towards hand-held devices. For instance, Univision, the Spanish language media company, is enhancing the plot lines of its shows by letting viewers choose to “follow” specific characters. Viewers then receive text messages and little bits of information in line with the plot. The additional content is created by each show’s screenwriters and can integrate sponsored products into the conversation or deliver coupons through the app.
Disney released an iPad app in conjunction with the re-release of The Little Mermaid. Screened in 16 theaters across the U.S., “Second Screen Live” allows viewers to sync their iPads with the film to interact with on-screen action, play games against other movie patrons, and participate in sing-alongs. The app enables the audience to feel as if it is an integral part of the story instead of a participant merely watching the story unfold.
Rather than being a secondary destination, The Little Mermaid app hints at the ability of brands to leverage native features of mobile devices to enhance the plot of a movie or TV show. Google’s “Spotlight Stories” demonstrates how features native to the phone, like tilting and flipping, can add value to mobile storytelling. As Moto X users move their screens, the story changes to match their perspective, which, in the words of Motorola’s Advanced Research and Technologies research group, explores “what it means to have an experiential device.” Similarly, the Nike SB App leverages a phone’s accelerometer to allow users to view skating tricks in multi-view, multi-angle videos. The app also helps skaters around the world connect with and compete against each other. In less than a year, the app has been downloaded more than 340,000 times. Skateboarders have uploaded their own videos of tricks more than 43,000 times, and over 8,500 games of S.K.A.T.E have been played.
Other brands are exploring how multiple devices can interact and sync to create new ways for consumers to experience a story. Arcade Fire released Just a Reflektor, an interactive film that lets viewers sync their smartphones to their computers to control the video’s visual effects—not by moving a mouse on a screen, but by moving their phone or tablet through the physical space around them. Arcade Fire won the Webby and People’s Voice Award for “Just a Reflektor” in the NetArt Websites category and a Webby in the Best Music category.