New York Times R&D Group: Newspaper 2.0 from Nieman Journalism Lab on Vimeo.
Suw and I visited the New York Times R&D lab last August when we were in New York. It was an impromptu visit. A friend, Jason Brush, at Schematic put us in touch with Nick Bilton after seeing that we were in New York from our Twitter status updates. (Yet another example of how useful Twitter is.) Nick was kind enough to work around our hectic schedule, and Suw and I were both happy to be able to fit the visit in before we had to dash for the airport. Nick showed us his table of devices including the One Laptop per Child, various e-book readers and the odd netbook.
The Nieman Journalism Lab at Harvard University is running an excellent series of interviews with Nick. It’s definitely worth watching the videos or reading the transcripts.
Nick not only showed us their collection of devices to show people at the Times how their audience might view their site, listen to their podcasts and view their video, he also showed us some of their projects. One that really impressed us was a print-on-demand customised version of the newspaper. However, this isn’t your father’s PDF to print. No, this was much more advanced and showed elements of effortless personalisation married to a future-looking mobile strategy. The system works by users having a card, similar to the Oyster cards used on the London Underground, that is linked to their account at the NYTimes. Based on the stories that you read on the site, it knows what your interests are, adding personalisation without the cumbersome box-ticking that has led most first generation customisation services to fail. Research shows that people say that want customised services, but they will rarely go through the hoops of ticking boxes to tell news sites what they want to read. This is not only customisation, but it also changes with users’ habits instead of being a static set of preferences. After the user swipes the card, they are presented with the top three sections of the site based on their reading habits. They can choose a version with the top story in full from each of those sections or a digest of those sections, similar to an RSS feed view. However, after each story, there is also a QR code or semacode. Using your mobile phone camera, these QR codes are translated to URLs and take you to the full story using the web browser on your phone.
Nick also showed us something that the R&D Team first came up with at a Hack Day in London, which is the idea of content following a reader throughout the day. They created a system with some of the ideas called shifd.com, which is actually a working site if you want to have a play.
The thinking behind shifd.com is actually realising that as we go through our days we actually shift from device to device, from form factor to form factor. Content that might be relevant or accessible on one platform might not be appropriate on another platform. The reader might begin reading a story on their computer before going to work and then want to continue reading that story on their mobile phone on their train ride to work. They might not want to watch a video associated with that story until they can come home. They can mark the video for viewing at home on their flat screen TV at home. This is the kind of user-centered thinking necessary to adapt to news consumption as it is instead of asking readers to modify their behaviour to our platforms and business models.
Nick and the rest of the team at the New York Times R&D lab are doing some great work that I hope drives thinking in the rest of the industry. I think it’s also an opportunity for cross-disciplinary academic research. How do we surround our audience with our content, delivering relevant information to the relevant devices as they move through their day? That’s a service I’d pay for.