Real-time search: The web at the speed of life

This is the presentation that I gave this week at the Nordic Supersearch 2010 conference in Oslo organised by the Norwegian Institute of Journalism. To help explain the presentation, I was looking at the crush of information that people are dealing with, the 5 exabytes of information that Eric Schmidt of Google says that we’re creating every two days.

I think search-based filters such as Google Realtime are only part of the answer. Many of the first generation real-time search engines help filter the firehouse of updates being pumped into Facebook and Twitter, but it’s often difficult to understand the provenance of the information that you’re looking at. More interestingly, I think we are now seeing new and better ways ways to filter for relevant information beyond the search box. Search has been the way for people to find information that is interesting and relevant, but I think real-time activity is providing new ways to deliver richer relevance.

I also agree with Mahendra Palsule that we’re moving from a numbers game to the challenge of delivering relevant information to audiences. In a lot of ways, simply driving traffic to a news site is not working. Often, as traffic increases, loyalty metrics decrease. Bounce rates go up. (Bounce rates are the percentage of visitors who spend less than 5 seconds on your site.) Time on site goes down. The number of single-page per visit visitors increase. It doesn’t have to be that way, but it is too often the case. For news organisations and other content producers, we need to find ways to increase loyalty and real engagement with our content and our journalists. I believe more social media can increase engagement, and I also believe that finding better ways to deliver relevant content to audiences is also key.

Google’s method of delivering relevance in the past was to determining the authority of content on the web by looking at the links to that content, but now we’re seeing other ways to filter for relevance. When you look how services such as filter content, we’re actually tapping into the collective attention of either our social networks or networks of influence in the case of lists of influential Twitter users. In addition to attention, we’re also starting to see location-based networks filter based on not only what is happening in real-time but also what we’re doing in real-space. We can deliver targeted advertising based on location, and for news organisations, there are huge opportunities to deliver highly targeted content.

Lastly, I think we’re finding new ways to capture mass activity by means of visualisation. Never before have we been able to tell a story in real-time as we can now. I gave the examples of the New York Times Twitter visualisation during the Super Bowl and also the UK Snow map.

I really do believe that with more content choices than the human brain can possibly cope with, intelligent filters delivering relevant information and services to people will be a huge opportunity. I think it’s one of the biggest challenges in terms of news organisations that in the battle for attention, we have to constantly be focused on relevance or become irrelevant. Certainly, any editor worth his or her salt knows (or thinks he or she knows) what his audience wants, but there are technology companies that are developing services that can help deliver a highly specialised stream of relevant information to people. As with so many issues in the 21st Century, it won’t be technology or editorial strategies alone that will deliver relevance or sustainable businesses for news organisations, it will the effective use of both.


Comments are closed.