-
Kevin: The Fredrikstad Blad in Norway (a place where I've been and have done journalism training at the Norwegian Institute of Journalism) decided last year to dedicate half of its resources, both editorial and commercial, to the web. They have doubled their online revenue. It's been difficult, but the Mecom-owned newspaper is part of a larger effort by the group to cut costs and raise revenues. The staff had already been reduced by 20% when this effort came.
links for 2010-09-21
-
Kevin: An article from 2009 that says that a glut of advertising space was one of the things depressing online display advertising during the recession. I wonder if things have changed as the recession eases. Tameka Kee wrote: "Even if the economy rebounds in 2009, it doesn’t look like the situation will improve because premium and mid-tier publishers are just creating too much content. When you add in the continuous stream of lower-quality user-generated content and social media inventory, the Journal says: 'The Web is likely heading for a shakeout on a scale unseen since the dot.com bust.'" I agree that a shake-out will happen, but like all such inflections, it's a matter of timing.
-
Kevin: Mathew Ingram at GigaOm has an interesting look at the failure of journalism startup NewsTilt: "One of the more glaringly obvious flaws in the company’s makeup is what appears to be a lack of interest in the problem NewsTilt was trying to solve. While the company had an idea of what it wanted to do for journalists — namely, to provide a platform for them to find an audience and theoretically build some kind of business around their content, similar to what True/Slant tried to do before being acquired by Forbes — neither of the founders had any background in journalism. Worse than that, Biggar admits that neither had much passion for the idea either; the startup evolved out of a plan to develop a better commenting system for newspapers."
Social Media Forum: My thoughts on the future of context
Next week, I’ll be giving the keynote at the Social Media in Hamburg, and I’ve been asked to speak about the future of context. Bjoern Negelmann asked me a few questions via email about the subject, and he’s kindly allowed me to cross-post the interview for the Social Web World blog.
1) Kevin, as an expert for new digital media strategies you will be giving a talk on the “future of context” at the upcoming Social Media FORUM on Sept 28. Can you give three keywords that describe what we can expect from your talk?
Relevance, insight, value
2) Is “context” the turning key for the misled strategies of media companies in the Internet? And if so what is the explanation?
First I should say, as much as everyone in the industry wishes it, there are no silver bullets, no single solution that will solve the problems that media companies are facing. The iPad won’t save us. Paywalls won’t save us, and simply finding ways to increase context won’t on its own save us.
That being said, most current digital media strategies are fundamentally flawed. They are mostly based on the premise that internet really is just another distribution medium like radio, television and print. They rely on a media landscape of scarcity instead of abundance. These outdated assumptions are rooted in the era of mass media. In 20th Century mass media models, which relied on just a few sources of information and entertainment, success relied upon building the biggest audience possible and using paid content and advertising to make loads of money.
As Edward Roussel of the Telegraph, said, the link between rising audience and higher returns was true until the spring of 2008. Then something happened. Yes, it was partly due to the recession, but it is also due to an oversupply of online advertising space. As Paid Content says, premium and mid-tier publishers are creating too much content, creating a surplus of content to run ads against. As in any market, if supply outstrips demand then you have downward price pressures.
There are exceptions. With the online advertising recovery, The Daily Mail in the UK has been able to outgrow the competition and translate that into commercial success. Big still sometimes wins. There are still lucrative verticals such as business in which returns have stood up or actually grown during the recession. The Wall Street Journal, The Economist and The Financial Times are all enjoying success, partly due to increasing interest in business and finance due to the recession. However, most other publishers find themselves under severe pressure.
To change our fortunes, we first need to question the assumptions underlying 20th Century media business models. Until the 1980s, both audiences and advertisers had fewer choices and media owners could charge monopoly rents for advertising. But when the multi-channel world, whether broadcast or online, arrived, the media’s first reaction was to create more channels and content to try to take advantage of increased distribution opportunities. We’re now seeing the limits of such an approach as the law of diminishing returns takes hold.
Context is about adding value to content in ways that benefit audiences and advertisers. It makes it easier for audiences to find and make sense of relevant content. Adding context, rather than simply creating more content, is about realising that content is no longer scarce, but audiences’ time and attention is. It helps advertisers by providing opportunities for more highly targeted advertising.
3) But this strategy means allocating resources for producing context? Isn’t this against the recent strategies of media companies that are just cutting costs because of the “lousy pennies” of online advertising?
While media companies, especially newspapers, have been cutting staff to cut costs, they have also been creating more content. Digital production techniques make this possible but, again, we’re starting to reach the limits of that strategy. Basically, we have an oversupply of content driving an oversupply of digital advertising space, and traditional markets have one way of valuing a surplus: returns plummet.
The market is already flooded and the last thing we need is more content. A study commissioned by the Associated Press (PDF) found that young audiences were shutting off because they were lost in a deluge of episodic updates. The key conclusion was: “The subjects were overloaded with facts and updates and were having trouble moving more deeply into the background and resolution of news stories.” In essence, the news industry is acting against its own economic interest by producing more content and exacerbating the problem of information overload. It’s like trying to save a drowning man by giving him a glass of water
We need a much more focused approach. Allocating resources to producing context around existing content while making strategic choices about what not to produce will create opportunities by adding value and creating differentiated products. Yes, we live in a world of flow, with constant streaming updates, but mining that flow for context and value-added information will be where sustainable business models are.
4) So putting the weight on the “context” – what are the formats and examples of this strategy?
Thomson-Reuters has a service called Calais. It analyses thousands of mainstream media and non-traditional sources of information every day. It powers services such as Zemanta, which allows bloggers and traditional journalists to easily add images and links, which add context, to articles. As a platform, Thomson-Reuters can sell Calais to enterprises to make sense of the data and information they create, but it’s also a tool the company itself uses to algorithmically find meaning in the flow of information from traditional and non-traditional news organisations, e.g. finding new companies to watch before they show up on the traditional news radar.
One of my favourite examples right now is Sunlight Foundation’s Poligraft. Using public information about political contributions and a service like Calais, they reveal details about donors and major campaign contributions to members of Congress. It quickly adds a layer of context in any story involving political leaders.
The Guardian is achieving some great things with their Datablog and Datastore. Data is a key part of many stories that journalists write everyday, but in the past, the only thing we with did with those numbers was highlight a few. Now, the Datablog not only allows everyone to see the full set of numbers, but by hosting them on Google Docs for others to download, people with skills in data visualisation are able to present these numbers in new and creative ways. The Guardian has a group on Flickr to allow them to highlight their work.
The BBC also had another great example during the World Cup this year. They called it dynamic semantic publishing, and it took the official FIFA statistics to dynamically create a rich store of information about players, teams and groups. Not only was it a rich presentation of the facts around the World Cup, but it also helped their audience discover BBC coverage of their favourite teams and players.
5) If you take a look ahead in the future – what kind of media companies are able to adapt to that strategy?
The kind of companies that have been able to adapt to this strategy have been ones that see beyond traditional containers of content. For news, they realise that the written story is no longer the atomic unit, the indivisible unit, of journalism. There is data and context within the story, context that can be linked and used to draw connections between seemingly unrelated events in our increasingly complex world. Context is not just about adding value to pieces of content, but it also helps make it easier to organise and add news ways for audiences to find and discover what is relevant and interesting to them.
links for 2010-09-18
-
Kevin: Some great examples of practical data and mapping applications for the 2010 Afghan elections. TileMill allows different sets of data to be overlaid on top of each such as fraud measurements over maps of election-related violence. Excellent examples of getting more information from mapping and datasets
-
Kevin: Scott Rosenberg has an excellent guide to how to dig in a website and find out who owns it. It's a great primer for web journalists in some basic investigative techniques to you know who's behind what you're reading online.
-
Kevin: Frederic Lardinois questions that numbers behind McDonald's claims that Foursquare helped it increase foot traffic by 33%. The numbers don't add up. Check-ins increased by 33% but most likely not foot traffic. Frankly, the check-ins might have started from a very low base, accounting for the double digit increase.
links for 2010-09-17
-
Kevin: Domino's UK says that its social media campaign using Facebook and Foursquare was key in driving profits 29% higher. On Foursquare, they have what they call a superfans programme. CEO Chris Moore said that the "web-based activities offer a dual benefit of driving pizza sales online and building customer loyalty."
-
Kevin: With a spend of only $1000, McDonald's in the US says that a Foursquare campaign was able to increase foot traffic to its stores by 33%. They didn't measure increased sales, only traffic, however, there are other examples that did measure sales, such as Domino's UK that attributed its social media and Foursquare campaign to a 29% increase in profits.
links for 2010-09-15
-
Kevin: Dean Starkman writes about the dramatic increase in content production in the US while at the same time the journalism industry has cut 15,000 jobs. He writes about the dramatic rise in output across newspapers, television and the web. More needs to be written about this. I think that Starkman has written an important piece. The question I have is whether companies will have the courage to take a look at their output and take measured steps to reverse the forward creep that is valuing quality over quantity. Speed is good but shouldn't be the only thing driving journalism in the digital era.
links for 2010-09-13
-
Kevin: An excellent look at the law concerning internet news aggregators and also looking at the different types of aggregators.
links for 2010-09-08
-
Kevin: Google is definitely starting to find some clever ways to drive HTML5 uptake. One way is by helping to develop interesting interactivity using the emerging web standard. In this case, they have worked with Arcade Fire to develop a video that pulls in your location and "mashes up the film with Google Maps and Street View". The video also allows you to write a postcard to "your younger self". This is driving new levels of real interactivity, and it will be a great time for storytellers.
-
Kevin: Mathew Ingram looks at investments in companies trying to build a business in the Twitter ecosystem. As Twitter itself seeks a business model, it's making it more difficult for other companies to build their businesses off of providing Twitter services. It's a fine line. Twitter wouldn't have experienced such growth if it hadn't been for the eco-system that developed around it, but Twitter also needs to find a sustainable business model or the heart of the eco-system will die.
-
Kevin: Sarah Perez writes at ReadWriteWeb: "According to ABI Research's Neil Strother, check-in apps may raise privacy concerns among some users today, but those issues can be overcome by offering consumers deals, discounts and rewards. The "value-exchange" of receiving these rewards will be high enough that consumers won't mind giving up privacy in order to take advantage of the benefits."
-
Kevin: The BBC uses Ushahidi's new cloud-based service, crowdmap, to map reports surrounding the London tube strike in September 2010. It should have had a filter not just as to what form of transportation but also whether the report was of a problem or of alternative routes to avoid congestion or suspended service.
The social side of citizen science
I spent last Thursday and Friday at the Citizen Cyberscience Summit, listening to a series of presentations about how the public are collaborating with scientists to achieve together what neither group can do alone. It was a fascinating couple of days which illustrated the vast variety of projects either running currently or in the pipeline. We’ve all heard of SETI@home, but there are projects now across a diverse set of disciplines, from botany to history, astronomy, meteorology, particle physics, seismology and beyond.
What was notable, however, was that the majority of the projects were about volunteers donating CPU cycles rather than brain cycles. Where communities were mentioned it was generally in passing, and when community tools were mentioned they were almost invariably forums/bulletin boards.
I had hoped to here more from the different projects about community churn, retention tactics, development tactics, social tools, and other such things, but was not totally surprised to see that most presentations focused on the science instead. There was a discussion session scheduled for Friday evening to talk some of these issues through, but I sadly couldn’t stay for it. Nevertheless, I think that the social and community aspects should have been discussed throughout the two days.
It is obvious that there is tremendous overlap of interests between the citizen science community and the social collaboration community, and there are lessons both parties could learn from each other. I’d love to see some sort of round-table organised that brought the two communities together to discuss some of the issues that citizen science faces. In lieu of that, here are a few ideas to hopefully get an online discussion going.
The forum is not the only tool
I don’t think it’s a surprise that those projects which do have a community component tend towards having a forum of some sort. They’ve been around for ages and for many people they are the default discussion tool. However, we’ve come a long way since the forum was invented and there are many social tools that are more suited to certain types of tasks.
Wikis, for example, are much better for collecting static (or slowly evolving) information such as help pages. Blogs are good for ongoing updates and discussion around them. UserVoice is great for gathering feedback on your website or software. A community is a multi-faceted thing so often needs more than just one tool.
Facebook is not a panacea
During lunch on Friday I did get to talk to some of the other attendees about social media. Facebook, of course, came up. Whilst Facebook is a massive social network, one has to be very careful how one uses otherwise it can be a massive waste of time. Facebook Causes, for example, was said by the Washington Post to have raised money for only a tiny percentage of the nonprofits that used it. I myself have seen how Facebook encourages ‘clicktivisim’ – the aimless joining of a group or cause that isn’t followed up by any meaningful action.
Facebook as a platform, however, is a more interesting proposition. Facebook Connect allows users to log in to your site using Facebook and lets your site post updates for the user to their wall. And Facebook apps may allow citizen science to be done actually on Facebook rather than requiring users to go to another site. In this way, Facebook shows promise, but starting a group or a page and hoping that people will just go off and recruit users to your project is unlikely to be successful.
Twitter is a network of networks
Where Facebook is sitting in the kitchen being introspective over a can of cider, Twitter is the extrovert at the party. Although Facebook has more users (~500m), Twitter is now at ~150m users and growing at 300k per day. More to the point, however, Twitter is easy to use, more open, and Tweets that go viral really do go viral because it’s not just your network you’re reaching, but a network of networks. The potential value for recruitment and retention is huge, if you do it right.
Design apps to be social from the beginning
If you’re creating software for users to download and run, think about how you could make that social. The social aspects to your project don’t need to be managed exclusively on a separate website or third party software. If it makes sense for what you are doing, build in sociability.
Most of these tools are free
I’m guessing that most citizen science projects have little funding. Where social media is concerned, the good news is that the vast majority of key tools are free. The not-so-good news is that you do need to understand how to use them, which could take some investment in terms of training and consulting, and you need time to maintain your online presence. A good consultant will help you understand how to work social media into your work life so that it doesn’t become a drain on resources, but you must have some time to commit to it.
This is where JISC and other funding bodies could really help: by allocating specific funds to raising awareness of social tools in the science community, providing training, ensuring that projects can afford to work with outside social media consultants, and even by helping project leaders understand how to find a good social media consultant (sadly, there are lots of carpetbaggers).
The opportunity afforded to citizen science by social media is enormous, regardless of whether a project is focused on CPU time or more human-scale tasks. Now let’s start talking about how to realise that potential!
Real-time search: The web at the speed of life
This is the presentation that I gave this week at the Nordic Supersearch 2010 conference in Oslo organised by the Norwegian Institute of Journalism. To help explain the presentation, I was looking at the crush of information that people are dealing with, the 5 exabytes of information that Eric Schmidt of Google says that we’re creating every two days.
I think search-based filters such as Google Realtime are only part of the answer. Many of the first generation real-time search engines help filter the firehouse of updates being pumped into Facebook and Twitter, but it’s often difficult to understand the provenance of the information that you’re looking at. More interestingly, I think we are now seeing new and better ways ways to filter for relevant information beyond the search box. Search has been the way for people to find information that is interesting and relevant, but I think real-time activity is providing new ways to deliver richer relevance.
I also agree with Mahendra Palsule that we’re moving from a numbers game to the challenge of delivering relevant information to audiences. In a lot of ways, simply driving traffic to a news site is not working. Often, as traffic increases, loyalty metrics decrease. Bounce rates go up. (Bounce rates are the percentage of visitors who spend less than 5 seconds on your site.) Time on site goes down. The number of single-page per visit visitors increase. It doesn’t have to be that way, but it is too often the case. For news organisations and other content producers, we need to find ways to increase loyalty and real engagement with our content and our journalists. I believe more social media can increase engagement, and I also believe that finding better ways to deliver relevant content to audiences is also key.
Google’s method of delivering relevance in the past was to determining the authority of content on the web by looking at the links to that content, but now we’re seeing other ways to filter for relevance. When you look how services such as paper.li filter content, we’re actually tapping into the collective attention of either our social networks or networks of influence in the case of lists of influential Twitter users. In addition to attention, we’re also starting to see location-based networks filter based on not only what is happening in real-time but also what we’re doing in real-space. We can deliver targeted advertising based on location, and for news organisations, there are huge opportunities to deliver highly targeted content.
Lastly, I think we’re finding new ways to capture mass activity by means of visualisation. Never before have we been able to tell a story in real-time as we can now. I gave the examples of the New York Times Twitter visualisation during the Super Bowl and also the UK Snow map.
I really do believe that with more content choices than the human brain can possibly cope with, intelligent filters delivering relevant information and services to people will be a huge opportunity. I think it’s one of the biggest challenges in terms of news organisations that in the battle for attention, we have to constantly be focused on relevance or become irrelevant. Certainly, any editor worth his or her salt knows (or thinks he or she knows) what his audience wants, but there are technology companies that are developing services that can help deliver a highly specialised stream of relevant information to people. As with so many issues in the 21st Century, it won’t be technology or editorial strategies alone that will deliver relevance or sustainable businesses for news organisations, it will the effective use of both.