Two projects to watch: Ben Franklin Project and TBD.com

TBD.com's Near You zip code news filter

TBD.com's Near You zip code-based news filter

At 428 am in Washington DC a new news site, TBD.com, launched, and it is definitely one worth watching. Why? They have assembled an all-star staff, brimming with passion. The general manager for the project is Jim Brady, the former executive editor and vice president of Washington Post Newsweek interactive. Steve Buttry, the site’s head of community engagement, has a long history in traditional journalism, training and innovation.  (For any journalist struggling to come to terms with the unrequited love you feel for the business, read this post by Mimi Johnson, Steve’s wife, as he left the newspaper business to go all digital at TBD.) They have some great staff who I have ‘met’ via Twitter including networked journalists Daniel Victor and Jeff Sonderman.

When he was hired, Jeff described his job as a community host as this:

developing ways to work with bloggers and users to generate, share and discuss content.

He described TBD.com as this:

Our goal is to build an online news site for the DC metro area, and do it taking full advantage of the how the web works — with partnership not competition, users not readers, conversation not dictation, linking not duplicating.

If you look on Twitter this morning, Jeff and Steve are very busy on their first full day as hosts for the new news service.

Digitally native at launch

The site is clean and clear, easy to navigate with a lot of excellent touches. TBD.com launched with an Android app and are awaiting approval for their iPhone application. They zip (post) code news filter to find out content not only from TBD but also from bloggers in the area is excellent. I lived in Washington from 1998 until 2005 as the Washington correspondent of BBCNews.com. I know the city well. I typed in my old home zip code, 20010, and got news about Mount Pleasant including from a blog called The 42 Bus, which was the bus that I used to take to work everyday. Their live traffic information is template for how city sites should add value for such bread and butter news. You can quickly pull up a map showing traffic choke points in the area. They even have a tool to plot your best travel route. The traffic tools are pulled from existing services, but the value is in the package.

They had a launch event last week, and they explained their networked journalism strategy. Steve Myers at the Poynter journalism institute said half of the links at TBD.com would point to external sources, much higher than at most sites. said that At launch, 127 local bloggers had joined their network. Steve Myers had this quote from Steve Buttry about their linking strategy:

“If we’re competing on the same story, we’ll do our story and we’ll link to yours,” said Steve Buttry, director of community engagement for the site. If another source owns a big story, “we’ll play you at the top of the home page and we’ll cover something else with our staff resources.”

Wow. Personally, I think that this is smart. With resources declining at most news organisations, they have to be much more strategic about how they use their staff. They need to focus on what value that they add. Jeff Jarvis says: “Cover what you do best and link to the rest“, and this is one of the highest profile tests of that strategy.

Ken Doctor, brilliant news industry analyst at Newsonomics, has 10 reasons to watch TBD.com. Harvard’s Nieman Lab for journalism has another six reasons why they are watching the launch. Of Ken’s list, I’ll highlight two. Bucking the trend for many new high-profile news projects in the US, this is a for-profit business. Ken’s seventh point is huge:

7) It’s got a big established sales force to get it going. Both TV stations salespeople with accounts — and relationships. So TBD is an extension of that sales activity, not a start-up ad sell, which bedevils many other  start-ups.

The other thing that TBD.com has going for it is that it has the commitment of someone who already has seen some success with new models, Robert Allbritton. A few years ago, he launched Politico.com, bringing in two high profile veterans from the Washington Post to compete not only with their newspaper but also specialist political outlets like Roll Call. Politico has managed to create a successful print-web product, “not profitable every quarter but says it’s turning a profit for any given six months,” Allbritton told paidContent.org. What is more important though is his commitment to his ventures. He’s got the money and commitment to support projects past the short term.

“The first year of Politico was pretty ugly in terms of revenue,” he admitted. “You’ve got to have some staying power for these things to work.”

The Ben Franklin Project

The other project that I’m watching is John Paton’s Ben Franklin Project at the Journal Register Company. What is it?

The Journal Register Company’s Ben Franklin Project is an opportunity to re-imagine the newsgathering process with the focus on Digital First and Print Last. Using only free tools found on the Internet, the project will – from assigning to editing- create, publish and distribute news content on both the web and in print.

Succinctly, this company is looking to disrupt its own business. Instead of attacking costs by cutting more staff, they are looking to cut costs by eliminating the price of their own production using free tools. It’s not something that every organisation could do, but with 18 daily newspapers and 150 non-daily local publications, it shows the ambition of their project. This is not a tiny organisation.

In practice, the organisation set the goal for all 18 of its newspapers to publish online and in print using free online and free open-source tools, such as the Scribus desktop publishing application. They are also pursuing the same kind of community engagement, networked journalism strategy that is at the heart of TBD.com.

On 4 July, 2010, Independence Day in the US, they published their 18 daily newspapers and websites only using free tools and crowdsourced journalism. Jon Cooper, Vice President of Content, Journal Register Company wrote:

Today — July 4, 2010 — marks not only Journal Register Company’s independence from the costly proprietary systems that have long restricted newspapers and news companies alike. Today also marks the start of a revolution. Today marks the beginning of a new path for media companies whose employees are willing to shape their own future.

This is just part of Paton’s turnaround strategy for the Journal Register Company. However, in 2010, which is proving to be another tough year for the US economy (especially in some of the areas the company covers), Paton just announced that the company is 15% ahead of its revenue goals. He said:

Our goal is to pay out an extra week’s pay this year to all employees for hitting our annual target of $40 Million.

That is an amazing investment in journalists and an incentive for them to embrace the disruptive change he is advocating, but it’s so heartening to see journalists engaged and benefitting from change in the industry.

With all the talk about innovation in journalism, it is rare to see projects launch with such clear ambitions. After a lot of talk in the industry, we’ll now see what is possible.

links for 2010-08-09

  • Kevin: A comparison of different local news strategies in the Boston area. They compare AOL's patch, part of the internet providers efforts to remake itself as a digital content company. Each site has one full-time editor who also writes and shoots video. They compared this with local newspaper chain GateHouse. Their Wicked Local sites benefit from coverage by its more than 100 community newspapers. Patch and Wicked Local also combine aggregation, highlighting local bloggers content. It's too early to declare one model the winner, but worthwhile knowing the different models in play.

links for 2010-08-05

APIs helping journalism “scale up”

A couple of days ago, I quoted AOL CEO Tim Armstrong on developing tools to help journalists “scale up” what they do. ?In a post on Poynter’s E-Media Tidbits, Megan Garber has a highlighted a good practical example of what I meant .

One thing that computers and other technology can help journalists to work more efficiently is to cut down or eliminate frequent, repetitive tasks. Derek Willis at the New York Times talks about APIs (as Derek describes APIs as “just a Web application delivering data). Derek says:

The flexibility and convenience that the APIs  provide make it easier to cut down on repetitive manual work and bring new ideas to fruition. Other news organizations can do the same.

Derek also points how savvy use of data is not just good for data visualisations and infographics, but it is also an excellent resource for New York Times’ journalists.

So if you have a big local election coming up, having an API for candidate summary data makes it easier to do a quick-and-dirty internal site for reporters and editors to browse, but also gives graphics folks a way to pull in the latest data without having to ask for a spreadsheet.

And as he said, the biggest consumer of New York Times APIs is the New York Times itself.

Projects such as building an API can be quite large (although new companies and also organisations like the Sunlight Foundation in the US and MySociety in the UK have great public service APIs and data projects), but with the benefits to both audiences, designers, developers and journalists, it makes it easier to justify the time and effort.

links for 2010-08-04

Opportunities from the data deluge

There are huge opportunities for journalism and data. However, to take advantage of these opportunities, it will take ?not only a major rethinking in the editorial and commercial strategies that underpin current journalism organisations, but it will take a major retooling. Apart from a few business news organisations such as Dow Jones, The Economist and Thomson-Reuters, there really aren’t that many general interest news organisations that have this competency. Most smaller organisations won’t be able to afford it on an individual level, but it leaves room for a number of companies to provide services for this space.

Neil Perkin outlines the challenge and the opportunity in a wonderful column that he’s cross-posted from Marketing Week. (Tip of the blogging hat to Adam Tinworth, who flagged this up on Twitter and on his blog.) In our advanced information economies, we’re generating exabytes of data. While we’re just getting used to terabyte disk drives, this is an exabyte:

1 EB = 1,000,000,000,000,000,000 B = 1018 bytes = 1 billion gigabytes = 1 million terabytes

To put this in perspective, I’ll use an oft-quoted practical example from Caltech researcher Roy Williams. All the words ever spoken by human beings could be stored in about 5 exabytes. Neil quotes Google CEO Eric Schmidt to show the challenge (and opportunity) that the data deluge is creating:

Between the dawn of civilisation and 2003, five exabytes of information were created. In the last two days, five exabytes of information have been created, and that rate is accelerating.

All the words spoken since the dawn of language in 5 exabytes or the amount of information created in the last two days helps illustrate the acceleration of information creation. Those mind-melting numbers wash over most people, especially in our arithmophobic societies. However, there is a huge opportunity here, which Neil states as this:

The upside of the data explosion is that the more of it there is, the better digital based services can get at delivering personal value.

And journalists can and definitely should play a role in helping make sense of this. However, we’re going to have to overcome not only the tyranny of chronology but also the tyranny of narrative, especially narratives that prejudice anecdote over data. Too often to sell stories, we focus on outliers because they shock, not because outliers are in any way representative of reality.

From a process point of view, journalists are going to need to start getting smarter about data. I think data crunching services will be one way that journalism organisations can subsidise the public service mission that they fulfil, but as I have said, it’s a capacity that will need to be built up.

Helping journalists ‘scale up what they do’

It’s not just raw data-crunching that needs to improve, but we’re starting to see a lot of early semantic tools that will help more traditional narrative-driven journalists do their jobs. In talking about how he wanted to help journalists at AOL overcome their technophobia, CEO Tim Armstrong talked about why these tools were necessary. Journalists have not been included in corporate technology upgrades (and often not included in creation of tools for their work). Armstrong said at a conference in June:

Journalists I met were often the only people in the room who never had access to a lot of info, except what they already knew.

It’s not technology for technology’s sake but tools to open up more information and help them make sense of it. Other industries have often implemented data tools to help them do their jobs, but it’s rare in journalism (outside of computer-assisted reporting or database journalism circles). Armstrong said:

You can pretty much go to any professional industry, and there’s some piece of data system that helps people scale what they do.

Journalists are being asked to do more with less as cuts go deep in newsrooms, and we’re going to have to work smarter because I know that there are some journalists now working to the breaking point.

There have been times in the last few years when I testing the limits of my endurance. Last summer, filling in behind my colleague Jemima Kiss, I was working from 7 am until 11 pm five days a week and then usually five or six hours on the weekends. I could do it for a while because it was a limited 10-week assignment. Even for 10 weeks, it was limiting the amount of time I had with my wife and was negatively affecting my health.

I’m doing a lot of thinking about services that can help journalists deal with masses of information and also help audiences more easily put stories into context. We’re going to need new tools and techniques for this new period in the age of information. The opportunities are there. Linked data and tools to analyse, sort and contextualise will lead to a new revolution in news and information services. Several companies are already in this space, but we’re just at the beginning of this revolution. We live in exciting times.

links for 2010-08-03

  • Kevin: Josh Benton at Nieman Lab has said that the current crisis in journalism is actually an opportunity to rethink its grammar, and Megan Garber has an excellent post on a good first step: Rethink the news cycle.

    “Because we choose, essentially, topic over time as journalism’s core ordering principle, we don’t generally think about time as an order unto itself. Newness, and nowness, become our default settings, and our default objectives. The ‘tyranny of recency,’ Thompson calls it.”

    Or, put another way in this post, the tyranny of the news peg. This is a very thoughtful post pointing at possibilities in how to rethink journalism.

Learning from a failed journalism project

I want to applaud Jen Lee Reeves who wrote about the mistakes that she made for a journalism project that she worked on for the 2008 elections in the US at PBS’ MediaShift blog. It’s a brave thing to do, but her courage flags up a number of mistakes that are common to journalism projects, including a few that I have made myself.

She is an “associate professor at the Missouri School of Journalism, I am also a new media director at the university-owned NBC-affiliate, KOMU-TV”, and for the elections, she had an ambitious idea to bring together the coverage of several different outlets “to make it easier for news consumers to learn about their candidates leading up to election day”. She would complete the project during a fellowship at the Reynolds Journalism Institute at the University of Missouri.

In 2006 for the mid-term elections in the US, she had done something similar, but the site had been hand-coded. (I’m assuming what she means is that there was no content management system.) She realised that this would be too cumbersome, but in 2008, she opted for a “hand-built” site created by students with her oversight. Technically, she was moving in the right direction. The site took in RSS feeds from the participating news organisations, and web managers simply had to tag the content so that it appeared in relation to the right candidate and election. However, while, the site was easier to user for the news organisations, it still wasn’t clear enough to use for the audience. She said:

Unfortunately, our site was not simple. It was not clean and it was hand built by students with my oversight. It did not have a welcoming user experience. It did not encourage participation. I had a vision, but I lacked the technical ability to create a user-friendly site. I figured the content would rule and people would come to it. Not a great assumption.

Back in 2008, I still had old-school thoughts in my head. I thought media could lead the masses by informing voters who were hungry for details about candidates. I thought a project’s content was more important than user experience. I thought I knew what I was talking about.

She goes on and lists assumptions that she had about the audience, assumptions which proved false and which she believes doomed the project for failure. Go to her post and read them. She is grateful that she had the opportunity to experiment and make mistakes during her fellowship, an opportunity that she says she wouldn’t have had while being in charge of a newsroom.

If we’re paralysed by fear of failure, we’ll never do anything new. It’s not failure that we should fear but rather the inability to learn from our mistakes. For big projects like this, it’s really important to have a proper debrief. Free services on the web can bring down the cost of experimentation, and by testing what works and what doesn’t, we can not only learn from our mistakes but also make sure that we take best practices to our next project.