Focus on editorial ideas, then find the right tool

My esteemed colleague and comrade in digital arms, Jemima Kiss, Twittered this very astute observation, in less than 280 characters, about Twitter and use of the micro-blogging application by news organisations:

jemimakiss: Common mistakes news orgs make with Twitter 1) That it’s all about Twitter, rather than how people are actually using Twitter and..

jemimakiss 2) They get fixed on using a tool, like Twitter, rather than working out what they want to do & finding the best tool for it. That is all.

She’s spot on when it comes to Twitter. There is a tendency for organisations to rush with the herd to a new social media service or site without thinking about what, editorially, they are trying to achieve. I’ve seen the same thing happen with blogs and Facebook. After entering the mainstream, some journalists demanded their own blog. Why did they want a blog? They saw it as a back door to having a column. They had always wanted an opinion column because it was a sign of status and as we all know, blogs are just opinion (sarcasm noted). A typical conversation in the industry might go like this:

Editor: How often are you planning on updating your blog?

Aspiring columnist: Oh, once a week should do.

Editor: Were you planning on linking to anything?

Aspiring columnist: Why would I do that? This is my column, er, I mean blog.

Editor: Are you going to take part in the conversation and respond to comments?

Aspiring columnist:
No, of course not. I’m far too busy for that kind of thing.

Editor: So why do you want a blog instead of a column in the newspaper?

Asprining columnist: *silence*

That’s not to say that the journalist wouldn’t get their own column, er, I mean blog, thus continuing traditional media’s focus on celebrity over interactivity. Some journalists make incredibly good bloggers, but when a blog is used simply to replicate what possible in print, it is an editorial waste.

Functionally, there might not be a great difference between a column-with-comments and a blog, but editorially, there is a huge difference.

  • Bloggers post frequently.
  • Bloggers take part in the conversation and respond to comments and questions.
  • Bloggers link to the conversation on other sites.

Blogs take part in a distributed conversation in ways that columns rarely do, whereas columns – even ones with comments – provide a relatively closed, introspective conversation.

Jemima has flagged up how much the same is happening with Twitter. This all comes down to understanding how social media differs from traditional uni-directional publishing and broadcasting and thinking about the editorial concept and the unique opportunities for engagement.

NUJ training chair at centre of blog storm

Over the weekend, I was tempted to write about the blog dust-up between Chris Wheal, chair of the National Union of Journalists training committee, and Adam Tinworth, the head of blog development at Reed Business International, on Adam’s personal blog, but I decided to let Suw fight her corner in the comments. However, I have written up a post looking at the debate with interviews from Chris and Adam over at the The Guardian’s media blog Organ Grinder. Adam’s post had kicked off a great debate about a range of issues, and I agree with him when he says that this kind of debate needs to happen out in the open.

I have to agree with Adam to say that this isn’t a print versus online debate. It’s not a bloggers versus journalists debate (thankfully). This is a new intramural debate amongst digital journalists. We’re now at the point where there are journalists who have been working online for a decade or more. This debate is amongst digital journalists who have embraced social media, and I’d include myself in that camp, and those who see it as a threat to traditional journalism values.

Leveraging a print poster on the web

FlowingData highlighted this data project from WallStats showing how US tax money was spent. The US government being the sprawling beast that it is has an incredibly complex budget, and this visualisation not only makes it accessible but pulls the reader into exploring it.

It has to be good. It even had the American queen of home decorating and entertaining, Martha Stewart, talking about it. I also love is that by using Zoomorama, they have leveraged a printed poster online, simply but quite effectively.

NUJ and Adam Tinworth’s ‘effing’ blog

This one is just too good to pass into my daily Delicious links. I think Adam Tinworth not only calls out someone at the National Union of Journalists for a passing reference to his ‘effing blog’, but he shows the power of a digital journalist. He quickly looked through his referrals, a log of links to his blog, but he also quickly did a reverse DNS lookup to find out where the referral was from. As Adam says:

Ah, yes. The NUJ’s e-mail system. Well, thanks folks. Nice to know that my union, which I have been a member of for the last 15 years thinks that the journalistic field in which I work – blogging – is “effing blogs”.
I wonder who LindaK is, and if she enjoyed the post?

Way to go Adam for showing them what digital journalism looks like.

UPDATE: Apologies for not linking to Adam’s blog when I first posted this. Thanks Adam for calling me out.

BeebCamp: Eric Ulken: Building the data desk at the LATimes

A fun example of structured data from the LATimes, which showed the popularity of dog names in LA County by postcode.

A fun example of structured data from the LATimes, which showed the popularity of dog names in LA County by postcode.

This is from one of the sessions at BeebCamp2, a BarCamp like event for BBC staff with some external folks like Suw, me, Charlie Beckett and others. Charlie has a great post on a discussion he led about user-generated content and what it adds to news, video games and also Twitter and Radio 4.

Eric Ulken, was the editor of interactive technology at the LATimes. He was one of the bridges between technology and the editorial

News organisations:

  • We collect a lot of data but don’t use it (We always thought that was a shame. We had a computer-assisted reporting team at the LATimes, wouldn’t it be nice if we used that.)
  • What online readers want from us is bigger than ‘news’ in the traditional sense
  • We need to be an information soure.

They did a homicide map, which mapped all of the murders in LA in a year on a map and which illustrated a blog that reported all of the murders in LA County in a year.

The project was well received, and they decided to develop a data desk. It brought together the computer-assisted reporting unit, investigative reporters, the interactive technology team and the graphics team to bring together the data desk. They all sat together in the newsroom. A lot of synergies were created. The Times had 10 to 15 investigative reporters on different desks from different disciplines.

Ten bits of advice:

  1. Find the believers.
  2. Get buy-in from above
  3. Set some priorities
  4. Go off the reservation (We had a real problem with our IT department. They had their priorities and we had ours. We invested in a server system using Django.)
  5. Templatize. Never do anything once. Do things you can reuse.
  6. Do breaking news. There is data in breaking news. They did a database of the victims. They added information to the database as it became available. The database was up in 24 hours after the crash. They had built most of the pieces for previous applications. (There was a question about accuracy. Eric said the information was being gathered, but it wasn’t structured. The information was edited by a line manager.)
  7. Develop new skills. They sent people out to workshops. They had hired a Django develop who was also a journalist. He taught Django to others in the office.
  8. Cohabitate (marriage is optional). The investigative reporters and computer-assisted reporters still reported to the pre-existing managers, but by being together, they saw possibilities for collaboration without reworking the organisation.
  9. Integrate.
  10. Give back. They worked to give back to the newspaper.

They used Javascript to add this to other parts of the site. They created these two datasets from the train crash and the homicides, but they also have used publicly available data in their projects. He showed their California schools guide. Apart from the standard data analysis available from state and national educational agencies, they also created a diversity rank that showed the relative diversity of the schools. They did do some reporting on the data. In analysing the schools data, they found discrepancies in reporting about the performance of the schools.

In a slightly more humourous example, he showed dog names and breeds by postcodes.

UPDATE: Eric has added some more details in comments below, and you can follow Eric’s work and follow his thoughts on his site.

BarCamp NewsInnovation UK

This idea has been rolling around in many heads for a long time. Chris Vallance (where is that new blog mister?), Philip Trippenbach and Suw and I have been talking about this for months. My autumn was occupied with the US elections and recovering from it, but Suw marshalled on. Our basic idea was to get past the talking about the future of journalism and just do it. We all talk about the future of journalism, but we felt like it was (long past) time to move things along. We also wanted to spread the future more evenly by bringing other journalists in on the process. We wanted to spread the future a little more evenly and while not turning every journalist into a programmer, help them understand the art of the possible in terms of digital journalism. But this is about the future of journalism, whether you’re a journalist, a programmer or anyone with ideas and an interest.

We had a lot of enthusiasm, but we never quite got around to doing anything about it. It looks like some of our number back in the US have gone out and done it. Introducing, BarCamp NewsInnovation.The goal:

The idea is to get energetic, tech-savvy, open-minded individuals who embrace the chaos in the media industry because the ability to do really cool things still exist. We also need find those people outside of our industry who love to consume news and information and are great thinkers and innovators.

Ok, let’s try this again. As I’ve shown up to this point, I’m terrible at organising anything. Let’s do this. BarCamp NewsInnovation UK. Let’s think outside the box (London). Let’s just get on with it.

, ,

US law and comments on websites

David Ardia, on legal liability for comments online from Nieman Journalism Lab on Vimeo.

David Ardia, director of the Citizen Media Law Project at Harvard, talks about CDA 230, the section of the Communications Decency Act that provides some protection to people who run web sites.

Joshua Benton from the Nieman Journalism Lab at Harvard University says:

I wish every managing editor in the country could see this 20-minute video. I’ve heard so many misconceptions over the years about news organizations’ legal ability to police, manage, or otherwise edit the comments left on their web sites. They say “the lawyers” tell them they can’t edit out an obscenity or remove a rude or abusive post without bringing massive legal liability upon themselves — and that the only solutions are to either have a Wild West, anything-goes comments policy or to not have comments in the first place.

That’s not true, and hasn’t been true since 1996.

Android, e-ink and live news displays

Android Meets E Ink from MOTO Development Group on Vimeo.

Motorola Development Group is showing off a proof of concept with Google’s Android running an e-ink display. With Amazon’s Kindle showing some signs of success, it looks like e-readers might finally be reaching a tipping point in terms of adoption. What I find interesting in terms of not only the Kindle but also this proof of concept is the delivery of content wirelessly.We’re starting to see experimentation in terms of form factor for these devices. We’re not just talking about laptops, netbooks and mobile phones.

With the cost of printing the New York Times roughly twice as much as sending every subscriber a free Kindle, there might be a point where wireless delivery to an electronic reading device makes economic sense. This is very speculative and very much out in front of the market and most consumers, but as Nicholas Carlson points out:

What we’re trying to say is that as a technology for delivering the news, newsprint isn’t just expensive and inefficient; it’s laughably so.

Print is always cast in terms of habit. The argument is that people prefer the tactile experience of the printed page and the easily browsable format, but with the economics of print news delivery becoming financially untenable, it’s worth seeing what options are available and what options are developing.

Guardian election road trip review: Geo-tagging


View Larger Map

With the inauguration of Barack Obama as president of the United States now well behind us, I thought I’d take (a long overdue) look back at the road trip that I took during the US elections for The Guardian and talk about some of the things we tried in terms of innovations in coverage and what I learned from it.
This is the third trip that I’ve taken for the US elections. In 2000, I took a trip with BBC Washington correspondent Tom Carver. Webcasts were the thing of the day, and we took a portable satellite dish and a DV video camera to webcast live or as-live (recorded but treated as live). We answered a range of questions covering topics suggested by our global audience. In 2004, I took another trip with BBC News Online colleague Richard Greene. The trip was my introduction to blogging, and it set the path for my career for the last five years.

The common thread through all of these trips has been an attempt to engage the audience in new ways and field test new digital journalism techniques. Over a series of posts I’ll talk about some of the things that we did for US Election trip 2008.

Geotagging

As I mentioned last summer, one of the things that I wanted to try was geo-tagging. I was inspired by the GPS and geo-tagging function in my Nokia N82 to add this to our coverage. The camera in the N82 is stellar. With a 5 megapixel sensor and a brilliant Xenon flash, it is one of the best features in the phone. (I’d be interested in seeing what the new N85 has to offer, apart from the OLED screen. ZDNet has a review.) I’m going to focus on geo-tagging in this post and talk more about mobile newsgathering with the N82 and other smartphones in another post.
As good as the camera is on the N82, I knew that there would be times when I needed Suw’s Nikon D70, a proper D-SLR with interchangeable lenses. But how to add the geo-data? Dan Chung, award-winning photographer and digital innovator at the Guardian, and I had played around with a geo-tagging device from Sony, the GPS-CS1.

A geo-tagger at its most basic has a GPS radio and some memory. It records your location either every so often or after you move a certain distance. It’s not physically connected to the D-SLR in any way, but it does require you to sync the clock from the geo-tagger with the clock in your D-SLR. To add the geo-data to your photos, all you have to do is import the photos to your computer and import the GPS logs from your geo-tagger. Software then compares the time that the photo was taken with your GPS logs and merges the geo-data into the EXIF files of the photos. Newer high-end cameras such as the D200 have GPS add-on units (the GP-1), and point-and-shoot cameras like the P6000 have integrated GPS.

Dan had me test the Song geo-tagger a couple of years ago, and I wasn’t that impressed. It didn’t acquire the satellites very quickly, and Sony didn’t officially support non-Sony cameras. But although the accuracy wasn’t brilliant, the idea is sound.
I looked around and settled on GiSTEQ CD110BT. It has a sensitive MTK chipset with 51-channel tracking, and I found the accuracy to be frighteningly good. The GPS track plotted on Google Earth actually shows when I changed lanes in my rental car. The Sony could take minutes to acquire the satellite, but from a cold start, the GiSTEQ usually got a lock in less than a minute. A synthesised voice says “Satellites fixed” when it’s got a lock. To conserve power, it will shut itself off but wake when moved or vibrated. I carried it around my neck on a lanyard or in the pocket of my camera bag when I was out and about. A supplied light adhesive patch kept it on my dashboard while driving. The unit also comes with both mains (AC) and car chargers.

That’s the good. The bad is that while GiSTEQ says CD110BT will work on PCs and Macs, mine didn’t out of the box. It required a firmware update to work with a Mac, and the firmware updater only works on PCs and didn’t like Windows XP running on Parallels virtualisation software. Fortunately, my friend Andy Carvin at NPR gave me five minutes on his PC to update the firmware, but even after that, I had difficulty getting the device to consistently download data. GiSTEQ has since released a new update that they say fixes this. I downloaded some GPS logs tonight without a hitch.

I’d like to try the Amod AGL3080 (review in Wired), which is touted as a driverless geo-tagger. It simply mounts as an external drive on Mac or PC, and all you need to do is copy the data from it. It uses a highly accurate SiRF III chipset. Unlike the GiSTEQ which is charged via the USB cable, the Amod runs on three AAA batteries. Kevin Jaako has a thorough review of it on his blog.

The software that comes with the GiSTEQ promises a lot and delivers most of it without too much fuss. It’s actually rebranded software from JetPhoto, and as the company says on its site, you don’t actually need a specialised geo-tagger. There are several Garmin or Magellan GPS units that will work with it. The software also works quite nicely with the N82, instantly recognising that the photos already have geo-data embedded in the files. If the geo-data is off, the software has a nice interface to relocate and update the geo-data. It also has a built-in Flickr uploader, although it could be a bit more intuitive and work more seamlessly with Flickr title and description fields.
But I didn’t just geo-tag my photos. I also geo-tagged my tweets using Twibble, a geo-aware Twitter app Nokia S60 phones. Twibble integrates seamlessly with the GPS on the N82. It also allows you to upload pictures you’ve taken with the phone directly to TwitPic. We just used this all to great effect for Guardian Travel’s first TwitTrip with Benji Lanyado. It is pretty heavy on the battery, but I had a power inverter in the car so everything was fully charged all the time. It was also a bonus to have Nokia and Google Maps on the phone for navigation.
I also geo-tagged all of my blog posts. I either took the geo-data from a Tweet or a photo, or if I didn’t have any geo-data handy, I used sites like Geo-tag.de or Tinygeocoder.com to generate geo-data from an address.
Visualising the trip
Thanks to a quick bit of python scripting by Guardian colleague Simon Willison, I have a KML file for all of the 2059 photos that I took over the more than 4000 miles of the trip. One of the reasons that I wanted to geo-tag pictures, posts and tweets was that while I know most of these towns, I wanted to give a global audience a sense of place.

But apart from easily visualising the trip, why all the fuss to do this? Adding geo-data to content is one of those fundamental enabling technological steps. It opens up a world of possibilities for your content. By geo-tagging your content, it allows users to subscribe to content based on location. Geo-tag your movie and restaurant reviews, and you can start leveraging emerging location-based services on mobile phones. With Google Maps on mobile and other mapping services, news organisations could provide real-time location based information. Geo-data allows users to navigate your content by location instead of more traditional navigation methods.
Some companies are already dipping their toes into geo-data. Associated Press stories hosted on Google News have a small inset Google Map inset based on the location information in the dateline. New York Times stories appear on Google Earth. But datelines are imprecise because they are city-based, but when you pull up more accurate data you can do much more. You can see the possibilities of mapped information on Everyblock.com.
But to get from most news sites to Everyblock, you’ve got to put in the foundational work both on the technical side and the journalistic workflow. Having said that, it’s not rocket science. It might seem a lot of work up front, but once the work is done, geo-data provides many opportunities, some of which could provide new revenue streams.