Guardian election road trip review: Geo-tagging


View Larger Map

With the inauguration of Barack Obama as president of the United States now well behind us, I thought I’d take (a long overdue) look back at the road trip that I took during the US elections for The Guardian and talk about some of the things we tried in terms of innovations in coverage and what I learned from it.
This is the third trip that I’ve taken for the US elections. In 2000, I took a trip with BBC Washington correspondent Tom Carver. Webcasts were the thing of the day, and we took a portable satellite dish and a DV video camera to webcast live or as-live (recorded but treated as live). We answered a range of questions covering topics suggested by our global audience. In 2004, I took another trip with BBC News Online colleague Richard Greene. The trip was my introduction to blogging, and it set the path for my career for the last five years.

The common thread through all of these trips has been an attempt to engage the audience in new ways and field test new digital journalism techniques. Over a series of posts I’ll talk about some of the things that we did for US Election trip 2008.

Geotagging

As I mentioned last summer, one of the things that I wanted to try was geo-tagging. I was inspired by the GPS and geo-tagging function in my Nokia N82 to add this to our coverage. The camera in the N82 is stellar. With a 5 megapixel sensor and a brilliant Xenon flash, it is one of the best features in the phone. (I’d be interested in seeing what the new N85 has to offer, apart from the OLED screen. ZDNet has a review.) I’m going to focus on geo-tagging in this post and talk more about mobile newsgathering with the N82 and other smartphones in another post.
As good as the camera is on the N82, I knew that there would be times when I needed Suw’s Nikon D70, a proper D-SLR with interchangeable lenses. But how to add the geo-data? Dan Chung, award-winning photographer and digital innovator at the Guardian, and I had played around with a geo-tagging device from Sony, the GPS-CS1.

A geo-tagger at its most basic has a GPS radio and some memory. It records your location either every so often or after you move a certain distance. It’s not physically connected to the D-SLR in any way, but it does require you to sync the clock from the geo-tagger with the clock in your D-SLR. To add the geo-data to your photos, all you have to do is import the photos to your computer and import the GPS logs from your geo-tagger. Software then compares the time that the photo was taken with your GPS logs and merges the geo-data into the EXIF files of the photos. Newer high-end cameras such as the D200 have GPS add-on units (the GP-1), and point-and-shoot cameras like the P6000 have integrated GPS.

Dan had me test the Song geo-tagger a couple of years ago, and I wasn’t that impressed. It didn’t acquire the satellites very quickly, and Sony didn’t officially support non-Sony cameras. But although the accuracy wasn’t brilliant, the idea is sound.
I looked around and settled on GiSTEQ CD110BT. It has a sensitive MTK chipset with 51-channel tracking, and I found the accuracy to be frighteningly good. The GPS track plotted on Google Earth actually shows when I changed lanes in my rental car. The Sony could take minutes to acquire the satellite, but from a cold start, the GiSTEQ usually got a lock in less than a minute. A synthesised voice says “Satellites fixed” when it’s got a lock. To conserve power, it will shut itself off but wake when moved or vibrated. I carried it around my neck on a lanyard or in the pocket of my camera bag when I was out and about. A supplied light adhesive patch kept it on my dashboard while driving. The unit also comes with both mains (AC) and car chargers.

That’s the good. The bad is that while GiSTEQ says CD110BT will work on PCs and Macs, mine didn’t out of the box. It required a firmware update to work with a Mac, and the firmware updater only works on PCs and didn’t like Windows XP running on Parallels virtualisation software. Fortunately, my friend Andy Carvin at NPR gave me five minutes on his PC to update the firmware, but even after that, I had difficulty getting the device to consistently download data. GiSTEQ has since released a new update that they say fixes this. I downloaded some GPS logs tonight without a hitch.

I’d like to try the Amod AGL3080 (review in Wired), which is touted as a driverless geo-tagger. It simply mounts as an external drive on Mac or PC, and all you need to do is copy the data from it. It uses a highly accurate SiRF III chipset. Unlike the GiSTEQ which is charged via the USB cable, the Amod runs on three AAA batteries. Kevin Jaako has a thorough review of it on his blog.

The software that comes with the GiSTEQ promises a lot and delivers most of it without too much fuss. It’s actually rebranded software from JetPhoto, and as the company says on its site, you don’t actually need a specialised geo-tagger. There are several Garmin or Magellan GPS units that will work with it. The software also works quite nicely with the N82, instantly recognising that the photos already have geo-data embedded in the files. If the geo-data is off, the software has a nice interface to relocate and update the geo-data. It also has a built-in Flickr uploader, although it could be a bit more intuitive and work more seamlessly with Flickr title and description fields.
But I didn’t just geo-tag my photos. I also geo-tagged my tweets using Twibble, a geo-aware Twitter app Nokia S60 phones. Twibble integrates seamlessly with the GPS on the N82. It also allows you to upload pictures you’ve taken with the phone directly to TwitPic. We just used this all to great effect for Guardian Travel’s first TwitTrip with Benji Lanyado. It is pretty heavy on the battery, but I had a power inverter in the car so everything was fully charged all the time. It was also a bonus to have Nokia and Google Maps on the phone for navigation.
I also geo-tagged all of my blog posts. I either took the geo-data from a Tweet or a photo, or if I didn’t have any geo-data handy, I used sites like Geo-tag.de or Tinygeocoder.com to generate geo-data from an address.
Visualising the trip
Thanks to a quick bit of python scripting by Guardian colleague Simon Willison, I have a KML file for all of the 2059 photos that I took over the more than 4000 miles of the trip. One of the reasons that I wanted to geo-tag pictures, posts and tweets was that while I know most of these towns, I wanted to give a global audience a sense of place.

But apart from easily visualising the trip, why all the fuss to do this? Adding geo-data to content is one of those fundamental enabling technological steps. It opens up a world of possibilities for your content. By geo-tagging your content, it allows users to subscribe to content based on location. Geo-tag your movie and restaurant reviews, and you can start leveraging emerging location-based services on mobile phones. With Google Maps on mobile and other mapping services, news organisations could provide real-time location based information. Geo-data allows users to navigate your content by location instead of more traditional navigation methods.
Some companies are already dipping their toes into geo-data. Associated Press stories hosted on Google News have a small inset Google Map inset based on the location information in the dateline. New York Times stories appear on Google Earth. But datelines are imprecise because they are city-based, but when you pull up more accurate data you can do much more. You can see the possibilities of mapped information on Everyblock.com.
But to get from most news sites to Everyblock, you’ve got to put in the foundational work both on the technical side and the journalistic workflow. Having said that, it’s not rocket science. It might seem a lot of work up front, but once the work is done, geo-data provides many opportunities, some of which could provide new revenue streams.

links for 2009-02-11

links for 2009-02-10

  • Kevin: Clayton Christensen and the disruptive-innovation crew from Harvard — who developed the NewspaperNext program with the American Press Institute — struggle to get us to understand how and why simple, low-end, inadequate, "junk" products and services so often topple the big guys.
  • Kevin: This is a very interesting piece that raises a lot of questions about columnists and abuse. I'd really like there to be a clearer differentiation between columnists and journalists. I think this piece slightly blurs the lines between the two. As far as I can tell, this piece is about columnists and a former Gawker blogger. But maybe I'm holding too closely to the US separation of columns and reporting. The comments are very much worth reading. The one thing I would say is that columnists are often shocked by the tone of the 'debate'. However, if you read the columns, they don't set the stage for a debate but rather seem written solely to provoke a reaction. Again, read the comments if you're running a comment site. They make some reasonable and very valid points.
  • Kevin: A good analysis by the folks at Bivings of the top 10 best US newspaper sites (from the top 100 newspapers in the States by circulation.) There are some good mentions in the comments from sites that don't fit those criteria. Check out the WikiJax feature at Jacksonville. It's an interesting innovation. I wonder if we wouldn't increase newspaper usage if we explained our features better. Of course, the best features and web sites explain themselves.
  • Kevin: Let the British media iPhone app rush begin. The one thing to note in this release is how ITN will enable offline video access.
  • Kevin: BitTorrent site Pirate Bay has just released a Google Maps mashup showing their worldwide user base. Janko Roettgers has some good analysis of the numbers. It's just a snapshot in time. It's also interesting to see where BitTorrent, or at least Pirate Bay, isn't used widely such as Africa and the Middle East.
  • Kevin: Some great sources of where to follow the Australian Bushfires via social media. I've been using a search based filter on Tweetdeck to follow the fires.
  • Kevin: I usually find mobile trend watchers in denial about the industry. They focus on handset manufacturers and ignore the speed bump/impregnable road block that the carriers are. But this trend list for '09 seems reasonable. I think we're finally seeing some movement in terms of Location Based Services. Apps are finally breaking the on-deck strangehold carriers used to have in terms of mobile data. Definitely worth a look.

links for 2009-02-09

How to tell if your social media consultant is a lemon

Dave Fleet has a great blog post about how to pick a social media marketing consultant, after a blog post by Ike Pigott calling into question the knowledge of the new flock of “social media consultants” who seem to have crawled out of the woodwork over the last six months.

[W]e have a glut of people selling their expertise on how you should handle “the Twitter community” who have zero experience using the service the way most people do. They hopped on board the Consultancy Express, went straight to the head of the line, and now want to tell you how to talk to people at all of the stops they skipped.

Like Dave and Ike, I have reservations about the way that it seems to have suddenly become fashionable to be a “social media consultant”. As Dave says:

I don’t know about you, but I’m sick of seeing people sign up for Twitter, follow ten thousand people (many of whom follow back) to build a substantial following, then start spouting advice as though followers equals expertise. Some of them are experts, for sure. Others, however, seem to have little beyond a big mouth to back their words up.

Almost as annoying, but just as dangerous, are the hordes of traditional practitioners that have realized they need to include social media in their pitches nowadays, but have no experience whatsoever using those tools.

I have been wanting to write a post like this for months now, but had been holding off because I was a bit worried that I’d end up sounding as If I was criticising people simply for being new. We all have to start somewhere, after all, but social media is experiential, which means if you haven’t experienced it then you really don’t know what you’re talking about.

That said, I lost a job to a guy who had giant red flashing text on his blog, and that was two or three years ago. (Funnily, not only did they tell me “he’s a blog expert recommended by one of our directors, they also told me “we’ll get back to you if we ever need any help with social media.” D’oh.) So experience alone doesn’t guarantee that you’re going to get good advice, because there are some people around who have been successfully spouting crap for years.

Dave offers up these questions to help you winnow out the wheat from the chaff when looking for a social media marketing consultant:

1. Can you give me an example of social media work you’ve completed for a client recently?
2. How do you go about pitching bloggers?
3. How do you monitor what people are saying about you?
4. Where can I find you online?
5. Can you (ghost) write my blog for me?
6. How do you measure results?
7. How would you define social media?
8. Can you just pretend to be me online?

Now, some of these work just as well if you’re looking for an expert to help with internal communications and collaboration, but I’d like to offer up my own list.

So, what do you ask a social media business consultant?

How long have you been using social tools? A good consultant will have been using social tools for quite a while, probably a year or two longer than they’ve actually been a consultant. If someone has only been doing this blogging for six months or a year, you might want to look much more closely at their experience, and make a decision as to whether you want to take a risk on them. They may be a natural, but they’re probably winging it.

Equally, do not believe anyone who says they’ve been doing it blogging forever. Blogs themselves are only ten years old. When I started consulting five years ago, I had only a handful of peers, and they are all very well known now. Any unknown who says they’ve been doing it consulting for more than six years is probably fibbing.

[Update: It’s been pointed out that this section was a bit fuzzy, so I’ve clarified what I mean by “doing it”! And yes, I know hand-coded blog-like websites have been around longer than ten years, but what makes blogging different from a website is the lightweight CMS that underpins it, and both LiveJournal and Blogger started in ’99.]

What was the first social tool you used? Most consultants who’ve been doing this for any length of time probably started off with a personal blog, because that was all that was around in those days. If they started off on Facebook, run away very quickly. If they started on Twitter, carefully examine their other experience.

What tools do you use on a regular basis? They should have at least one blog, a Twitter (or similar) account, and some sort of social network account. If they list every damn thing under the sun, it means that either they have no clients and therefore a lot of time to kill, or they are playing buzzword bingo with you. Realistically, it’s hard to go deep on more than three tools and a lot of the really important stuff is learnt only through focused engagement.

What sort of clients do you have? Expect a broad range of clients in many different sectors, and expect company sizes to range from tiny to multinational. Ask what type of engagements they were, and you should get similarly broad descriptions, from one hour presentations on upwards. Any consultant worth their salt has done a lot of work with very unsure clients who don’t want to spend too much money, because that’s just how the market has been (and still is).

Have you ever had a project that didn’t work out the way you anticipated? If the answer to this is not “Yes”, be suspicious. Good consultants have had to experiment because there isn’t a definitive guide to running social software projects. We know a lot more about what sort of things work now than we used to, but every new client has a new culture, and every new culture throws up new and sometimes surprising problems. Rarely do things go as planned, and you want someone who can think on their feet and adapt to changing circumstances.

What presentations have you given? This is a slightly nuanced question to ask, because not all knowledgeable people speak at conferences, but the more experienced someone is, the more likely they are to have done some speaking. Maybe it will be at conferences of their peers, or maybe it will be at small specialist meetings, or maybe it’s even a lunchtime talk for a business. I’m not really sure that barcamps count – they’re a great place for learning how to present, but they don’t necessarily indicate anything other than a desire to stand up in front of people and speak.

How do you measure success and recognise failure? The correct answer isn’t a stream of jargon about statistics and metrics, but instead should cover understanding the situation as it is before the new software is installed, having clear project goals, and critically examining what can be measured and what it might mean. There is no simple answer to this question, and if they suggest complicated metrics like “edits per page view per person”, then they’re not really thinking things through enough.

Of course, you should thoroughly Google any consultant before you contact them. You should easily be able to find:

  • A professional site or LinkedIn/Xing (etc.) profile
  • A blog, professional or personal
  • A Twitter or other micro-conversation account
  • Articles and blog posts that quote them
  • Their name on conference speaker rosters
  • Audio and/or video of talks they’ve given

Take the time to read through what other people say about them. Do they seem to be respected by their peers? Are they personable online? Can you build a sense of how much experience they have? What do they reveal about themselves as a person?

I wouldn’t worry about the age-old “Have they done work similar to the project I have in mind?” question, because to be honest, every project is a little bit different and what works perfectly for one company might not work in another, for cultural reasons.

Equally, don’t worry if they haven’t worked in your sector – social tools are cross-sector, and good consultants can work successful in any industry. I hate to say it, but your industry is unlikely to be so different that it genuinely takes specialist knowledge to work in. After all, we’re talking mainly about human qualities, such as openness, trust, or transparency, and these exist everywhere. (Also, anyone who tries to flog you sector-specific tools is probably talking out of their arse.)

Red flags
There are some thing that should make you immediately wary, however they are couched.

Promising the earth. Social media projects are neither fast nor easy, because they are centred not around technology but around behavioural change, and that takes time. Any consultant who promises a ‘quick win’ is promising something they can’t deliver.

‘Facebookitis’. Consultants whose only focus is Facebook are to be avoided. Facebook is great at what it does, which is help people organise their social lives and throw virtual sheep at each other. Internal business social networks are most useful tools only when they are designed to fulfil the needs of the user, which are likely to be different to those of the average Facebook user.

Too much focus on technology
. Having the right tools is important, but it’s only 20% of the solution. The rest is about understanding and communicating with people about how these tools will make genuine improvements to their work life. If all the consultant talks about is tech, they’re not right for you.

Too much focus on launch. We are (or should be) long past the idea that all the hard work is done prior to a project launch, but this is especially true with social media projects. Getting things up and running is only the beginning – the hard work comes when you start focusing on adoption and long-term usage.

Hard questions to ask yourself
Before you start looking for help, there are some questions you should be asking yourself. If you can’t say “Yes” to these questions, perhaps you’re not ready to get a consultant of any sort in yet.

Are you in it for the long haul? As I’ve said, social media projects take time, and there’s no such thing as a quick win. If you’re not really interested in ongoing change, don’t run the project.

Are you capable of accepting hard truths? A good consultant won’t shy away from hard truths. They may have to tell you that your wonderful idea won’t work. Are you ready to hear that?

Are you willing to spend money on your people? I’ll say it again. Tech is only 20% of the problem – the rest is people. If you’re not willing to spend significant time and money working on understanding your people’s individual needs and helping them learn how these tools will help, don’t go ahead with the project. You can’t just throw mud against the wall and see what sticks – we know that doesn’t work, so don’t pretend it will.

Are you willing to eat your own dogfood? You want to get other people to use these tools, but do you?

It’s turned into a bit of a long post, and I hope that it’s been useful. Personally, I relish the idea that maybe one day I’ll turn up to a first meeting with a client, and they’ll have printed this post out and proceed to ask me what I’m proposing you ask your consultant. Am I willing to eat my own dogfood? Oh yes!!

links for 2009-02-06

DEN: Eric Ulken: Beyond the story-centric model of the universe

After appearing virtually at a few Digital Editors Network events at the University of Central Lancashire in Preston, I finally made the trip to appear in person. I really enjoyed Alison Gow talking about live blogging the credit crunch for several Trinity-Mirror sites using CoverItLive.

Eric Ulken, formerly the LATimes.com editor of interactive technology, spoke about an issue dear to my heart: Moving beyond the story as the centre of the journalism universe. It’s one of the reasons that I chose to be a digital journalist is that I think it brings together the strengths of print, audio and video while also adding some new story-telling methods such as data and visualistions. Eric talked about the projects he worked on at the Times to explore new ways of telling stories.

Eric started off by talking about the history of news articles.

The story article so far

  • born 17th Century
  • served us well for about 400 years
  • lots of words (800-1000 words on average)
  • unstructured, grey and often boring.

“What else is there in the toolbox?” he asked.

Some examples: (Eric suffered the dreaded no internet, links in presentation problem so am a little link light on this. You can see examples that Eric has worked on from his portfolio.)

  • text trick – lists, tables, timelines, (Eric mentioned Dipity as one way to easily create a timeline, but said it was “not quite there”. He also mentioned MIT’s Simile project (which has ‘graduated’ and is now hosted on Google Code). Licenced for use under BSD licence, it’s is easily something for more news organisations to use.) Other text formats include the q&a and what he called the q&no, eg the New York Time tech blog. They put up questions for Steve Jobs before MacWorld. His Steve-ness never answers them, but it lays out the agenda.
  • blogs are the new articles
  • photo galleries as lists, timelines
  • stand-alone UGC
  • video: short-form, packages
  • mapping, charts, data visualisation
  • database applications visualisation.

I think this is really important for journalists to understand now. They have to be thinking about telling stories in other formats than just the story. Journalist-programmer ninja Adrian Holovaty has a number of ways that stories can be re-imagined and enhanced with structured data. News has to move on from the point where the smallest divisible element of news is the article. News organisations are adding semantic information such as tags, as we have at the Guardian.

But beyond that, we have to think of other ways to present information and tell stories. As more journalists shift from being focused solely on the print platform to multi-platform journalism, one of the most pressing needs is to raise awareness of these alternate story-telling elements. Journalists, outside of the development departments and computer-assisted reporting units, need to gather the data around a story. It needs to become an integral part of newsgathering. If a department inside of your organisation is responsible with gathering this data, your data library needs to be made accessible and easily searchable by journalists. If it sounds daunting, especially for small shops, then use Google Docs as an interim solution. This is also an area ripe with opportunities for cooperation between universities and news organisations.

Eric gave one example of this non-story-centric model for news. “We did a three-way mashup”, he said. They brought together the computer-assisted reporting team, the graphics team and Eric’s team.

They worked with a reporter on the City desk. She wanted to chronicle every homicide in LA County. In 2007, there were 800 murders. She did the reporting in a blog format. It might not have been the best format, but it was easy to set up. She started building up a repository of information. I was begging people to get the tech resources to build a database. We built a database on top of the blog. We took data from the County Coroner. We took gender, race and age and put it in a database which was crossed linked to the blog. We added a map. You could filter based on age or race on the map. The result was two things. It was a way to look at the data in aggregate, and it was a way to drill down through the interface to the individual record. They took public data, original reporting and contributions from users.

“One of the things that is challenging is getting the IT side to understand what it is actually that you do,” he said.There are more tech people who are interested in journalism probably than there are journalist who are able and willing to learn the intricacies of programming.

When the floor was opened to questions, I wasn’t surprised that this one came up.

Question: Could the LATimes get rid of the print and remain profitable?
Answer: No. Revenue from online roughly covers the cost of newsroom salaries, not the benefits, not for ad staff. I don’t think he was saying that the LATimes had figured it out. He had been saying that for some time before he said it publicly. It was for morale. He was saying that it is not inconceivable for the website to pay in the future.
“There is a point where this cycle ends of cutting staff and cutting newshole,” he said.

UPDATE: And you can see the presentation on SlideShare:

links for 2009-02-05