Participatory media: Encouraging people to ‘level up’

Derek Powazek has an interesting analysis of the quirky quiz show on US public radio (NPR) called Wait, Wait Don’t Tell me and looks at the lessons the show provides in developing participatory media projects. What I like about this post is that he’s looking at a relatively traditional media format, the radio quiz show, through a different lens, from the point of view not of radio but of social media and gaming. I like from the start how he re-defines the term “crowdsourcing”.

For my purposes, it means collaborating with the people who used to be the silent audience to make something better than you could make alone.

I’m going to focus on two of his points and let you read the rest of the post to get the full monty. I couldn’t agree more with his second point about structuring input. You have to give crowds a goal, something to aim for.

Too many crowdsourced projects create a blank canvas and have a rather utopian view that the crowd will create a masterpiece. It just doesn’t work like that. You’ll most likely get obscene graffiti rather than a Van Gogh because not a lot of people engage with something when it isn’t clear what they are engaging with. A vacuum encourages vandals. They assume that no one is looking after your particular corner of the internet and will usually start trying to sell Viagra if you’re lucky. I still hear Field of Dreams strategies at conferences, a “build it and they will come” ethos that was discredited by anyone with credibility a decade ago. (If someone espouses such a strategy and dresses up with a lot of buzzwords stressed to impress, run away. They really are just snake oil salesman.)

I think Derek makes another good point when he says “Encourage the audience to level up”. Again, this is taking a concept from gaming and applying it to participatory media. Most people still passively consume media (although many more people are sharing and recommending media). It’s often referred to as the user-generated content pyramid or the 1-9-90 rule (although this might be changing). A participatory media project or service should give “new users a clear path, limited tools, and an awareness of that those on the next level can do”, Derek says.

Too often, media create crowdsourced projects that are akin to bad dates, they are all about us. It’s focused on what we, the media, not what we, the people, get out of it. As I’ve been saying for several years now, if user-generated content plays actually provide value to users, not just media outlets, then more people will participate. Creating levels for users and clear benefits for them as they contribute more is one solid strategy for achieving greater participation and better results.

Al Jazeera Unplugged: Juliana Rotich of Ushahidi

This is a live blog. It may contain grammatical errors, but I tried to be as true to the essence of the comments as possible?.

Juliana Rotich spoke about Ushahidi, the crowdsourced crisis reporting platform. I’ve written about Ushahidi before, and I have written about Swift River last year. During a rapidly developing event, how do you manage that torrent of information, Juliana said. You have to create an ‘information slider’, she said to help evaluate information. How do you separate signal from noise, wheat from chaff? They wanted to know how to deal with a “hot flash” event:

It was that crisis that started two members of the Ushahidi dev community (Chris Blow and Kaushal Jhalla) thinking about what needs to be done when you have massive amounts of information flying around. We’re at that point where the barriers for any ordinary person sharing valuable tactical and strategic information openly is at hand. How do you ferret the good data from the bad??

What if we listened to the crowd? Not just what is popular, that might not be pertinent.

What if we listened to victims?

What about creating a crisis dashboard. They showed how to us Tweetdeck to curate information. Information can be filtered by crowd or by algorithms. Swift River is an “aggregator with entity extraction”.  By pulling together relevant feeds, they can then parse content, creating a rich database of people, places and organisations in real time. They can create a taxonomy to deal with the data. Swift can help determine the authority of sources with algorithms. The location data can help them figure out what is happening where.

They are trying to save time, identify and rate trusted sources, surface relevant content (suppress noise) and curate it all.

Jon Gossier of Appfrica, who I met last year, has been helping to move the Swift River project forward. I’ve been meaning to write about this for a while, but Swift recently released a web service. This is definitely a project to watch.

 

Ushahidi and Swift River: Crowdsourcing innovations from Africa

For all the promise of user-generated content and contributions, one of the biggest challenges for journalism organisations is that such projects can quickly become victims of their own success. As contributions increase, there comes a point when you simply can’t evaluate or verify them all.

One of the most interesting projects in 2008 in terms of crowdsourcing was Ushahidi. Meaning “testimony” in Swahili, the platform was first developed to help citizen journalists in Kenya gather reports of violence in the wake of the contested election of late 2007. Out of that first project, it’s now been used to crowdsource information, often during elections or crises, around the world.

What is Ushahidi? from Ushahidi on Vimeo.

Considering the challenge of gathering information during a chaotic event like the attacks in Mumbai in November 2008, members of the Ushahidi developer community discussed how to meet the challenge of what they called a “hot flash event“.

It was that crisis that started two members of the Ushahidi dev community (Chris Blow and Kaushal Jhalla) thinking about what needs to be done when you have massive amounts of information flying around. We’re at that point where the barriers for any ordinary person sharing valuable tactical and strategic information openly is at hand. How do you ferret the good data from the bad?

They focused on the first three hours of a crisis. Any working journalist knows that often during fast moving news events false information is often reported as fact before being challenged. How do you increase the volume of sources while maintaining accuracy and also sifting through all of that information to find the information that is the most relevant and important?

Enter Swift River. The project is an “attempt to use both machine algorithms and crowdsourcing to verify incoming streams of information”. Scanning the project description, the Swift River application appears to allow people to create a bundle of RSS feeds, whether those feeds are users or hashtags on Twitter, blogs or mainstream media sources. Whoever creates the RSS bundle is the administrator, allowing them to add or delete sources. Users, referred to as sweepers, can then tag information or choose the bits of information in those RSS feeds that they ‘believe’. (I might quibble with the language. Belief isn’t verification.) Analysis is done of the links, and “veracity of links is computed”.

It’s a fascinating idea and a project that I will be watching. While Ushahidi is designed to crowdsource information and reports from people, Swift River is designed to ‘crowdsource the filter’ for reports across the several networks on the internet. For those of you interested, the project code is made available under the open-source MIT Licence.

One of the things that I really like about this project is that it’s drawing on talent and ideas from around the world, including some dynamic people I’ve had the good fortunte to meet. Last year when I was back in the US for the elections, I met Dave Troy of Twittervision fame who helped develop the an application to crowdsource reports of voting problems during the US elections last year, Twitter Vote Report. The project gained a lot of support including MTV’s Rock the Vote and National Public Radio. He has released the code for the Twitter Vote Report application on GitHub.

To help organise the Swift River project for Ushahidi, they have enlisted African tech investor, Jon Gosier of Appfrica Labs in Uganda. They have based Appfrica Labs loosely on Paul Graham’s Y Combinator. I interviewed Jon Gosier at TEDGlobal in Oxford this summer about a mobile phone search service in Uganda. He’s a Senior TED Fellow.

There are a lot of very interesting elements in this project. First off, they have highlighted a major issue with crowdsourced reporting: Current filters and methods of verification struggle as the amount of information increases. The issue is especially problematic in the chaotic hours after an event like the attacks in Mumbai.

I’m curious to see if there is a reputation system built into it. As they say, this works based on the participation of experts and non-experts. How do you gauge the expertise of a sweeper? And I don’t mean to imply as a journalist that I think that journalists are ‘experts’ by default. For instance, I know a lot about US politics but consider myself a novice when it comes to British politics.

It’s great to see people tackling these thorny issues and testing them in real world situations. I wonder if this type of filtering can also be used to surface and filter information for ongoing news stories and not just crises and breaking news. Filters are increasingly important as the volume of information increases. Building better filters is a noble and much needed task.

Reblog this post [with Zemanta]