links for 2007-02-28

links for 2007-02-27

UK start-ups: They are out there

I’ve been having conversations lately with a few people about British start-ups. As Tom Coates noted, it is a conversation we’ve been having for quite a while now, but rather than pontificate, I thought I’d do another one of my list blog posts. Who are the British start-ups? And what do they do? I’ll be editing this post as I go along to reflect new info, but here’s my starter for ten:

“Ning is the fast and free way to create custom Social Websites!”

“word-of-mouth community where people can remember, share and discover great places”

“etribes is used by thousands of people like you who want a simple, secure personal website.”

“Web Widgets. Snipperoo is for collecting and using them without hacking code. Add widgets to your account and they appear on your site. It’s like magic! And it’s free.”

“Webjam is a flexible tool that allows you to manage multiple pages, on your own or with people you invite, with just one account.”

“Blog instantly by speaking your entry into your mobile phone. Simply call your Speak-a-Blog TM number and speak your post. SpinVox converts it to text and posts the entry live to your blog, within minutes.”
“The social music revolution.”

“The marketplace where people meet to lend and borrow money.”

“Email large files easily and securely”

I just know I’ve forgotten some, so tell me… where are the other UK tech start-ups? And which ones do you rate? Equally, I feel pretty confident of the provenance of these start-ups, although it’s not always clear, so please correct me if I’ve got it wrong.

Open publishing – A few questions left

This week is my turn to work with the students on De Montfort’s Online MA in Creative Writing and New Media, which I am very much looking forward to. But first, an apology: I had promised to put together a video lecture, but it turns out that video is a lot harder than it looks. I spent most of the weekend struggling with the technology, only to end up at 1am this morning with a video which was both too long and rubbish. I’ve thus concluded that I need to acquire a few new skills before I start making rash promises about video – I hope you’ll forgive me, but I honestly think those are 30 minutes of your life that you can do better things with.

Everything I would have said in the video has already been published, however, in the Open Publishing category of this blog:

But I’m left with a few questions.

  • What are the numbers? How have Penguin, Tor and Baen seen sales develop over the live of an open book? Do they have any information that would allow a comparison between downloads and sales?
  • Does open publishing prolong the shelf-life of a book?
  • Is success genre specific, and focused on internet-literate readers such as science fiction fans and tech books?
  • Do authors who open publish earn more overall? Do they get more requests to speak, or write for magazines or newspapers? Do they get other paid gigs alongside their writing?
  • Will the model work when we don’t need paper at all? Is open publishing a blip, viable only during the period within which ebooks are non-interchangable with paper books?
  • Do ebook downloaders buy more books overall?
  • What’s the relationship between audiobooks and ebooks?

There is, obviously, a lot more to say about open publishing and my curiosity is very much piqued by what I’ve read and written so far. I look forward to delving into the topic even more and look forward to everyone’s questions and comments.

Technorati Tags: , ,

Open publishing – The opposite of open is DRM

It’s difficult to have a discussion about open publishing without also considering digital rights management (DRM), the software that attempts to control what people do with digitally distributed content. For many publishers, the thought of publishing books under a Creative Commons licence is anathema, but yet they don’t want to pass up on the opportunity to distribute their material digitally online. Instead of experimenting with open publishing, they try to find a middle way and frequently they think that middle way is to use DRM to lock up their ebooks and audiobooks.

As you can tell from my tone, I’m none too keen on DRM. It’s something I’ve done a lot of work on with the Open Rights Group, where I was until recently Executive Director. Rather than rehash all the arguments here as to why I believe DRM is bad, I’m going to give you a nice list of links:

The problem with DRM is that it’s a fundamentally flawed technology which erodes our rights and favours contract law over copyright law. It prevents users exercising their fair dealing rights (called fair use in the US), restricts access to those with disabilities, and does nothing to benefit the consumer.

I have been surprised by the relish with which some publishers approach DRM, but in looking for a middle way they’ve ended up down a cul-de-sac.

Technorati Tags: , ,

Rethinking video, rethinking journalism, rethinking priorites

I love blogs for the distributed conversation that they engender, and one of the discussions over the last few weeks has been about online video and how it is fundamentally different from television. There has long been a post in the back of my head that newspapers should focus on creating video and not recreating television.

Paul Bradshaw beat me to this post in calling for newspapers to stop trying to make television – it’s video. He makes some excellent points on how the grammar of TV does not translate directly to the web. For instance, on the web, why have an anchor pass to a video reporter?

My view is that TV shovelware not only translates poorly online, but adopting television production methods cedes the competitive economic advantage that newspapers now have over television. The argument for a 24-hour live broadcast television news operation is economically and journalistically dubious. Rocketboom’s daily downloads equal or outstrip the viewership for many cable news channel programmes. But I wonder how much more is spent per cable news programme versus Rocketboom’s production costs? OK, that analogy isn’t completely fair, but on-demand video divorced from television’s high overhead will begin to pressure rolling news channels. That is where the opportunity exists for newspapers and other non-traditional sources of video, not in jumping from one threatened business model to another.

Paul Mason, business reporter for the BBC’s Newsnight, actually read out an obituary for rolling news. Paul wrote:

In addition, the limitations of rolling news as a news medium are beginning to block its ability to set the pace in terms of design. When it first started, the bosses consoled themselves for the low viewing figures with the promise that, once viewers saw what they were missing – all those dramatic sound stings, breaking news straps, crawling text, blinking arrows and massive sets – they would be drawn to this visual feast. Today the feast is to be found online – and it is not just visual. It is the immersive experience of interaction in real time with real people that compels users to stay online for hours – whether on eBay or World of Warcraft.

Note, both Paul and I make a distinction between 24-hour live broadcast television and 24-hour newsgathering. I found Paul’s arguments really compelling, not least because he knows the business, but also because he was saying that the workflow and grammar developed for 24-hour rolling news operations didn’t necessarily provide compelling material for 24-hour on-demand news operations.

Adrian Monck has a great post based on a piece he wrote for the BBC College of Journalism. Check out the bullet points, Monck’s Maxims. I really took note of this line:

So, a quick review of video online tells you newspaper guys are still in charge of newspapers, and TV and radio people at the BBC control the commissioning strings for the content that ends up online.

Ah, the commissioning budget and old lines of editorial control. The bottom line is that as economic priorities shift to online, commissioning priorities for original journalism also have to shift in that direction. That’s a long term process. In the near term, media companies have to radically revamp their development process, but that is another blog post. Suffice to say, new media development cycles have to become incremental, iterative and measured in months, not in years.

But in this video discussion, it was great to see my former colleague Alf Hermida’s (new, at least new to me) blog post push this discussion a little further and call for some thinking outside the TV news box.

What I find surprising is that the industry is still having this discussion. It reflects how people in broadcasting and print have failed to realise that the internet is a new medium. It shows the deep lack of understanding of digital journalism and its potential.

Rethinking how we do video online is a start. But we need to rethink journalism for an interactive and participatory age.

Andy Dickinson thought that Alf was calling for a focus on journalism and not the medium. Andy, I might be respectfully disagreeing, but I took away from Alf’s post that the industry needed to rethink journalism in light of interactivity and participation. I might just be misreading Andy’s post because it looks like something I’ve heard over the years that journalism is journalism no matter the medium, which I have always disagreed with.

Regardless, I think Alf is spot on in calling for a rethink of journalism that considers the opportunities of digital journalism and multimedia storytelling. These days, I focus on the interactive and participatory possibilities. That still escapes most broadcasters and publishers. They don’t really understand the social dynamics and psychology of social media because in most part they don’t understand how media can be social.

I think at the end, the opportunity for video exists, not in replicating television, but in:

  • Taking advantage of the disruptive economic potential in pro-sumer video production, not in trying to replicate TV production methods.
  • Developing a workflow that supports on-demand video not rolling television news.
  • Developing an editorial voice and grammar that works in an online, on-demand world, not one that apes CNN and other rolling news channels.

Technorati Tags:

Open publishing – Collaborative writing

It’s not just publishing that is becoming an open process, but also writing. The advent of wikis and blogs allows people to collaborate on creative works with complete strangers, regardless of geographic divides. The idea seems a bit strange to creative writers used to what is most frequently a solitary pursuit, but for certain types of writing it can work very well. Opening your work up for proof-reading and criticism right from the beginning can be an emotionally difficult task for some, but bringing together a number of experts to work on a book and provide feedback can result in a much better end product.

Some types of writing are clearly good for collaborative writing – technical books, such as books about computer programming, or factual books with a lot of fine detail benefit from the insight and expertise of more than one person. One such example is The Django Book, written by Adrian Holovaty and Jacob Kaplan-Moss. Here’s a very quick tour of their site:

Clive Thompson did something similar way when writing a feature on radical transparency for Wired. He published his initial ideas about what the feature should cover, and asked his readers for their input. They gave him information and links to use in his research; discussed the implications of his ideas on secrecy, transparency and the hivemind; and helped him shape his feature with views from around the world.

And a project that De Montfort students might already be aware of is the Million Penguins wiki, a join Penguin/De Montfort project attempting to bring strangers together to write a novel. Rather than using a blog and comments to solicit feedback, this wiki allows people to write and edit the novel directly. Unlike The Dango Book or Radical Transparency, which are examples of factual writing where people can pool their expertise on a given subject, A Million Penguins is an experiment to see if people can write fiction together.

The problem with writing fiction is that it’s not just a series of scenes put into a logical order, it has to have an internal structure of its own, and that usually comes from one person’s imagination, or collaboration between a small number of people (frequently two). It’s also difficult for a group of strangers to write with a consistent voice, to avoid cliché, and to develop working plots, sub-plots, themes and motifs. But A Million Penguins is an experiment to see if people can self-organise, and to see how parallel storylines develop as individuals and small groups pick up a concept and run with it in different directions.

It reminds me somewhat of the email role playing games (RPG) that I’ve been a part of in the past, where people come together, each create a character and weave a story together email by email. Sometimes, email RPGs work really well – when you have a cohesive group who respect each other’s contribution, not only is it a lot of fun but the story that unravels is creative and interesting. But it only takes one person being difficult to turn a fun RPG into something tedious and annoying, and I fear that the same is true – possibly more true – of a wiki novel. I guess we’ll have to wait and see.

Wikis can also be used for non-fiction, just as blogs can be. Justin Patten is currently writing a book called Blogging and Other Social Media: Technology and Law, and is using a wiki to open up the writing process to other social media experts. Again, I think it’s slightly easier to write a non-fiction book on a wiki than a novel, but either way it’s a non-trivial task.

One issue that springs to mind is, how you deal with someone else posting content that infringes someone else’s copyright? It’s not feasible to double-check every passage added to the wiki by every user, particularly if your wiki takes off and you have a lot of contributors. It could be troublesome if such a passage was not picked up until the book was in print, potentially forcing all copies to be pulped if legal action was taken.

The answer is, I think, not just that you can generally trust your contributors, but also to encourage contributors to add in references if they spot a passage they recognise as being quoted from another source. Then, inclusion of infringing text – whether innocent or malicious – could be picked up fairly early in the process. Of course, there are no guarantees, but we’ll have to wait and see if this sort of concern is even valid.

One final method that I’ve used a lot for writing up collaborative conference notes is simultaneous note taking, using software like SubEthaEdit (on the Mac). SubEthaEdit allows multiple people to edit the same document at the same time – so you can see people typing, letter by letter. It’s an amazing tool for real-time collaboration, and I’d love to experiment with writing something substantive with it. Certainly it’d be a fun tool for co-writing a novel, so long as your collaborators are in the right time zone!

But this openness isn’t suitable for everyone or every project. Sometimes, the joy of writing is sitting, on your own, somewhere quiet, and just working through your own thoughts, figuring out what you really mean, getting your own words out of your head and into a medium where they can eventually be shared – when you are ready. Much of writing for me is about self-expression, and that’s something that’s never going to go away, no matter how much technology provides me with the tools and opportunity to collaborate. That’s not a rejection of collaboration, but recognition of the fact that I like to put my self into my writing, and no one else can do that for me. Neither way of writing is right or wrong, it’s just horses for courses.

Technorati Tags: , ,

Six Apart spins like a Whirling Dervish

I’ve refrained from blogging about Six Apart lately, because I have nothing positive to say about them or their products right now, but I’m afraid I can’t let their latest marketing email pass without calling bullshit.

I have spent the best past of the last four or five months listening to various friends struggling on a daily basis to keep Movable Type up and running. In fact, if you’re a regular reader then you’ll have experienced for yourself some of the problems that Corante have had with MT: the slowness, the failed page loads, the inability to post comments and, at one point, Strange’s total absence. I know of at least four large commercial installations of MT that have struggled – and, at times, failed – because Movable Type simply did not scale. (Although the new Rebuild Queue has helped.) I have personal friends who have had significant problems with MT, even though their sites are relatively small. And I have consoled more than one developer as MT saps their will to live, with significant bugs in 6A’s code being found and, eventually, fixed.

(Note: I am not going to name names, other than Corante’s – you knew about that anyway. Businesses in particular seem to be very wary of admitting when they are having software problems, but I am talking about household names both in the UK and the US who are having problems, and not small ones.)

With all this in mind, I find it totally disingenuous of 6A when they write:

We talk a lot about helping bloggers succeed with Movable Type, and that requires us to also focus on an important rule: Failure Is Not An Option.

You see, one thing Movable Type users often have in common is that, whether they’re writing a personal parenting blog for friends and family, or they’re publishing their opinion on case law for a law blog, they just can’t accept downtime on their blog. Fortunately, Movable Type was designed from day one to be super-reliable, standing up to the heaviest traffic load, even if you get linked to be a huge website.

This is nothing more than marketing department spin. MT is not super-reliable. If it was, then I wouldn’t keep hearing of yet another blogger who has abandoned MT, or another company that’s fighting to keep its MT installation going.

Six Apart is talking about MT as if it’s only used by individual bloggers, and that the only problem is when you get linked to by a big site. But whilst there are plenty of individual bloggers who are having problems, there are also business who have paid good money to MT for a commercial licence and are now finding MT to be a liability. And it’s not necessarily a big link that’s causing the problem, but fundamental flaws in the way that MT deals with spam and comments, and other bugs in the code that frankly should have been picked up years ago.

The spam problem, as I understand it, is that MT doesn’t differentiate between a spam hit and a proper comment until it has hit the database. It does the same amount of work in both cases, and the only difference is where that comment eventually turns up: on your blog or in the junk folder. So if your blog is hammered by spammers, the database does the same amount of work as it would do if it were hammered by real commenters. Of course, a spambot can hit your database with more comments more quickly than a human being can, and that alone can bring a blog down.

I heard of one case where, every time a comment was made, it caused 250mb of data to be transferred between servers. Scale that up to 100s or 1000s of spam comments, and suddenly you have the kind of load that can melt a server.

So no. MT is not super-reliable, and it cannot stand up to the heaviest traffic loads.

Six Apart go on:

How does it work? Well, unlike most blogging tools, Movable Type supports two different ways of publishing your pages — it can look in your database and choose which posts to display each time someone visits your site, or it can just generate a regular HTML web page that gets displayed without having to touch your database. That’s what we’re talking about when we say Movable Type supports “static” or “dynamic” publishing — static publishing doesn’t talk to your database every time someone visits your blog, and it’s the default in Movable Type. We let you choose between both so you can set the right balance of performance and scalability. (Static publishing takes longer for you as an author, but less time for your readers — so if you’ve ever waiting for your site to “rebuild”, you can take some consolation in the fact that your readers will have less of a lag when they visit you.)

Aaah yes, the rebuild. They talk as if this is a good thing. The trouble with rebuild is it’s really not very efficient, and frequent comments cause superfluous tasks to be queued for the rebuild, so you end up wasting a lot of server capacity. God knows the number of times I’ve sat there, waiting for a blog to rebuild… and waiting, and waiting, and waiting.

If you have the very latest version of MT, you have Rebuild Queue, but if you don’t then it doesn’t matter whether your site is static or dynamic, the problem is total comment load, including both spam and valid comments.

Most other blogging tools don’t do rebuilds the way MT does, and I can’t think of another tool that I use that suffers as much from bugs and downtime as MT. Doing things differently doesn’t mean you’re doing them right and everyone else is doing them wrong.


Now, if you have a huge farm of servers and lots of technical staff, you can make dynamic publishing work at very high traffic volumes, too. In fact, our LiveJournal team here at Six Apart invented a lot of the open source technology that makes that work — the people behind sites like Facebook and Digg and Wikipedia and our own Vox use it, too. But if you’re running on a regular web server at a standard hosting company, they’re going to get kind of annoyed if your blog is hitting the database thousands of times just because you wrote a popular post.

Most commercial installations don’t have big server farms, nor do they have lots of technical staff. Yet even if you do chuck a few extra blades and a couple of developers at the problem, it’s still difficult to make MT work in either mode, static or dynamic, if you’re being hammered by spammers. Again, writing popular posts isn’t the problem. Serving pages isn’t the problem. Comments are the problem.

Now, it’s very easy to blame the spammers, but the sad fact is that spammers aren’t going to go away, and tools have to be built to withstand their onslaughts. MT isn’t. It didn’t matter how many servers you threw at MT 3.2x, comment spam could still kill them.

Oh, and just to nitpick… all that lovely open source stuff from LiveJournal? Well, let’s remember that minor point of fact that 6A bought LJ for its open source goodies. No sneakily trying to claim credit for LJ, please.

You might’ve seen this effect already — ever check out a link that’s been promoted on a big site like Digg or Slashdot and been faced with a “database connection error” when you visit the blog that got Dugg? Well, Movable Type is designed to prevent you from ever having to face that problem.

I feel like a broken record. Spam, guys, spam. Not the Slashdot Effect. (For the record, I’ve noticed that the Slashdot Effect is nowhere near as strong as it used to be anyway.)

For more tips on how to make sure your blog is performing as reliably as possible, our community’s put together some resources:

* MT Wiki
* Performance tuning Movable Type
* Enabling FastCGI
* Movable Type System Architectures

MT was always a tool that you needed to have a reasonable amount of expertise to install. Then they made it a bit easier, so you didn’t need to have quite the developer chops that you used to. Now you need to be a developer again to make the damn thing work. Make up your minds, 6A. Either MT is a developer tool or a consumer tool – you can’t keep wavering between the two.

And of course, we haven’t yet achieved this goal of making blogs failure-proof. Some of the steps for making a Movable Type blog bulletproof are too obscure or confusing. So we want to collect your feedback on the questions and concerns you have about the reliability of your Movable Type site — if you’ve ever missed out on some page views or potential readers because your blog wasn’t reachable, let us know or briefly summarize your story on this Movable Type wiki page.

OK, so 6A haven’t achieved their goal of making blogs failure-proof, why spend five paragraphs claiming they had?

If they want to understand where the problems are, they should start offering some support instead of expecting the people they’ve let down put the time and effort into writing it all up for them on their wiki. I know of people who have paid good money for MT who have had to fight to get 6A’s attention for support – 6A have complained when people ‘don’t use the ticketing system’ when the ticketing system was in fact broken. Hell, I even know of companies that have had to fight to pay them for a licence to use their software as per their terms and conditions. What sort of way is that to run a business?

Give proper support to the people whose MT blogs are failing, and you’ll soon gather all the stories you need to figure out what’s screwed up with MT. Instead of asking us to put the effort in, why don’t you, for a change?

We’ll start blogging about the reliability stories we’ve heard, both where MT has held up under pressure as well as where MT didn’t do what you’d expect, and how to fix it. Until then, you can help by pointing us at examples of Blog Failure, whether it’s on Movable Type or not, and we can all work together to help solve the problem.

Frankly 6A’s marketing department should be given, at the very least, a strong talking to for this email and especially the first and last paragraphs. Why should we do your work? It’s not the consumer’s job to figure out what’s wrong with your software – that’s your job, and if you provided decent support you’d have most of the answers by now anyway.

MT 3.34, released on 17 Jan 2007, has helped a few of my friends and contacts, but they are still having to do significant work to get all the plug-ins installed and working efficiently. Spam is still a problem. FastCGI gives a perceived speed increase, but frankly is a bit like faking it.

And whilst Rebuild Queue helps, it comes too late for many individual users and large MT installations. In commercial settings, MT’s damaged reputation has rubbed off not just on other third-party blog-related tools, but also on those evangelists who championed blogs in the first place, obscuring blogs’ benefits with serious performance issues that blot everything else out. It also makes it much harder to sell other Web 2.0 applications because of the fear that they too won’t scale.

The truth is that 6A have dropped the ball. They abandoned MT and their users, and their lack of support and updates has caused significant problems for even those people who are paying to use the software. Instead of keeping on top of MT and ensuring that it can cope with a rapidly changing environment and increasingly sophisticated spammers, they’ve spent the last two years focused on Vox.

Personally, I find it hard to have faith in Six Apart’s commitment to developing, improving and supporting Movable Type, which is why I now advise clients to avoid it at all costs.

Technorati Tags: ,

FOWA 07: Round up and impressions

So the notes are up, the dust has settled, and I’ve recovered from all the excitement. Time to think about the Future of Web Apps and give my opinions, as so nicely requested by Alan Patrick. I don’t usually have time to both take notes and think about my reactions to what’s being said – the notes are just my way of processing what I’m hearing. If I don’t take notes, I tend to fall asleep, so it’s a sort of conference survival mechanism thing, really. Anyway…

Firstly, I have to thank Ryan, Gillian and Lisa for letting me in to cover the conference. We had agreed that I would write short summaries of each session for their live coverage page, but problems with the wifi meant that it was really difficult for me to get the summaries to Lisa who was posting them live, so we really only managed to get a few of the talks from the first day posted. I feel a bit bad about that, but I hope that the comprehensive notes I have posted here will do instead.

The wifi was a real problem this year. Last year, they’d organised great wifi, but this year, despite spending good money on it, one of their suppliers failed and there was no wifi except BTOpenzone, which on the first day crumbled under the weight. It was a bit better on the second day, but still not all that reliable. I really hope that Carson get their money back, and compensation, from whichever supplier screwed up. Wifi at conferences is really important, and it’s something that significantly changes attendees’ opinion of a conference, so I feel for Ryan, having spent so much on it only to have it die.

Now, on to the content. The tone of this year’s conference was very different to last year’s, in my opinion. It was much more business- and vendor-led, with fewer of the sort of inspirational talks that we had last year, from people like David Heinemeier Hansson, Tom Coates, or Cal Henderson. I think that made it a little flatter this year, with fewer ‘wow’ moments.

The highlight, for me, was without a doubt Simon Willison talking about OpenID. Simon’s good at being excited about things – he has an energy and enthusiasm which is totally contagious, and by the time he finished his talk I immediately wanted to run off and set myself up an OpenID server.

Stef Magdalinski and Richard Moross from Moo were also great. Stef has a great style as a speaker, a nice wry humour that I very much appreciated after some of the dry sponsor slots. Plus I love Moo. They produce the best business cards I’ve ever seen, and every time I give one out, people notice it, notice the quality and the unusual dimensions, and they immediately love them. Which reminds me, I really must get some more done. So, as a fan of the product it was great to hear more about Moo, and surprising to hear that they are based in London. For some reason, I thought they were based in San Francisco! Just goes to prove, yet again, that the best start-ups don’t always come from America.

Tara Hunt’s presentation on community was very interesting. I think I know a bit about community but Tara had lots of interesting stuff to say and said it well. My only criticism was that she tried to cram a bit too much in, and so she went a little bit too fast for me to keep up.

Notable product pitches came from Simon Wardley from Zimki, who should get some sort of special award for effective use of photos of kittens with guns, and Stefan Founatin from Soocial whose presentation was funny and inspiring all in one.

So, what didn’t I like? Well, I don’t like boring presentations from people who could only talk about how great their own company is. Werner Vogels from Amazon totally wasted a good opportunity to talk about on-demand resourcing in a useful and interesting way, instead choosing to bang on about how great S3 and EC2 are as if that was all we needed to know. He had the beginnings of a really good talk about push- and pull-mode resourcing, and could have given us a really useful insight into how S3 and EC2 actually work, but chose the patronising ‘Look how great we are! These people use our service! Our service is great!’ route instead. When someone asked “How do you ensure that the data you host on S3 isn’t lost”, he totally refused to answer and basically just told us to trust them. Sorry, but you can’t demand trust, you have to earn it.

Here’s a general tip for people representing their company at a conference. Remove every single superlative from your presentation. I don’t want to see you saying that your product is ‘the best’ or ‘most this’ or ‘incomparable’ – I won’t believe you anyway. You can do more to enhance your company’s reputation by giving an interesting talk that’s only tangentially related to your products or services than you can by blathering on about how great you are.

As for sponsor talks, well, frankly, sponsors should never be allowed anywhere near the stage unless they have something genuinely interesting to stay.

Barring a few boring talks, I came away from FOWA 07 feeling pretty good – I enjoyed myself and had some good conversations. It wasn’t as inspirational as last year, and I think I’d prefer next year’s to go back to a one-day format but be much more rigourous about who gets invited to speak than have two days with sponsor chaff clogging things up.

Of course, I’m secretly hoping that next year I’ll be able to come up with a relevant talk to submit myself, but I guess that depends on how much Ruby on Rails I get my head round in the meantime. But either way, I’ll certainly be hoping to attend again.

Technorati Tags: ,

FOWA: That’s all the notes published!

OK, so that’s the lot! All my FOWA notes are now up. I’ll do a round-up tomorrow and actually talk about how I thought the conference went, but meantime I hope you find my notes useful. Please remember, though, that these notes were taken live and I can’t vouch for their accuracy, both because I am fallible and because I am only reporting what speakers said and not fact-checking them.