Sapir-Whorf Hypothesis. Connection between thought and language, if you haven’t got a word for it you can’t think it. If you don’t percieve it asw a concept, you won’t invent a word for it. For example: Dutch ‘gezellig’ [or Welsh ‘hiraeth’].
The Deeper Meaning of Liff: A dictionary of things there aren’t any words for yet but there ought to be.
Example, Peoria (n.): the fear of peeling too few potatoes.
Web examples, AJAX, blog, microformats, Web 2.0. These are words that let us talk about things, they create the concept for us so we can talk about them, even though the thing existed before. They also signal the success of work that has gone on in the past.
There’s little in AJAX that wasn’t there from the start. Blogs have really been around since 95.
What needs a name? Think about concepts that needs names (which the Saphir-Whorf Hypothesis doesn’t allow us to do).
E.g. the sort of website that is like CSS Zen Garden wherein the HTML has been sliced straight off from the CSS. Another example, is using SVG to render data.
Other things that need to be Whorfed in the future:
– layering semantics over viewalb econtent like microformats, RDF/A, making the semantic web more palatable for the web author.
– webapps using decorative markup.
Moores law and an exponential world. Computers very powerful now. His new computer is a dual-core, which means his computer is twice as idle as it was before. Why aren’t we using best use of this power?
A declarative approach puts the work in the computer, not on the human’s shoulders.
Software versions not so much of an issue these days, but devices are. Lots and lots of devices. Also diversity of users. We are all visually impaired at some point or another, specially with tiny fonts on powerpoint slides, so designing for accessibility is designing for our future selves. It’s essential.
Google is a blind users, it sees what a blind user sees. If your site is accessible, Google will see more too.
Want ease of use, device independence, accessibility.
Bugs increase with complexity. A program that is 10 times longer has 32 times the bugs. But most code in most programmes has nothing to do with what the programme should achieve.
However, declarative programming cuts the crap. Javascript, for example, falls over if it gets too long, and declarative programming could replace it and make the computer do the hard stuff without it cluttering up the code. It makes it easier by removing the administrative details that you don’t want to mess about with anyway, so if you let the computer do it then you can remove a lot of this code. So the declarative mark-up is the only bit produced by the human.