Does anyone remember The Birdie Song? The original version was released in the 1960s, but in the 1980s a UK band called The Tweets got to No. 2 in the charts with an instrumental version accompanied by a silly dance. I use the term ‘dance’ loosely. In the same way that you might describe a bacon double cheese burger with chilli sauce, caper mayo and a side order of onion rings as ‘nutritious’.
Both the Birdie melody and footwork burned itself into a collective global psyche. (As a special treat I have included a link the Indonesian version by Warkop, who built a whole comedy routine around it, at the end of this post.) Huge numbers of people hated The Birdie Song but a frightening and equal number are compelled to hum the first few bars under their breadth in moments of crisis. Go to a wedding and sooner or later Aunty Ethel and your strange cousin will loosen their clothing and start teaching the moves to anyone who dares come within striking distance of the dance floor. By 9.30 the same evening every inch of available floorspace is given over to synchronised chicken dancing.
All of which brings me to the subject of this post: Twitter.
Okay, at first glance this may seem like a gratuitous segue based on a tenuous ornithological resonance. But Twitter and The Birdie Song connect on a much deeper level. People get very hot under the collar about this particular branch of social media (as they did with The Birdie Song). It’s a love it or loathe it kind of thing. For every Aunty Ethel desperate to teach you the Twitter moves there’s an Uncle Alfred spitting tacks about collective navel gazing.
Until a couple of weeks ago I was in Auncle Alfred’s camp. I had bigger social media fish to fry. I was interested in ‘communities’, ‘platforms’, you know, ‘big stuff’. So what if Stephen Fry could describe dolphins undulating in 140 characters or less. Twitter was witter. I took words seriously.
But if you’re going to get under the skin of social media you can’t leave anything out. I sidled up on Twitter, the same way I approached Wasabi mustard and pickled ginger when I first discovered sushi. You had to poke at the condiments just to prove you knew what you were doing. Take a little dip, decide you don’t like it (can’t see what it adds) and then get back to the raw fish and soy sauce. (Okay, a serious amount of mixed metaphors going on here, but keep up with me.)
But Twitter is a very interesting phenomenon. There are layers to it. Dismiss it as geeks meet airheads at your peril. Like The Birdie Song, its predicated on some simple basic steps. First the question: What are you doing? and then the answer: as brief as you can make it. You can teach someone The Birdie Song dance in about 10 minutes. You can start to Twitter in a similar amount of time.
I am rapidly coming to the conclusion that Twitter is the first pure-blood content progeny of the online age. It is adapted for skimming and dwell times that you can count in nano-seconds. Even the line length is perfect for screen reading, although whether that’s by design or luck, I don’t know.
Websites, although they’re getting better and better, are still caught up in their offline heritage. Websites may embrace interactive media, real-time chat and online transactional interfaces, but every now and then they drop their aitches and start sounding like printed brochures. Blogging has shifted control more firmly into the hands of users but they’re still predicated on offline values. Phenomena such as Facebook, bebo and YouTube have further societised the internet; but they are, simply, highly accessible online manifestations of yearbooks, youth clubs and the weirder hinterlands of televisual entertainment respectively.
Twitter is an online baby. For a start, your ‘standing’ on Twitter has everything to do with how many people follow your Tweets (posts). You can’t throw money at it in order to get noticed. And people only seem to follow what engages them. There’s no brand loyalty here. I’ve come across big business Twitters with 2 followers, while mums in Maryland can number followers in thousands.
Secondly, you’re only as good as your last Tweet. And if you last Tweet was more than a few hours ago, chances are it has already been submerged by newer, fresher perspectives. Twitter has taken internet ‘currency’ to a new level. When people visit the internet they want to find information that is relevant now. Yesterday’s news is so very, very yesterday. That doesn’t mean there’s no room on the internet for historic / archive content (if presented usefully) but there’s no excuse for not being up to date, as well, particularly as publishing to web is being made easier by a plethora of content management systems.
And like The Birdie Song, Twitter is all about collective impact. It doesn’t matter that Aunty Ethel is always half a beat behind the rest of the dancers, or that your strange cousin has added a couple of unique moves to the bit where you all turn round; Twitter is a collective. It’s thousands of voices threading in and out of each other on a single platform.
Twitter also exposes the associative nature of internet information connectivity. Thanks to hyperlinking, the internet mimics and facilitates the human brain (associative thought), allowing us to move from one piece of information to another, propelled by what we’re thinking of doing. It’s this hyperlinking that allows us to get from, say, checking the cost of flights to Malaga this summer to tracking down the right kind of rice for a great paella recipe.
Twitter is highly associative. My experience is that although each Twitter post is officially provoked by the question: ‘What are you doing?’ often the question people choose to answer is ‘What I’m thinking about’ or ‘What has got me thinking.’ Twitterers point to other Twitterers’ Tweets, a signficant number of which are crafted around a stimulating thought, or which act as signposts to useful information on other websites. (Tiny URLs and Twitter – a marriage made in heaven.)
All of which has got me thinking – what next? I’m no Darwin scholar but it seems like every time there’s an evolutionary leap it spawns a period of extrordinary fertility. Get the structure right and Mother Nature pops out a huge number of permutations. Then it’s just down to the survival of the fittest.
I’m sure they’ll be Twitter derivatives but the big question is what else can evolve around user value, equal access, immediacy, succinctness, ease of publication, associative linking and associative thinking? Answers in 140 characters… or more.