What’s wrong with bad language?

(Here’s a bit of a post-Christmas rant. Sorry for the length!)

Complaints like the following are not uncommon on the internet and elsewhere:

— “Why can’t people write/speak Proper English?”
— “Write proper English if you want others to understand you!”
— “People who don’t use Proper English are lazy, stupid, or both.”
— “English is ruined by sloppy pronunciation and bad grammar!”

Taken at face value, complaints like the above seem perfectly reasonable. If you want to get a message through, then you need to communicate in a way that is understood. What could be more reasonable than that? There are a few underlying assumptions here that need a fair bit of consideration before we can address that issue. For instance, what exactly is Proper English? And by implication, what is Improper English? But let us start with a few basics.

Spoken vs written language

We normally communicate by either speaking or writing. Speech is our primary mode of communication. We learnt to speak before we learnt to read/write, both as individuals and as a species. Without speech there would be no writing. The reverse is not true.

Speech and writing are separate, albeit related, code systems. Even though there is a considerable overlap in lexicon and grammar, they nonetheless function according to their own separate sets of rules and constraints. Hence we need to keep them clearly separated.

In speech, for instance, we have a whole range of extra-linguistic phenomena that help us interpret what people say, such as intonation, stress, loudness, mimicry, body posture, finger pointing, and so on. Even the clothes a speaker wears and the dirt under his/her finger nails can be used as clues when decoding a message. In a written text, all those speech-typical signals are lacking. On the other hand, written texts have their own distinguishing features, e.g. font type, text formatting (such as bold-print, italics, underlining, font colour, etc.), paragraph organisation, headlines, info boxes, illustrations, paper quality, and so on and so forth. In short, spoken and written messages are constructed differently, and consequently we also decode them differently.

However, there are differences also where you might hypothetically expect similatities. For instance, the spelling of English does not match the pronunciation. Hence while there are 6 letters signifying vowels in written English (a, e, i, o, u, y), there are more than 10 actual vowel sounds (excl. diphthongs) in any variety of spoken English. Some spelling/pronunciation mismatches are archaisms, testifying to old pronunciations. This does not, however, mean that they are not functional or useful. While the written words right/write/rite are kept apart in writing, they are pronounced identically in most forms of English. This actually makes reading easier. Since the words are visually distinct, they are not likely to be confused and thus their meanings are quicker to access. Moreover, the different spellings maintain a visual consistency among a set of related words, in this case write/wrote/written (all contain wr-t) and rite/ritual (both contain rit). Compare also anxious and anxiety which are pronounced something like angkshious and angziety, respectively, but are kept visually related by their anx-parts. This maintains a consistency on the word/grammar level at the expense of congruency on the spelling/pronunciation level. It’s an unavoidable trade-off, but very useful for readers.

Vocabulary and grammar differ between written and spoken language. We commonly use more elaborate sentence constructions in writing than we do in speech, and while we write £5, we use a different word order in speech, namely five pounds (instead of pounds five).

Speech is immediate and momentary, while writing is planned and lasting. We make up, edit and correct our spoken utterances on the spot, while the listener is hearing them. Written messages are often edited and revised before they are read by their intended recipients. Hence readers of written texts are less likely to witness any corrections and edits. In speech, however, these occur naturally during the actual communication. Thus it is only normal to expect more errors and mistakes in speech. This has to be taken into account when evaluating either speech or writing.

Change and variation is natural and ever-present

Human languages are heterogenous and they constantly change. In fact, languages are in constant flux. They display variation in both time and space. They always have and they always will. It is a necessary property of any living language. When a language ceases to change, it dies. It becomes non-functional and no longer serves the purpose as a useful tool of communication.

Language cannot be a closed and rigid system of rules for a variety of reasons. It has to be adaptable and flexible, simply because we need to be able to talk about new things, or even old things in new ways. Such things as metaphors, similes, analogies, sarcasms, and so on, constantly change and enrich our languages with new forms and new constructions. This is what poets and authors do all the time. Well, at least the interesting ones. But it’s not only poets who do it. Other people do it, too. Some people do it more deliberately and innovatively than others, but we all do it to some extent; perhaps not with as impressive results as Shakespeare did, but still, we do it.

It is not only the use of words and phrases that change. Every aspect of language changes, pronunciation and grammar included. The reasons can be many, ranging from the biological/physiological workings of our speech apparatus to intentional idiosyncracies used to mark one’s own identity or make a joke. Some are due to errors and mistakes, no doubt. But whatever their reasons, some of them catch on and spread, either because they are considered more prestigious (e.g. if a famous person uses it), or because they are felt as better in some way (e.g. when the shorter mobile is used instead of the compound mobile phone). Clearly we cannot dismiss changes in principle. They happen, sometimes for reasons known to us, but most commonly for reasons that are unknown. This, of course, leaves the field open for people to make up all kinds of speculative and bizarre reasons, such as “laziness” and “stupidity”.

Different individuals do have different language skills, habits and preferences. I know a different set of words than you do. I’m comfortable with a different way of expressing myself than you. These differences exist, but such differences do not unproblematically translate into values of good or bad.

More than just messages

Language is not only used to convey messages. With language, we also signal our identities and group belongings. This is an important function of language, especially spoken language. We imitate those we want to identify with. With language, we create boundaries to other people who do not speak like us, or towards people we do not want to sound like, for whatever reasons. For instance, kids do not want to sound like their grand-parents. Nor do grand-parents want to sound like thirteen-year olds. Scots do not want to sound like Londoners, and vice-versa. Middle class people do not want to sound like chavs. Hence they make sure they don’t speak like them. In their opinion, they of course speak “proper” English while the so-called chavs speak “bad” English.

This segregating function of language exists in all speech communities, and the causes are largely social. People naturally form groups, both temporary and long-lasting ones, large and small. It can be professional groups like lawyers, electricians, clergy, etc., or it can be social groups like family circles, chess clubs, buddies, street gangs, and so on. These groups create and maintain in-group specific behaviours, be it dress codes, hand shakes, in-jokes, whatever, incl. particular linguistic behaviours such as specific forms of salutations and other fixed phrases, common technical terminology, peculiar pronunciations of certain words, and whatever else. These in-group peculiarities, furthermore, are part of what defines any given group. They reinforce the group’s identity and signal that identity towards outsiders.

What, who, how, where and when?

We constantly adapt our language use according to a variety of factors, such as what we say (the topic), to whom we are saying it (the intended receiver), how we say it (the medium), as well as when and where we say it (the context/milieu). For instance, if there is a lot of background noise we may chose to scream. If there’s a lot of other people around, we may instead chose to whisper. We speak differently to our loved ones than we do to our bosses. When we communicate we usually do so with an intended audience in mind. It can be a single individual (e.g. in face-to-face dialogues) or it can be a large non-specific mass of people (e.g. when giving a national speech on radio). Thus we adjust our speech accordingly. That’s why “Sup?” is a perfectly valid formulation in one context, while “How do you do?” might be considered better in another. This is also the reason why there is no one single correct way of communicating. There are in fact innumerable correct ways.

Sometimes one can hear complaints from adults who have overheard adolescents speak unintelligble English on a bus, or some other public place. This kind of complaint has little rational basis to it. The needs for successful communication have evidently been adequately met, as they obviously understand each other. They have no intention, nor any obligation, to make themselves understood by outsiders. Their use of “unintelligble” language is no indication that the adolescents in question cannot speak “proper” English. In all likelihood, they can. They are simply sensitive to the fact that they are intending their communication for each other, not those around them. Their choice of language behaviour signals “This is who we are” and “This conversation is not meant for you but my mate(s)”.

Standard English

When people talk about Proper English, it is usually some form of written English they mean. In particular, they imply the kind of English tought in schools, and which is commonly referred to as Standard English or Queen’s English.

Sometimes you hear people say things like “Common rules (of Standard English) are necessary to secure good communication”. However, this is looking at it the wrong way around. It is our need to communicate that has created and continues to maintain whatever rules there are. Some of these rules (generalisations) have been “discovered” by scholars and subsequentlty printed in books (grammars), which others then have come to treat as indisputable dogma. It is important to remember here that writers sometimes get it wrong, and not every stipulated grammar rule is a valid rule. Nor can a grammar ever be complete. And because languages constantly change and adapt, every published grammar is instantly obsolete. Think of a grammar as a photograph capturing a single moment in time. And not only that, no matter how much of the landscape it captures, there will always be something outside the frame, or beyond the horizon.

Standard English is not a natural language. It is an artificial construct existing only in written form. Even though it is a written language, many people try to emulate it in their speech, especially in formal situations. It is not the case that dialectal forms of English are deviations or even variations of Standard English. If anything, it is the other way round. Standard English is to spoken English dialects as the poodle is to wild dog species. More specifically, Standard English is an artificial variation of Midlands-based dialects, just like the poodle is an artificially created variation of a once-domesticated wolf. What we today recognise as Standard English has been deliberately engineered and promoted by the social and academic elite over the past 500 years or so.

It is some idealised form of this artificial, written Standard English that people usually have in mind when they complain about other people’s Englishes, be they written or spoken. And this is the basis of their irrationality. Everyday English is not the same as Standard English, nor should it be, and anyone expecting it to be is by definition wrong, even foolish.

So what are people complaining about?

There are seemingly no limits to what people can complain about when it comes to language. However, some complaints have been repeated so often that they have become unquestioned clichés rather than observations based on any rational thinking. The ironic thing is that many of the things that “language snobs” complain about aren’t even errors to begin with. Their complaints have been refuted many times by linguists, but the internet in particular is ripe with the same age-old complaints, including such dear things as split infinitives, double negations, the word “like”, saying bigger than me instead of bigger than I, writing could of instead of could have, and many others. Let’s have a look at some of these.

Split infinitives

Complaining about split infinitives seems to be a favourite. However, there is nothing wrong with them. They are fully permissible by English grammar, and they have been used by many generations of speakers and writers. Sometimes sentences become more clear with split infinitives than without them. Compare the following three versions:
 — He prepared silently to accompany her
 — He prepared to silently accompany her
 — He prepared to accompany her silently
In the first sentence, silently modifies prepared. In the second, it modifes accompany. The third sentence is ambiguous. If you want to make it clear that it is the accompanying that is done silently, instead of the preparation, then you chose the second sentence, the one with the split infinitive. That is not grammatically flawed. It is stylistically good.

Double negations

Double negations is another favourite gripe among language snobs. They are sometimes claimed to cancel each other out. Thus He didn’t say nothing can allegedly be misinterpreted as He said something. This is just plain wrong. No one is likely to construe such a meaning unless they intentionally try to. The double negation is indeed redundantly marked (i.e. it is pleonastic), but this is quite a common phenomenon in languages. It can be used for emphasis or simply to make sure that the negation is heard. This is valuable in speech if not in writing. There is certainly nothing ungrammatical about it. There are even constructions in which we expect double negations to occur, as in neither … nor, in which negation is doubly (redundantly) marked.

It is even possible to argue that those who use double negations are more attuned to communicative needs than those who don’t use them. In Old English, the common negative structure was something like ic ne lufie (lit. I not love). The negative particle ne was frequently destressed in speech, for which reason it was strengthened with another negation marker, noht (nothing), giving rise to the Middle English construction ic ne seye not (lit. I not say nothing). This doubly-negated construction ensured that the negation wasn’t lost in transmission. (When the original negation ne later disappeared altogether, the newer negation, noht, was brought forward with the help of the auxiliary do giving rise to the Modern English construction I do not know.) In modern times, we see the same process happening again. Negative particles are frequently destressed in constructions like I don’t have it, as opposed to I do not have it. This creates a natural need to strengthen the destressed negation with an additional negation, as in I don’t have nothing. Thus the second negation is not there to cancel the first one out. It’s there to make sure the negation is heard. That’s not being ungrammatical or unidiomatic. It’s being sensitive to communicative needs.

(As a side-note, there are cases where we use double negations with the seemingly intended purpose of having them cancel each other out. But this seems to be possible only when the negations apply to the same word, as in not uncommon, in which the negations are not and un-. However, in such cases the resulting semantics does not equal the simple positive root, in this case common, so that semantically speaking, there is more going on here than a mere cancelling out.)

I or me

Whether to use me or I after than depends on how one interprets than. Is it a conjunction or a preposition? Both interpretations are permissible in English. Many words can function as both prepositions and conjunctions. In a somewhat simplified way, you can say that it depends on what comes afterwards. Is it a noun phrase or is it a verb phrase? In he did it before me, before is a preposition since what follows is a single pronoun. In he did it before I did, it is a conjunction since what follows is a verb phrase. This double-functionality is an integral part of English grammar (spoken as well as written), and there is nothing wrong with it. Hence he is bigger than me (preposition) is just as correct as he is bigger than I am (conjunction).

The curse of the Bishop

As already indicated, the above-mentioned complaints aren’t valid complaints at all. They have nonetheless been prevalent complaints for many generations, despite having been refuted many many times. They seem to originate from a set of (pseudo)rules established by Bishop Lowth in an influential book on English grammar which he published in 1762. They are likely based on Bishop Lowth’s own, idiosyncratic aversions against other people’s English which just happened to differ from his own. Due to his social prestige and position in the church, his opinions came to be propagated by generations of English teachers, who are no doubt well-meaning, but misguided and wrong nonetheless.

Words like like

A more recent annoyance is directed towards the use of like in phrases such as it’s only, like, an hour and he was, like, stupid or something. There seems to be no end to the stigma attached to those who use like like this. Needless to say, their complaints are emotional rather than rational. The word like is multifunctional in most people’s English. It can appear as a verb, as in I like it, an adjective, of like mind, a conjunction, he eats like there is no tomorrow, or a preposition, she walks like a duck. For many people, these (and perhaps a handful others) are the only accepted uses of like. Like in so many other cases, this, too, rests on the misapplication of ideal written standards to the spoken language. In spoken English, like has a further function seldom noted by grammar books (which are based on written language). It can also be a so-called discourse particle/marker, a functional category found in all languages. Indeed, discourse particles form an important and seemingly essential part of any spoken language. In English, words and phrases like like, well, I mean, you know, as well as others, can all be used as discourse particles fulfilling a variety of important functions. For instance, like can be used as an approximation marker, as in it was, like, two hours ago (emphasising that something is an estimate), mitigation marker, he was, like, stupid (lessening the impact of the accusation of being stupid), quotation marker, e.g. so he was like, “Ooh, my brain hurts” (framing reported speech, i.e. signalling that something is being quoted), as well as other things. Discourse particles are seldom, if ever, meaningless ticks. They only appear superfluous and meaningless to those whose ideal language is some form of Standard Written English and/or to those who don’t pay attention to what those little particles actually do.

Could of

A frequently ridiculed construction in written English, especially on the internet, is the use of of where you would normally expect a have, as in must of, should of, etc. Worth noting is that this particular mistake is always self-correcting, so even though it can be construed as a grammatical error, it does not create any confusion with regard to the message conveyed. However, it is also worth noting that it is not the grave error many people think it is. The grammatical template Auxiliary Verb + Preposition + Main Verb is already acceptable in English. It is used in, for instance, I ought to go home, in which ought to go is grammatically analoguos to should of done, i.e. Auxiliary + Preposition + Main Verb. There are even hints of a consistent division of labour here. While to is followed by infinitive forms, of is followed past participle forms. There is clearly more at play here than it being a simple error.

Commonly mixed homophones

Other oft-noted mistakes/errors concern the mixing of it’s/its, they’re/their/there, we’re/were/where, you’re/your, hear/here, and others. Admittedly, these are genuine errors, but they are always self-correcting. It is very difficult to come up with contexts where they’d create any serious misunderstadings. They may be eye-catching, but can hardly be considered detrimental to communication as such. Some might even argue that since they aren’t distinguished in speech at all, being pronounced identically, we might as well dispense with the distinctions entirely and use a single form for each.

Language is self-regulating and optimal

Any given speech community will always regulate its language behaviour so as to be an optimal tool designed for easy flow of information (make yourself understood within your group) as well as a tool for signalling an individual speaker’s identity (tell your surrounding who you are and who you are not). The need to succesfully convey messages favours similarities and common rules, while the need to mark one’s identity favours differences and idiosyncracies.

Since language is also context-dependent (who says what to whom, how, when and where), what counts as optimal varies from situation to situation. Sometimes being only able to haggle over prices at a flea market is enough to qualify as functionally optimal. Sometimes it’s enough only to be able to hurl insults across a border. In other situations, such as parliamentary debates or when romancing a loved one, more elaborate language behaviour is called for. The situation, the participants and their individual motives dictate how communication occurs and what forms it takes.

The point here is that many factors have to be taken into account when assessing what is or is not appropriate and/or functional language. Using the sole reference frame of Standard (Written) English simply won’t do. It’s deeply ignorant of what language is and how it works. When a perceived error keeps being repeated over and over again, generation after generation, then there is always more to it than it being a simple “error”. Sloppiness, lazyness and stupidity are never the answers, provided you’re interested in understanding language as opposed to merely denouncing what breaks a perceived idealised dogma.

What people use in their daily lives could do well with a lot less respect for normative standards. Correcting is completely superfluous in ordinary, everyday language use, mainly because linguistic errors are typically self-correcting, be they grammatical errors, typos or mispronunciations. If they weren’t, they wouldn’t be spotted. And when they do create genuine ambiguities, it is always better to ask for clarifications rather than trying to correct them. Communicatively speaking, that is a much more productive solution.

I think people pay way too much attention to spelling in their writing. There’s no harm whatsoever in allowing a more free and liberated spelling in ordinary language use, especially in personal letters, emails, internet forums, etc. There are admittedly some contexts where a normative spelling is preferable, and where correcting (genuine) errors is a good thing. This concerns mainly educational settings where language either is being taught as a subject or forms part of the curricular activities (e.g. essay writing). Official texts, regulations and legal contracts may also benefit from fixed spellings, but newspapers and prose publishers have hardly any reasons to abide by dogmatic spelling conventions.

Linguistic errors (real or imagined) are no more harmful to the English language, or even communication in general, than picking up the wrong fork at a fancy dinner party. That is, it may jar the (over)sensibilities of some snobs, but ultimately has no effects beyond that. To me, language snobs (prescriptivists, purists, the Grammar Police) are like extreme creationists. Instead of observing and understanding what language really is, they chose to believe in some sort of mythic ideal (Standard/Proper English) which they use as a holy dogma, especially when judging and denouncing the behaviour of other people. They then vilify whatever behaviour they perceive break the rules of their interpretation of this revered dogma.

Virtually every complaint about “bad language” is nothing more than just another stick to beat other people over the head with in order to feel superior. It is a behaviour not very different from school-yard bullies who point a finger at the kid who dresses differently or talks with a lisp. Unfortunately, many complaints (be they valid or not) seem to rest on an underlying rationale that goes something like “Yes, I can understand what you’re saying, but I don’t want to, so it’s your fault!”

That’s a shame.


Aitchison, Jean. 1991. Language change: progress or decay? 2nd edition. Cambridge University Press.
Andersson, Lars-Gunnar & Peter Trudgill. 1992. Bad language. London: Penguin Books.
Brook, G.L. 1978. English dialects. 3rd edition. London: Andre Deutsch.
Coates, Richard. 1989. A solution to the ‘must of’ problem. In: York papers in linguistics, v. 14, p. 159-167.
Crystal, David. 2008. Txtng: the gr8 db8. Oxford University Press.
D’Arcy, Alexandra. 2007. Like and language ideology: disentangling fact from fiction. In: American speech, v. 82, p. 386-419.
Foster, Brian. 1970. The changing English language. Harmondsworth: Penguin Books.
Janson, Tore. 2002. Speak: a short history of languages. Oxford University Press.
Jespersen, Otto. 1933. Essentials of English grammar. London: George Allen & Unwin.
Jespersen, Otto. 1938. Growth and structure of the English language. 9th edition. Oxford: Basil Blackwell.
Mattson, Jenny. 2009. The subtitling of discourse particles: a corpus-based study of well, you know, I mean, and like, and their Swedish translations in ten American films. PhD dissertation. University of Gothenburg. (download)
Todd, Loreto & Ian Hancock. 1990. International English usage. London: Routledge.


David Crystal’s blog
Language log
Urban dictionary
World Wide Words

Is illegal copying of software theft?

If I take a DVD with a piece of software on it, without the owner’s approval, then everyone would agree that it is an act of theft. But if I only copy the disc’s content, or download it over the internet, and thus do not deprive the owner of any physical object, is it still theft?

The most basic understanding of theft would be when you take possession of someone’s physical property without that someone’s approval, be it a car, a wallet, or some jewels. Prototypically theft involves touchable things, but it can also extend to nonphysical things. For instance, you can illegally empty someone else’s bank account without actually moving any physical coins or money around. It would still be theft, even though no physical objects are involved. In both cases, you would have deprived the rightful owner of something of value.

But can the concept of ‘theft’ be extended to illegal copying and downloading of software? Clearly you have not deprived the owner of anything physical or even digital. You have merely copied it. The software itself is still there in its original place, so how can it be theft?

If I steal a physical DVD with software on it, it is not the disc itself I want. It’s the content of the disc that I want. If I illegally copy or download it, I’m after the same thing. The fact that I’m not taking the software with its physical container/carrier seems irrelevant to me. I have illegally transferred something into my possession that doesn’t belong to me. I have thereby also unduly benefited from someone else’s property. I have infringed on the legal owner’s right to control it’s distribution. Does this amount to theft? Instinctively I would say yes, it does.

Now, I can understand if people object to this. It’s common to treat words and their meanings as fixed points in the universe. If you have a fixed concept of the word ‘theft’, and try to apply that to illegal copying/downloading, then you would naturally conclude that illegal copying is not an act of theft because you’re not depriving the owner of the thing you’re making a copy of.

But words and meanings are not fixtures. Nor should they be treated as such. The world around us changes all the time, and so we must constantly re-negotiate our vocabulary to match it. Otherwise our language would eventually be useless.

The meaning of ‘theft’ relies on (at least) three concepts, namely, property, ownership and possession, as well as on how those concepts are transferred between keeper and taker. When the idea of theft was originally thought up (an occasion now long lost to history), there were no digital products around. Now there are. I can have ownership and possession of a physical thing like a car, and I can have ownership and possession of a digital product like a piece of software or a digital recording.

A piece of software cannot normally change hands in a physical sense, only copied. That is, while you can transfer the ownership of software, you cannot physically transfer the property itself. You can copy it and then delete the original, but unless you transfer the software’s physical carrier/container, the software by itself cannot be transferred.

If the concept of ‘theft’ depends necessarily on the illegal transfer of the property itself, it should by implication never be possible to steal digital products. To me, there’s something wrong, and obsolete, about that. In principle, anything that can be possessed can also be stolen. It really isn’t that much of a stretch to re-think the idea of ‘theft’ to include illegal copying/downloading. We need to focus on theft as an act of illegally taking possession of a property, and only that. The physical transfer of the property itself does not have to be involved.

I should perhaps emphasize that I’m not talking about the legal definition of ‘theft’ here. I’m trying to understand a colloquial usage of the word ‘theft’, in particular my own. And to be quite frank, I’m not even sure that I’m all that categorical about it. Perhaps we do need a new word for this. I guess my only point is that it’s at least not impossible to think of illegal copying/downloading as an act of theft.

Animated logos and other stuff on DVDs

Like most people, I am grateful for all the magnificent DVD releases we have seen over the years. It is absolutely wonderful to have home access to all kinds of great movies, TV shows and music concerts, and most of them in very good quality, too. I have a lot of appreciation for the people and companies involved in the making of those DVDs, which often come with a whole bunch of interesting, high-quality extras.

Having said that, I have one issue with them. It concerns most DVDs on sale, and it’s all those stupid things you have to sit through to get to the good parts. Sometimes the DVD starts with a copyright notice that locks up your DVD player. At other times, it happens when you press “play”. Although I can understand why they think they must include it, in reality it’s utterly pointless. Those who pirate DVDs are already aware of the fact that they are doing something illegal. They don’t need to be told that. It might as well say “Don’t feed this to your cat”.

But my gripe is not with the copyright notice. It’s all the other idiotic crap I have to sit through. Sometimes it can get pretty crowded at the front of a DVD. First there are animated logos for all the production companies involved (usually several). Then there’s a short movie telling me how bad stealing is, followed by trailers for other DVDs*, after which I often get some kind of animation leading to the DVD menue. If you’re really unlucky, you’ll get the DVD from Hell which contains all of them.

OK, so why not just skip them? Unfortunately, very often the useless cannot be skipped or even fast-forwarded. Somewhat luckily, with some DVDs, you can pop it in, go out for a cup of coffee or a beer, have it run all the annoying stuff on its own, and when you return, the menue is nicely displayed on the screen. But not seldom, you cannot do even this. Quite a number of DVDs stop at various locations, forcing you to chose “enter” and “language”, so you’re stuck sitting there after all. On more than one occasion, I’ve actively avoided DVDs I know have these forced stops.

It’s perhaps OK to sit through it once, or even twice, but enough is enough. It’s like I’m being punished for being a paying customer. Indeed, all those forced logos, messages, trailers, and the annoying stops do nothing but create bad blood between the DVD makers and their customers.

(Note * = I have to point out that I actually appreciate the trailers, but only if I can chose to watch them at my own time. I don’t want to be forced to watch them. My living room is not a cinema. Watching movies on a DVD is not the same social event as “going to the movies” where the trailers function more naturally as a warm-up to the main event.)

Does science need theories anymore?

I just read a somewhat flawed paper at the Edge entitled: The end of theory, written by Chris Anderson.

In short, the paper argues that we will no longer need theories in science, because we have Google. We can now use computers to look at massive amounts of data, and use them to detect patterns for us. From this, Anderson draws the slightly irrational conclusion that we need no theories.

Says Anderson:

Petabytes allow us to say: "Correlation is enough." We can stop looking for models. We can analyze the data without hypotheses about what it might show. We can throw the numbers into the biggest computing clusters the world has ever seen and let statistical algorithms find patterns where science cannot …

There’s no reason to cling to our old ways. It’s time to ask: What can science learn from Google?

Anderson’s somewhat fallacious observation is that we don’t need scientific theories or models since Google will give us all our answers anyway. However, what Anderson is talking about is nothing new. He is merely describing a first step in a long-established scientific method called induction, or "data-to-explanation". In its modern form it has been around since at least Francis Bacon (late 16th century), sometimes referred to as the father of scientific induction.

Amputating the inductive method by removing the explanation part (the model, theory) is not the way to go, as then we would effectively be entering a stage of scientific stagnation. It would be a job half-done (to some degree even pointless) for a scientific endeavour to collect data and establish patterns and not try to explain why the patterns are there. The explanation part is essential if we want to understand *why* the patterns exist, and for that we need models. The models need not be established before-hand, of course (even though analysing data without some prior theory is virtually impossible). Finding patterns in data can be, and often is, a perfectly valid impetus for developing (new) explanatory models.

What Google, and the like, does is offer us new methods in handling much larger amounts of data than what has been possible before. With Google, we can find new, previously undetected patterns, some of which our existing theories cannot predict. These, in turn, will create a need for new explanations and new theories. Hence it is more likely that Google will foster even more theories and models, not less.

There are no ethnic conflicts in Africa

Why is it that people persist in labelling African conflicts “ethnic”?

Virtually any African conflict is immediately labelled “ethnic” (or worse, “tribal”) without any deeper consideration of the actualities involved. The rioting in Kenya following the elections in late 2007 were referred to as ethnic. The Rwandan Genocide is still often referred to as an ethnic conflict, even though there were no ethnic groups involved. But hey, it’s Africans. And they’re fighting. Ergo it must be ethnic/tribal.

It is true, of course, that some people invoke ethnicity as some sort of justification when making decisions (political or otherwise), or when applying simplistic (often post-hoc) explanations to otherwise complex social issues. Politicians are usually guilty of the former, journalists of the latter.

But, the fact that there are ethnic groups in Africa (like anywhere else in the world, incl. Europe) does not justify labelling African conflicts ethnic. Ethnicity is/was not a causing (or even relevant) factor in Rwanda, nor in Kenya, nor in Sudan, nor anywhere else in Africa. When it comes to conflicts, ethnicity is, and always has been, a pseudo-issue.

Conflicts in Africa, just like everywhere else in the world, have their roots in issues dealing with access to power and resources, not people’s ethnicities. It would be much more apt to use labels like “political” or even “social” instead, in as much as any single label can be valid.

Labelling African conflicts ethnic is not only wrong. It is evil. It perpetuates harmful misconceptions about African conflicts in particular, and Africa and Africans in general. It makes most people think of African conflicts as if they are some sort of spontaneous, inexplicable outbursts of violence born out of age-old “ethnic rivalries”. This image stems from old colonial desires to impose “proper” law and order in Africa. Fighting is just something they do down there. They can’t help it. It’s part of their culture. Why else would people refer to the conflicts as “ethnic”?

By comparison, the different sides in the Northern Ireland conflict divide themselves (largely) along religious lines. You have the Protestant on one side, and you have the Catholics on the other. Does that make the Northern Ireland conflict a religious conflict? Is religion a key factor in the conflict? Is religion a causing factor? Does the Northern Ireland conflict become more understandable if we think of it as a religious conflict? No, of course it doesn’t. There are religious components involved, sure, but religion is not causing the conflict, nor is it a particularly important factor. It would be simplistic and irresponsible to reduce the Northern Ireland conflict to a matter of religion.

Similarly, African conflicts cannot be reduced to a matter of ethnicity. Ethnicity is not a causing factor. It’s intellectually irresponsible to single out ethnicity as an all-important factor in African conflicts. It’s not. It’s about politics, power, resources. Referring to African conflicts as ethnic is just as inaccurate and irresponsible as labelling the Northern Ireland conflict religious.