The use of fraud in conflict

“Force, and Fraud, are in warre the two Cardinall vertues.” When Thomas Hobbes wrote that, he was not thinking of traditional moral virtues (which he thought only exist in society, after individuals have agreed to observe rules). Rather, he was referring to the strengths or skills that are most effective in achieving goals. In a conflict, each side can overwhelm with force, or undermine with deception. And in any real conflict each side always uses both.

It seems to me that the power of fraud has been greatly underestimated in recent times, despite playing an increasingly important strategic role in determining the outcome of conflicts. Of course, deception has always been used in war, from the Trojan horse to the “man who never was” (i.e. the dead body carrying fake invasion plans dropped overboard for Nazis to find). And propaganda has long played a role in persuasion, so much so that the word ‘propaganda’ is often used as a synonym for misinformation.

But the rise of mass media news sources (TV, internet, etc.) in the late twentieth century opened new vistas and opportunities in the old art of wrong-footing the enemy. In particular, it enables parties to a conflict to harness the widespread politically correct assumption that “the people” can do no wrong, because “the system” is always at fault.

In recent years, “playing the victim” has become one of the most effective weapons in the ancient arsenal of fraud. A typical example might involve a confrontation between protesters and police, which can be understood in game-theoretic terms. The “game” for police is to beat up protesters, and then to lie about it. The “game” for protesters is to get beaten up by the police, and then to make a great public display of their victimhood for the media. Please note that both sides hope to gain from their respective “game”, as often happens — and is often overlooked — in conflict situations. Many conflicts resemble a private arrangement between sadist and masochist, but underneath the cooperation is the serious purpose of defeating the enemy. Like a round of poker, only one player can win, but other players stay in the game as long as it looks advantageous to them.

I have used the example of police-protester brutality because we all see this sort of thing in the media quite often. And please be clear that I regard police brutality as a completely unacceptable use of force, and police denial of brutality as a completely unacceptable use of fraud. But I also regard the deliberate attempt to make a show of one’s own victimhood as an unacceptable use of fraud.

On a larger scale, beyond the everyday spectacle of conflict between police and protesters, there is conflict between nations and ethnic groups. Here too each side typically tries to present itself as the underdog, as long as there is strategic gain in doing so. In general, such gains are possible as long as the conflict takes place under the eyes and auspices of a larger “community” of observers such as an international media or bodies such as the United Nations. Where there are no such observers or agents, no such gains are to be made. I would argue that that explains why there was remarkably little victim-stancing during the Second World War, despite the fact that there were so many actual victims.

With the advent of so many new direct forms of communication, I wonder if the tide is turning. In the present century, the increasing use of peer-to-peer rather than top-down news feeds (via Twitter and the like) can “exclude the middleman” of mass media news sources. Eventually, this may do the great good of making news reporters and cynics of every one of us.

“All they that take the sword shall perish with the sword.” And they that live by the lie shall be disbelieved, because they play fast and loose with their own credibility.

Liberalism and conservatism: we need both

Properly speaking, a liberal is someone who regards freedom of the individual as the main — or only — social good. In other words, liberals think the purpose of government is to protect the freedom of individuals. Liberals understand freedom “negatively”, as the absence of external obstacles to doing whatever you want. This concept of freedom was famously expressed by Hobbes: “a free man is he that in those things which by his strength and wit he is able to do is not hindered to do what he hath the will to do”. In other words, according to the liberal tradition of Hobbes, you’re free as long as nothing external is stopping you doing what you want to do. According to the contrasting “positive” concept of freedom, being truly free means something like “wanting the right things”, if that makes any sense. To find out more, try reading Rousseau.

Since thinking freely, thinking aloud, gaining knowledge, creating and enjoying art, religious observances (etc.) are so vital to human life, liberals strongly support freedom of thought and expression. Anyone hostile to the expression of “impure” thoughts cannot be a liberal in the proper sense of the word. Limiting the expression of such thoughts or enforcing rules against “offensive” speech are the activities of anti-liberal puritans rather than liberals.

Typically, liberals think “freedom for the pike is death for the minnows”, so protecting the freedom of minnows entails putting limits on the freedom of pikes. This can emerge in fiscal policy, for example, when liberals demand heavy taxes of richer individuals to provide free education or health care for poorer individuals. So liberals tend to be “left wing”, although clear-thinking liberals do not regard equality as valuable in itself. Ironing out some extreme inequalities is more the unintended by-product of enhancing the freedom of poorer individuals, which only comes at the cost of limiting some freedoms of richer individuals. This is only justifiable as long as there is an overall enhancement of freedom.

So much for liberalism. What about conservatism? Properly speaking, a conservative is someone who takes a cautious approach to political change. Conservatives do not oppose change per se: in Burke’s words, “a state without the means of some change is without the means of its conservation.” Conservatives often welcome newer arrangements, but they always strive to keep what we know works tolerably well (or even tolerably badly, as long as it’s tolerable). Conservatives are above all sceptical pragmatists rather than convinced radicals. They don’t think anyone knows enough about society to justify root-and-branch reform of the sort attempted in political revolutions. These revolutions are usually bloody, and the changes they bring are rarely for the better, although the people who bring them about always write about them in glowing, euphemistic terms.

It is a pity that so much political discourse assumes liberalism and conservatism are opposed to each other. Liberals and conservatives are often supposed to be rivals, or even enemies. But really, they just emphasize one or the other of two essential components of thought. The necessity of both components is perhaps most obvious in the growth of scientific knowledge. On the one hand, we need new ideas and playful thinking; on the other hand, we cannot accept ideas that contradict what we believe already. So science proceeds by means of both a free-thinking liberalism and a cautious conservatism. Science needs sometimes nutty experimentation and wild theorizing, as well as careful observation, testing, and the judicious application of logic. The latter essentially involves accepting a new idea only if it fits with what we know works already. This insistence on a good “fit” gives us a preference for modest rather than extravagant theory, simple rather than complicated theory, and so on.

Interestingly, this conservative principle in science is an essential ingredient in the lead-up to a scientific revolution. Over the course of a theory’s growth, anomalies build up, like pressures between tectonic plates, to the point where “something’s gotta give” and an intellectual earthquake occurs. Those pressures can only build up where change is resisted — in Thomas Kuhn’s terminology, where a “paradigm” is clung to, often with grim, partisan determination. In the end, change cannot be avoided any longer, and it is dramatic and precipitous. Scientific revolutions are often bad-tempered, but they are never literally bloody like political revolutions, and they nearly always do a lot of good, because they involve the adoption of completely new and better ways of looking at things. No one loses their life in a scientific revolution, although some lose their jobs.

One of the most important aspects of conservatism is its rejection of “foundations” or basic principles. Conservatism is thus opposed to radicalism rather than liberalism, where radicalism supposes that the goodness or badness of a political system depends on a small set of basic principles. Typically, political radicals think “the system” is irredeemably flawed because it has some basic injustice built into its foundations. The cure, radicals suppose, is a revolution that will change everything from the bottom up, starting at the foundations and going all the way to the top of the edifice. Alas, human nature being what it is, radical attempts to change everything usually end up where you get to “meet the new boss / same as the old boss”.

The dominant conservative metaphor for knowledge (political or otherwise) is not that of a building resting on a foundation of basic principles, but rather of a web anchored at various points to the real world. These points inevitably change as circumstances change, and the web itself is an “organic”, evolving thing. The repairs are ongoing, piecemeal, and have to be made without “stepping off” the web. The role of circumstances is important here. Conservatives appeal to particular circumstances rather than to general principles when trying to work out what is for the best. As Burke put it: “Circumstances (which with some gentlemen pass for nothing) give in reality to every political principle its distinguishing colour and discriminating effect. The circumstances are what render every civil and political scheme beneficial or noxious to mankind.”

There seems to have been a remarkable mutual admiration and intellectual agreement between Burke and Adam Smith. Both had a wide liberal streak, as do almost all present-day libertarians. However, something intrigues me in present-day libertarianism, which often appeals to basic principles. Although present-day libertarians often describe themselves as “conservative”, I suspect that many of them are more radical than conservative. The urge to change the entire system from the bottom up isn’t just found among “anti-capitalist” socialists, but also among some “pro-capitalist” anarchists!

Are skeptics nutters?

The sudden rise of a new Republican US presidential candidate — Michele Bachmann, “the thinking man’s Sarah Palin” — has prompted a predictable wave of claims that she is a nutter. She must be a nutter, apparently, because she has very strange views about homosexuality, she claims to have fully reared 23 foster children, and she neither believes in evolution, nor in manmade global warming.

Well, yes, I agree she is a nutter. Her wonky homophobic attitudes and wonky fantasies about being a Mia-Farrow-type “hypermom” point to that. But skepticism about evolution and skepticism about manmade global warming are very different. The first is driven by religious belief, and depends on appeals to authority. The second is driven by deeper doubts about methodology, and depends on a rejection of authority.

Although no one had a full theory of evolution till Darwin described the mechanism of natural selection, most intelligent non-religious thinkers have accepted for centuries that some form of evolution is the only reasonable alternative to the biblical account of creation. And the latter doesn’t really count as an explanation of life and diversity, because the life forms that God is supposed to have created must have existed as ideas in His mind before He made them physical realities. The trouble is, that leaves us with the more intractable problem of explaining the contents of God’s mind, which has to be every bit as complicated and mysterious as the things it designed. So, far from explaining something complicated and mysterious in terms of something less complicated and less mysterious, we instead have to appeal to something even more complicated and more mysterious. That is no explanation! It is so obviously not an explanation that the only obvious alternative — some sort of evolutionary theory — has been the “default” position for centuries.

Darwin’s theory of evolution fits the general pattern of scientific discovery. Something stands in need of explanation, and someone comes up with some hypotheses that explain it. Together, these hypotheses have various observational consequences — in other words, they imply that we will be able to see various things happening — which indeed are actually observed to happen. The theory of evolution departs slightly from the general pattern of science by relying less on testing and more on explanation. But testing and explanation are logically quite similar. In the first case, something has not been seen yet, and we have no reason to expect it. But the theory predicts it, and if it is indeed seen as predicted, the theory passes the test and gives us a reason where formerly we had none. In the second case, something has already been seen, but we don’t understand it — in other words, it surprises us. The theory gives us a reason not to be surprised by it. In both cases, we start off without reasons, and hypotheses supply us with the reasons we were without. In both cases, guesses enable us to “fill in the gaps”.

A lot of Americans say they don’t believe in evolution. Having met a fair number of American students, I’m not sure we should take all such claims at face value. A tongue-in-cheek contrarianism is quite common in America, at least in American universities. Students know they are “supposed to” believe in evolution, by their teachers and the mainstream media, and setting themselves against all that adds a touch of contrarian “spice” to their self-image. So we mustn’t think America is wall-to-wall with people who reject evolution. Having said that, of course many do sincerely reject it. Those that do sincerely reject it are generally very religious. They take the word of the scriptures as authoritative. That is an appeal to authority.

The theory of catastrophic manmade global warming is quite different from Darwin’s theory of evolution. It doesn’t itself explain anything, but instead takes as its starting point an uncontroversial part of mainstream physics (i.e. the greenhouse effect). Using computer modeling, it makes a one-off prediction that we are all headed for catastrophe. That is prediction, but it isn’t prediction used as a means of testing the theory. In fact this theory pretty much dispenses with explanation and testing altogether, substituting instead the common but mistaken idea that empirical knowledge rests on a “foundation” of “data”, as if observation comes first and theoretical knowledge is “based” on it. But that is not the way of science. It is not even the way of most everyday knowledge of the world. Observation suggests hypotheses, and tests hypotheses, but it just cannot imply hypotheses.

So why should anyone believe this body of supposed knowledge that so signally fails to fit the general pattern of scientific discovery? — Because we are “supposed to”. The authorities tell us to. These authorities include pillars of the establishment such as Her Majesty’s government, Her Majesty’s loyal opposition, the BBC, the Archbishop of Canterbury, the Guardian, and the rest of them. But above all we are supposed to believe “the scientists”.

Examine their methodology, and you will see that they are just people who call themselves scientists.

 

 

Science and guessing

The embarrassing lack of success of German epidemiologists in locating the source of a deadly strain of E. coli bacteria seems to be the latest in an ever-advancing, ever-widening front of “scientific” failures. Most of these failures stem from downright bad methodology. (Perhaps this doesn’t apply to German epidemiology, but we’ll see…) This methodology is so bad that I don’t think we should count the disciplines that employ it as sciences at all. These are not “scientific” failures so much as pseudoscientific comeuppances.

At the mention of bad scientific methodology, many will now expect me to start complaining about a lack of “rigour”, or too great a reliance on “mere speculation”, a lamentable “lack of proof”, an insufficiency of the “data” needed to “ground” theory, and all that sort of thing.

Well, those expectations are completely misplaced. It’s exactly the opposite. If you, dear reader, were expecting all those familiar-sounding complaints, you are probably guilty of the same assumptions that underlie the bad methodology I mean to criticize.

The main mistake is to think that science is wonderful because it’s certain. That could not be more wrong. We never have certainty, and science is wonderful not because it’s even remotely certain, but because it’s penetrating. It draws back the curtain on the nature of reality by giving us explanations, telling us what the real world is made of, and how it works. Understandably enough, science buys these powers of penetration at a price: it’s often very unsure. It has to be unsure, because it’s guesswork. And it has to be guesswork, because scientific theories describe things we can’t see directly, such as electrons, force fields, and viruses. The best we can do is guess at the nature of things we can’t see directly, and then indirectly check our guesses by looking at what we can see directly.

The method of genuine science is thus guesswork combined with testing. Our guesses are tested by checking to see whether the observable things the guesses say should occur are indeed observed to actually occur. The fancy word for a guess is ‘hypothesis’, and the logic of the testing of a hypothesis is well understood. A hypothesis passes a test if one of its observational consequences (i.e. something it predicts will be observed) turns out to be true. It fails a test if one of its predictions turns out to be false. The more tests a hypothesis passes, the better reason we have to think it’s true, but it never becomes anything better than an educated guess.

People who yearn for certainty don’t like that. It makes them uncomfortable, because they think there shouldn’t be any guesswork in science, and they don’t like the idea that scientists are “to blame” for guessing wrongly. In their flight from uncertainty and culpability, these people wrongly suppose that science follows a rigorous methodology in which “data” are collected first, and then hypotheses are somehow distilled afterwards from them. In other words, they assume we start off with observations, and by diligently following a plodding, painting-by-numbers sort of method, we arrive at theory. Hence their use of phrases like “theory is supported by data” or “hypotheses rest upon observations” — meaning that theories/hypotheses are implied by data. But that is wrong. In genuine science, theories/hypotheses are tested against actual observations because they imply observations that can be made, rather than being implied by observations that have already been made. These people have things backwards. Backasswards.

Underlying this error is a more widespread philosophical error called “foundationalism”. Foundationalism is the idea that empirical knowledge rests on (i.e. is implied by) a “foundation” of more secure beliefs, in much the same way as mathematical theorems rest on (i.e. are implied by) axioms. Typically, these foundational beliefs are supposed to be beliefs about the qualities of our own conscious experience. It’s too long a story to relate here, but foundationalism is just plain wrong.

In many supposedly “scientific” disciplines — climate science, for example — “data” are collected in the hope that they can be shown to imply theory. Where there are no actual data, “proxy data” (often called “proxies”) are cooked up (i.e. dishonestly conjured up) in the hope that these will work instead to “fill in the holes” of the imagined “foundation” for the theory. The climate record of the past — consisting mostly of these fake “proxies” — is supposed to tell us how the future climate will unfold. This is like listening to the first movement of a symphony in the hope that it will tell you how the second movement will go. Actually, it’s worse than that: it’s like listening to a classroom of Alfred E. Neumans playing their recorders, in the hope that it will enable you to write the second movement of Mozart’s Clarinet Concerto.

You can tell how hopelessly confused this methodology is from claims that scientists are “90% certain” of this or that, as if the theory told them so. But no scientific theory ever tells us how much we ought to believe anything.

Or again, in medicine — where patients, doctors and researchers are especially uneasy about uncertainty and fearful of guessing — there is a thriving industry of statistical “studies”, which are conducted to record correlations, which are then used as a basis for extrapolation. But this is again wholly misguided, as the endless succession of contradictory “studies” reported in newspapers illustrates.

What epidemiologists should be doing — and I suspect have not been doing, but we’ll see, won’t we? — is guessing what might be the source of recent E. coli infections. Then they should test each guess to eliminate it from their inquiries, if it can be eliminated. If a guess can’t be eliminated like that, in effect it passes a test, and it begins to look more like a “suspect”. To avoid guesswork by diligently noting correlations and extrapolating is to write a recipe for more death.

 

“Intellectual property”: time to move on

It’s time to leave the concept of “intellectual property” behind, and substitute a much more robust concept of “authorship”.

It’s easy to understand why we have the urge to treat ideas as property: we are a property-owning animal. This has contributed to our evolutionary success in all sorts of ways. For example, we can eat a much wider range of food than other animals, because we are able to cook it, but we can only cook what we can put down for a few minutes while we fire up the barbie. Chimpanzees cannot do that, because chimpanzees run off with anything left unattended. They don’t understand or respect the concept of property.

Our property-owning nature really comes into its own when we exchange things. In the old days, people exchanged bits of physical stuff like gold. But as time passes and technology advances, we increasingly exchange “information”, i.e. physical patterns (i.e. what ideas are made of) rather than stuff. The important difference between them is that patterns can be reproduced, often at negligible cost, but stuff can’t be reproduced (except in science fiction).

If one aspect of property is buying and selling, the other aspect is theft. When something is stolen, the thief gains what the owner loses, pretty much. For example, if I have a bicycle and you steal it, then you end up with a bicycle and I end up without one. But that is where patterns/ideas really differ from exchangeable stuff.  Most people who have a pattern/idea want to share it rather than keep it to themselves. When a pattern/idea is adopted and properly acknowledged, its author is usually proud to have his genius recognized. Even if he is the victim of unauthorized use of his pattern/idea, what he loses and what the perpetrator gains usually have very different values. Sometimes the victim loses more, sometimes less; sometimes a “victim” can even gain from the so-called “theft” (for example, through piracy his software might be more widely used, thereby increasing legitimate commercial usage; or his reputation as an artist might expand).

I think we need updated laws that protect authorship of patterns/ideas instead of “ownership” of such things. That way, the link between an author and his work can be protected properly, and we can understand the link as gradually weakening over successive generations or repeated reproduction.

We have to change our way of thinking, because publishing is increasingly becoming a matter of “making patterns available” rather than “selling stuff”. And computer piracy (of software, music, books, etc.) is increasingly becoming an unavoidable fact of life. We must be able to take effective steps against serious commercial piracy, as well as distinguishing it from minor non-commercial piracy that if anything helps to promote ideas. Home taping did not “kill music”, as consumers were warned in the 1980s, and the present-day equivalent probably isn’t going to kill anything either.

It will take time to adapt, but we may as well get started by admitting that big conceptual changes are on the way. These changes are not wholly unprecedented: the invention of moveable type, public libraries, etc. were steps in this inevitable direction, which met resistance in earlier times.

So instead of treating a pattern/idea as a special sort of property, and its author as the owner of property, I suggest instead that we should treat the author as a source. Usually, the pattern/idea that we have hitherto treated as “stolen” from an “owner” is in effect plagiarized from the source. Plagiarism is a failure to acknowledge authorship rather than failure to respect ownership. The losses an author suffers are quite different from those of simple theft: his reputation suffers, or at least it is not enhanced as it would otherwise be; and he is deprived of a profit that he might not have made in the first place. But that is merely “potential” profit (as well as being comparatively small). If, as a matter of fact, he would not have made that profit, then he simply does not lose it. (This is a tricky sort of fact because it has to be expressed as a “counterfactual conditional”.)

Recent technological advances have made it much easier to detect plagiarism. But interestingly, those who stand to lose or gain most through the detection of plagiarism — academics — seem largely to prefer the familiar old conceptual framework of ownership. One wonders whether a much larger proportion of academics are guilty of plagiarism than is generally thought!

The supposed “ownership” of patterns/ideas harms education in all sorts of ways. Students lose most in the current system, which seems designed to protect the careers and reputations of their (sometimes plagiarist) teachers. For example, academics routinely re-cycle their own writing (or someone else’s writing) and then, under the guise of “owning” the copyright to what they have written, in effect prevent people from reading it and judging it on its own merits (or demerits). Often, it goes into a journal that only a university library can afford to “buy”, and then only a handful of people read it or discuss its contents. Next, the “owner” of these supposedly “original ideas” (translation: unchecked ideas) presents an academic selection committee with all the “real estate” he has produced over the course of his career, as if this “real estate” were original thought.

That sort of thing has to end. It’s dishonest. People who want to be read can nowadays make their writing easily available to anyone who wants to read it. And they can make their authorship plain, and have it protected in an appropriate way. We are really very lucky to live in an age in which that has become a reality, at last.

Holy crap

Four horrible habits of religious thinking:

  1. The habit of thinking that culpability is inherited (the doctrine of “original sin”).
  2. The habit of thinking that right/wrong is essentially a legal matter (so that morality is determined by a “higher law”).
  3. The habit of thinking that consequences count for nothing — even very bad, foreseen consequences count for nothing — because good intentions are all that matter (the doctrine of “double effect”).
  4. The habit of treating various items as bearing such enormous symbolic significance that they are regarded as holy or blasphemous (“idolatry”).

The first of my four horrible habits is the doctrine of original sin. This is the idea that we are culpable for what our ancestors did, rather than what we do ourselves. Now I just expressed that thought using the pronoun ‘we’, but of course it usually gets the spin that “they” are blameworthy (and “we” are blame-free). We all know how this has worked over the centuries to the great detriment of Jews in Europe. But it isn’t limited to Jews: if you are descended from British imperialists, say, you are deemed guilty of the sins of British imperialism, even if you yourself are not the slightest bit sympathetic to British imperialism.

The blame game works both ways, and it can easily turn into an “exoneration game”: if you are lucky enough to be descended from victims of colonialism, say, your inherited victimhood stands you in good stead, and any colonialist urges of your own are exonerated. Heck, it isn’t “colonialism” at all, but the “recognition of an historical right”! And so it goes, mutatis mutandis, for nearly every traditional sore spot in Europe and elsewhere: “we” are “nationalists”, “they” are “colonialists”. Four legs good, two legs bad.

This doctrine inevitably leads to racism, or at least to an unhealthy fixation on ethnicity and ethnic purity. Why? — If culpability is inherited, then the culpable ones are all related to each other by descent. They might even have their own “race” (often a made-up race). And then there is ethnic “purity”, with degrees of culpability determined by relatedness. This explains the fascist obsession with endlessly fine-grained categories and sub-categories of imagined ethnicity (the “Aryan”, the “Jewish intellectual”, the “street Jew”, etc.).

This habit of thought is so obviously backward and grossly unjust I will not bother to criticize it; it is the core of fascist thinking.

The second of my horrible habits is the assumption that morality is essentially a legal matter, in which doing the right thing means following a “higher law”. The classic example of such law is the Ten Commandments, but there are more sophisticated versions. For example, Kant’s “categorical imperative” tells us that we only do the right thing when we follow a rule that we could want everyone to follow. Rules again. When morality is assumed to be constituted by rules like that, inevitably it begins to be understood as law like the laws written in statute books, only “higher”. For example, if a war is unjustifiable, we are told not that it is immoral, but that it is illegal. Thus moral behaviour becomes an exercise in obedience, i.e. compliance with the correct “higher” law; and moral deliberation is reduced to the study of legal orthodoxy (and the apportioning of blame for heterodoxy).

Present-day appeals to “human rights” (or “natural rights”) can be traced to this assumption that morality is a matter of following rules by obeying the “higher” law. The concept of a right is essentially legalistic, because rights are created by duties, and duties are done when rules are complied with.

This second habit merges into a third bad habit: of assuming that the consequences of action count for nothing, because good intentions count for everything. According to the current way of thinking, obedience is the highest thing anyone can aspire to — and obedience is a characteristic of what agents intend (rather than of what actually follows when we act). Hence the focus of attention falls on best (i.e. “virtuous”) intentions, rather than best consequences. This meshes perfectly with the traditional religious understanding of the mind as a disembodied “soul”. Souls don’t really belong to the material world, where consequences unfold, so instead of acting for the best — i.e. for the best consequences — each soul must endeavour to keep itself “free of blemishes” by always having the right intentions. Few people nowadays believe that blemish-free souls are rewarded with an eternity in paradise, of course, but the habitual idea that virtue is all about maintaining a blemish-free soul lingers on. And on.

An extreme — or extremist — version of this doctrine says that even very bad consequences of action that have been foreseen can be ignored, as long as the good consequences are what is principally intended. (This convenient division between good and bad consequences for the purpose of waiving all responsibility for the bad is called the doctrine of “double effect”.) For example, dropping a bomb on — or setting a bomb off in — a crowded public place is OK, this idea goes, as long as the main objective is something “higher”, such as self-defence or self-sacrifice. This provides a convenient rationalization of the deliberate targeting of innocent non-combatants in war (or whatever program of racial extermination is conveniently labelled as “war”).

The fourth and final horrible habit is of “thinking in symbols”. Primitive religions tend to observe a range of religious idols and taboos, i.e. everyday things which are thought to be inhabited by good or evil “spirits”. A similar idea survives in less primitive religions, which treat particular objects not quite as being literally inhabited by spirits, but instead as having a mighty, mystical symbolic significance. Obvious examples are the “host” in Christian communion, various sacred scriptures (or condemned texts), iconic images of saints (or demons), and so on.

People who profess to be “non-religious” often retain this idea, and treat various flags, books, songs, iconic images of martyrs or “our glorious leader”, even entire fields of technology as having this symbolic significance. In grand symbolic gestures they burn flags and dummies. The gesture is everything, from parading around with giant papier-mâché heads that caricature political foes to treating the hated symbol as downright blasphemous, as something that must be rejected or ostentatiously destroyed — usually by being burned, usually amid a welter of equally symbolic banners representing the opposite “good” side. Thus ordinary empirical judgement and balance are thrown to the winds.

So much for my four horrible religious habits of thought. It seems to me that all of them have a clearly religious inspiration. Resistance to them has traditionally been considered heretical. For example, “Pelagians” are among better-known heretics who rejected original sin.

And yet, despite the religious origins of these habits, not all that many religious believers nowadays exhibit them. Generally the ones who have these habits are those who think they have moved beyond religion, and therefore whose vigilance against their own susceptibility to religious dogma and religious habits of thought is low. They have rejected belief in God, but have neglected to deal with all the extra baggage that came with God, and with human nature.

I find it remarkable and disappointing how many people smugly assume they have left religion behind, when in fact these four habits remained unshakably ingrained in their thinking.