The Samovar

On not taking a stand

A problem for progressives is that intellectuals are more (small-c) conservative than you might expect. The reason for this is that intellectuals are frightened of making mistakes and being caught making mistakes. This comes about because of their engagement in intellectual arguments and investment in their outcomes. Combine this fear of making mistakes with an understanding that they do not have sufficient facts to make an accurate determination of the truth in most cases, and the effect is that intellectuals will often not take a stand on important issues. The problem with this is that it leaves the field clear for those who will take a stand, or leaves the decision to those who make decisions by default, i.e. the powerful. In other words, the intellectual refusal to take a stand on issues where they are not certain amounts to a de facto stand in favour of established authorities and ways of doing things.

The danger of this passivity in response to issues that are too difficult to be sure of, is that it leaves you open to a form of propaganda or public relations strategy that has been well known and exploited for a long time.

‘‘Doubt is our product,’’ proclaimed an internal tobacco industry document in 1969. ‘‘Spread doubt over strong scientific evidence and the public won’t know what to believe.’’

It is very easy to spread doubt, and often extremely difficult to prove a positive claim beyond all possible doubt.

The counter-argument says that not taking a stand is reasonable because we don’t know, and if we don’t know we shouldn’t take a stand. This seems reasonable, but in the case of issues where a political decision needs to be taken, we have to think about the effects of both possible positions we could take. On the one hand, if we refuse to take a stand we’ll never be subject to the criticism that we expressed a wrong opinion, but we’ll be letting the political decision be taken by those with established authority or power. On the other hand, if we take a stand we might be able to have some input into the political decisions, but we run the risk of being proven wrong at some point in the future.

Being proven wrong, though, is not in itself a political problem, it’s a personal problem. So, thinking of our actions as political, rather than personal, we shouldn’t worry about the upset it will cause to us if we turn out to be wrong. Rather, we should only refuse to take a stand if by recklessly taking a stand without sufficient information we might increase the chance of a harmful political decision. Going further than this, in most cases there is not a neutral position: the lack of a position being equivalent to a position in favour of the status quo. Let’s take a look at some examples.

As quoted above, the tobacco industry’s tactic was to create doubt about the link between smoking and cancer. Although the overwhelming evidence was supportive of this link, some evidence was found that was not supportive. What about the effects of taking a stand in favour of or against the link? On the one hand, taking no stand means not pushing for regulation of the tobacco industry, and allowing them to do whatever they want, at the risk of many more cases of lung cancer. On the other hand, taking a stand would mean pushing for such regulation, and reducing the number of people who smoke, at the cost of the profits of the tobacco industry, for no reason. Given that the evidence was very much supportive of the link, the cost-benefit analysis is clear: one should take a stand in support of this link.

A similar tactic is being used today by those who want to deny a link between human industrial activity and global warming. Again, the overwhelming majority of evidence is in favour of the link, but some evidence goes against it. Not taking a stand means letting things go on as they are at the moment, and putting the entire world at risk. Taking a stand, however, is also costly: it means potentially cutting back on emissions and consequently on economic growth, which also has an effect on millions of lives. In both the cases above, there is no neutral position: you have to take a position, and either position you take has potentially dire consequences if you’re wrong. In this case, not taking a position means taking a position in favour of the status quo, i.e. gambling that there is no link and that there will be no effect.

There are many more examples of this situation that crop up all the time. I have come across it in debates with people about industrial action, about alternatives to capitalism, and many more. I want to finish though with an example that is playing out right now and is causing a great deal of tension within the left: the case of the rape allegations against Julian Assange.

History has shown, including quite recent history, that governments are willing to smear their opponents with allegations of sexual misconduct. Given this, it was no huge surprise when allegations of rape against Julian Assange surfaced after the release of documents embarassing to several governments. On the face of it, it would seem as though this was a prime case for ignoring these allegations. However, there is a problem: there is a huge problem with the downplaying or denial or accusations of rape. And indeed, many articles arguing we should ignore these rape allegations have been of a decidedly misogynist character. The response of some (not all) feminists against such articles is then quite reasonable, but unfortunately I feel misses an important point about the way smears work: mud sticks. Consider, as above, our options on taking a stand in favour of or against Assange. If we take a stand in favour of him, we support those who would release important documents that reveal the way governments behave in secret, but we risk supporting someone who may be a rapist. If we take a stand against him, we show that the left will not stand by those who attempt to take on those in power, we let those in power disarm us by accusing us of rape. Yes, of course it’s the case that he might have done it, but there is no neutral stance here: not supporting him is equivalent to saying that you will not stand by anyone if they are attacked by governments in this way. And that’s true even if turns out he did it. Let’s take the logic a step further: what happens if we support him, and it turns out he did it? Our supporting him is a form of political support. The case against him is a legal one, and will hopefully proceed based on the quality of the evidence, regardless of our political support of him. If he did it, and the evidence is sufficient, he’ll be found guilty. We’ll feel bad for supporting him, but in terms of justice no harm will have been done. On the other hand, if we don’t support him, even if he is subsequently shown to be innocent, the damage will have been done.

Given the history of sexual smears in the past then, we surely must support Assange in this case. And it’s important to say that this is the attitude to take not based on looking at the smattering of details of the case that have been leaked, it’s not to accuse the women involved of being CIA agents (it’s possible that they were, but even if the allegations are part of a smear campaign it doesn’t follow that they have acted in bad faith, and we don’t need to take a position on that one way or the other), and it’s not to have an opinion on what Sweden’s sex crime laws should be.

Finally, the argument of this post, that intellectuals are more prone to propaganda because they are afraid to hold a position that might be wrong, should be compared to an earlier article in which I argue that intellectuals are prone to propaganda because they think they understand things better than they do, and oversimplify them. On the face of it, that looks contradictory but I think it’s not: they are two types of error that are not in conflict. On the one hand, we can make the mistake of overconfidence in what we think we know and our understanding of it, and on the other hand we can refuse to take a political stand in cases where we can be proved wrong. These can coexist in that different people can make these two different types of mistake, and that a single person can make the two different types of mistake in different situations. It’s even possible to make both mistakes simultaneously, for example someone who reads and understands a climate change denial article that makes a valid point which they understand, and then declares that there is uncertainty about global warming. They are both making a mistake of overconfidence in their understanding of global warming (they’ve only read the one article and don’t have the expertise or the wide scope to weigh the evidence against all the other evidence), and refusing to take a political position one way or the other on it and are therefore implicitly supporting the status quo case.

Looking for the simple explanation

Intellectuals are more prone to propaganda than others.

That’s one of the claims of Jacques Ellul in his book Propaganda which got me quite excited about it. His explanation of this is that intellectuals want to have an opinion on every subject, they follow current events carefully, and because they’re intelligent they think they can understand what is going on. This leads to their being more prone to propaganda because there isn’t enough time to have an informed opinion on every subject, because following current events carefully means being led by the news agenda and investing energy in comprehending things from within that given framework, and because intelligence is not enough to understand complex events which require huge breadth of knowledge and experience.

I find this idea very interesting, but there’s another aspect that I want to focus on, which is that intellectuals want to try to understand the world by simplifying it. They want to reduce complex ideas to simple models of them, and to understand them by doing so. This ties in with Ellul’s claim because if you have a simple model of the world that you think explains everything, it’s very hard to give it up. You end up reinterpreting events and facts to fit the theory rather than the much more onerous and difficult prospect of giving up the theory, which would require you to rethink the whole way you look at the world. One of Ellul’s points is that one of the two main functions of propaganda – what he calls integration propaganda – is to intensify currently existing ways of looking at the world and to turn them into actions. Integration propaganda must work better on someone who has a strong personal incentive not to give up his already existing simplified model of the world.

What I would like to understand though, is why people who seem to be intelligent, caring, and even kind, can be capable of believing things that are quite mad, and have consequences which are morally horrific. The obvious example is Nazism, but there are many less dramatic examples. Some people are not upset, for example, by the sight of a homeless person freezing on the streets during the winter.

I can see two sorts of explanation for this, at the emotional level and the rational level. I’m going to come back to the emotional part in a future post, but roughly speaking it’s something like cognitive dissonance. It’s too hard to live in the world if you are to have an emotional reaction to everything that is horrible, and so we have a strong incentive to try to see the world in a way that makes unpleasant things inevitable or out of our control. The other type of explanation is that we want to try to understand things by simplifying them, but that the world is too messy and complicated for this to really work, so we end up making the facts fit the theory.

The neoconservative economist believes that free markets are always efficient, so he sees the creation of new markets as the solution to all the world’s problems. The Marxist sees everything in terms of a dialectical process and class conflict. Both see the pain and suffering that happen as a consequence of these theories as necessary, and so are not shocked by them. The theist believes absolutely in the teachings of their religion, and so cannot see the human suffering that those beliefs can entail. For example, the Catholic who opposes the use of condoms in Africa. On the other hand, the vehement atheist sees that belief in God is wrong and so blames religion for all the world’s problems, blinding himself to the political or economic cause of many of them. Consequently, they can end up supporting incredibly bloody war and torture on a scale that dwarfs the Crusades, as in the case of Christopher Hitchens and Sam Harris for example.

The intellectual is particularly prone to this sort of thinking, because reductionism is our intellectual cultural heritage, something they are totally immersed in. Reducing complex situations to simple models of them through mathematics, physics, etc. has enabled us to make enormous leaps forward in our understanding and control of the physical world. But there is no successful reductionist model of politics or of human problems. Attempting to find reductionist explanations of politics or human behaviour is a reasonable scientific endeavour, albeit so far an unsuccessful one. But believing that we are already in a position to understand people or politics so simplistically, and – worse – acting on those beliefs, is a gross intellectual error (even if it is understandable). Empiricism is incredibly hard, even trained scientists working in much more concrete fields than politics or human behaviour find it very difficult to separate a good scientific explanation of a phenomenon from a confusion.

We cannot wait for an empirical scientific understanding of politics. We have to try to understand the world now, and make decisions and actions based on that understanding. I think it is important that we recognise that we cannot be over-reliant on reductionist models to guide our thinking on these matters, but that leaves a huge open problem of what we can rely on. My feeling is we can use these models of the world, but we need to bear in mind that none of them have a very wide scope, and that all of them are likely to be wrong in fairly major ways. In the end, we need to rely on our essentially human judgement rather than our theories as the final arbiter of our political thinking. That doesn’t mean abandoning reason and logic, it means being committed to pragmatically training our judgement and trying to make decisions as best as possible within the limits of our ability to reason about the world. It means attempting to imperfectly understand complex situations as they are rather than perfectly understanding over-simplifications of them. It means attending to the details rather than trying to find a theory that enables us to ignore them.

This is of course, incredibly difficult. One strategy that may make it more tractable is the idea of having multiple, overlapping, and weakly held principles for understanding the world rather than a smaller number of strongly held principles which attempt to explain everything in one grand scheme. Combined with this is the strategy of having multiple views on a given situation with varying degrees of conviction, rather than having a single one. These views can even be, in fact probably must be, mutually contradictory. Again, this is not to abandon reason in favour of accepting contradiction, but to remember to bear in mind that there are alternative views on a given situation rather than to put the alternatives out of mind. This may of course not be the best strategy. It would have been a bad strategy in the long term for understanding physics, for example. The test for whether or not it is a good strategy, is whether it helps us to get a better understanding of things from an empirical and pragmatic point of view. The main point is not that this strategy is necessarily the best one, but that the reductionist strategy is consistently leading us into error.

I’ll end with an example of this method applied to a reasonably contemporary political problem, the US invasion of Iraq. Before the invasion happened, there was huge debate about whether or not it was a good thing, or could be a good thing. Perhaps, regardless of the US’ reasons for wanting the war, it could have been a good thing for the Iraqi people. Well absolutely. It could, despite the hundreds of thousands of casualties, still be a good thing in the long run, although that seems a very remote possibility now. The reason I opposed it was not because I could foresee these hundreds of thousands of deaths – in fact that vastly exceeded my worst imaginings of how bad it could be – but that everything about the proposed war was dubious. The US and the UK governments lied to us repeatedly and their motives were clearly not either disarmament or helping the Iraqi people. Mostly, I felt that whether or not the war had a positive outcome would depend on the way in which it was conducted, and given that the principal agent in that clearly didn’t have the interests of disarmament or the Iraqi people at heart, I couldn’t believe that they would conduct it well. My opposition to the war was not based on predictions about what would happen, it wasn’t based on the illegality of the war according to international law (which wouldn’t concern me greatly if the war had really been a huge success for the Iraqi people), it was made in ignorance of what the US’ real motives were in the war. And yet, I believe, despite all that uncertainty and ignorance on my part, my judgement was essentially correct, and that subsequent events have shown that to be the case. You can read what I wrote about it in February 2003 here.

p.s. I’m not sure that I would recommend Ellul’s book. I haven’t finished it yet, but it appears to be rather self-contradictory from chapter to chapter and even occasionally from paragraph to paragraph.

p.p.s. When you’re reading a book about propaganda on the train, it’s weird how suddenly when you look up from it you realise that everyone around you is reading propaganda: the Economist telling you how great capitalism is; the glossy magazine telling women they have to look like these incredibly thin models; everything stuffed full of adverts, advertorials and PR-driven stories.