The Samovar

On not taking a stand

A problem for progressives is that intellectuals are more (small-c) conservative than you might expect. The reason for this is that intellectuals are frightened of making mistakes and being caught making mistakes. This comes about because of their engagement in intellectual arguments and investment in their outcomes. Combine this fear of making mistakes with an understanding that they do not have sufficient facts to make an accurate determination of the truth in most cases, and the effect is that intellectuals will often not take a stand on important issues. The problem with this is that it leaves the field clear for those who will take a stand, or leaves the decision to those who make decisions by default, i.e. the powerful. In other words, the intellectual refusal to take a stand on issues where they are not certain amounts to a de facto stand in favour of established authorities and ways of doing things.

The danger of this passivity in response to issues that are too difficult to be sure of, is that it leaves you open to a form of propaganda or public relations strategy that has been well known and exploited for a long time.

‘‘Doubt is our product,’’ proclaimed an internal tobacco industry document in 1969. ‘‘Spread doubt over strong scientific evidence and the public won’t know what to believe.’’

It is very easy to spread doubt, and often extremely difficult to prove a positive claim beyond all possible doubt.

The counter-argument says that not taking a stand is reasonable because we don’t know, and if we don’t know we shouldn’t take a stand. This seems reasonable, but in the case of issues where a political decision needs to be taken, we have to think about the effects of both possible positions we could take. On the one hand, if we refuse to take a stand we’ll never be subject to the criticism that we expressed a wrong opinion, but we’ll be letting the political decision be taken by those with established authority or power. On the other hand, if we take a stand we might be able to have some input into the political decisions, but we run the risk of being proven wrong at some point in the future.

Being proven wrong, though, is not in itself a political problem, it’s a personal problem. So, thinking of our actions as political, rather than personal, we shouldn’t worry about the upset it will cause to us if we turn out to be wrong. Rather, we should only refuse to take a stand if by recklessly taking a stand without sufficient information we might increase the chance of a harmful political decision. Going further than this, in most cases there is not a neutral position: the lack of a position being equivalent to a position in favour of the status quo. Let’s take a look at some examples.

As quoted above, the tobacco industry’s tactic was to create doubt about the link between smoking and cancer. Although the overwhelming evidence was supportive of this link, some evidence was found that was not supportive. What about the effects of taking a stand in favour of or against the link? On the one hand, taking no stand means not pushing for regulation of the tobacco industry, and allowing them to do whatever they want, at the risk of many more cases of lung cancer. On the other hand, taking a stand would mean pushing for such regulation, and reducing the number of people who smoke, at the cost of the profits of the tobacco industry, for no reason. Given that the evidence was very much supportive of the link, the cost-benefit analysis is clear: one should take a stand in support of this link.

A similar tactic is being used today by those who want to deny a link between human industrial activity and global warming. Again, the overwhelming majority of evidence is in favour of the link, but some evidence goes against it. Not taking a stand means letting things go on as they are at the moment, and putting the entire world at risk. Taking a stand, however, is also costly: it means potentially cutting back on emissions and consequently on economic growth, which also has an effect on millions of lives. In both the cases above, there is no neutral position: you have to take a position, and either position you take has potentially dire consequences if you’re wrong. In this case, not taking a position means taking a position in favour of the status quo, i.e. gambling that there is no link and that there will be no effect.

There are many more examples of this situation that crop up all the time. I have come across it in debates with people about industrial action, about alternatives to capitalism, and many more. I want to finish though with an example that is playing out right now and is causing a great deal of tension within the left: the case of the rape allegations against Julian Assange.

History has shown, including quite recent history, that governments are willing to smear their opponents with allegations of sexual misconduct. Given this, it was no huge surprise when allegations of rape against Julian Assange surfaced after the release of documents embarassing to several governments. On the face of it, it would seem as though this was a prime case for ignoring these allegations. However, there is a problem: there is a huge problem with the downplaying or denial or accusations of rape. And indeed, many articles arguing we should ignore these rape allegations have been of a decidedly misogynist character. The response of some (not all) feminists against such articles is then quite reasonable, but unfortunately I feel misses an important point about the way smears work: mud sticks. Consider, as above, our options on taking a stand in favour of or against Assange. If we take a stand in favour of him, we support those who would release important documents that reveal the way governments behave in secret, but we risk supporting someone who may be a rapist. If we take a stand against him, we show that the left will not stand by those who attempt to take on those in power, we let those in power disarm us by accusing us of rape. Yes, of course it’s the case that he might have done it, but there is no neutral stance here: not supporting him is equivalent to saying that you will not stand by anyone if they are attacked by governments in this way. And that’s true even if turns out he did it. Let’s take the logic a step further: what happens if we support him, and it turns out he did it? Our supporting him is a form of political support. The case against him is a legal one, and will hopefully proceed based on the quality of the evidence, regardless of our political support of him. If he did it, and the evidence is sufficient, he’ll be found guilty. We’ll feel bad for supporting him, but in terms of justice no harm will have been done. On the other hand, if we don’t support him, even if he is subsequently shown to be innocent, the damage will have been done.

Given the history of sexual smears in the past then, we surely must support Assange in this case. And it’s important to say that this is the attitude to take not based on looking at the smattering of details of the case that have been leaked, it’s not to accuse the women involved of being CIA agents (it’s possible that they were, but even if the allegations are part of a smear campaign it doesn’t follow that they have acted in bad faith, and we don’t need to take a position on that one way or the other), and it’s not to have an opinion on what Sweden’s sex crime laws should be.

Finally, the argument of this post, that intellectuals are more prone to propaganda because they are afraid to hold a position that might be wrong, should be compared to an earlier article in which I argue that intellectuals are prone to propaganda because they think they understand things better than they do, and oversimplify them. On the face of it, that looks contradictory but I think it’s not: they are two types of error that are not in conflict. On the one hand, we can make the mistake of overconfidence in what we think we know and our understanding of it, and on the other hand we can refuse to take a political stand in cases where we can be proved wrong. These can coexist in that different people can make these two different types of mistake, and that a single person can make the two different types of mistake in different situations. It’s even possible to make both mistakes simultaneously, for example someone who reads and understands a climate change denial article that makes a valid point which they understand, and then declares that there is uncertainty about global warming. They are both making a mistake of overconfidence in their understanding of global warming (they’ve only read the one article and don’t have the expertise or the wide scope to weigh the evidence against all the other evidence), and refusing to take a political position one way or the other on it and are therefore implicitly supporting the status quo case.