[Note for bibliographic reference: Melberg, Hans O. (1996), Backward Induction, the
false counterfactual and terrorism, http://www.oocities.org/hmelberg/papers/960321b.htm]
Backward Induction, the false counterfactual and terrorism
by Hans O. Melberg
Two incidents
Recently I read about two incidents which made me think about the problem of the false
counterfactual in game theory. In the first incident, there was an Israeli who claimed he
was not afraid to use bus number 18, despite the fact that bus number 18 had been bombed
by terrorists. In fact, precisely because is has been bombed, he was not afraid. As he
said: They will never bomb the same bus twice.
The second incident was the strange story about a plane with weapons which was forced
down in India. The Indian government claimed the plane was sent by the Pakistani
intelligence service to create problems for India. However, Pakistan claimed the Indian
secret service was behind the whole operation, the purpose being to destroy the reputation
of the Pakistani intelligence service.
The interesting point about these incident is that they illustrate a common problem: As
soon as we have established a likely conclusion (a belief), we should change our minds. If
we believe it is unlikely that the terrorist will bomb the same bus twice, then it could
become more likely that they will do. The reason being that the terrorists may take our
belief into account when choosing their target, precisely trying to choose what we do not
think they will choose. Of course we should try to account for this when we form our
beliefs, but: 1. Very often we don't (for psychological reasons)
2. Even if we did the terrorists could take this fact into account and there is no end to
the regression (except the limit to our empirical capacity to think "deep")
The Indian incident shows something of the same: As soon as I believe Pakistan was
behind, I catch myself thinking that if I believe it was Pakistan, then it must be India
since it would be India if they believed that most people would believe that it was
Pakistan! This line of reasoning might require the assumption that I take my original
belief as a sign that others have the same belief ("my (original) belief " as a
sign of "most people's" belief).
The problem seems to be that my belief is taken as an indication that a different
belief is right. The reasoning may be something like this: I believe x. But my belief in x
is a signal that the belief is wrong. In a sense it is like the person who always think he
believes he is wrong. Sometimes he may say: I believe x, but I am always wrong, so y is
probably right!
What is the significance of this?
In a previous article I have argued that prediction in the social science is very
difficult because of the information required. The above observation underlies the
argument: To predict whether people would travel with Bus 18 or not, or whether Bus 18
would become a terrorist goal, we need to know the beliefs people have about the beliefs
of other people (and etc in an endless regression)
It also indicates some of the limitations of game theory. In some situations there may
be no solution to a game. At least there is no theoretical solution. We might try to build
an empirical theory based on psychology, about how deep people think in order to create a
game theoretical solutiution.
The last point is connected to the introduction: The problem of the false
counterfactual in backward induction. These incident may be (I am not completely sure)
real-life examples of the problem of the false counterfactual in backward induction.
For more about the problem of backward induction and the false counterfactual, see J.
Elster (1989) The Cement of Society, p. 4-8, 43 and 74 and his article in Acta
Sociologica. See also a debate between Aumann and Binmore in Games and Economic
Behaviour (1996).
[Note for bibliographic reference: Melberg, Hans O. (1996), Backward Induction, the false
counterfactual and terrorism, http://www.oocities.org/hmelberg/papers/960321b.htm]