Navigation
Papers by Melberg
Elster Page
Ph.D work

About this web
Why?
Who am I?
Recommended
Statistics
Mail me
Subscribe
Search papers
List of titles only
Categorised titles

General Themes
Ph.D. in progress
Economics
Russia
Political Theory
Statistics/Econometrics
Various papers

The Questions
Ph.D Work
Introduction
Cost-Benefit
Statistical Problems
Social Interaction
Centralization vs. Decentralization

Economics
Define economics!
Models, Formalism
Fluctuations, Crisis
Psychology

Statistics
Econometrics

Review of textbooks

Belief formation
Inifinite regress
Rationality

Russia
Collapse of Communism
Political Culture
Reviews

Political Science
State Intervention
Justice/Rights/Paternalism
Nationalism/Ethnic Violence

Various
Yearly reviews

Philosophy
Explanation=?
Methodology

 

[Note for bibliographic reference: Melberg, Hans Olav (1996), The information required for perfect prediction, http://www.oocities.org/hmelberg/papers/950313.htm]




The information required for perfect prediction

ABSTRACT
This paper identifies and comments upon eleven categories of information needed to make accurate predictions in the social sciences. Most of the examples are from the history of the Soviet Union.

The question
Imagine you are in a research institution in the 1960s trying to predict the situation in the Soviet Union by the year 2000. The question is then: What kind of information do you need in order to make a perfect prediction? Or, more generally, What kind of information to you need in order to make predictions about the future?

Minimal definition: Perfect prediction
A perfect prediction would include correct answers to the following fundamental questions:
- What will be the size of the USSR's Gross Domestic Product? (and the derivative question about the change and direction of change of the GDP)
- What kind of political system will be in place? (one-party state? democracy? presidential? parliamentary? other?)
- Will the Empire remain united?

Not only should we answer these, and a host of other questions, in a perfect prediction. We should also answer the questions of how the new situation will be brought about and when the various events will take place that leads up to the predicted situation.


Required information

1. Aims (of the rulers and other individuals)
Political systems change as the result of human actions. Humans act because they try to achieve certain ends 1 . Hence, we need to know the aims of individuals in order to predict their actions.

First of all we need to know the aims of those who are in power. These aims come in various levels of generality. On the most general level one might say the aim of Khrushchev was the establishment of Communism. At a lower level of generality, Khruschev wanted to beat the USA in the race to the Moon. Hence, we must not only know the aims of the political leaders, but also which aims they give highest priority.

It is not enough to know the aims of the rulers, since the actions of single individuals may also matter a great deal. In what way would the history of the USA be different if J. F. Kennedy had not been shot? Would the situation in Poland have been different if one Parliamentary representative had not overslept the vote of no confidence in the government, which the opposition won by one vote thereby causing a new election? Clearly individual acts make a difference in the course of history, though we may argue about the degree of its significance.

How do we get to know information about the aims of individuals? Since it is impossible to get inside their heads, we must infer their aims from their actions (and/or words). In the words of economists, we are using the theory of revealed preferences. One problem of this revealed preference theory is that people often do strategic acts of the type "one step back, two steps forward." For example, Khrushchev supported the military and the heavy industry against Malenkov when Malenkov wanted to make cuts in these sectors in order to give more emphasis to consumer goods. However, Khrushchev's support for the military and the heavy industry should not be interpreted, as simple revealed preference theory would do, as if Khrushchev really wanted to support these sectors and not the light (consumer) industry. The real reason behind Khrushchev's action was the power struggle between Malenkov and himself. He aligned himself with the strongest forces at the time in order to eliminate his opponent. Having done so he was free to pursue his real intention - which was the same as Malenkov - to give more emphasis to the consumer industry.2 This illustrates how difficult it is to infer preferences from actions.

In recent history the debate about whether Gorbachev was a "real" democrat or a shrewd Leninist tactician further illustrates the problem of reading intentions from actions.3 If he was a real democrat, why was he so reluctant to do away with the constitutional provision about the leading role of the Party (Article 6)? Why did he, for a long time, insist on "socialist pluralism" instead of a real multiple party system? His actions in this area, and others, lead some people to believe that Gorbachev was no real democrat. On the other hand, those who believed he was a real democrat said that Gorbachev had to be careful in order not to loose his position as General Secretary. Although he wanted to do away with Article 6, he did not do so (or say so) because the rest of the Politburo would then sack him - as they had done with Khruschev in October1964. Once again we have a proof of the large practical problems in inferring aims from actions. We may also note that this problem has very real policy implications. If you believe Gorbachev was a real democrat, you would probably be in favour of economic aid to the Soviet Union. But if Gorbachev was a shrewd tactician - trying to fool the West to get a break from the arms race in order to build Communism - he should not be supported.

2. Beliefs I: Situational beliefs (Beliefs about current situation)

Before I try to predict what an individual will do, I need to know his current estimates of the factual situation. For example, the prediction that Gorbachev would initiate large scale economic reforms was build on the belief that Gorbachev believed the real economic situation was bad. If I have wrong information about the situational beliefs, I may produce wrong predictions. S. Bialer predicted in 1983, according to A. Nove, that - "the odds are overwhelmingly against" fundamental economic reform. He build this prediction the premise that the Soviet Union was "a basically stable state" and there was no "systemic crisis" or economic crisis.4 In contrast Marshall I. Goldman argued that the Societ economy was in a crisis in his book USSR in Crisis: The Failure of an Economic System, also from 1983. If people start fom these different premises, and if they both believe their belief was shared by the Soviet leadership, it is easy to see how people make different (and wrong) predictions by having incorrect beliefs of the beliefs of others. 5

3. Beliefs II: Beliefs about the aims of others
The actions of individuals depend upon what the individual think about the aims of other individuals. If I think you are after my wife, my actions toward you will be very different from a situation in which I believed you did not aim after my wife. Or, on a larger scale, the Western policy of containment against the USSR only made sense if they believed that the USSR was trying to spread their ideology to the rest of the world. If you, like some people did, believe that the Cold War was the result of misunderstandings, containment would simply make things worse by confirming the fears on both sides. (In my opinion this was a very wrong belief)

4. Beliefs III: Causal beliefs (Beliefs about causal mechanisms)
Before we can use our knowledge of aims to predict the actions of individuals, we need to know how they believe they can achieve their aims. Khruschev may have Communism as his aim but how did he think he could build it? Assume he defined Communism as a life in material abundance and that he believed the way to achieve this was industrialization. From this we may deduce certain actions, such as an emphasis on heavy industry. However, we are always led into new circles. What did he mean by industrialization and how did he think he could achieve it? The answer to this, in turn, gives rise to a new question. Maybe this regression will end at some point, but it is clear that we need detailed information about many levels before we can predict the actions that follows from the belief-end relationship.

5. Beliefs IV: Strategic beliefs (Beliefs about beliefs)
In order to predict the actions of individuals it is not enough to know their aims and their technical beliefs about how to achieve this aim. We also need to know the beliefs of the individual about the beliefs of other individuals. An example may illustrate the real-life importance of strategic beliefs: You are trying to predict the outcome of the Cuban missile crisis of October 1962. In order to predict the outcome, that the Soviet ships turned away, you needed to know that Kennedy believed that Khruschev believed that Kennedy was serious when he threatened to use force against the ships. In this perspective the blockade and the turning away of the ships was a logical outcome (though probably not the uniquely logical outcome).

The strategic beliefs can be sub-divided into the following categories:
a) Beliefs about other individuals' situational beliefs ("I believe that you believe the economy is bad")
b) Beliefs about other individuals' beliefs of your aims ("I believe he believes I am after his wife")
c) Beliefs about other individuals' causal beliefs ("I believe that you believe that decentralization is the best means to solve the economic problems")
d) Beliefs about other individuals' strategic beliefs (I believe that you believe that I believe that the economy is bad")

How likely is it that we know all these beliefs?

The beliefs of category d) can be extended i.e. we have beliefs about beliefs about beliefs. If this regression continues infinitely, we cannot determine the strategic beliefs and hence prediction is theoretically impossible. To solve this problem we once again need some psychological theory of how deep people think. It is empirically proven that even experts, such as chess players, cannot go deeper than seven rounds (For example, you may be hiding behind a curtain listening to a conversation. However, the people you are listening to may know that you are listening. However, you may also know that they know that you are listening. This is three depth levels.) Both in theory and in practice it seems impossible to know all these beliefs.

Another problem is the non-existence of good reasons to base your beliefs on. For example, you observe two people (A and B) who play the game "stone, paper and scissors." In order to predict which strategy A will use, you must know what A thinks B thinks A will play. This may sound complicated, but we do these calculations every day. In plain language we think like this: "He thinks I will play stone (and hence he'll play paper), therefore it is best for me to play scissors." However, there may be no way of forming a belief about what the other person believes. What reason do you have for believing that the other should play stone? If you can find no rational reason why the other person should play one of the strategies, what do you predict? You could end up with a probabilistic prediction, that each strategy is 1/3 probable as if he throws an imaginary die in his head before he chooses. However, because it is probabilistic, this is not a perfect prediction. Furthermore, it is unstable in the sense that a very small doubt that the other person does not play each strategy with 1/3 will lead you do choose a pure and not a mixed (i.e.. probabilistic) strategy.

One might argue that there are good psychological reasons behind some strategic beliefs. The theory of focal points argues that some strategies are psychologically speaking more likely than others. For example, a person promises you and your friend $10 000 if you independently choose to write down the same tourist-attraction in Paris. Most people would immediately write down the Eiffel Tower. Each do so based on the belief that they both believe that the other will write "The Eiffel Tower." This theory may solve some of the problems of indeterminate strategic beliefs, but this does not reduce our the information problem because our knowledge of the psychological mechanisms determining focal points is also very incomplete.

6. Information about breakdowns between decision and execution
With he information from 1-4, I may predict that a certain course of action will be decided upon, however, this does not mean that the decision will be executed since weakness of will or accidents may prevent the execution. Sometimes we simply press the wrong button by accident. Sometimes we want to go to the dentist, but we are too weak to do so. In order to make perfect predictions we need to know when weakness of will or accidents will prevent the outcome predicted by aims and beliefs.

We could also add another belief category, namely beliefs about our own or other individuals' weakness of will. There is a typical scene in many movies which illustrates this point such as the following scene in a MacGyver movie I was half-watching when I wrote this article: The bad guy had taken a number of hostages. One of the «bad» characters challenged one of the hostages to kill him with a knife. Although we had previously been given the understanding that the hostage wanted to kill the «bad guy», we now saw that the «bad» person acted on the (correct) belief that the hostage was too «weak» to carry out the desired action.

7. Real situation
Individual acts change a situation, but we also need to know what the factual situation was in the first place to know the new situation created by the act. I may know that you are going to do a certain act which means you loosing $5, but if I don't know how much money you had initially I cannot predict how much you have after the act.

In the same way we would need to know the current size of the economy before we can even try to predict the future size of the economy. This, of course, is no small requirement. As we now know, our estimates of the Soviet economy were very wrong. For example, the CIA estimated that the Soviet GDP was about 60% of the American GDP. 6 Today we know this was an overestimation.

8. Organizational decisions
Organizations, states or any other collective entities do not have the capacity to act or think. It is thus a truism that only individuals act (excluding acts of nature). Yet, individuals operate within a context which is important to know before we make predictions. For example, a Parliament has certain rules for decision-making. These rules shape individual acts by affecting the strategic nature of the environment. Consider the following rule: the Parliament is only allowed a "constructive" vote of no confidence (that is they cannot bring the government down unless they have agreed to an alternative ruling coalition). This will affect the decisions made by the members of the Parliament (in the sense that the aims of the governing parties will get more emphasis than the opposition parties). Similar rules, such as the right of the government to dissolve the Parliament and hold new elections, will also affect the acts of individuals. We thus need to know about the mechanisms of organizational decision making before we can make predictions about the future.

In practice this is a very difficult task indeed. We need a theory of coalition formation and bargaining combined by a deep knowledge of how institutional rules affect these. For example, it is not enough to know the preferences and beliefs of individuals, we also need to know the order in which the proposals are voted over. This particular problem is illustrated by the Voter's Paradox which is as follows:

 
 			Phil	Susan Neil 
	Library		1	3	2 
	Police		2	1	3 
	Hospital	3	2	1 

The numbers indicate the order of the priorities for each person i.e. Phil wants a new Library rather than a new Police station, but he also wants a new Police station more than he wants a new Hospital. The question is what decision we get if these three people are to decide which institution should get a new building? What is your prediction?

Let's assume the voting procedure is such that they first have to choose either Library or Police. A majority will choose the Library (both Phil and Neil wants a new library more than they want a new police station). In the next round they are supposed to choose between Library and Hospital. In this vote Hospital will win because both Susan and Neil think a new hospital is better than a new library. The end decision then, after all the alternatives have been considered, is to build a new hospital.

Let us now change the order in which we vote over the alternatives. We'll start by choosing between Hospital and Police. As we can see Police will win because a majority (Phil and Susan) wants a new police station rather than new hospital. In the next vote, between Police and Library, Library will win (supported by Phil and Neil). Hence, a new voting order has produced a different end decision (note that the aims of the individuals has not changed): To build a new Library.

As if the above complications were not enough, we must also consider the possibility of strategic coalitions (log-rolling), lying (Phil could pretend that he wanted a new Hospital instead of a new Police station as his second best option. Depending on the number of voters and the preference structure this could change the outcome of the vote) and the stability of the outcome (a small party may switch sides when it is pivotal in order to get a majority. It does so because being pivotal the larger parties will always try to outdo the others in their promises to the little party. This side switching is difficult to predict). It thus seems almost impossible to gather the information described in this category.

9. Real causal connections
In the unlikely case that we have all the information described in 1-9, we are still left with one major obstacle to prediction: What are the aggregate consequences of all the isolated acts that we have predicted. An example may illustrate this need for considering the acts in aggregate: Imagine for some reason that a firm is experiencing a fall in profit. In order to stay competitive the management decides to cut the wages of their workforce. The consequence is to increase the profit of the firm. Now imagine that all firms cut their wages. What are the consequences? That the profit of all firms increase? No! When all the firms cut their wages, the firms will also experience a loss in sales because the consumers spend less (their wages were cut). This may mean that the aggregate consequence of all the isolated actions is to decrease profit. This is one example of how we may try to develop a theory of causal connections between isolated individual acts and aggregate social consequences.

The example above gave one example of a possible causal mechanism, but in order to make perfect predictions, we need a complete theory of the net effect from all causal connections in society. It goes almost without saying that we do not have such a theory. We do not agree on the causes of inflation, unemployment, crime, family breakdowns or many other social phenomena.

10. Acts of nature
On the 26th of April 1986 the nuclear plant Chernobyl exploded in the Soviet Union. The explosion had a large impact. Politically it increased the distrust of the people against the Communist Party. Ecologically it was a major catastrophe, making 25% of the land of Belorussia unfit for agriculture. Socially it destroyed the lives of thousands of individuals, physically and psychologically. This is not the only act of Nature which has changed affected the course of history. What if the Spanish Armada of 1588 had not run into a storm? What if there had been a mild winter in the Soviet Union when Stalin was fighting Hitler (1941/2)? If we were to predict the future perfectly, we need to know about these events and their effects. Once again it goes almost without saying that we do not have a theory which allows us to predict acts of Nature ten years (or indeed ten days!) ahead.

11. How aims and beliefs change as the result of the new situation
Let us say that we are able to predict all acts and aggregate consequences one step ahead. However, to make predictions over several steps we need a theory of how the transformation from the old to the new situation changes the aims, the beliefs and the possibilities of the individuals. In other words we need a theory of preference formation, belief formation (learning) and a theory of political possibilities (hysteresis). Once we have this we may predict more than one step into the future.


Counterarguments
There are several possible counterarguments which will be examined in a later essay. Among the arguments addressed are:

- That I have used a micro- instead of a macro-approach

That the above list builds on a micro-approach to social change and that a macro-approach which focuses on aggregate variables is a better, more reliable and less demanding basis for prediction. For example, we may simply extrapolate the growth rate of the economy into the future to predict the future size of an economy.

- That I have exaggerated the seriousness of incomplete information

One might argue that although we do not have all the information required, sometimes we have enough information to make reliable predictions. It is simply too much to require that we should have complete information and produce perfect predictions.

- That I have focused on the wrong type of prediction

I might not be able to predict that a person will so something, but I might predict that if he does an act, then certain consequences will follow. For example, if Gorbachev tries to get something in between a full market economy and a centrally planned economy (for example by decentralizing the decision-making to enterprises without freeing prices), this will create an even worse economic situation than before because the nature of the systems are such that they are inherently incompatible.


Conclusion
Summary
This article has had a narrow focus: To examine the information required to make very good predictions. As I have argued the information requirements were severe indeed: We need to know aims, beliefs of many different types and the current factual situation. We then need a reliable theory of the following: weakness of will, accidents, bargaining/coalition formation, causal connections in society, acts of nature, preference formation and belief formation. Some of this information is theoretically impossible to get, some is practically impossible and some is practically possible to get only through indirect and sometimes fallible means.

Implication: Explain but not predict?
Although the focus of this article has simply been on showing the information required to make predictions, I cannot resist the temptation to elaborate on a major implication: That we may still explain although we cannot predict. The reason being that we have much more information after an event than before. For example, we may at one point in time have different beliefs about the size and growth rate of the Soviet economy or the aims of Gorbachev. These differences lead to different predictions about the future. However, as time passes we gather more information. Some new actions by Gorbachev may finally make his real intentions less muddled. Similarely time will also give better information about the real size of the economy. Hence, the information required for explanation is more easily available than the information required for prediction. However, seeing how academics disagree on historical explanations should make us weary of claiming too much explanatory power, even after the events. Maybe predictions are third decimal arguments while explanations are second decimal arguments?



ENDNOTES

1 I am, of course, aware that the rational-choice conception of man is not always correct. Sometimes our actions are governed by norms or emotions more than rational calculations. However, as a general rule I believe that we act selfishly in order to achieve some goal. Furthermore, one might argue that acting according to norms is a way of achieving a goal, namely the fulfilment of the norm. As for emotional action it is covered by my discussion under weakness of will/accidents. Hence, the argued information requirement does not build on an exclusively (thin) rational conception of man.

2 Malia (1994), p. 328

3 Malia (1994), p. 479

4 Nove, A. (1983), p. 836

5 I do not claim to have a strong point against Bialer here. I have not read the original source and the statement was made years before Gorbachev came to power. However, with the benefit of hindsight it does seem wrong to assert that the economy was not in a state of crisis and that the system was stable - even in 1983.

6 Lipset, S. M. and Bence G. (1993), p. 177


Texts used or cited in the article
Not all of these references are worth reading. If you are seriously interested in the topic, I would recommend: Both of Malia's works and the article by Lipset and Bence. I would also recommend the book by Jon Elster and the article by T. Kuran. Lastly I should admit my intellectual debt to the books of Jon Elster. Many of the ideas used in this paper are picked up from reading his books.

Amalrik, Andrei (1970), Will the Soviet Union Survive Until 1984?, New York: Harper and Row

Bell, Daniel (1958), Ten theories in search of reality: The prediction of Soviet behavior, World Politics 10:358- -

Bergson A. and Levine H. S (1983), eds., The Soviet Economy: Toward the Year 2000, Allen and Unwin

Collins, Randal (1986), The future decline of the Russian empire, (ch. 8 in Weberian Sociological Theory, Cambridge University Press 1986)

Elster, Jon (1989), Nuts and Bolts for the Social Sciences, Cambridge: Cambridge University Press

Goldman, M. I. (1983), U.S.S.R. in Crisis: The Failure of an Economic System, Norton

Holmes, Leslie (1993), The End of Communist Power: Anti-Corruption Campaigns and Legitimation Crisis, Cambridge: Polity Press

Janos, Andrew C. (1991), Social science, Communism, and the dynamics of political change, World Politics 44: 81-112

Kuran, Timur, (1991), Now out of never: The element of surprise in the East European Revolution of 1989, World Politics 44:7-48

Lipset, Seymour Martin and Bence, Gyorgy (1993), Anticipations of the failure of Communism, Theory and Society 23: 169-210

Little, Daniel (1993), On the scope and limits of generalizations in the social sciences, Synthese 97:183-297

Malia, Martin (1990), To the Stalin Mausoleum, Daedalus, p. 295-344

Malia, Martin (1994), The Soviet Tragedy: A History of Socialism in Russia, 1917- 1991, New York: The Free Press

Mount, Ferdinand (1992), ed., Communism, London: Harvill (HarperCollins Publishers)

Månson, Per (1994), Minervas uggla och Sovjetunionens fall, Sociologisk Forskning 4: 3-32

Nove, Alec (1983), Estimating the future, The Times Litarary Supplement, Aug. 5, p. 836

Pipes, Richard (1994), Communism: The Vanished Specter, Oslo: Scandinavian University Press

Rosenberg, Alex (1993), Scientific innovation and the limits of social scientific prediction, Synthese 97, p. 161-182

Skirbekk, Sigurd and Bakke, Per (1995), Sosiologi og forutsigelser: Er framtidsforskning umulig?, Sosiologisk tidsskrift 3:231-239