Navigation
Papers by Melberg
Elster Page
Ph.D work

About this web
Why?
Who am I?
Recommended
Statistics
Mail me
Subscribe
Search papers
List of titles only
Categorised titles

General Themes
Ph.D. in progress
Economics
Russia
Political Theory
Statistics/Econometrics
Various papers

The Questions
Ph.D Work
Introduction
Cost-Benefit
Statistical Problems
Social Interaction
Centralization vs. Decentralization

Economics
Define economics!
Models, Formalism
Fluctuations, Crisis
Psychology

Statistics
Econometrics

Review of textbooks

Belief formation
Inifinite regress
Rationality

Russia
Collapse of Communism
Political Culture
Reviews

Political Science
State Intervention
Justice/Rights/Paternalism
Nationalism/Ethnic Violence

Various
Yearly reviews

Philosophy
Explanation=?
Methodology

 

Are Elster's arguments

on the impossibility of collecting

an optimal amount of information correct?

By Hans O. Melberg

(The second draft of chapter three of my thesis:

How much information should you collect before making a decision?

See: www.oocities.org/hmelberg/papers/papers.htm for more)

 

Comments to: hansom@online.no

University of Oslo

March, 1999.

1 Introduction *

2 My arguments *

3 What are Elster's arguments? *

3.1 A list of quotations *

3.2 Reflecting on the quotations *

3.3 S.G. Winter and Elster's argument *

4 Are the two arguments consistent? *

5 Evaluating the infinite regress argument? *

5.1 Elster's argument: A visualization *

5.2 The first counterargument *

5.3 Resurrecting the problem *

5.4 The death of the resurrected argument: One weak and one strong counterargument *

5.5 Sub-conclusion *

6 Elster on the problems of estimation *

6.1 Introduction *

6.2 What is the argument? *

6.2.1 The textual basis for the distinctions on implications *

6.3 Are the arguments valid? *

6.3.1 On the existence of probability estimates *

6.3.2 Weak probabilities and the argument for randomization *

6.3.3 The relevance of biased probabilities *

6.4 Conclusion: Elster on the problem of estimation *

7 Conclusion *

References

 

  1. Introduction
  2. Is it true that the problem of information collection cannot be solved rationally? To answer this question, I shall start by examining Elster's arguments as he has presented them in different papers. From this it will emerge that the real problem he points to is not infinite regress, but the impossibility of forming reliable expectations due to reasons other than infinite regress. However, when considered more closely even this argument is not without flaws, since it relies heavily on the notion of probability as relative frequency.

     

  3. My arguments

My tentative arguments are as follows:

  1. There has been a tendency in Elster's writings about the impossibility of collecting an optimal amount of information away from an emphasis on the logical problems of infinite regress and towards the empirical problems of estimating the net value of information.
  2. Elster was (is?) probably wrong in focusing on the problem of infinite regress in the collection of information as an important problem in rational choice theory. The argument is, moreover, in conflict with his later admission that it is sometimes possible to collect an optimal amount of information.
  3. There are several potential problems with Elster's treatment of the problem of estimation: First, it relies heavily on the classical relative frequency view of probability (this view may be wrong and/or in contradiction to his arguments about the importance of remembering the subjective character of rationality). Second, the problem is one of degrees (and this leads to the problem of how to distinguish between situations in which we have / can find reliable probabilities). Third, there seems to be an inconsistency between Elster's writings on the problems of estimation and his arguments about the importance of judgment.

 

  1.  
  2. What are Elster's arguments?
  3.  

    1. A list of quotations
    2. To enable the reader to follow the discussion, I have made the following table which summarizes Elster's writings on the impossibility of collecting an optimal amount of information.

       

      Tabell 1

      Year

      Source

      Key pages

      Is "infinite regress" mentioned?

      Reference to Winter?

      Quotation

      1978 Logic and Society (book) 162 (173) Yes Yes One might argue that "... satisfaction emerges as a variety of maximization once the costs of acquiring and evaluating information are taken into account. [176] Winter, then, in a surprisingly ignored paper, argued that this retort creates an infinite regress, for how do you solve the problem of finding the optimal amount of information? The 'choice of a profit maximizing information structure requires information, and it is not apparent how the aspiring profit maximizer acquires this information, or what guarantees that he does not pay an excessive price for it'." p. 162 (quoting Winter 1964)
      1979 Ulysses and the Sirens (book) 58-60, 135 Yes Yes "Take the case of a multinational firm that decides not to enter the forward exchange market because the information costs of the operation would exceed the benefits. Then we shall have to ask how the firm decided how much information to acquire before taking the decision not to acquire the information needed in the forward exchange market. Unless one could prove (and I do not see how one could prove) that the deviation from the 'real' optimum converges to zero or at any rate rapidly becomes smaller for each new level in the hierarchy of information structures, this argument not only has the implication that in every decision there must be a cut-off point where calculation stops and you simply have to make an unsupported choice, but also that this point might as well be as close to the action as possible. Why, indeed, seek for precision in the second decimal if you are uncertain about the first?" p.59
      1982 "Rationality"

      (a chapter in Fløistad (1982), A survey of contemporary philosophy)

      112-113 Yes Yes Some "argue that firms are profit maximizers because otherwise they go bankrupt. The argument is particularly powerful because it is backed (Winter [6b] by an infinite regress argument against the very possibility of planned profit-maximizing. The argument, briefly, is this. In order to maximize, say, profits, you need information. As information is costly, it would be inefficient to gather all the potentially available information. One should rather settle for the optimal amount of information. But this means that the original maximization problem has not been solved, only replaced by a new one, that immediately raises the same problem." p. 112 (sic.)
      1983 "The crisis in economic theory"

      (Review of Nelson and Winter (1982) in London Review of Books)

      5, 6 Yes Yes "The Nelson-Winter attack on optimality is therefore a two-pronged one. The argument from satisficing is that firms cannot optimise ex ante, since they do not have and cannot get the information that would be required. Specifically, they would need an optimal amount information, but this leads to a new optimisation problem and hence into an infinite regress. On the other hand, we cannot expect firms to maximise ex post, since the elimination of the unfit does not operate with the same, speed and accuracy as it does in natural selection. Taken together, these two arguments strike at the root of neo-classical orthodoxy." p. 6
      1983 Explaining Technical Change (book) 139-140 Yes Yes "One of his [S. Winter] contributions is of particular interest and importance: the demonstration that the neoclassical notion of maximizing involves an infinite regress and should be replaced by that of satisficing. The argument appears to me unassailable, yet it is not universally accepted among economists, no doubt because it does not lead to uniquely defined behavioral postulates." p. 139
      1983 Sour Grapes (book) 17-18 Yes Yes "The demand for an optimal amount of evidence immediately leads to an infinite regress" p. 18
      1985 "The nature and scope of rational choice explanations" (book chapter) 69 No Yes "In most cases it will be equally irrational to spend no time on collecting evidence and to spend most of one's time doing so. In between there is some optimal amount of time that should be spent on information-gathering. This, however, is true only in the objective sense that an observer who knew everything about the situation could assess the value of gathering information and find the point at which the marginal value of information equals marginal costs. But of course the agent who is groping towards a decision does not have the information needed to make an optimal decision with respect to information-collecting.[23] He knows, from first principles, that information is costly and that there is a trade-off between collecting information and using it, but he does not know what that trade-off is." (Elster 1985, p. 69)
      1986 "Introduction" to the edited book: Rational Choice 14, 19 No No "It is not possible, however, to give general optimality criteria for the gathering of information." p. 14

      "The non-existence of an optimal amount of evidence arises ... from our inability to assess the expected marginal value of information." p. 19

      1989 Solomonic Judgements (book) 15-16 No No "Sometimes it is impossible to estimate the marginal cost and benefits of information. Consider a general in the midst of battle who does not know the exact disposition of the enemy troops. The value of more information, while potentially great, cannot be ascertained. Determining the expected value would require a highly implausible ability to form numerical estimates concerning the possible enemy positions." p. 16
      1989 Nuts and Bolts (book) 35-38 No Yes (bibliographical essay). "Deciding how much evidence to collect can be tricky. If the situation is highly stereotyped, as medical diagnosis is we know pretty well the costs and benefits of additional information. In situations that are unique, novel and urgent , like fighting a battle or helping the victim in a car accident, both costs and benefits are highly uncertain ..." p. 35

       

      1993 "Some unresolved problems in the theory of rational behaviour"

      (article in Acta Sociologica)

      182-183 No No "Suppose than I am about to choose between going to law school or to a school of forestry - a choice not simply of career but of life style. I am attracted to both professions, but I cannot rank and compare them. If I had tried both for a lifetime, I might have been able to make an informed choice between them. As it is, I know too little about them to make a rational decision." p. 182

       

    3. Reflecting on the quotations
    4. First of all, the table indicates that there has been a shift in Elster's emphasis. From 1978 to 1983 the argument against the possibility of collecting an optimal amount of information was based on "the infinite regress argument." After the important article from 1985, the focus turned to the empirical problems of estimating the value of information when we are in novel situations and when the environment is fast changing. I shall label the first argument "the infinite regress problem" and the second "problems of estimation." Both arguments are used by Elster to argue that it is impossible to collect an optimal amount of information.

       

    5. S.G. Winter and Elster's argument

    The infinite regress argument is clearly inspired by S.G. Winter. The key quotation is:

    The "choice of a profit maximizing information structure itself requires information, and it is not apparent how the aspiring profit maximizer acquires this information, or what guarantees that he does not pay an excessive price for it "(Winter 1964, quoted from Elster 1983, p. 139-140.)

    Three things should be noted about this quotation. First, Winter does not use the term "infinite regress" in the quotation, nor does he do so in any of the articles I have read (Winter 1964, 1971, 1975). In fact, in Winter's article from 1975 the problem is said to be "self-reference" not "infinite regress." Second, the focus is on how somebody can acquire information about the value of more information. The term "not apparent" indicates some reservation whether the argument is purely logical (it is impossible) or empirical (it is not apparent). Third, there is something odd about the last sentence ("what guarantees that he does not pay an excessive price for it"). It seems to me that rationality does not demand that we never pay more than the true value of something. The question is whether we were justified in believing that the information was worth the costs when the decision was made. It may turn out that the information was less valuable than we believed, but - as Elster argues in Sour Grapes (p. 15-19) - it is possible to make rational mistakes.

     

  4. Are the two arguments consistent?
  5. There is, at the very least, a tension between Elster's argument on the infinite regress problem and the estimation problem. When discussing the estimation problem, Elster admits that it is sometimes possible to choose what approximates the optimal amount of information (see, e.g. 1985, p. 70). This is a problem because it is inconsistent to argue that it is logically impossible to collect an optimal amount of information and at the same time argue that the problem is sometimes solved empirically. To what extent is this a problem in Elster's writings?

    First of all, I am hesitant about using the label "contradiction." Because Elster himself does not explicitly write that the problem of infinite regress represents a "logical problem" in the theory of optimization. On the other hand, consider the following quotations:

    "The demand for an optimal amount of evidence immediately leads to an infinite regress." (SG, 1983, p. 18)

    "... firms cannot optimise ex ante, since they do not have and cannot get the information that would be required. Specifically, they would need an optimal amount information, but this leads to a new optimisation problem and hence into an infinite regress." (Crisis, 1983 Article, my emphasis)

    "One of his [S.G. Winter] contributions is of particular interest and importance: the demonstration that the neoclassical notion of maximizing involves an infinite regress and should be replaced by that of satisficing." (ETC, 1983, p. 139)

    As for the strength of these arguments, Elster writes that the argument "appears to me unassailable" (ETC, 1983, p. 139). He also thinks that S.G. Winter has provided a sketch of an "impossibility theorem" (US, p. 59) and that the infinite regress problem represents an argument "against the very possibility of planned profit-maximizing" (1982, p. 112, my emphasis) . The tendency of the argument seems to be that it is logically impossible to choose an optimal amount of information. To the extent this is true, Elster's early argument is in tension with his later emphasis on the empirical nature of the problem i.e. that it is usually impossible to form a reliable estimate about

    the value of information.

     

  6. Evaluating the infinite regress argument?
  7.  

    1. Elster's argument: A visualization
    2. The infinite regress problem is presented as follows by Elster in his 1982 article:

      In order to maximize, say, profits, you need information. As information is costly, it would be inefficient to gather all the potentially available information. One should rather settle for the optimal amount of information. But this means that the original maximization problem has not been solved, only replaced by a new one, that immediately raises the same problem (p. 112)

      Visualized the argument may look like this:

      Figure 1
      
      1. Collect an optimal amount of information (the first maximization problem)
      2. To do (1) we must first collect information about how much information it would be optimal to collect (the second maximization problem).
      3. To do (3) we have to collect an optimal amount of information about how much information we should collect before we decide how much information to collect (the third maximization problem).
      ...
      ...
      and so on forever
      

       

       

      Since the chain goes on forever, the argument is that the original problem has no rational solution. My question is this: Is it really true that we have to collect information before we decide how much information to collect? Is this not to demand that the agent always should know something that he does not know? Maybe this is precisely the point, that Elster and Winter believe rational choice theory cannot yield a determinate solution because it demands the impossible.

       

    3. The first counterargument
    4. Imagine the following reply: At any point in time you simply have to base your decision on what you know at that time. This includes the decision about how much information to gather. Previous experience in making decisions and gathering information may give you some basis for estimating how much information to collect (or it may not, but this is an empirical question). In any case, the rational decision is simply to choose the best alternative - act or collect more information - that has the highest expected utility given your beliefs at time zero. The situation could be visualized as in Figure 2. At time t=0 you want to make a rational decision about what to do, that is either to "act now" or to "collect more information."

      Figure 2
      
         Act
      0
      
      

       

      When the problem is visualized in this way, the infinite regress problem is simply that the branching could go on forever. This, in turn, means that it may be impossible to work out the expected utility of collecting more information, and/or that the value of collecting more information may always be greater than "act" In practice, however, there is little reason to expect an infinite regress problem in the collection of information. Many decisions simply cannot be postponed forever i.e. in the words of Holly Smith (1987) the decision is non-deferral. In fact, as he also notes, all decisions are non-deferral since all humans eventually die. As long as this is the case, it seems rational to me simply to start at 0 and base your choice of whether to collect more information on your beliefs about the net value of collecting more information. We avoid the infinite regress since it would not be rational to include the options after your death (or after the time limit) in the calculation.

      Second, even if the decision could be postponed forever, the benefits of collecting more information might decrease and as such the problem has a solution in the limit. Of course, the real question is not only whether the problem has a solution in the limit, but whether it is possible for the agent to know this and the precise trade-off that enables him to make the rational choice about whether to collect more information or not. It seems to me that the answer is simply to base your decision on the best possible beliefs about the value of more information at time t=0. Should I collect more information? Yes, if my beliefs (based on all my past experiences up to t=0) tell me that more information has a higher expected utility than acting now. Is it logically possible to estimate the expected net value of more information? Yes, but there may be large practical problems - as Elster points out. In theory there is no problem: You simply estimate the value of information based on historical experience. Of course, this is easier said than done, but sometimes you may compare with similar situations in the past (classical view of probability), sometimes you may use a theory which is developed using past data to predict the value of more information), and, finally, some would argue that it is rational to base your decision simply on your subjective beliefs regarding the value of more information.

      So, the only way the infinite regress in the collection of information could get off the ground (if the visualization in Figure 1 is correct) would be that it we have an agent who is immortal (or acting on behalf on something which is immortal), the decision can be postponed forever and the value of information does not eventually converge. A problems based on these assumptions do not appear very significant in the real world.

       

    5. Resurrecting the problem
    6. There is, however, a third twist to the argument. Consider the visualization presented in Figure 3. [This interpretation is inspired by, but not equal to, Lipman (1991).] Here the problem at t=0 is that the set of possible actions is infinite. The choice is not only between "act now" and "collect more information" since the "collect more information is really a general category which includes an infinite set of alternatives. Hence, at t=0, one option is to act right away, another is to collect information directly relevant to the problem; A third option is to collect information about how much information you should collect. Fourth, you may collect information about how much information to collect before you decide how much information you are going to collect. And so we could go on forever.

      Figure 3
      
      Possible alternatives at time 0:
      1. Act
      2. Collect information
      3. Collect information about how much information you should collect
      4. Collect information about how much information you should collect to decide how much information to collect.
      ...
      ...
      

       

      If the problem is visualized in this way it is less obvious that the non-deferral of decisions can solve the problem. Among all the feasible alternatives at t=0 we want to choose the one that has the highest expected utility. If the set of feasible actions is not well defined (i.e. it is infinite), then we do not know for certain whether some alternative "far down" would be of higher expected value.

       

    7. The death of the resurrected argument: One weak and one strong counterargument
    8. One possible counterargument could be that it is not feasible (given limitations in the human mind) to go very deep. For instance Lipman (1991:1112) "solution" of the infinite regress involves a restriction on the feasible set of computations which in his words "can be viewed as a restriction the complexity of the calculation the agent can carry out." Most people are not able to go beyond three or at most four levels. Even experts in strategic thinking cannot go further than about seven. I need to think more about this, because information about information about information may not be comparable to "I know that you know I know." Imagine the case of buying a house. Most people want to collect information about the house. Some also collect information about what kind of information (and how much?) they should collect (e.g. books about how to collect information before buying a house). We could easily imagine information about this information e.g. a magazine that reviews several books about how to collect information (But can we find information about how many books to read before determining how much information to collect?).This is three levels deep and it is still not too difficult to imagine. Maybe information about information about information is easier to imagine than "I know that he know that I know?" On the other hand, we seldom find that the regression goes beyond three or four levels (empirically speaking). This might indicate either that this information is not so valuable, or that it is difficult to utilize it given our cognitive limitations (which in turn makes the information less valuable). In any case - the argument could be - being a perfectly calculating smart robot with no limitations is not the definition of rationality. Rationality is doing the best we can within the set of feasible options. How convincing is this argument?

      I am unsure, but I do not think we should allow the argument about limited human cognitive abilities much weight in the current context. The key question in this paper is whether it is logically possible to collect an optimal amount of information. "Logically possible" may be interpreted to mean "is it feasible given unavoidable constraints?" The reason I am reluctant to use "limited human cognitive abilities" as an argument against the infinite regress argument, is that we may overcome (at least some) of our cognitive weaknesses (i.e. they are not unavoidable). And, as Savage (1967) argued, we want to use the theory of rationality to police our own decisions -- as a tool to find the best possible decision. There is a question of degrees here, but including human limitations makes it too easy to label actions rational, reduces the use of the theory as a guide, and - finally - it seems to me to be a case of mislabeling to argue that human cognitive weaknesses represent a logical problem. In sum, I do not want to use this argument against the infinite regress problem.

      There is, however, a second possible argument against the infinite regress in Figure 3. It is the same argument that was used to "solve" the problem as visualized in Figure 2. That is, if the decision is non-deferral the set of relevant alternatives is also constrained. True, one could always choose to collect information at some very deep level at time t=0, but as long as we know that time is limited the value of doing so is zero since after collecting this information we have to go through all the other levels before we finally make a decision. After collecting information about how much information to collect we have to go out and collect the information (though, the deeper level might tell us to collect zero at the more immediate level). Since this process is time-consuming, time constraints limits the depth of the feasible set than needs to be considered.

       

    9.  
    10. Sub-conclusion

    I have so far tried to understand Elster's argument on the impossibility of collecting an optimal amount of information because of the infinite regress problem. My conclusion is that the argument fails to point out a significant problem in rational choice theory. Empirically the conditions under which it may arise are very restrictive. And I do not think it constitutes a logical proof against the very possibility of choosing an optimal amount of information. I want to note, however, that I have only discussed this problem in the context of how much information to gather. There are at least two other categories of infinite regress problems that I have not discussed. That is, first, to decide how to decide. And, second, to form beliefs using a fixed set of information (for instance, the problems involved in reasoning like "I know that you know that I know ..."). There may be significant problems here for rational choice theory, but that was not the topic of the section above. Assuming these two problems have been solved, I asked whether there was a problem of infinite regress in the collection of information and my conclusion was negative.

     

     

  8. Elster on the problems of estimation
  9.  

    1. Introduction
    2. The argument that there is no logical infinite regress problem that makes it impossible to collect an optimal amount of information does not imply that it is empirically possible. For instance, when we are in a unique situation we cannot determine the value of information from historical experience of similar situations, and hence there is (in the classical view of probability) no rational basis for estimating the value of information. I have labeled these problems the estimation problem and I have characterized it as Elster's second main argument against the possibility of collecting an optimal amount of information. As argued in chapter one and two there is a shift in towards this line of argument after 1985. In that article, and later, Elster does not use the term "inifinite regress" and he does not quote S.G. Winter. Instead, the argument focuses on the problems involved in the formation of probabilities.The purpose of this section is to examine this view more closely.

       

    3. What is the argument?
    4. Elster's general position is that "beliefs are indeterminate when the evidence is insufficient to justify a judgment about the likelihood of the various outcomes of action. This can happen in two main ways: through uncertainty, especially about the future, and through strategic action" (Nuts and Bolts, p. 33). More specifically the following two quotations illustrate some of the causes of the problem according to Elster:

      Deciding how much evidence to collect can be tricky. If the situation is highly stereotyped, as medical diagnosis is we know pretty well the costs and benefits of additional information. In situations that are unique, novel and urgent, like fighting a battle or helping the victim in a car accident, both costs and benefits are highly uncertain ... (Nuts and Bolts, p. 35, my emphasis)

      In many everyday decisions, however, not to speak of military or business decisions, a combination of factors conspire to pull the lower and upper bounds [on how much information it would be rational to collect] apart from one another. The situation is novel, so that past experience is of limited help. It is changing rapidly, so that information runs the risk of becoming obsolete. If the decision is urgent and important, one may expect both the benefits and the opportunity costs of information-collecting to be high, but this is not to say that one can estimate the relevant marginal equalities. (Elster 1985, p. 70, my emphasis)

      To impose some order on the following discussion, I want to make a distinction between three types of probability, three types of problems and three types of implications.

      On probability, we may follow Elster (ETC, p. 195-199) and distinguish between the following concepts of probability according to their source: objective probabilities (using relative frequency as source), theoretical probability (the source of the estimate is a theory such as a weather prediction), and subjective probability (degrees of belief as measured by willingness to make bets on the belief).

      As for the three problems, I want to make a conceptual distinction between non-existent probabilities, weak (but unbiased) probabilities and biased probabilities. Elster seems to argue that both non-existence and weak probabilities represent indeterminacy (see the first quotation, NB p. 33), but I believe it is important to distinguish between the two since the question in this chapter is whether it is impossible to form beliefs about the value of information.

      Finally, I want to separate the following three implications related to the arguments about probabilities. First, the advice that uncertainty makes it rational to use the maximin strategy. Second, that uncertainty implies that it would be intellectually honest to use a strategy of randomizations. Third, that uncertainty implies that we should not seek more information since it is wasteful to spend resources learning the second decimal when we cannot know the first.

      Table 2: An overview of Elster's arguments about the problem of estimation and their implications

      Probability concept Problem Cause Implication a Justification Example
      Objective Non-existent probabilities Brute and strategic uncertainty Maximin b Arrow+Hurwicz proof (Best end result?). Choice between fossile, nuclear and hydroelectric energy
      Objective/

      Subjective

      Weak probabilities Brute and strategic uncertainty Randomization/

      Maximin

      Intellectual honesty Choice of career (forester or lawyer)
      Subjective Biased probabilities Hot and cold cognitive mechanisms Randomization? Better end result Investment choices?

      (a) Implication for all: Not waste time seeking information when such information is impossible to find or only weakly significant.

      (b) Assuming we know the best/worst possible outcome.

       

      1. The textual basis for the distinctions on implications

      In Explaining Technical Change Elster (1983, p. 185) argues that "there are two forms of uncertainty [risk and ignorance] that differ profoundly in their implications for action. [...]. To this analytical distinction there corresponds a distinction between two criteria for rational choice, which may be roughly expressed as 'maximize expected utility' and 'maximize minimal utility'." More specifically, the argument is that the choice between fossile, nuclear and hydroelectric energy source should be determined not by trying to assign numerical probabilities to the outcomes, but by selecting that alternative which has the best worst consequence (minimax). To justify this principle, Elster appeals to a paper by Arrow and Hurwicz (1972). Hence, one implication of the impossibility of estimating probabilities - Elster claims - is that we should use minimax instead of maximizing expected utility (see also ETC p. 76).

      In a different context, the argument is that intellectual honesty implies that we should use a strategy of randomization when we are in situations of ignorance:

      In my ignorance about the first decimal - whether my life will go better as a lawyer or as a forester - I look to the second decimal. Perhaps I opt for law school because that will make it easier for me to visit my parents. This way of deciding is as good as any - but it is not one that can be underwritten by rational choice as superior to, say, just tossing a coin. (SJ, p. 10)

      The idea is followed up in a chapter discussing rules about child custody after a divorce in which Elster argues that it may be better to toss a coin than to make an impossible attempt to determine who of the parents will be best for the child.

      A third implication of uncertainty, according to Elster, is that it is wasteful to collect a lot of information: "it is often more rational to admit ignorance than to strive for numerical quasi-precision in the measurement of belief" (US, 128).

      In sum, Elster presents a number of arguments about our inability to form reliable estimates and the implications of this inability. Probabilities can be non-existent, weak or biased and this implies that it may be rational to use maximin and/or randomization instead of maximization of expected utility, and that it is irrational to collect information about the second decimal in a problem when the first decimal is unknown. The arguments are summarized in the table below.

       

    5. Are the arguments valid?
    6. To further demonstrate what Elster labels an irrational prefrence for symmetry, I have chosen to discuss the validity of Elster's arguments under three headings. First, how strong is the argument about the non-existence of probabilities (which involves a discussion of subjective and objective probability). Second, how sound is the argument that randomization is preferable (since it is more honest) in situations of weak probabilities. Third, what is the relevance of biased probabilities to the indeterminacy of rational choice? Within these three headings I want to discuss both the validity of the arguments in isolation, and their consistency with Elster's other arguments.

      1. On the existence of probability estimates
      2. The principle of maximization of expected utility presupposes that the agent has or can form probabilities about the possible consequences of an action. Hence, if it can be shown that these probabilities do not exist, it implies that MEU cannot be used in that situation. This means, As Elster argues, that uniqueness, novelty and fast changing environments are problematic for expected utility theory because we cannot use previous experience of similar situations to estimate the relevant probabilities. One possible counterargument is that Elster's arguments about uniqueness and non-existence of probabilities is heavily dependent on the classical view of probability as relative frequency. If, for instance, we use the concept of theoretical probability it seems perfectly possible to get reasonable estimates even from unique combinations of weather observations. Another, and in this context more significant, counterargument counterargument, is the argument that probabilities should be intepreted as measures of subjective uncertainty, in which case it is perfectly possible to speak about probability even in unique situations.

         

        1. Subjective probabilities

Elster, of course, is aware of this alternative view of probability, but he argues against the use of subjective probabilities. The arguments are (rather crudely), summarized in the following list:

  1. It denies the possibility of genuine uncertainty (SG, p. 19-20)
  2. It leads to logical inconsistencies.
  3. "It presupposes that we are somehow able to integrate our various fragments of knowledge and arrive at a stable and intersubjectively valid conclusion" (ETC, p. 199)

On (1) and (2)

Does subjective probability deny genuine uncertainty? Bayesians argue that it is always possible to translate my uncertainty into probability statements about the world that can be acted upon. You simply elicit the subjective probabilities by forcing a person to choose between a given set of alternatives. For instance, suppose you had to choose between the following alternatives (A vs. B, the example is build on US p. 129):

A: If you correctly guess the twenty-first decimal of pi you get $100, if you are wrong you get nothing.

B: If you draw a red ball from an urn of p per cent red balls and 100 - p per cent blue balls you get $100.

If the person prefers A to B one might infer that the person's subjective probability of being able to guess the decimal, is higher than B. One might then increase the percentage of red balls in alternative B and force the agent once again to choose between A and B. If we continue this process we will eventually come to a point where the agent prefers B to A (or end up with the conclusion that the agent is certain that he can estimate the twentyfirst decimal of pi).

I am not convinced by this argument for the non-existence on genuine uncertainty. First, it seems either to deny (by assumption) the very question we want to examine; We do not allow the agent to respond "I don't know!" Second, it assumes that the answer reveals what we want it to reveal since the inference that the choice reveals our subjective uncertainty is only valid if the agent really tries to maximize his expected utility when faced with the two alternatives. If the agent instead simply selects his answer at random (or using some other criteria), then the inference from his answer to his subjective probability is not valid.

A Bayesian might argue that the problem could easily be solved by saying that total ignorance ("I don't know" in the example above) can simply be translated into the probabilty statement that "all outcomes are equally likely to happen." I find this an attractive proposal, but this is both conceptually and logically problematic. Conceptually, as Iversen (1984, p. 61) admits, "saying that each value is equally likely is to say something about the parameter and represents one step up from complete ignorance." As for the logical problem, imagine that you have to guess the value of X, and all you know is that X is somewhere between (including) 0 and 5 (the example is from Iversen 1984, p. 61). If you use the assumption that complete ignorance means that all outcomes between 0 and 5 are equally likely, then the probability that X is less than 2.5 is 0.5:

P (X < 2.5) = 0.5

But, if you are ignorant about the value of X, you are also ignorant about the value of X2. The possible range of X2 is from 0 to 25 (since X goes from 0 to 5). This means that the probability that X2 is less than 12.5 should be 0.5 (being ignorant about the value of X2 we simply say that all outcomes between 0 and 25 are equally likely). In other words:

P (X2 < 12.5) = 0.5

By taking the square root of both sides of the inequality above, we get:

P (X < 3.54) = 0.5

But this clearly contradicts the first statement that P(X < 2.5) = 0.5.

I am not sure how to respond to this problem. It certainly shows that complete ignorance is not the same as a uniform probability distribution. It does not show, however, that complete ignorance is something that really exists. The inconsistency is simply caused by the different specification of the possible outcomes. One might "solve" the problem by arguing that also the specification of possible outcomes belong to the subjective realm. That is, we must simply use the states we believe are possible in the calculation and the proof that this is inconsistent compared to the results using a different set of possible states (more states) is not relevant (or does not prove irrationality). I cannot be blamed for not using a set of outcomes I believed did not exist (given that this belief itself was rational). I am slightly more worried about the conceptual step (going from I don't know to a probability distribution), but I am less willing than Elster to dismiss the argument that "insufficient reason" justifies a uniform distribution.

On (3)

The final argument is that subjective probabilities are not intersubjectively valid. I am unsure about what this means, but one interpretation might be that people given the same information might come up with different probabilities and it sounds wrong to argue that both are equally valid as a basis for calculating what you should do. (The underlying argument seems to be that "two different estimates cannot both be equally rational since there is only one truth"). A bayesian could make several responses. First, bayesian and classical estimates may converge over time even if people have different initial priors (People starting with different beliefs about the amount of red and blue balls in an urn will revise their beliefs as the are allowed to see the colour of selected balls using Bayes rule).

Second, given the differences in background knowledge it is perfectly possible that two rational people come up with different probability estimates. People will differ in their background knowlege because they have encountered different information in their lives and this is reflected in their prior beliefs. Rational updating based on the same new information may then result in two different belief, but none need be more rational than the other (one is certainty closer to the truth than the other, but that is not the point; beliefs do not have to be true to be rational).

I believe that this second point also reveals a tension in Elster's argument. He demands that probabilities should be intersubjectively valid, but he also insists that rationality is a subjective notion. Consider the following quotation:

It is not possible, however, to give general optimality criteria for the gathering of information. One frequently made proposal - to collect information up to the point where the expected value of more evidence equals marginal cost of collecting it - fails because it does not respect the subjective character of rational choice (RC, p. 14, my emphasis)

The argument here is that an outside observes might be able to asses the value of information, but this does not help the person who tries to act rationally as long as he cannot estimate the value of information. The information has to be available to the person who is making the decisions. This is true, but it also suggests that probability is an inherently subjective notion. As argued, different persons have different information and as such it is possible that they both rationally estimate probabilities that differ. To demand that probabilities be intersubjectively valid (if one by this means that everybody should arrive at the same estimate), is to impose an objective standard on something that is inherently subjective. [On reflection I am not sure that this is what Elster means by the phrase "intersubjectively valid."]

A third reply to the argument that subjective probabilities are not "intersubjectively valid" is that objective probabilities are no more intersubjectively valid than subjective probabilities. This is because three is no neutral criterion that determines which cases are "similar enough" to be used as a basis for calculating the objective probability. Some might argue that it was impossible to estimate the proabability that the USSR would collapse (no similar events to use as a basis for calculation), others might argue history provided cases of "similar empires" that could be used to work out the probability of collapse. Or, to use an example from Elster: "The doctor carrying out a medical diagnosis finds himself many times in the same situation" while "most people are unemployed only once, or, if more than once, under widely differing circumstance." (SJ, 16, emphasis in the original). For this argument to be "intersubjectively valid" we need a criterion of "sameness" and "different circumstances" and there is no such neutral criterion.

 

        1. Risk dominates uncertainty and vice versa
        2. Even if we yield the (dubious) point that only objective probabilities are valid as inputs in the decision-making process, Elster himself presents an argument that reduces the importance of uncertainty (ETC, p. 202). The argument is that risk dominates uncertainty when the two interact multiplicatively. For instance, assume you want to know the probability of successful use of stolen plutonium. For this to occur, three things must happen: first somebody must try to steal the plutonium (assume the probability of this is P1), the break-in must be successful (P2) and they must manage to construct a bomb using the plutonium (P3). An safety expert worried about this may then multiply the three to probabilities to get an estimate of how likely the "successful theft" scenario is p1 * p2 * p3. As long as one of these is measurable, there is some basis for the overall probability (the overall probability cannot be higher than its the highest individual component). While this problem may reduce the problem of genuine ignorance, we should also be aware that uncertainty dominates risk when they interact additively. This gives risk, once again, an important role.

           

        3. Sub-conclusion on the existence of genuine uncertainty

I hope to have shown that Elster's argument about the non-existence of probability depends quite heavily on the classical view on probability as relative frequency. I also hope to have shown that the argument in favour of this view, and against the subjective view, is (at least) open to discussion. Beyond this I have no strong conclusions on whether the non-existence of probabilities is a serious problem. I tend to believe (rather weakly) that there is often some aspect of the problem that allows us to make some inferences on probabilities. For instance, in the mentioned problem about pi I would certainly choose A as long as the percentage of red balls was below 10 since there are only ten decimals to choose from. In many cases it also seems reasonable to translate "I don't know" into "all alternative are equally likely." Yet, I am also aware of the problems with the other proposals and this is the reason for my guarded conclusion.

 

      1. Weak probabilities and the argument for randomization
      2. First of all we must ask in what sense probabilities are weak. Since I want to distinguish between bias and weakness, I shall reserve the label weak for beliefs that are unbiased. Conceptually the distinction is important (although practice is more difficult!). For instance, we may form a belief about the colour composition of the balls in an urn based on a sample of three. This belief is not very strong, but - if the proper statistical formulas are applied - it is not biased.

        As mentioned Elster argues that some beliefs are too weak to justify inclusion in a rational calculation of net expected utility (and that we for this reason should refrain from choosing actions based on such calculations).

        In my ignorance about the first decimal - whether my life will go better as a lawyer or as a forester - I look to the second decimal. Perhaps I opt for law school because that will make it easier for me to visit my parents. This way of deciding is as good as any - but it is not one that can be underwritten by rational choice as superior to, say, just tossing a coin. (SJ, p. 10)

        I think the argument is weak. Assume you have to choose between the following two alternatives:

        A: 10 000 USD with an estimated probability of 50.01 (and 0 with probability 49.99)

        B: 10 000 USD with an estimated probability of 49.99 (and 0 with probability 50.01)

        It seems to me that I would choose A even if my I knew that the variance in my estimated probability was high. True, I have no strong preference between the alternatives, but why toss coins as long as I have an option that gives a higher expected payoff? Elster might reply that this choice is an example of hyperrationality ("defined as the failure to recognize the failure of rational choice theory to yield unique prescriptions or predictions." SJ, p.17). I agree that it would be irrational to spend much time and money trying to estimate the second decimal if we were ignorant about the first in the case above, but that is not the question. We do not ask whether it is profitable to collect more information, but which choice you should make for a given set of information.

        One might argue that the difference is small in the example above, but the true compasrison is not simply between the difference in probability, but the difference in expected utility when the probabilities are multiplied by the payoffs. In the case above the difference is $200, which seems to me to be a non-negligible sum. The larger the payoff the more significant the small difference in probability is. This argument seems to reveal a tension in Elster's view: In the quotations at the beginning of this chapter he argues that both factors (weak probabilities and large payoffs or "importance") pull in the direction of "coin-tossing", but it seems to me that the factors (at least in my example) pull in separate directions.

        There is, however, an even more serious problem with Elster's suggestion. In the real world we will encounter many choices in which we may rely on probabilities of varying reliability. Sometimes we are very uncertain, sometimes we are more certain. Let us compare the following two rules for choosing what to do (decision-guides):

        A: If our beliefs are very weak, you should (or weaker: might as well) toss a coin to decide the matter; if the beliefs are reliable, you should choose the alternative with the highest expected utility (Elster's strategy)

        B: Choose the action with the highest expected utility both in situations with weak and strong beliefs. (Bayesian strategy)

        First of all, the fact that we have to make many choices means that the many small differences becomes large in aggregate. As a Bayesian says in response to why we should choose B:

        "... life is full of uncertainties - in a given week, you may buy insurance, bet on a football game, make a guess on an exam question, and so forth. As you add up the uncertainties of the events, the law of large numbers come into play, and the expected value determine your long-run gains" Gelman 1998, p. 168).

        Another problem with Elster's decision-rule, is the fact that before we make a decision we have to determine whether the situation is one of "enough" certainty to use chose the action that maximizes expected utility, or whether we are so uncertain that we should randomize (or something else, like maximin). Where is the limit, and is it not costly to do examine the circumstances in this way every time we have to make a decision? Of course, we could go all the way and say that all our knowledge is always so weak that we always should toss coins. In this way we could avoid the problem of choosing when using Elster's strategy. Sometimes Elster is attracted to this argument, but at other times he seems to want to "have the cake and eat it." For instance, he is sympathetic to Descartes when he claims that our knowledge is limited in a way that can be compared to being lost in the forest. Yet, when discussing child custody after a divorce he does not want to go all the way and argue that it might as well always be decided using randomization. In some "obvious" cases the court should not toss a coin. But then the court first has to examine whether the case is obvious and this process is costly in the same way (but maybe not to the same extent) that a trial about child-custody would be. In short, either decision-rule A has a problem in terms of deciding when to toss a coin, or one has to believe that we are so lost that we might as well always toss coins.

         

      3. The relevance of biased probabilities

When discussing subjective beliefs (and beliefs in general) Elster often presents convincing arguments to the effect that beliefs often are formed by hot (beliefs influence by what you want to be the case) and cold cognitive mechanisms (wrong beliefs even when you do not have any strong preferences about the truth). The argument is also used when discussing the problems involved in collecting an optimal amount of information. For instance, he argues that the elicitation of subjective beliefs is subject to a mechanism called anchoring, that is if we start from a low probability (few red balls) in the example of eliciting subjective probabilities, the agent is more likely to end up with a low "subjective probability" than if we start from a high probability and goes down (many red balls). In short the procedure for measuring the belief affect the belief we find! Surely this is a sign that these subjective probabilities are unreliable and should not be used as inputs in decision-making.

Although I find the topic of hot and cold belief-formation both interesting and important, it is not relevant in the present context. The main question in this paper is whether the principle of rationality yields a determinate answer, not whether peoples' actual behaviour conform to the standards of rationality.

There is, however, room for a final comment about Elster arguments that applies to all the previous situations and the recommendation that agents should use maximin or randomization in situations of great uncertainty. It seems to me that this prescription itself (toss coins when you are very unsure), is itself subject to the problem it is meant to avoid. Since Elster argues that we sometimes have reliable probabilities, it follows that we have to decide whether to use the maximin/randomization or maximize expected utility. If the argument against the use of expected utility is that we tend to deceive ourselves so we cannot rely on our subjective probabilities, then one might also suspect that the agent deceives himself when making the choice about which procedure to use. To say that we sometimes should use maximin because we are biased, is not very helpful if the same bias makes us exaggerate the reliability of the probabilities so that we will not choose maximin. This is another instance of the problem already mentioned, when you do not go all the way to say that we should always use the maximin strategy.

 

    1. Conclusion: Elster on the problem of estimation

Sometime Elster argues that some people have good judgement (see e.g. SG, p. 16, ETC p. 87). It seems to me that this implicitly reveals that it is often possible to form rational beliefs about the value of information. If we really lived in a world in which we were lost in the forest, there would be no judgment - only luck and unluck. I am still unsure about the extent to which we are inherently "lost" (i.e. except for out limited abilities), but I do think this section has demonstrated some weaknesses in the argument that the estimation problems implies that it is often impossible to form rational estimates about the value of information.

 

  1. Conclusion

Rational choice theory can be attacked for many reasons. However, after reviewing Elster's arguments I do not think it is a significant objection to argue that it is impossible to make a rational decision because there is an infinite regress problem in the collection of information that makes a rational decision logically impossible. As for the problems of estimation, I agree that these are significant, but they do not prove the impossibility of making a rational choice and I am uncertain about the implications that follow (i.e. Elster's recommendation of randomization and maximin). Finally, I believe some of Elster's arguments on the issue are, if not contradictory then at least "in tension" with each other.

 

References

(for all the chapters in the thesis, not just this)

Akerlof, George and Janet Yellen (1985): The macro-economic consequences of near-rational rule-of-thumb behavior, Quarterly Journal of Economics ??

Backhouse, Roger E. (1995): Interpreting Macroeconomics: Explorations in the History of Macroeconomic Thought, London and New York: Routledge.

Baker G. L. and Gollub J. P (1990): Chaotic dynamics: An introduction, Cambridge: Cambridge University Press.

Baumol, William J. And Richard E. Quandt (1964): Rules of thumb and optimally imperfect decisions, ??, American Economic Review, 23-46.

Baumol, William J. and Jess Benhabib (1989): Chaos: Significance, mechanisms, and economic applications, Journal of Economic Perspectives, 3, 77-105.

Becker, Gary S. (1986/1976): "The Economic Approach to Human Behavior." In Jon Elster, (ed.), Rational Choice, Oxford: Basil Blackwell, 108-122.

Becker, Bary S. (1993): The Economic Way of Looking at behaviour, Journal of Political Economy, 101, 385-409.

Bikhchandani, Sushil and David Hirschleifer and Ivo Welch (1992): A theory of fads, fashion, custom, and cultural change as informational cascades, Journal of Political Economy, 100, 992-1026.

Bikhchandani, Sushil and David Hirschleifer and Ivo Welch (1998): Learning from the behaviour of others: Conformity, fads, and informational cascades, Journal of Economic Perspectives, 12, 151-170.

Binder, Michael and M. Hashem Pesaran (1998): Decision making in the presence of heterogeneous information and social interaction, International Economic Review, 39, 1027-1052.

Boland, L. A. (1981): On the futility of criticizing the neoclassical maximization hypothesis, American Economic Review, 71, 1031-6

Caskey, John (1985): Modeling the formation of price expectations: A bayesian approach, American Economic Review, 75, 768-776.

Chavas, Jean-paul (1993): On the demand for information, Economic Modelling, 10, 398-407. (sure 407?)

Coats, A. W. (1976): "Economics and Psychology: The death of a research programme." In Spiro J. Latsis, (ed.), Methods and appraisal in economics, Cambridge: Cambridge University Press, 43-65.

Colander, David C. (1993): The macrofoundations of micro, Eastern Economic Journal, 19, 447-457.

Coleman, James S. (1984): Introducing social structure into eco0nomic analysis, American Economic review (papers and proceedings), 74, 84-88.

Collard, David A. (1983): Pigou on expectations and the cycle, Economic Journal, 93, 411-414.

Collard, David A. (1996): Pigou and modern business cycle theory, Economic Journal, 106, 912-924.

Davidson, Paul (1982-83): Rational expectations: a fallacious foundation for studying crucial decision-making processes, Journal of Post Keynesian Economics, 5, 182-198?

Davidson, Paul (1991): Is probability theory relevant for uncertainty? A Post Keynesian Perspective, Journal of Economic Perspectives, 5,??

Demsetz, Harold (1997): The primacy of economics: An explanation of the comparative success of economics in the social sciences, Economic Inquiry, 35, 1-11.

Dow, Sheila C. (1997): Mainstream economic methodology, Cambridge Journal of Economics, 21, 73-93.

Dow, Alexander and Sheila C. Dow (1985): "Animal Spirits and Rationality." In T. Lawson and H. Pesaran, (eds), Keynes' Economics: Methodological Issues. Armonk: NY: M. E. Sharpe Inc., 46-65.

Earl, Peter E. (1990): Economics and Psychology: A Survey, Economic Journal, 100, 718-755.

Elster, Jon (1982): "Rationality." In Guttorm Fløistad (ed.): Contemporary Philosophy. A new survey, The Hague, Boston, London: Martinus Nijhoff Publishers, 111-131.

Elster, Jon (1983): The crisis in economic theory (Review of R. Nelson and S. Winter (1982), An Evolutionary Theory of Economic Change and J. Roemer (1982), A General Theory of Exploitation and Class, London Review of Books, 4 (9), 5-7.

Elster, Jon (1983): Explaining Technical Change, Cambridge: Cambridge University Press

Elster, Jon (1983). Sour Grapes: Studies in the subversion of rationality. Cambridge, England: Cambridge University Press.

Elster, Jon (1984, revised edition, first, 1979): Ulysses and the Sirens: Studies in rationality and irrationality, Cambridge: Cambridge University Press.

Elster, Jon (1985): "The nature and scope of rational-choice explanations." In Ernest LePore and Brian P. McLaughlin, (eds.), Actions and Events: Perspectives on the philosophy of Donald Davidson, Oxford: Blackwell, 60-72.

Elster, Jon (1989): Nuts and Bolts for the Social Sciences, Cambridge: Cambridge University Press.

Elster, Jon (1989): Solomonic Judgements: Studies in the limitations of rationality, Cambridge: Cambridge University Press.

Elster, Jon (1993): Some Unresolved problems in the theory of rational behaviour, Acta Sociologica, 36, 179-190.

Elster, Jon (1998): "A Plea for Mechanisms." In Peter Hedström and Richard Swedberg, (eds.), Social Mechanisms, Cambridge: Cambridge University Press.

Elster Jon (ed.) (1986): Rational Choice, Oxford: Basil Blackwell.

Fisher, Franklin M. (19??): "Adjustment processes and stability." In ??? New Palgrave , 26-29.

Fishman, George S. (1964??): Price behavior under alternative forms of price expectations, Quarterly Journal of Economics, ??, 281-298.

Forget, Evelyn L. (1990): John Stuart Mill's business cycle, History of Political Economy, 22, 629-642.

Friedman, M. (1953): "The methodology of positive economics." In Friedman, M. (ed.), Essays in Positive Economics. Chicago: University of Chicago Press (in Hollis ...)

Frydman, Roman and Edmund S. Phelps (eds.) (1982): Individual forecasting and aggregate outcomes: "Rational expectations examined", Cambridge, New York and Sidney: Cambridge University Press.

Garber, Peter M. (1990): Famous First Bubbles, Journal of Economic Perspectives, 4, 35-54.

Gelman, Andrew (1998): Some Class.Participation Demonstrations for Decision Theory and Bayesian Statistics, The American Statistician, 52 (2), 167-174.

Gilbert, Christopher L. (1986), Professor Hendry's Econometric Methodology, Oxford Bulletin of Economics and Statistics, 48: 283-307.

Grandmont, Jean-Michel (1998): Expectations formation and the stability of large socio-economic systems, Econometrica, 66, 741-781.

Grossman, Sanford and Joseph Stiglitz (1980): The impossibility of informationally efficient markets, American Economic Review, 393-408.

Hacking, Ian (1967): Slightly more realistic personal probability, Philosophy of Science, 34, 311-325.

Hacking, Ian (1990): "Probability." in J. Eatwell, M. Milgate and P. Newmann (eds.) The New Palgrave: Utility and Probability. London: W.W. Norton,163-177.

Hahn, F.H. (1973): On the notion of equilibrium in economics, Cambridge: Cambridge University Press.

Hampton, Jean (1994): The failure of expected utility theory as a theory of reason, Economics and Philosophy, 10, 195-242.

Hardin, Russell (19??): "Determinacy and Rational Choice." In Reinhard Selten (ed.): Rational Interaction: Essays in honor of John C. Harsanyi, ??:??, 191-200.

Hardin, Russell (1995): One for All: The Logic of Group Conflict, New Jersey: Princeton University Press.

Hausman, Daniel M. (1992): The Inexact and Separate Science of Economics, Cambridge: Cambridge University Press.

Hausman, Daniel M. (1998): Problems with realism in Economics, Economics and Philosophy, 14, 185-213.

Haussman, John P. (1992): Market efficiency and inefficiency in rational expectations equilibria, Journal of Economic Dynamics and Control, 16, 655-680.

Haregreaves Heap, Shaun, Martin Hollis, Bruce Lyons, Robert Sugden, Albert Weale (1992): The Theory of Choice: A Critical Guide. Oxford: Blackwell.

Heap, Shaun Haregreaves (1992) "Rationality." In Heap, et al., (eds.), The Theory of Choice: A Critical Guide. Oxford: Blackwell.

High, Jack (19??): Knowledge, maximizing, and conjecture: a critical analysis of search theory, Journal of Post Keynesian Economics, ??, 252-264.

Hirschleifer, Jack (1985): The expanding domain of economics, American Economic Review, 75, 53-68.

Hirschman, A. O. (1984), Against Parsimony: Three ways of complicating some categories of economic discourse, American Economic Review, 74 (papers and proceedings), 89-96

Hodgson, Geoffrey M. (1994): Optimization and evolution: Winter's critique of Friedman revisited, Cambridge Journal of Economics, 18, 413-430.

Hodgson, Geoffrey M. (1998): The ubiquity of habits and rules, Cambridge Journal of Economics, 21, 663-684.

Iversen, Gudmund R. (1984): Bayesian statistical inference, SAGE Publications inc., Beverly Hills, CA. (Series: Quantitative Applications in the Social Sciences, no. 43 (07-043).

Kaish, Stanley (1986): "Behavioral economics in the theory of the business cycle." In Benjamin Gilad and Stanley Kaish, (eds.), Handbook of behavioral economics (volume B), Greenwich, Connecticut: JAI Press Inc., 31-49

Kahneman, Daniel; Slovic, Paul; Tversky, Amos (1982): Judgment under Uncertainty: Heuristics and biases, Cambridge: Cambridge University Press.

Kaldor, Nicholas (1972): On the irrelevance of equilibrium economics, Economic Journal, 82, 1237-1255.

Kamien, Morton I., Yari Tauman and Shmuel Zamir (1990): On the value of information in strategic conflict, Games and Economic Behaviour, 2, 129-153.

Kelsey, David and John Quiggin (1992): Theories of choice under ignorance and uncertainty, Journal of Economic Surveys, 6, 133-153.

Keynes, John M. (1936), The General Theory of Employment, Interest, and Money, New York: Harcourt, Brace & World. (My version, reprint from 1997, Amherst, New York: Prometheus Books.)

Kirman, Alan (1989): The intrinsic limits of modern economic theory: The emperor has no clothes, Economic Journal, 99, 126-139.

Kirman, Alan P. (1992): Whom or what does the representative agent represent?, Journal of Economic Perspectives, 6, 117-136.

Kirschenbaum, Susan S. (1992): Influence of experience on information-gathering strategies, Journal of Applied Psychology, 77, 343-352.

Koppl, Roger (1991): Retrospectives: Animal Spirits, Journal of Economic Perspectives, 5, 203-210.

Kreps, David M. (19??): Economics - The current position, Deladus, ??, 59-85.

Larson, Bernt (1968): Bayesian Strategies and Human Information Seeking, Lund: Lund CWK Gleerup (Lund studies in psychology and education).

Lawson, Tony (1988): Probability and Uncertainty in economic analysis, Journal of Post Keynesian Economics, 11, 38-65.

Lewis, Alain A. (1992): On Thuring degrees of Walrasian models and a general impossibility result in the theory of decision-making, Mathematical Social Sciences, 2?, 141-171.

Lipman, B. (1991): How to decide how to decide how to...: Modeling limited rationality, Econometrica 59, 1105-1125.

Matthews, R. C. O. (1984): Animal spirits, Proceedings of the British Academy, 70, 209-229.

Mill, John S. (1826) Paper currency and commercial distress (1826) which is printed in The collected works of J.S. Mill edited by J.M. Robson (4:71-123, Toronto).

Mirowski, Philip (1991): The when, the how and the why of mathematical expression in the history of economic analysis, Journal of Economic Perspectives, 5, 145-157.

Mongin, Philippe and Bernard Walliser (1986): "Infinite regression in the optimizing theory of decision." In Bertrand E. Munier, (ed.), Risk, decision and rationality, Dordrecht: D. Reidel Publishing Company, 435-457.

Mullineux, Andy and WenSheng Peng (19??): Nonlinear business cycle modelling, Journal of Economic Surveys, 7, 41-83.

Munier, Bertrand R. (198?): "A guide to decision-making under uncertainty." In Bertrand R. Minuer (ed.), Risck, Uncertainty and rationality ..

Nelson, Richard R. and Sidney G. Winter (1964): A case study in the economics of information and coordination: The weather forcasting system, Quarterly Journal of Economics, 78, 420-441.

Nelson, Richard R. and Sidney G. Winter (1982), An Evolutionary Theory of Economic Change, Cambridge, MI: The Belknap Press of Harvard University Press.

Nozick, Robert (1993): The Nature of Rationality, Princeton: Princeton University Press.

Pigou, A. C. (1927) Industrial Fluctuations, London: Macmillan.

Plosser, Charles I. (1989): Understanding Real Business Cycles, Journal of Economic Perspectives, 3, 51-77.

Rabin, Matthew (1998): Psychology and Economics, Journal of Economic Literature, 36, 11-46.

Radner, Roy (1996): Bounded rationality, indeterminacy, and the theory of the firm, Economic Journal 106, 1360-1373.

Rosser, J. Barkley Jr. (19), "Chais theory and rationality in economics." In L. Douglas Kiel and Euel Elliot, (eds.), Chaos theory in the Social Sciences: Foundations and applications, Ann Arbor: university of Michigan Press, 199-213.

Rotschild, M. (1974): Searching for the lowest price when the distribution of prices is unknown, Journal of Political Economy, 82, 689-711. (reprinted in ..)

Sargent, Thomas J. (1991): Equilibrium with signal extraction from endogenous variables, Journal of Economic Dynamics and Control, 15, 245-273.

Savage, Leonard J. (1967): Difficulties in the theory of personal probability, Philosophy of Science, 34, 305-310.

Scheinkman, José A. (1990): Nonlinearities in economic dynamics, Economic Journal, 100, 33-48.

Schelling, Thomas C. (1978): Micromotives and Macrobehaviour, New York, N.Y.: W. W. Norton & Company.

Schoemaker, Paul J. (1982): The expected utility model: its variants, purposes, evidence and limitations, Journal of Economic Literature, 20, 529-563.

Schmidt, Frank L. and John E. Hunter (1998): The validity and utility of selection methods in personell psychology: practical and theoretical implications of 85 years of research findings, Psychological Bulletin, 124, 262-274.

Sen, A. K. (1982): "Rational Fools: a critique of the behavioural foundations of economic theory." In Sen A. K. (ed.), Choice, Welfare and Measurement, Oxford: Blackwell (also in beyond self.interest?)

Shapiro, Ian and Donald P. Green (1994): Pathologies of Rational Choice Theory: A Critique of Applications in Political Science, London/New Haven: Yale University Press.

Shulman, Steven (1997): What's so rational about rational expectations? Hyperrationality and the logical limits to neoclassicism, Journal of Post Keynesian Economics, 20, 135-148.

Simon, Herbert A. (1979): Rational Decision-making in Business Organizations, American Economic Review, 69, 493-513.

Simon, Herbert A. (198?): "Behavioural economics." In ???: (eds.), The New Palgrave ...221-225.

Smedslund, Jan (1990) "A critique of Tversky and Kahneman's distinction between fallacy and misunderstanding," Scandinavian Journal of Psychology, 31, 110-120

Smith, Holly (1987): "Deciding how to decide: Is there a regress problem?" In Michael Bacharach and Susan Hurley, (eds.), Foundations of decision theory, Oxford: Blackwell, 194-219.

Solow, Robert M. (19??): How did economics get that way and what way did it get?, Deladus, ??, 39-58.

Solow, Robert M. (1980): On theories of unemployment, American Economic Review, 70, 1-11.

Stigler, George J. (1961): The Economics of Information, Journal of Political Economy, 69, 213-225.

Stigler, George J. (1984): Economics - The Imperial Science, Scandinavian Journal of Economics, 86, 301-313.

Taylor, Michael (1995) "When Rationality Fails." In Jeffrey Friedman (ed.) The Rational Choice Controversy. New Haven and London: Yale University Press

Vassilakis, Spyros (1992): Some economic applications of Scott domains, Mathematical Social Sciences, 2?, 173-208.

Winter, Sidney G. (1964): Economic "natural selection" and the theory of the firm, Yale Economic Essays, 4, 225-272.

Winter, Sidney G. (1971): Satisficing, selection, and the innovating remnant, Quarterly Journal of Economics, 85, 237-261.

Winter, Sidney G. (1986): "The research program of the behavioral theory of the firm: orthodox critique and evolutionary perspective." In Benjamin Gilad and Stanley Kaish, (eds.), Handbook of behavioral economics (volume A), Greenwich, Connecticut: JAI Press Inc., 151-188.

Winter, Sidney G. (1987): "Comments on Arrow and Lucas." In Robin M. Hogarth and Melvin W. Reder (eds.), Rational Choice: The contrast between economics and psychology, Chicago and London: The University of Chicago Press, 243-250.

Zarnowitz, V. (1985): Recent work on business cycles in historical perspective, Journal of Economic Literature, 23, 523-580.

Zeckhauser, Richard (1987): "Comments: Behavioral versus Rational Economics: What you see is what you conquer." In Robin M. Hogarth and Melvin W. Reder (eds.), Rational Choice: The contrast between economics and psychology, Chicago and London: The University of Chicago Press, 251-265.