Hope springs eternal: learning and the stability of cooperation in short horizon repeated games

成果类型:
Article
署名作者:
Conlon, JR
署名单位:
University of Mississippi
刊物名称:
JOURNAL OF ECONOMIC THEORY
ISSN/ISSBN:
0022-0531
DOI:
10.1016/S0022-0531(03)00073-5
发表日期:
2003
页码:
35-65
关键词:
Rational learning Speed of learning finitely repeated prisoners' dilemmas reputation COOPERATION recurrent games
摘要:
This paper considers learning rates in finitely repeated prisoners' dilemmas. If players think their opponents might be relatively cooperative (e.g., tit-for-tat or grim types), they will cooperate in finitely repeated prisoners' dilemmas (see Kreps et al., J. Econom. Theory 27 (1982) 245). However, if there are actually no cooperative types, players will eventually learn this and cooperation will break down. This paper shows that this learning is extremely slow, so it will take an extremely long time for cooperation to break down. Thus, suppose the world is either good or bad. The probability of a grim type is delta > 0 if the world is good, and zero if the world is bad. Successive generations pair up to play finitely repeated prisoners' dilemmas. Players observe play in previous generations and use Bayes' rule to update their prior, pi, that the world is good. We show that, if the world is really bad, then pi falls O(delta(2) log(1/delta)) per generation on average. Thus, if delta is small, there is less cooperation if the world is good, but cooperation may become more stable. For a representative 19 period repeated prisoners' dilemma, beliefs fall one percentage point on average after a thousand generations. To derive these learning rates, we must refine existing results on the sensitivity of repeated games to Kreps et al. (1982) type perturbations. Specifically, we show cooperation is possible in perturbed prisoners' dilemmas repeated O(log(1/delta)) times. This improves significantly on the O(1/delta) results in previous work, The paper thus provides two new reasons why cooperation tends to be stable, even in short horizon repeated games. (C) 2003 Elsevier Science (USA). All rights reserved.