The Evolution of Cooperation: Difference between revisions
imported>Citation bot Added bibcode. | Use this bot. Report bugs. | Suggested by Smasongarrison | Category:Basic Books books | #UCB_Category 27/95 |
imported>Md Abdul Muhaimen |
||
| (One intermediate revision by one other user not shown) | |||
| Line 65: | Line 65: | ||
interpret the associate's actions are is non-trivial (e.g. recognizing the degree | interpret the associate's actions are is non-trivial (e.g. recognizing the degree | ||
of cooperation shown)<ref> | of cooperation shown)<ref> | ||
{{cite journal |last1=Prechelt |first1=Lutz |title=INCA: A multi-choice model of cooperation under restricted communication |journal=Biosystems |date=1996 |volume=37 |issue=1–2 |pages=127–134 |doi=10.1016/0303-2647(95)01549-3 |bibcode=1996BiSys..37..127P |ref=INCA|url=https://publikationen.bibliothek.kit.edu/159797/1178 }}</ref> | {{cite journal |last1=Prechelt |first1=Lutz |title=INCA: A multi-choice model of cooperation under restricted communication |journal=Biosystems |date=1996 |volume=37 |issue=1–2 |pages=127–134 |doi=10.1016/0303-2647(95)01549-3 |bibcode=1996BiSys..37..127P |ref=INCA|url=https://publikationen.bibliothek.kit.edu/159797/1178 |url-access=subscription }}</ref> | ||
== Subsequent work == | == Subsequent work == | ||
| Line 75: | Line 75: | ||
Axelrod considers his subsequent book, ''[[The Complexity of Cooperation]]'',<ref> | Axelrod considers his subsequent book, ''[[The Complexity of Cooperation]]'',<ref> | ||
{{Harvnb|Axelrod|1997}}.</ref> | {{Harvnb|Axelrod|1997}}.</ref> | ||
to be a sequel to ''The Evolution of Cooperation''. Other work on the evolution of cooperation has expanded to cover prosocial behavior generally,<ref> | to be a sequel to ''The Evolution of Cooperation''. Other work on the evolution of cooperation has expanded to cover [[prosocial behavior]] generally,<ref> | ||
{{Harvnb|Boyd|2006}}; {{Harvnb|Bowles|2006}}.</ref> | {{Harvnb|Boyd|2006}}; {{Harvnb|Bowles|2006}}.</ref> | ||
and in religion,<ref> | and in religion,<ref> | ||
| Line 87: | Line 87: | ||
{{Harvnb|Axelrod|Dion|1988}}; | {{Harvnb|Axelrod|Dion|1988}}; | ||
{{Harvnb|Hoffman|2000}} categorizes and summarizes over 50 studies</ref> | {{Harvnb|Hoffman|2000}} categorizes and summarizes over 50 studies</ref> | ||
and the use of other games such as the [[Public goods game|Public Goods]] and [[Ultimatum]] games to explore deep-seated notions of fairness and fair play.<ref> | and the use of other games such as the [[Public goods game|Public Goods]] and [[Ultimatum game|Ultimatum]] games to explore deep-seated notions of fairness and fair play.<ref> | ||
{{Harvnb|Nowak|Page|Sigmund|2000}}; {{Harvnb|Sigmund|Fehr|Nowak|2002}}.</ref> | {{Harvnb|Nowak|Page|Sigmund|2000}}; {{Harvnb|Sigmund|Fehr|Nowak|2002}}.</ref> | ||
It has also been used to challenge the rational and self-regarding "[[economic man]]" model of economics,<ref> | It has also been used to challenge the rational and self-regarding "[[economic man]]" model of economics,<ref> | ||
{{Harvnb|Camerer|Fehr|2006}}.</ref> | {{Harvnb|Camerer|Fehr|2006}}.</ref> | ||
and as a basis for replacing Darwinian [[sexual selection]] theory with a theory of social selection.<ref> | and as a basis for replacing Darwinian [[sexual selection]] theory with a theory of [[social selection]].<ref> | ||
{{Harvnb|Roughgarden|Oishi|Akcay|2006}}.</ref> | {{Harvnb|Roughgarden|Oishi|Akcay|2006}}.</ref> | ||
| Line 100: | Line 100: | ||
When an IPD tournament introduces noise (errors or misunderstandings), TFT strategies can get trapped into a long string of retaliatory defections, thereby depressing their score. TFT also tolerates "ALL C" (always cooperate) strategies, which then give an opening to exploiters.<ref> | When an IPD tournament introduces noise (errors or misunderstandings), TFT strategies can get trapped into a long string of retaliatory defections, thereby depressing their score. TFT also tolerates "ALL C" (always cooperate) strategies, which then give an opening to exploiters.<ref> | ||
{{Harvtxt|Axelrod|1984|pp=136–138}} has some interesting comments on the need to suppress universal cooperators. See also a similar theme in Piers Anthony's novel ''[[Macroscope (novel by Piers Anthony)|Macroscope]]''.</ref> | {{Harvtxt|Axelrod|1984|pp=136–138}} has some interesting comments on the need to suppress universal cooperators. See also a similar theme in Piers Anthony's novel ''[[Macroscope (novel by Piers Anthony)|Macroscope]]''.</ref> | ||
In 1992 Martin Nowak and Karl Sigmund demonstrated a strategy called Pavlov (or "win–stay, lose–shift") that does better in these circumstances.<ref> | In 1992 [[Martin Nowak]] and Karl Sigmund demonstrated a strategy called Pavlov (or "win–stay, lose–shift") that does better in these circumstances.<ref> | ||
{{Harvnb|Nowak|Sigmund|1992}}; see also {{Harvnb|Milinski|1993}}.</ref> | {{Harvnb|Nowak|Sigmund|1992}}; see also {{Harvnb|Milinski|1993}}.</ref> | ||
Pavlov looks at its own prior move as well as the other player's move. If the payoff was R or P (see "Prisoner's Dilemma", above) it cooperates; if S or T it defects. | Pavlov looks at its own prior move as well as the other player's move. If the payoff was R or P (see "Prisoner's Dilemma", above) it cooperates; if S or T it defects. | ||
In a 2006 paper Nowak listed five mechanisms by which natural selection can lead to cooperation.<ref> | In a 2006 paper Nowak listed five mechanisms by which [[natural selection]] can lead to cooperation.<ref> | ||
{{Harvnb|Nowak|2006 }};</ref> | {{Harvnb|Nowak|2006 }};</ref> | ||
In addition to kin selection and direct reciprocity, he shows that: | In addition to kin selection and direct reciprocity, he shows that: | ||
| Line 118: | Line 118: | ||
In small populations or groups there is the possibility that indirect reciprocity (reputation) can interact with direct reciprocity (e.g. tit for tat) with neither strategy dominating the other.<ref>Phelps, S., Nevarez, G. & Howes, A., 2009. The effect of group size and frequency of encounter on the evolution of cooperation. In LNCS, Volume 5778, ECAL 2009, Advances in Artificial Life: Darwin meets Von Neumann. Budapest: Springer, pp. 37–44. [https://doi.org/10.1007%2F978-3-642-21314-4_5].</ref> The interactions between these strategies can give rise to dynamic [[social networks]] which exhibit some of the properties observed in empirical networks<ref>{{cite journal | last1 = Phelps | first1 = S | year = 2012 | title = Emergence of social networks via direct and indirect reciprocity | url = http://repository.essex.ac.uk/3900/1/trust.pdf| journal = Autonomous Agents and Multi-Agent Systems | doi = 10.1007/s10458-012-9207-8 | s2cid = 1337854 }}</ref> If network structure and choices in the Prisoner's dilemma co-evolve, then cooperation can survive. In the resulting networks cooperators will be more centrally located than defectors who will tend to be in the periphery of the network.{{sfn|Fosco|Mengel|2011}} | In small populations or groups there is the possibility that indirect reciprocity (reputation) can interact with direct reciprocity (e.g. tit for tat) with neither strategy dominating the other.<ref>Phelps, S., Nevarez, G. & Howes, A., 2009. The effect of group size and frequency of encounter on the evolution of cooperation. In LNCS, Volume 5778, ECAL 2009, Advances in Artificial Life: Darwin meets Von Neumann. Budapest: Springer, pp. 37–44. [https://doi.org/10.1007%2F978-3-642-21314-4_5].</ref> The interactions between these strategies can give rise to dynamic [[social networks]] which exhibit some of the properties observed in empirical networks<ref>{{cite journal | last1 = Phelps | first1 = S | year = 2012 | title = Emergence of social networks via direct and indirect reciprocity | url = http://repository.essex.ac.uk/3900/1/trust.pdf| journal = Autonomous Agents and Multi-Agent Systems | doi = 10.1007/s10458-012-9207-8 | s2cid = 1337854 }}</ref> If network structure and choices in the Prisoner's dilemma co-evolve, then cooperation can survive. In the resulting networks cooperators will be more centrally located than defectors who will tend to be in the periphery of the network.{{sfn|Fosco|Mengel|2011}} | ||
In "The Coevolution of Parochial Altruism and War" by Jung-Kyoo Choi and Samuel Bowles. From their summary: | In "The [[Coevolution]] of [[Parochial altruism|Parochial Altruism]] and War" by Jung-Kyoo Choi and Samuel Bowles. From their summary: | ||
<!-- Possible issue here: should quoted material retain em-dashes, in accord with the editing style of the original paper, or use spaced en-dashes, for consistency with this article. I haven't found any style guidelines on this (yet), but am inclined to retain the style of the original. Note that Wiki style says to avoid using more than one pair of em-dashes in a sentence ([[MOS:MDASH]] – does this mean the quote has to be rewritten? It makes more sense the Wiki rule should not be required of quotes from other sources. --> | <!-- Possible issue here: should quoted material retain em-dashes, in accord with the editing style of the original paper, or use spaced en-dashes, for consistency with this article. I haven't found any style guidelines on this (yet), but am inclined to retain the style of the original. Note that Wiki style says to avoid using more than one pair of em-dashes in a sentence ([[MOS:MDASH]] – does this mean the quote has to be rewritten? It makes more sense the Wiki rule should not be required of quotes from other sources. --> | ||
| Line 127: | Line 127: | ||
Consideration of the mechanisms through which learning from the social environment occurs is pivotal in studies of evolution. In the context of this discussion, learning rules, specifically conformism and payoff-dependent imitation, are not arbitrarily predetermined but are biologically selected. Behavioral strategies, which include cooperation, defection, and cooperation coupled with punishment, are chosen in alignment with the agent's prevailing learning rule. Simulations of the model under conditions approximating those experienced by early hominids reveal that conformism can evolve even when individuals are solely faced with a cooperative dilemma, contrary to previous assertions. Moreover, the incorporation of conformists significantly amplifies the group size within which cooperation can be sustained. These model results demonstrate robustness, maintaining validity even under conditions of high migration rates and infrequent intergroup conflicts.<ref>{{Cite journal |last1=Guzmán |first1=R. A. |last2=Rodríguez-Sickert |first2=C. |last3=Rowthorn |first3=R. |year=2007 |title=When in Rome, do as the Romans do: the coevolution of altruistic punishment, conformist learning, and cooperation |journal=Evolution and Human Behavior |volume=28 |issue=2 |pages=112–117|doi=10.1016/j.evolhumbehav.2006.08.002 |bibcode=2007EHumB..28..112A |url=https://mpra.ub.uni-muenchen.de/2037/1/MPRA_paper_2037.pdf }}</ref> | Consideration of the mechanisms through which learning from the social environment occurs is pivotal in studies of evolution. In the context of this discussion, learning rules, specifically conformism and payoff-dependent imitation, are not arbitrarily predetermined but are biologically selected. Behavioral strategies, which include cooperation, defection, and cooperation coupled with punishment, are chosen in alignment with the agent's prevailing learning rule. Simulations of the model under conditions approximating those experienced by early hominids reveal that conformism can evolve even when individuals are solely faced with a cooperative dilemma, contrary to previous assertions. Moreover, the incorporation of conformists significantly amplifies the group size within which cooperation can be sustained. These model results demonstrate robustness, maintaining validity even under conditions of high migration rates and infrequent intergroup conflicts.<ref>{{Cite journal |last1=Guzmán |first1=R. A. |last2=Rodríguez-Sickert |first2=C. |last3=Rowthorn |first3=R. |year=2007 |title=When in Rome, do as the Romans do: the coevolution of altruistic punishment, conformist learning, and cooperation |journal=Evolution and Human Behavior |volume=28 |issue=2 |pages=112–117|doi=10.1016/j.evolhumbehav.2006.08.002 |bibcode=2007EHumB..28..112A |url=https://mpra.ub.uni-muenchen.de/2037/1/MPRA_paper_2037.pdf }}</ref> | ||
Neither Choi & Bowles nor Guzmán, Rodriguez-Sicket and Rowthorn claim that humans have actually evolved in this way, but that computer simulations show how war could be promoted by the interaction of these behaviors. A crucial open research question, thus, is how realistic the assumptions are on which these simulation models are based.<ref>{{Harvnb|Rusch|2014}}.</ref> | Neither Choi & Bowles nor Guzmán, Rodriguez-Sicket and Rowthorn claim that humans have actually evolved in this way, but that computer simulations show how war could be promoted by the interaction of these behaviors. A crucial [[open research]] question, thus, is how realistic the assumptions are on which these simulation models are based.<ref>{{Harvnb|Rusch|2014}}.</ref> | ||
== Software == | == Software == | ||
| Line 133: | Line 133: | ||
Several software packages have been created to run prisoner's dilemma simulations and tournaments, some of which have available source code. | Several software packages have been created to run prisoner's dilemma simulations and tournaments, some of which have available source code. | ||
* The source code for the second tournament run by Robert Axelrod (written by Axelrod and many contributors in [[Fortran]]) is available online.<ref>http://www-personal.umich.edu/~axe/research/Software/CC/CC2.html</ref> | * The source code for the second tournament run by Robert Axelrod (written by Axelrod and many contributors in [[Fortran]]) is available online.<ref>{{cite web | title=Complexity of Cooperation Chapter 2 | url=http://www-personal.umich.edu/~axe/research/Software/CC/CC2.html }}</ref> | ||
* PRISON,<ref>https://web.archive.org/web/19991010053242/http://www.lifl.fr/IPD/ipd.frame.html</ref> a library written in [[Java (programming language)|Java]], last updated in 1999 | * PRISON,<ref>https://web.archive.org/web/19991010053242/http://www.lifl.fr/IPD/ipd.frame.html</ref> a library written in [[Java (programming language)|Java]], last updated in 1999 | ||
* Axelrod-Python,<ref>https://github.com/Axelrod-Python/Axelrod</ref> written in [[Python (programming language)|Python]] | * Axelrod-Python,<ref>{{cite web | title=Axelrod-Python/Axelrod: V4.12.0 | website=[[GitHub]] | url=https://github.com/Axelrod-Python/Axelrod }}</ref> written in [[Python (programming language)|Python]] | ||
== Recommended reading == | == Recommended reading == | ||
| Line 192: | Line 192: | ||
| first1 = Richard | | first1 = Richard | ||
| last1 = Dawkins | | last1 = Dawkins | ||
| orig- | | orig-date = 1976 | ||
| year = 1989 | | year = 1989 | ||
| title = The Selfish Gene | | title = The Selfish Gene | ||
| Line 221: | Line 221: | ||
}} | }} | ||
* {{citation|first1=Karl |last1=Sigmund |first2=Ernest |last2=Fehr |first3=Martin A. |last3=Nowak |date=January 2002 |title=The Economics of Fair Play |volume=286 |issue=1 |magazine=Scientific American |pages=82–87 |url=http://www.ped.fas.harvard.edu/people/faculty/publications_nowak/SciAm02.pdf | * {{citation|first1=Karl |last1=Sigmund |first2=Ernest |last2=Fehr |first3=Martin A. |last3=Nowak |date=January 2002 |title=The Economics of Fair Play |volume=286 |issue=1 |magazine=Scientific American |pages=82–87 |url=http://www.ped.fas.harvard.edu/people/faculty/publications_nowak/SciAm02.pdf |archive-url=https://web.archive.org/web/20110518024042/http://www.ped.fas.harvard.edu/people/faculty/publications_nowak/SciAm02.pdf |archive-date=18 May 2011 |bibcode=2002SciAm.286a..82S |doi=10.1038/scientificamerican0102-82 |pmid=11799620 | ref=none}} | ||
* {{citation | * {{citation | ||
| first1 = Robert L. |last1 = Trivers | | first1 = Robert L. |last1 = Trivers | ||
| Line 380: | Line 380: | ||
|archive-date = 7 February 2009 | |archive-date = 7 February 2009 | ||
|archive-url = https://web.archive.org/web/20090207042004/http://www.santafe.edu/~bowles/GroupCompetition | |archive-url = https://web.archive.org/web/20090207042004/http://www.santafe.edu/~bowles/GroupCompetition | ||
}} | |||
}} | |||
* {{citation | * {{citation | ||
|first1 = Samuel | |first1 = Samuel | ||
| Line 402: | Line 401: | ||
|archive-date = 19 February 2005 | |archive-date = 19 February 2005 | ||
|archive-url = https://web.archive.org/web/20050219024556/http://www.santafe.edu/~jkchoi/jtb223_2.pdf | |archive-url = https://web.archive.org/web/20050219024556/http://www.santafe.edu/~jkchoi/jtb223_2.pdf | ||
}} | |||
}} | |||
* {{citation | * {{citation | ||
|first1 = Robert | |first1 = Robert | ||
| Line 420: | Line 418: | ||
|archive-date = 17 October 2018 | |archive-date = 17 October 2018 | ||
|archive-url = https://web.archive.org/web/20181017140723/http://xcelab.net/rm/wp-content/uploads/2008/10/boyd-evolution-human-cooperation-review.pdf | |archive-url = https://web.archive.org/web/20181017140723/http://xcelab.net/rm/wp-content/uploads/2008/10/boyd-evolution-human-cooperation-review.pdf | ||
}} | }} | ||
* {{citation | * {{citation | ||
| Line 453: | Line 450: | ||
* {{citation | * {{citation | ||
| first1 = Charles |last1 = Darwin | | first1 = Charles |last1 = Darwin | ||
| orig- | | orig-date = 1859 | ||
| year = 1964 | | year = 1964 | ||
| title = On the Origin of Species | | title = On the Origin of Species | ||
| Line 462: | Line 459: | ||
| first1 = Richard | | first1 = Richard | ||
| last1 = Dawkins | | last1 = Dawkins | ||
| orig- | | orig-date = 1976 | ||
| year = 1989 | | year = 1989 | ||
| title = The Selfish Gene | | title = The Selfish Gene | ||
| Line 491: | Line 488: | ||
| url = http://www.marxists.org/subject/science/essays/kropotkin.htm | | url = http://www.marxists.org/subject/science/essays/kropotkin.htm | ||
}} | }} | ||
* {{citation |first1=Özgür |last1=Gürerk <!-- Ozgur Gurerk --> |first2=Bernd |last2=Irlenbusch |first3=Bettina |last3=Rockenbach |date=7 April 2006 |title=The Competitive Advantage of Sanctioning Institutions |journal=Science |volume=312 |issue=5770 |pages=108–11 |doi=10.1126/science.1123633 |url=http://www.lrz.de/~u516262/webserver/webdata/guererketal2006_sanctioningmechanisms_science.pdf |bibcode=2006Sci...312..108G | * {{citation |first1=Özgür |last1=Gürerk <!-- Ozgur Gurerk --> |first2=Bernd |last2=Irlenbusch |first3=Bettina |last3=Rockenbach |date=7 April 2006 |title=The Competitive Advantage of Sanctioning Institutions |journal=Science |volume=312 |issue=5770 |pages=108–11 |doi=10.1126/science.1123633 |url=http://www.lrz.de/~u516262/webserver/webdata/guererketal2006_sanctioningmechanisms_science.pdf |bibcode=2006Sci...312..108G |archive-url=https://web.archive.org/web/20110719060124/http://www.lrz.de/~u516262/webserver/webdata/guererketal2006_sanctioningmechanisms_science.pdf |archive-date=19 July 2011 |pmid=16601192 |s2cid=40038573 }} | ||
* {{citation | * {{citation | ||
| year = 1963 | | year = 1963 | ||
| Line 501: | Line 498: | ||
| bibcode = 1963ANat...97..354H | s2cid = 84216415 | url = http://westgroup.biology.ed.ac.uk/teach/social/Hamilton_63.pdf | | bibcode = 1963ANat...97..354H | s2cid = 84216415 | url = http://westgroup.biology.ed.ac.uk/teach/social/Hamilton_63.pdf | ||
}}{{dead link|date=March 2016}} | }}{{dead link|date=March 2016}} | ||
* {{citation|first1=William D. |last1=Hamilton |year=1964 |title=The Genetical Evolution of Social Behavior |journal=Journal of Theoretical Biology |volume=7 |issue=1 |pages=1–16, 17–52 |doi=10.1016/0022-5193(64)90038-4 |url=http://lis.epfl.ch/~markus/References/Hamilton64a.pdf |pmid=5875341 |bibcode=1964JThBi...7....1H | * {{citation|first1=William D. |last1=Hamilton |year=1964 |title=The Genetical Evolution of Social Behavior |journal=Journal of Theoretical Biology |volume=7 |issue=1 |pages=1–16, 17–52 |doi=10.1016/0022-5193(64)90038-4 |url=http://lis.epfl.ch/~markus/References/Hamilton64a.pdf |pmid=5875341 |bibcode=1964JThBi...7....1H |archive-url=https://web.archive.org/web/20091229084043/http://lis.epfl.ch/~markus/References/Hamilton64a.pdf |archive-date=29 December 2009 }} | ||
* {{citation|first1=Garrett |last1=Hardin |date=13 December 1968 |title=The Tragedy of the Commons |journal=Science |volume=162 |pages=1243–1248 |doi=10.1126/science.162.3859.1243 |pmid=5699198 |issue=3859 |bibcode=1968Sci...162.1243H |doi-access= }} | * {{citation|first1=Garrett |last1=Hardin |date=13 December 1968 |title=The Tragedy of the Commons |journal=Science |volume=162 |pages=1243–1248 |doi=10.1126/science.162.3859.1243 |pmid=5699198 |issue=3859 |bibcode=1968Sci...162.1243H |doi-access= }} | ||
* {{citation|ref=CITEREFHauertothers2007 |first1=Christoph |last1=Hauert |first2=Arne |last2=Traulsen |first3=Hannelore |last3=Brandt |first4=Martin A. |last4=Nowak |first5=Karl |last5=Sigmund |date=29 June 2007 |title=Via Freedom to Coercion: The Emergence of Costly Punishment |journal=Science |volume=316 |issue=5833 |pages=1905–07 |doi=10.1126/science.1141588 |url=http://www.ped.fas.harvard.edu/people/faculty/publications_nowak/science07.pdf |bibcode=2007Sci...316.1905H | * {{citation|ref=CITEREFHauertothers2007 |first1=Christoph |last1=Hauert |first2=Arne |last2=Traulsen |first3=Hannelore |last3=Brandt |first4=Martin A. |last4=Nowak |first5=Karl |last5=Sigmund |date=29 June 2007 |title=Via Freedom to Coercion: The Emergence of Costly Punishment |journal=Science |volume=316 |issue=5833 |pages=1905–07 |doi=10.1126/science.1141588 |url=http://www.ped.fas.harvard.edu/people/faculty/publications_nowak/science07.pdf |bibcode=2007Sci...316.1905H |archive-url=https://web.archive.org/web/20110518015457/http://www.ped.fas.harvard.edu/people/faculty/publications_nowak/science07.pdf |archive-date=18 May 2011 |pmid=17600218 |pmc=2430058}} | ||
* {{citation|first1=Joseph |last1=Henrich |date=7 April 2006 |title=Cooperation, Punishment, and the Evolution of Human Institutions |journal=Science |volume=312 |issue=5770 |pages=60–61 |doi=10.1126/science.1126398 |pmid=16601179 |s2cid=39232348 |url=https://www.sfu.ca/~wchane/sa304articles/Henrich.pdf | * {{citation|first1=Joseph |last1=Henrich |date=7 April 2006 |title=Cooperation, Punishment, and the Evolution of Human Institutions |journal=Science |volume=312 |issue=5770 |pages=60–61 |doi=10.1126/science.1126398 |pmid=16601179 |s2cid=39232348 |url=https://www.sfu.ca/~wchane/sa304articles/Henrich.pdf |archive-url=https://web.archive.org/web/20110629014507/http://www.sfu.ca/~wchane/sa304articles/Henrich.pdf |archive-date=29 June 2011 }} | ||
* {{citation | * {{citation | ||
| ref = CITEREFHenrichothers2006 | | ref = CITEREFHenrichothers2006 | ||
| Line 603: | Line 600: | ||
| first1 = Phillip M. |last1 = Morse | | first1 = Phillip M. |last1 = Morse | ||
| first2 = George E. |last2 = Kimball | | first2 = George E. |last2 = Kimball | ||
| year = 1951 |isbn= | | year = 1951 |isbn=978-0-262-13005-9 | ||
| title = Methods of Operations Research |publisher=The MIT Press | | title = Methods of Operations Research |publisher=The MIT Press | ||
}} | }} | ||
| Line 628: | Line 625: | ||
| bibcode = 2008Sci...322...58N | pmid=18832637|s2cid = 28514 | | bibcode = 2008Sci...322...58N | pmid=18832637|s2cid = 28514 | ||
}} | }} | ||
* {{citation|first1=Martin A |last1=Nowak |date=8 December 2006 |title=Five Rules for the Evolution of Cooperation |journal=Science |volume=314 |issue=5805 |pages=1560–63 |doi=10.1126/science.1133755 |url=http://www.ped.fas.harvard.edu/people/faculty/publications_nowak/Nowak_Science06.pdf |pmid=17158317 |bibcode=2006Sci...314.1560N |pmc=3279745 | * {{citation|first1=Martin A |last1=Nowak |date=8 December 2006 |title=Five Rules for the Evolution of Cooperation |journal=Science |volume=314 |issue=5805 |pages=1560–63 |doi=10.1126/science.1133755 |url=http://www.ped.fas.harvard.edu/people/faculty/publications_nowak/Nowak_Science06.pdf |pmid=17158317 |bibcode=2006Sci...314.1560N |pmc=3279745 |archive-url=https://web.archive.org/web/20110518021052/http://www.ped.fas.harvard.edu/people/faculty/publications_nowak/Nowak_Science06.pdf |archive-date=18 May 2011 }} | ||
* {{citation|first1=Martin A |last1=Nowak |first2=Karen M. |last2=Page |first3=Karl |last3=Sigmund |date=8 September 2000 |title=Fairness Versus Reason in the Ultimatum Game |journal=Science |volume=289 |pages=1773–75 |doi=10.1126/science.289.5485.1773 |url=http://www.ped.fas.harvard.edu/people/faculty/publications_nowak/Science00.pdf |issue=5485 |pmid=10976075 |bibcode=2000Sci...289.1773N | * {{citation|first1=Martin A |last1=Nowak |first2=Karen M. |last2=Page |first3=Karl |last3=Sigmund |date=8 September 2000 |title=Fairness Versus Reason in the Ultimatum Game |journal=Science |volume=289 |pages=1773–75 |doi=10.1126/science.289.5485.1773 |url=http://www.ped.fas.harvard.edu/people/faculty/publications_nowak/Science00.pdf |issue=5485 |pmid=10976075 |bibcode=2000Sci...289.1773N |archive-url=https://web.archive.org/web/20110719204742/http://www.ped.fas.harvard.edu/people/faculty/publications_nowak/Science00.pdf |archive-date=19 July 2011 }} | ||
* {{citation|first1=Martin A. |last1=Nowak |first2=Karl |last2=Sigmund |date=16 January 1992 |title=Tit For Tat in Heterogenous Populations |journal=Nature |volume=355 |pages=250–253 |doi=10.1038/315250a0 |pmid=3889654 |url=http://www.ped.fas.harvard.edu/people/faculty/publications_nowak/Nature92b.pdf |issue=6016 |bibcode=1985Natur.315..250T |s2cid=4344991 | * {{citation|first1=Martin A. |last1=Nowak |first2=Karl |last2=Sigmund |date=16 January 1992 |title=Tit For Tat in Heterogenous Populations |journal=Nature |volume=355 |pages=250–253 |doi=10.1038/315250a0 |pmid=3889654 |url=http://www.ped.fas.harvard.edu/people/faculty/publications_nowak/Nature92b.pdf |issue=6016 |bibcode=1985Natur.315..250T |s2cid=4344991 |archive-url=https://web.archive.org/web/20110616192929/http://www.ped.fas.harvard.edu/people/faculty/publications_nowak/Nature92b.pdf |archive-date=16 June 2011 }} | ||
* {{citation|first1=Martin A. |last1=Nowak |first2=Karl |last2=Sigmund |date=1 July 1993 |title=A strategy of win-stay, lose-shift that outperforms tit for tat in Prisoner's Dilemma |journal=Nature |volume=364 |pages=56–58 |doi=10.1038/364056a0 |url=http://www.ped.fas.harvard.edu/people/faculty/publications_nowak/Nature93.pdf |issue=6432 |bibcode=1993Natur.364...56N | * {{citation|first1=Martin A. |last1=Nowak |first2=Karl |last2=Sigmund |date=1 July 1993 |title=A strategy of win-stay, lose-shift that outperforms tit for tat in Prisoner's Dilemma |journal=Nature |volume=364 |pages=56–58 |doi=10.1038/364056a0 |url=http://www.ped.fas.harvard.edu/people/faculty/publications_nowak/Nature93.pdf |issue=6432 |bibcode=1993Natur.364...56N |archive-url=https://web.archive.org/web/20080704060932/http://www.ped.fas.harvard.edu/people/faculty/publications_nowak/Nature93.pdf |archive-date=4 July 2008 |pmid=8316296|s2cid=4238908 }} | ||
* {{citation | * {{citation | ||
| year = 1992 | | year = 1992 | ||
| Line 682: | Line 679: | ||
| hdl-access = free | | hdl-access = free | ||
}} | }} | ||
* {{citation|first1=Joan |last1=Roughgarden |first2=Meeko |last2=Oishi | author2-link = Meeko Oishi |first3=Erol |last3=Akcay |date=17 February 2006 |title=Reproductive Social Behavior: Cooperative Games to Replace Sexual Selection |journal=Science |volume=311 |issue=5763 |pages=965–69 |doi=10.1126/science.1110105 |url=http://www.ecfs.org/projects/pchurch/AT%20BIOLOGY/PAPERS/ReplacingDarwinsSexualSelection.pdf |pmid=16484485 |bibcode=2006Sci...311..965R |s2cid=32364112 | * {{citation|first1=Joan |last1=Roughgarden |first2=Meeko |last2=Oishi | author2-link = Meeko Oishi |first3=Erol |last3=Akcay |date=17 February 2006 |title=Reproductive Social Behavior: Cooperative Games to Replace Sexual Selection |journal=Science |volume=311 |issue=5763 |pages=965–69 |doi=10.1126/science.1110105 |url=http://www.ecfs.org/projects/pchurch/AT%20BIOLOGY/PAPERS/ReplacingDarwinsSexualSelection.pdf |pmid=16484485 |bibcode=2006Sci...311..965R |s2cid=32364112 |archive-url=https://web.archive.org/web/20110721215055/http://www.ecfs.org/projects/pchurch/AT%20BIOLOGY/PAPERS/ReplacingDarwinsSexualSelection.pdf |archive-date=21 July 2011 }} | ||
* {{citation | * {{citation | ||
| first1 = Jean Jacques |last1 = Rousseau | | first1 = Jean Jacques |last1 = Rousseau | ||
| Line 695: | Line 692: | ||
| title = The Evolutionary Interplay of Intergroup Conflict and Altruism in Humans: A Review of Parochial Altruism Theory and Prospects for its Extension | | title = The Evolutionary Interplay of Intergroup Conflict and Altruism in Humans: A Review of Parochial Altruism Theory and Prospects for its Extension | ||
| journal = Proceedings of the Royal Society B: Biological Sciences | | journal = Proceedings of the Royal Society B: Biological Sciences | ||
| volume = 281 | issue = 1794 | | | volume = 281 | issue = 1794 |article-number = 20141539 | ||
| doi = 10.1098/rspb.2014.1539 | | doi = 10.1098/rspb.2014.1539 | ||
| pmid=25253457 | | pmid=25253457 | ||
| Line 711: | Line 708: | ||
| bibcode = 2007Sci...318..598S | doi-access = | | bibcode = 2007Sci...318..598S | doi-access = | ||
}} | }} | ||
* {{citation|first1=Karl |last1=Sigmund |first2=Ernest |last2=Fehr |first3=Martin A. |last3=Nowak |date=January 2002 |title=The Economics of Fair Play |volume=286 |issue=1 |magazine=Scientific American |pages=82–87 |url=http://www.ped.fas.harvard.edu/people/faculty/publications_nowak/SciAm02.pdf | * {{citation|first1=Karl |last1=Sigmund |first2=Ernest |last2=Fehr |first3=Martin A. |last3=Nowak |date=January 2002 |title=The Economics of Fair Play |volume=286 |issue=1 |magazine=Scientific American |pages=82–87 |url=http://www.ped.fas.harvard.edu/people/faculty/publications_nowak/SciAm02.pdf |archive-url=https://web.archive.org/web/20110518024042/http://www.ped.fas.harvard.edu/people/faculty/publications_nowak/SciAm02.pdf |archive-date=18 May 2011 |bibcode=2002SciAm.286a..82S |doi=10.1038/scientificamerican0102-82 |pmid=11799620 }} | ||
* {{citation | * {{citation | ||
| first1 = Alexander |last1 = Stewart | | first1 = Alexander |last1 = Stewart | ||
| Line 750: | Line 747: | ||
| first2 = Oskar |last2 = Morgenstern | | first2 = Oskar |last2 = Morgenstern | ||
| title = Theory of Games and Economic Behavior | | title = Theory of Games and Economic Behavior | ||
| journal = Nature | volume = 157 | issue = 3981 | | | journal = Nature | volume = 157 | issue = 3981 | page = 172 | publisher = Princeton Univ. Press | ||
| bibcode = 1946Natur.157..172R | doi = 10.1038/157172a0 | s2cid = 29754824 }} | | bibcode = 1946Natur.157..172R | doi = 10.1038/157172a0 | s2cid = 29754824 }} | ||
* {{citation | * {{citation | ||
| Line 775: | Line 772: | ||
* {{citation | * {{citation | ||
| first1 = Edward O. |last1 = Wilson | | first1 = Edward O. |last1 = Wilson | ||
| orig- | | orig-date = 1975 | ||
| year = 2000 | | year = 2000 | ||
| edition = 25th Anniversary | | edition = 25th Anniversary | ||
Latest revision as of 03:16, 23 December 2025
Template:Short description Script error: No such module "Unsubst". Template:Use dmy dates Script error: No such module "Infobox".Template:Template otherScript error: No such module "Check for unknown parameters".Template:Wikidata image
The Evolution of Cooperation is a 1984 book written by political scientist Robert Axelrod[1] that expands upon a paper of the same name written by Axelrod and evolutionary biologist W.D. Hamilton.Template:Sfn The article's summary addresses the issue in terms of "cooperation in organisms, whether bacteria or primates".Template:Sfn
The book details a theory on the emergence of cooperation between individuals, drawing from game theory and evolutionary biology. Since 2006, reprints of the book have included a foreword by Richard Dawkins and have been marketed as a revised edition.
The book provides an investigation into how cooperation can emerge and persist as explained by the application of game theory.Template:Sfn The book provides a detailed explanation of the evolution of cooperation, beyond traditional game theory. Academic literature regarding forms of cooperation that are not easily explained in traditional game theory, especially when considering evolutionary biology, largely took its modern form as a result of Axelrod's and Hamilton's influential 1981 paperTemplate:Sfn and the subsequent book.
Background: Axelrod's tournaments
Axelrod initially solicited strategies from other game theorists to compete in the first tournament. Each strategy was paired with each other strategy for 200 iterations of a Prisoner's Dilemma game and scored on the total points accumulated through the tournament. The winner was a very simple strategy submitted by Anatol Rapoport called "tit for tat" (TFT) that cooperates on the first move, and subsequently echoes (reciprocates) what the other player did on the previous move. The results of the first tournament were analyzed and published, and a second tournament was held to see if anyone could find a better strategy. TFT won again. Axelrod analyzed the results and made some interesting discoveries about the nature of cooperation, which he describes in his book.[2]
In both actual tournaments and various replays, the best-performing strategies were nice:[3] that is, they were never the first to defect. Many of the competitors went to great lengths to gain an advantage over the "nice" (and usually simpler) strategies, but to no avail: tricky strategies fighting for a few points generally could not do as well as nice strategies working together. TFT (and other "nice" strategies generally) "won, not by doing better than the other player, but by eliciting cooperation [and] by promoting the mutual interest rather than by exploiting the other's weakness."[4]
Being "nice" can be beneficial, but it can also lead to being suckered. To obtain the benefit – or avoid exploitation – it is necessary to be provocable and forgiving. When the other player defects, a nice strategy must immediately be provoked into retaliatory defection.[5] The same goes for forgiveness: return to cooperation as soon as the other player does. Overdoing the punishment risks escalation, and can lead to an "unending echo of alternating defections" that depresses the scores of both players.[6]
Most of the games that game theory had heretofore investigated are "zero-sum" – that is, the total rewards are fixed, and a player does well only at the expense of other players. But real life is not zero-sum. Our best prospects are usually in cooperative efforts. In fact, TFT cannot score higher than its partner; at best it can only do "as good as". Yet it won the tournaments by consistently scoring a strong second-place with a variety of partners.[7] Axelrod summarizes this as "don't be envious";[8] in other words, don't strive for a payoff greater than the other player's.[9]
In any IPD game, there is a certain maximum score each player can get by always cooperating. But some strategies try to find ways of getting a little more with an occasional defection (exploitation). This can work against some strategies that are less provocable or more forgiving than TFT, but generally, they do poorly. "A common problem with these rules is that they used complex methods of making inferences about the other player [strategy] – and these inferences were wrong."[10] Against TFT one can do no better than to simply cooperate.[11] Axelrod calls this "clarity". Or: "don't be too clever".[12]
The success of any strategy depends on the nature of the particular strategies it encounters, which depends on the composition of the overall population. To better model the effects of reproductive success Axelrod also did an "ecological" tournament, where the prevalence of each type of strategy in each round was determined by that strategy's success in the previous round. The competition in each round becomes stronger as weaker performers are reduced and eliminated. The results were amazing: a handful of strategies – all "nice" – came to dominate the field.[13] In a sea of non-nice strategies the "nice" strategies – provided they were also provocable – did well enough with each other to offset the occasional exploitation. As cooperation became general the non-provocable strategies were exploited and eventually eliminated, whereupon the exploitive (non-cooperating) strategies were out-performed by the cooperative strategies.
In summary, success in an evolutionary "game" correlated with the following characteristics:
- Be nice: cooperate, never be the first to defect.
- Be provocable: return defection for defection, cooperation for cooperation.
- Don't be envious: focus on maximizing your own 'score', as opposed to ensuring your score is higher than your 'partner's'.
- Don't be too clever: or, don't try to be tricky. Clarity is essential for others to cooperate with you.
Foundation of reciprocal cooperation
The lessons described above apply in environments that support cooperation, but whether cooperation is supported at all, depends crucially on the probability (called ω [omega]) that the players will meet again,[14] also called the discount parameter or, figuratively, the shadow of the future. When ω is low – that is, the players have a negligible chance of meeting again – each interaction is effectively a single-shot Prisoner's Dilemma game, and one might as well defect in all cases (a strategy called "ALL D"), because even if one cooperates there is no way to keep the other player from exploiting that. But in the iterated PD the value of repeated cooperative interactions can become greater than the benefit/risk of single exploitation (which is all that a strategy like TFT will tolerate).
Curiously, rationality and deliberate choice are not necessary, nor trust nor even consciousness,[15] as long as there is a pattern that benefits both players (e.g., increases fitness), and some probability of future interaction. Often the initial mutual cooperation is not even intentional, but having "discovered" a beneficial pattern both parties respond to it by continuing the conditions that maintain it.
This implies two requirements for the players, aside from whatever strategy they may adopt. First, they must be able to recognize other players, to avoid exploitation by cheaters. Second, they must be able to track their previous history with any given player, in order to be responsive to that player's strategy.[16]
Even when the discount parameter ω is high enough to permit reciprocal cooperation there is still a question of whether and how cooperation might start. One of Axelrod's findings is that when the existing population never offers cooperation nor reciprocates it – the case of ALL D – then no nice strategy can get established by isolated individuals; cooperation is strictly a sucker bet. (The "futility of isolated revolt".[17]) But another finding of great significance is that clusters of nice strategies can get established. Even a small group of individuals with nice strategies with infrequent interactions can yet do so well on those interactions to make up for the low level of exploitation from non-nice strategies.[18]
Cooperation becomes more complicated, however, as soon as more realistic models are assumed that for instance offer more than two choices of action, provide the possibility of gradual cooperation, make actions constrain future actions (path dependence), or in which interpret the associate's actions are is non-trivial (e.g. recognizing the degree of cooperation shown)[19]
Subsequent work
In 1984 Axelrod estimated that there were "hundreds of articles on the Prisoner's Dilemma cited in Psychological Abstracts",[20] and estimated that citations to The Evolution of Cooperation alone were "growing at the rate of over 300 per year".[21] To fully review this literature is infeasible. What follows are therefore only a few selected highlights.
Axelrod considers his subsequent book, The Complexity of Cooperation,[22] to be a sequel to The Evolution of Cooperation. Other work on the evolution of cooperation has expanded to cover prosocial behavior generally,[23] and in religion,[24] other mechanisms for generating cooperation,[25] the IPD under different conditions and assumptions,[26] and the use of other games such as the Public Goods and Ultimatum games to explore deep-seated notions of fairness and fair play.[27] It has also been used to challenge the rational and self-regarding "economic man" model of economics,[28] and as a basis for replacing Darwinian sexual selection theory with a theory of social selection.[29]
Nice strategies are better able to invade if they have social structures or other means of increasing their interactions. Axelrod discusses this in chapter 8; in a later paper he and Rick Riolo and Michael Cohen[30] use computer simulations to show cooperation rising among agents who have negligible chance of future encounters but can recognize similarity of an arbitrary characteristic (such as a green beard); whereas other studies[31] have shown that the only Iterated Prisoner's Dilemma strategies that resist invasion in a well-mixed evolving population are generous strategies.
When an IPD tournament introduces noise (errors or misunderstandings), TFT strategies can get trapped into a long string of retaliatory defections, thereby depressing their score. TFT also tolerates "ALL C" (always cooperate) strategies, which then give an opening to exploiters.[32] In 1992 Martin Nowak and Karl Sigmund demonstrated a strategy called Pavlov (or "win–stay, lose–shift") that does better in these circumstances.[33] Pavlov looks at its own prior move as well as the other player's move. If the payoff was R or P (see "Prisoner's Dilemma", above) it cooperates; if S or T it defects.
In a 2006 paper Nowak listed five mechanisms by which natural selection can lead to cooperation.[34] In addition to kin selection and direct reciprocity, he shows that:
- Indirect reciprocity is based on knowing the other player's reputation, which is the player's history with other players. Cooperation depends on a reliable history being projected from past partners to future partners.
- Network reciprocity relies on geographical or social factors to increase the interactions with nearer neighbors; it is essentially a virtual group.
- Group selection[35] assumes that groups with cooperators (even altruists) will be more successful as a whole, and this will tend to benefit all members.
The payoffs in the Prisoner's Dilemma game are fixed, but in real life defectors are often punished by cooperators. Where punishment is costly there is a second-order dilemma amongst cooperators between those who pay the cost of enforcement and those who do not.[36] Other work has shown that while individuals given a choice between joining a group that punishes free-riders and one that does not initially prefer the sanction-free group, yet after several rounds they will join the sanctioning group, seeing that sanctions secure a better payoff.[37]
In small populations or groups there is the possibility that indirect reciprocity (reputation) can interact with direct reciprocity (e.g. tit for tat) with neither strategy dominating the other.[38] The interactions between these strategies can give rise to dynamic social networks which exhibit some of the properties observed in empirical networks[39] If network structure and choices in the Prisoner's dilemma co-evolve, then cooperation can survive. In the resulting networks cooperators will be more centrally located than defectors who will tend to be in the periphery of the network.Template:Sfn
In "The Coevolution of Parochial Altruism and War" by Jung-Kyoo Choi and Samuel Bowles. From their summary:
Altruism—benefiting fellow group members at a cost to oneself —and parochialism—hostility towards individuals not of one's own ethnic, racial, or other group—are common human behaviors. The intersection of the two—which we term "parochial altruism"—is puzzling from an evolutionary perspective because altruistic or parochial behavior reduces one's payoffs by comparison to what one would gain from eschewing these behaviors. But parochial altruism could have evolved if parochialism promoted intergroup hostilities and the combination of altruism and parochialism contributed to success in these conflicts.... [Neither] would have been viable singly, but by promoting group conflict they could have evolved jointly.[40]
Consideration of the mechanisms through which learning from the social environment occurs is pivotal in studies of evolution. In the context of this discussion, learning rules, specifically conformism and payoff-dependent imitation, are not arbitrarily predetermined but are biologically selected. Behavioral strategies, which include cooperation, defection, and cooperation coupled with punishment, are chosen in alignment with the agent's prevailing learning rule. Simulations of the model under conditions approximating those experienced by early hominids reveal that conformism can evolve even when individuals are solely faced with a cooperative dilemma, contrary to previous assertions. Moreover, the incorporation of conformists significantly amplifies the group size within which cooperation can be sustained. These model results demonstrate robustness, maintaining validity even under conditions of high migration rates and infrequent intergroup conflicts.[41]
Neither Choi & Bowles nor Guzmán, Rodriguez-Sicket and Rowthorn claim that humans have actually evolved in this way, but that computer simulations show how war could be promoted by the interaction of these behaviors. A crucial open research question, thus, is how realistic the assumptions are on which these simulation models are based.[42]
Software
Several software packages have been created to run prisoner's dilemma simulations and tournaments, some of which have available source code.
- The source code for the second tournament run by Robert Axelrod (written by Axelrod and many contributors in Fortran) is available online.[43]
- PRISON,[44] a library written in Java, last updated in 1999
- Axelrod-Python,[45] written in Python
Recommended reading
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".Script error: No such module "Unsubst".
- Script error: No such module "citation/CS1".
See also
Script error: No such module "Portal".
References
<templatestyles src="Reflist/styles.css" />
- ↑ Axelrod's book was summarized in Douglas Hofstadter's May 1983 "Metamagical Themas" column in Scientific American Script error: No such module "Footnotes". (reprinted in his book Script error: No such module "Footnotes".; see also Richard Dawkin's summary in the second edition of The Selfish Gene (Script error: No such module "Footnotes"., ch. 12).
- ↑ Script error: No such module "Footnotes"..
- ↑ Script error: No such module "Footnotes"..
- ↑ Script error: No such module "Footnotes"..
- ↑ Script error: No such module "Footnotes"..
- ↑ Script error: No such module "Footnotes"..
- ↑ Script error: No such module "Footnotes"..
- ↑ Script error: No such module "Footnotes"..
- ↑ Script error: No such module "Footnotes"..
- ↑ Script error: No such module "Footnotes"..
- ↑ Script error: No such module "Footnotes"..
- ↑ Script error: No such module "Footnotes"..
- ↑ Script error: No such module "Footnotes"..
- ↑ Script error: No such module "Footnotes"..
- ↑ Script error: No such module "Footnotes"..
- ↑ Script error: No such module "Footnotes"..
- ↑ Script error: No such module "Footnotes"..
- ↑ Script error: No such module "Footnotes".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Footnotes"..
- ↑ Script error: No such module "Footnotes"..
- ↑ Script error: No such module "Footnotes"..
- ↑ Script error: No such module "Footnotes".; Script error: No such module "Footnotes"..
- ↑ Script error: No such module "Footnotes"..
- ↑ Script error: No such module "Footnotes"..
- ↑ Script error: No such module "Footnotes".; Script error: No such module "Footnotes". categorizes and summarizes over 50 studies
- ↑ Script error: No such module "Footnotes".; Script error: No such module "Footnotes"..
- ↑ Script error: No such module "Footnotes"..
- ↑ Script error: No such module "Footnotes"..
- ↑ Script error: No such module "Footnotes"..
- ↑ Stewart and Plotkin (2013)
- ↑ Script error: No such module "Footnotes". has some interesting comments on the need to suppress universal cooperators. See also a similar theme in Piers Anthony's novel Macroscope.
- ↑ Script error: No such module "Footnotes".; see also Script error: No such module "Footnotes"..
- ↑ Script error: No such module "Footnotes".;
- ↑ Here group selection is not a form of evolution, which is problematical (see Script error: No such module "Footnotes"., ch. 7), but a mechanism for evolving cooperation.
- ↑ Script error: No such module "Footnotes"..
- ↑ Script error: No such module "Footnotes".
- ↑ Phelps, S., Nevarez, G. & Howes, A., 2009. The effect of group size and frequency of encounter on the evolution of cooperation. In LNCS, Volume 5778, ECAL 2009, Advances in Artificial Life: Darwin meets Von Neumann. Budapest: Springer, pp. 37–44. [1].
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Footnotes"..
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Footnotes"..
- ↑ Script error: No such module "citation/CS1".
- ↑ https://web.archive.org/web/19991010053242/http://www.lifl.fr/IPD/ipd.frame.html
- ↑ Script error: No such module "citation/CS1".
Script error: No such module "Check for unknown parameters".
Bibliography
Most of these references are to the scientific literature, to establish the authority of various points in the article. A few references of lesser authority, but greater accessibility are also included.
<templatestyles src="Div col/styles.css"/>
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "Citation/CS1".Script error: No such module "Unsubst".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".Script error: No such module "Unsubst".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".Script error: No such module "Unsubst".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".