Consciousness: Difference between revisions
imported>ImmortalRationalist |
imported>Wielojedność m →In artificial intelligence: Minor spelling standardization. ' and " should be further standardized. |
||
| (2 intermediate revisions by 2 users not shown) | |||
| Line 5: | Line 5: | ||
{{Use American English|date=July 2023}} | {{Use American English|date=July 2023}} | ||
[[Image:RobertFuddBewusstsein17Jh.png|thumb| | [[Image:RobertFuddBewusstsein17Jh.png|thumb|17th century representation of consciousness by [[Robert Fludd]], an English [[Paracelsianism|Paracelsian]] physician]] | ||
'''Consciousness''', at its simplest, is [[awareness]] of | '''Consciousness''', at its simplest, is [[awareness]] of states or objects either internal to one's [[self]] or in one's external environment.<ref name="consciousness">{{cite dictionary|title=consciousness|dictionary=Merriam-Webster|access-date=June 4, 2012|url=http://www.merriam-webster.com/dictionary/consciousness}}</ref> However, its nature has led to millennia of explanations, analyses, and debate among philosophers, scientists, and theologians. Opinions differ about ''what'' exactly needs to be studied, or can even be considered consciousness. In some explanations, it is synonymous with [[mind]], and at other times, an aspect of it. | ||
In the past, consciousness meant one's "inner life": the world of [[introspection]], private thought, [[imagination]], and [[volition (psychology)|volition]].<ref name="JJ90">{{cite book|last=Jaynes|first=Julian|author-link=Julian Jaynes|title=The Origin of Consciousness in the Breakdown of the Bicameral Mind|publisher=Houghton Mifflin|orig-date=1976|year=2000|isbn=0-618-05707-2}}</ref> Today, it often includes any kind of [[cognition]], [[experience]], feeling, or [[perception]]. It may be awareness, awareness of awareness, [[metacognition]], or [[self-awareness]], either continuously changing or not.<ref name="Rochat 2003 717–731">{{cite journal|last=Rochat|first=Philippe|title=Five levels of self-awareness as they unfold early in life|journal=Consciousness and Cognition|year=2003|volume=12|issue=4|pages=717–731|url=http://psychology.emory.edu/cognition/rochat/Five%20levels%20.pdf|archive-url=https://ghostarchive.org/archive/20221009/http://psychology.emory.edu/cognition/rochat/Five%20levels%20.pdf|archive-date=2022-10-09|url-status=live|doi=10.1016/s1053-8100(03)00081-3|pmid=14656513|s2cid=10241157}}</ref><ref name="Guertin 2019 406-412">{{cite journal|author=P.A. Guertin|title=A novel concept introducing the idea of continuously changing levels of consciousness|journal=Journal of Consciousness Exploration & Research|year=2019|volume=10|issue=6|pages=406–412|url=https://jcer.com/index.php/jcj/article/view/829/825|access-date=2021-08-19|archive-date=2021-12-15|archive-url=https://web.archive.org/web/20211215112848/https://jcer.com/index.php/jcj/article/view/829/825|url-status=live}}</ref> There is also a medical definition that helps, for example, to discern "[[coma]]" from other states. The disparate range of research, notions, and speculations raises some curiosity about whether the right questions are being asked.<ref name="Hacker2012">{{cite journal|author-link=Peter Hacker|last=Hacker|first=P.M.S.|url=http://info.sjc.ox.ac.uk/scr/hacker/docs/ConsciousnessAChallenge.pdf|archive-url=https://ghostarchive.org/archive/20221009/http://info.sjc.ox.ac.uk/scr/hacker/docs/ConsciousnessAChallenge.pdf|archive-date=2022-10-09|url-status=live|title=The Sad and Sorry History of Consciousness: being, among other things, a challenge to the "consciousness-studies community"|journal=Royal Institute of Philosophy|volume=supplementary volume 70|date=2012}}</ref> | |||
Examples of the range of descriptions, definitions and explanations are: ordered distinction between self and environment, simple [[wakefulness]], one's sense of selfhood or [[soul]] explored by "looking within", being a metaphorical [[stream of consciousness (psychology)|"stream" of contents]], or being a [[mental state]], [[mental event]], or [[mental process]] of the [[brain]]. | |||
The Latin ''[[:la:conscientia|conscientia]]'', literally | == Etymology == | ||
The words "conscious" and "consciousness" in the [[English language]] date to the 17th century, and the first recorded use of "conscious" as a simple adjective was applied figuratively to inanimate objects (''"the conscious Groves"'', 1643).<ref name="Barfield26">{{cite book|last=Barfield|first=Owen|title=History in English Words|date=1962|orig-date=1926|publisher=Faber and Faber Limited|location=London|edition=239 pgs. paper covered|author-link=Owen Barfield}}</ref>{{rp|p=175}} It derived from the [[Latin]] ''conscius'' (''con-'' "together" and [[wikt:scio|''scio'']] "to know") which meant "knowing with" or "having joint or common knowledge with another", especially as in sharing a secret.<ref>{{cite book|title=Studies in words|author =C. S. Lewis|year=1990|publisher=Cambridge University Press|chapter=Ch. 8: Conscience and conscious|isbn=978-0-521-39831-2|author-link =C. S. Lewis}}</ref> [[Thomas Hobbes]] in ''[[Leviathan (Hobbes book)|Leviathan]]'' (1651) wrote: "Where two, or more men, know of one and the same fact, they are said to be Conscious of it one to another".<ref>{{cite book|title=Leviathan: or, The Matter, Forme & Power of a Commonwealth, Ecclesiasticall and Civill|author=Thomas Hobbes|publisher=University Press|year=1904|url=https://archive.org/details/leviathan00hobbgoog|page=[https://archive.org/details/leviathan00hobbgoog/page/n62 39]|author-link=Thomas Hobbes}}</ref> There were also many occurrences in Latin writings of the phrase ''conscius sibi'', which translates literally as "knowing with oneself", or in other words "sharing knowledge with oneself about something". This phrase has the figurative sense of "knowing that one knows", which is something like the modern English word "conscious", but it was rendered into English as "conscious to oneself" or "conscious unto oneself". For example, [[Archbishop Ussher]] wrote in 1613 of "being so conscious unto myself of my great weakness".<ref>{{cite book|title=The whole works, Volume 2|author=[[James Ussher]], [[Charles Richard Elrington]]|page=417|year=1613|publisher=Hodges and Smith}}</ref> | |||
The Latin ''[[:la:conscientia|conscientia]]'', literally "knowledge-with", first appears in Roman juridical texts by writers such as [[Cicero]]. It means a kind of shared knowledge with moral value, specifically what a witness knows of someone else's deeds.<ref>{{cite book|title=Dictionary of Untranslatables. A Philosophical Lexicon|author=Barbara Cassin|publisher=Princeton University Press|isbn=978-0-691-13870-1|year=2014|page=[https://archive.org/details/dictionaryofuntr0000unse/page/176 176]|url=https://archive.org/details/dictionaryofuntr0000unse/page/176}}</ref><ref>{{cite journal| author=G. Molenaar|title=Seneca's Use of the Term Conscientia|journal=Mnemosyne| volume=22|issue=2|year=1969|pages=170–180|doi=10.1163/156852569x00670}}</ref> Although [[René Descartes]] (1596–1650), writing in Latin, is generally taken to be the first philosopher to use ''conscientia'' in a way less like the traditional meaning and more like the way modern English speakers would use "conscience", his meaning is nowhere defined.<ref name="Hennig">{{cite journal| author=Boris Hennig|title=Cartesian Conscientia|journal=British Journal for the History of Philosophy|year=2007|volume=15|issue=3|pages=455–484|doi=10.1080/09608780701444915|s2cid=218603781}}</ref> In ''Search after Truth'' (''{{lang|la|Regulæ ad directionem ingenii ut et inquisitio veritatis per lumen naturale}}'', Amsterdam 1701) he wrote the word with a [[Gloss (annotation)|gloss]]: ''conscientiâ, vel interno testimonio'' (translatable as "conscience, or internal testimony").<ref>Charles Adam, [[Paul Tannery]] (eds.), ''Oeuvres de Descartes'' X, [https://archive.org/stream/oeuvresdedescar10desc#page/524/mode/2up 524] (1908).</ref><ref>{{cite book|pages=205–206|title=Consciousness: from perception to reflection in the history of philosophy|isbn=978-1-4020-6081-6|publisher=Springer|editor1=Sara Heinämaa|editor2=Vili Lähteenmäki|editor3=Pauliina Remes|year=2007}}</ref> It might mean the knowledge of the value of one's own thoughts.<ref name="Hennig" /> | |||
[[File:JohnLocke.png|thumb|upright|[[John Locke]], a 17th-century British [[Age of Enlightenment]] philosopher]] | [[File:JohnLocke.png|thumb|upright|[[John Locke]], a 17th-century British [[Age of Enlightenment]] philosopher]] | ||
The origin of the modern concept of consciousness is often attributed to [[John Locke]] who defined the word in his ''[[Essay Concerning Human Understanding]]'', published in 1690, as "the perception of what passes in a man's own mind".<ref>{{cite web|title=An Essay Concerning Human Understanding (Chapter XXVII)|last=Locke|first=John|publisher=University of Adelaide|location=Australia|url=https://ebooks.adelaide.edu.au/l/locke/john/l81u/B2.27.html|access-date=August 20, 2010|archive-date=May 8, 2018|archive-url=https://web.archive.org/web/20180508053707/https://ebooks.adelaide.edu.au/l/locke/john/l81u/B2.27.html | The origin of the modern concept of consciousness is often attributed to [[John Locke]] who defined the word in his ''[[Essay Concerning Human Understanding]]'', published in 1690, as "the perception of what passes in a man's own mind".<ref>{{cite web|title=An Essay Concerning Human Understanding (Chapter XXVII)|last=Locke|first=John|publisher=University of Adelaide|location=Australia|url=https://ebooks.adelaide.edu.au/l/locke/john/l81u/B2.27.html|access-date=August 20, 2010|archive-date=May 8, 2018|archive-url=https://web.archive.org/web/20180508053707/https://ebooks.adelaide.edu.au/l/locke/john/l81u/B2.27.html}}</ref><ref>{{cite encyclopedia|url=https://www.britannica.com/EBchecked/topic/133274/consciousness|title=Science & Technology: consciousness|encyclopedia=Encyclopædia Britannica|access-date=August 20, 2010}}</ref> The essay strongly influenced 18th-century [[British philosophy]], and Locke's definition appeared in [[Samuel Johnson]]'s celebrated ''[[A Dictionary of the English Language|Dictionary]]'' (1755).<ref>{{cite book|title=A Dictionary of the English Language|author=Samuel Johnson|publisher=Knapton|year=1756|url=https://archive.org/details/dictionaryofengl01john|author-link=Samuel Johnson}}</ref> | ||
The French term ''conscience'' is defined roughly like English "consciousness" in the 1753 volume of [[Diderot]] and [[d'Alembert]]'s [[Encyclopédie]] as "the opinion or internal feeling that we ourselves have from what we do".<ref>Jaucourt, Louis, chevalier de. "Consciousness." The Encyclopedia of Diderot & d'Alembert Collaborative Translation Project. Translated by Scott St. Louis. Ann Arbor: Michigan Publishing, University of Michigan Library, 2014. [http://hdl.handle.net/2027/spo.did2222.0002.986. Originally published as "Conscience," Encyclopédie ou Dictionnaire raisonné des sciences, des arts et des métiers], 3:902 (Paris, 1753).</ref> | The French term ''conscience'' is defined roughly like English "consciousness" in the 1753 volume of [[Diderot]] and [[d'Alembert]]'s [[Encyclopédie]] as "the opinion or internal feeling that we ourselves have from what we do".<ref>Jaucourt, Louis, chevalier de. "Consciousness." The Encyclopedia of Diderot & d'Alembert Collaborative Translation Project. Translated by Scott St. Louis. Ann Arbor: Michigan Publishing, University of Michigan Library, 2014. [http://hdl.handle.net/2027/spo.did2222.0002.986. Originally published as "Conscience," Encyclopédie ou Dictionnaire raisonné des sciences, des arts et des métiers], 3:902 (Paris, 1753).</ref> | ||
==Problem of definition== | == Problem of definition == | ||
Scholars are divided <!-- In the late 20th century, philosophers like [[David W. Hamlyn|Hamlyn]], [[Richard Rorty|Rorty]], and [[Kathy Wilkes|Wilkes]] have disagreed with [[Charles H. Kahn|Kahn]], [[W. F. R. Hardie|Hardie]], and [[Deborah Modrak|Modrak]] CLARIFICATION IS NEEDED: WHO THOUGHT WHAT? --> as to whether [[Aristotle]] had a concept of consciousness. He does not use any single word or terminology that is clearly similar to the [[phenomenon]] or [[concept]] defined by John Locke. Victor Caston contends that Aristotle did have a concept more clearly similar to [[perception]].<ref name="caston02">{{cite book|last1=Caston|first1=Victor|title=Mind|date=2002|publisher=Oxford University Press|page=751|url=http://ancphil.lsa.umich.edu/-/downloads/faculty/caston/aristotle-consciousness.pdf|archive-url=https://ghostarchive.org/archive/20221009/http://ancphil.lsa.umich.edu/-/downloads/faculty/caston/aristotle-consciousness.pdf|archive-date=2022-10-09|url-status=live|chapter=Aristotle on Consciousness}}</ref> | Scholars are divided <!-- In the late 20th century, philosophers like [[David W. Hamlyn|Hamlyn]], [[Richard Rorty|Rorty]], and [[Kathy Wilkes|Wilkes]] have disagreed with [[Charles H. Kahn|Kahn]], [[W. F. R. Hardie|Hardie]], and [[Deborah Modrak|Modrak]] CLARIFICATION IS NEEDED: WHO THOUGHT WHAT? --> as to whether [[Aristotle]] had a concept of consciousness. He does not use any single word or terminology that is clearly similar to the [[phenomenon]] or [[concept]] defined by John Locke. Victor Caston contends that Aristotle did have a concept more clearly similar to [[perception]].<ref name="caston02">{{cite book|last1=Caston|first1=Victor|title=Mind|date=2002|publisher=Oxford University Press|page=751|url=http://ancphil.lsa.umich.edu/-/downloads/faculty/caston/aristotle-consciousness.pdf|archive-url=https://ghostarchive.org/archive/20221009/http://ancphil.lsa.umich.edu/-/downloads/faculty/caston/aristotle-consciousness.pdf|archive-date=2022-10-09|url-status=live|chapter=Aristotle on Consciousness}}</ref> | ||
| Line 41: | Line 35: | ||
#* ''concerned awareness;'' INTEREST, CONCERN—''often used with an attributive noun [e.g. class consciousness]'' | #* ''concerned awareness;'' INTEREST, CONCERN—''often used with an attributive noun [e.g. class consciousness]'' | ||
# ''the state or activity that is characterized by sensation, emotion, volition, or thought; mind in the broadest possible sense; something in nature that is distinguished from the physical'' | # ''the state or activity that is characterized by sensation, emotion, volition, or thought; mind in the broadest possible sense; something in nature that is distinguished from the physical'' | ||
# ''the totality in psychology of sensations, perceptions, ideas, attitudes, and [[feelings]] of which an individual or a group is aware at any given time or within a particular time span—'' | # ''the totality in psychology of sensations, perceptions, ideas, attitudes, and [[feelings]] of which an individual or a group is aware at any given time or within a particular time span—'' | ||
# ''waking life (as that to which one returns after sleep, trance, fever) wherein all one's mental powers have returned . . .'' | # ''waking life (as that to which one returns after sleep, trance, fever) wherein all one's mental powers have returned . . .'' | ||
# ''the part of mental life or psychic content in psychoanalysis that is immediately available to the ego—'' | # ''the part of mental life or psychic content in psychoanalysis that is immediately available to the ego—'' | ||
The ''[[Cambridge English Dictionary]]'' defines consciousness as "the state of being awake, thinking, and knowing what is happening around you", as well as "the state of understanding and realizing something".<ref>{{cite dictionary|url=https://dictionary.cambridge.org/dictionary/english/consciousness|entry=consciousness|encyclopedia=Cambridge English Dictionary|publisher=Cambridge University Press|access-date=2018-10-23|archive-date=2021-03-07|archive-url=https://web.archive.org/web/20210307210954/https://dictionary.cambridge.org/dictionary/english/consciousness|url-status=live |title=Consciousness}}</ref> | The ''[[Cambridge English Dictionary]]'' defines consciousness as "the state of being awake, thinking, and knowing what is happening around you", as well as "the state of understanding and realizing something".<ref>{{cite dictionary|url=https://dictionary.cambridge.org/dictionary/english/consciousness|entry=consciousness|encyclopedia=Cambridge English Dictionary|publisher=Cambridge University Press|access-date=2018-10-23|archive-date=2021-03-07|archive-url=https://web.archive.org/web/20210307210954/https://dictionary.cambridge.org/dictionary/english/consciousness|url-status=live |title=Consciousness}}</ref> | ||
The ''[[Oxford Living Dictionary]]'' defines consciousness as "[t]he state of being aware of and responsive to one's surroundings", "[a] person's awareness or perception of something", and "[t]he fact of awareness by the mind of itself and the world".<ref>{{cite dictionary|url=https://en.oxforddictionaries.com/definition/consciousness|archive-url=https://web.archive.org/web/20160925102008/https://en.oxforddictionaries.com/definition/consciousness | The ''[[Oxford Living Dictionary]]'' defines consciousness as "[t]he state of being aware of and responsive to one's surroundings", "[a] person's awareness or perception of something", and "[t]he fact of awareness by the mind of itself and the world".<ref>{{cite dictionary|url=https://en.oxforddictionaries.com/definition/consciousness|archive-url=https://web.archive.org/web/20160925102008/https://en.oxforddictionaries.com/definition/consciousness|archive-date=September 25, 2016|entry=consciousness|dictionary=Oxford Living Dictionary|publisher=Oxford University Press|title=Consciousness - definition of consciousness in English | Oxford Dictionaries}}</ref> | ||
Philosophers have attempted to clarify technical distinctions by using a [[jargon]] of their own. The corresponding entry in the ''[[Routledge Encyclopedia of Philosophy]]'' (1998) reads: | Philosophers have attempted to clarify technical distinctions by using a [[jargon]] of their own. The corresponding entry in the ''[[Routledge Encyclopedia of Philosophy]]'' (1998) reads: | ||
;'''Consciousness''':Philosophers have used the term ''consciousness'' for four main topics: knowledge in general, intentionality, introspection (and the knowledge it specifically generates) and phenomenal experience... Something within one's mind is 'introspectively conscious' just in case one introspects it (or is poised to do so). Introspection is often thought to deliver one's primary knowledge of one's mental life. An experience or other mental entity is 'phenomenally conscious' just in case there is 'something it is like' for one to have it. The clearest examples are: perceptual experience, such as tastings and seeings; bodily-sensational experiences, such as those of pains, tickles and itches; imaginative experiences, such as those of one's own actions or perceptions; and streams of thought, as in the experience of thinking 'in words' or 'in images'. Introspection and phenomenality seem independent, or dissociable, although this is controversial.<ref name=Craig>{{cite encyclopedia|author=Edward Craig|encyclopedia=Routledge Encyclopedia of Philosophy|publisher=Routledge|entry=Consciousness|year=1998|isbn=978-0-415-18707-7|author-link=Edward Craig (philosopher)}}</ref> | ;'''Consciousness''':Philosophers have used the term ''consciousness'' for four main topics: knowledge in general, intentionality, introspection (and the knowledge it specifically generates) and phenomenal experience... Something within one's mind is 'introspectively conscious' just in case one introspects it (or is poised to do so). Introspection is often thought to deliver one's primary knowledge of one's mental life. An experience or other mental entity is 'phenomenally conscious' just in case there is 'something it is like' for one to have it. The clearest examples are: perceptual experience, such as tastings and seeings; bodily-sensational experiences, such as those of pains, tickles and itches; imaginative experiences, such as those of one's own actions or perceptions; and streams of thought, as in the experience of thinking 'in words' or 'in images'. Introspection and phenomenality seem independent, or dissociable, although this is controversial.<ref name=Craig>{{cite encyclopedia|author=Edward Craig|encyclopedia=Routledge Encyclopedia of Philosophy|publisher=Routledge|entry=Consciousness|year=1998|isbn=978-0-415-18707-7|author-link=Edward Craig (philosopher)}}</ref> | ||
===Traditional metaphors for mind=== | === Traditional metaphors for mind === | ||
During the early 19th century, the emerging field of [[geology]] inspired a popular [[metaphor]] that the mind likewise had hidden layers "which recorded the past of the individual".{{r|JJ76|p=3}} By 1875, most psychologists believed that "consciousness was but a small part of mental life",{{r|JJ76|p=3}} and this idea underlies the goal of [[Freudian psychology|Freudian therapy]], to expose the {{em|unconscious layer}} of the mind. | During the early 19th century, the emerging field of [[geology]] inspired a popular [[metaphor]] that the mind likewise had hidden layers "which recorded the past of the individual".{{r|JJ76|p=3}} By 1875, most psychologists believed that "consciousness was but a small part of mental life",{{r|JJ76|p=3}} and this idea underlies the goal of [[Freudian psychology|Freudian therapy]], to expose the {{em|unconscious layer}} of the mind. | ||
| Line 58: | Line 52: | ||
In 1892, [[William James]] noted that the "ambiguous word 'content' has been recently invented instead of 'object'" and that the metaphor of mind as a {{em|container}} seemed to minimize the dualistic problem of how "states of consciousness can {{em|know}}" things, or objects;{{r|WJames92|p=465}} by 1899 psychologists were busily studying the "contents of conscious experience by [[introspection]] and [[experiment]]".<ref name=Thomas67 />{{rp|365}} Another popular metaphor was James's doctrine of the [[stream of consciousness (psychology)|stream of consciousness]], with continuity, fringes, and transitions.{{r|WJames92|p=vii}}{{efn|From the introduction by [[Ralph Barton Perry]], 1948.}} | In 1892, [[William James]] noted that the "ambiguous word 'content' has been recently invented instead of 'object'" and that the metaphor of mind as a {{em|container}} seemed to minimize the dualistic problem of how "states of consciousness can {{em|know}}" things, or objects;{{r|WJames92|p=465}} by 1899 psychologists were busily studying the "contents of conscious experience by [[introspection]] and [[experiment]]".<ref name=Thomas67 />{{rp|365}} Another popular metaphor was James's doctrine of the [[stream of consciousness (psychology)|stream of consciousness]], with continuity, fringes, and transitions.{{r|WJames92|p=vii}}{{efn|From the introduction by [[Ralph Barton Perry]], 1948.}} | ||
James discussed the difficulties of describing and studying psychological phenomena, recognizing that commonly | James discussed the difficulties of describing and studying psychological phenomena, recognizing that commonly used terminology was a necessary and acceptable starting point towards more precise, scientifically justified language. Prime examples were phrases like ''inner experience'' and ''personal consciousness'': | ||
{{blockquote|The first and foremost concrete fact which every one will affirm to belong to his inner experience is the fact that {{em|consciousness of some sort goes on. 'States of mind' succeed each other in him}}. [...] But everyone knows what the terms mean [only] in a rough way; [...] When I say {{em|every 'state' or 'thought' is part of a personal consciousness}}, 'personal consciousness' is one of the terms in question. Its meaning we know so long as no one asks us to define it, but to give an accurate account of it is the most difficult of philosophic tasks. [...] The only states of consciousness that we naturally deal with are found in personal consciousnesses, minds, selves, concrete particular I's and you's.{{r|WJames92|pp=152–153}}}} | {{blockquote|The first and foremost concrete fact which every one will affirm to belong to his inner experience is the fact that {{em|consciousness of some sort goes on. 'States of mind' succeed each other in him}}. [...] But everyone knows what the terms mean [only] in a rough way; [...] When I say {{em|every 'state' or 'thought' is part of a personal consciousness}}, 'personal consciousness' is one of the terms in question. Its meaning we know so long as no one asks us to define it, but to give an accurate account of it is the most difficult of philosophic tasks. [...] The only states of consciousness that we naturally deal with are found in personal consciousnesses, minds, selves, concrete particular I's and you's.{{r|WJames92|pp=152–153}}}} | ||
===From introspection to awareness and experience=== | === From introspection to awareness and experience === | ||
Prior to the 20th century, philosophers treated the phenomenon of consciousness as the "inner world [of] one's own mind", and [[introspection]] was the mind "attending to" itself,{{efn|From the ''Macmillan Encyclopedia of Philosophy'' (1967): "Locke's use of 'consciousness' was widely adopted in British philosophy. In the late nineteenth century the term 'introspection' began to be used. [[G. F. Stout]]'s definition is typical: "To introspect is to attend to the workings of one's own mind" [... (1899)]".<ref name=Landesman67>{{cite encyclopedia|last1=Landesman|first1=Charles Jr.|editor1-last=Edwards|editor1-first=Paul|encyclopedia=The Encyclopedia of Philosophy|contribution=Consciousness|volume= 2|date=1967|publisher=Macmillan, Inc.|pages=191–195|edition=Reprint 1972}}</ref>{{rp|191–192}}}} an activity seemingly distinct from that of perceiving the | Prior to the 20th century, philosophers treated the phenomenon of consciousness as the "inner world [of] one's own mind", and [[introspection]] was the mind "attending to" itself,{{efn|From the ''Macmillan Encyclopedia of Philosophy'' (1967): "Locke's use of 'consciousness' was widely adopted in British philosophy. In the late nineteenth century the term 'introspection' began to be used. [[G. F. Stout]]'s definition is typical: "To introspect is to attend to the workings of one's own mind" [... (1899)]".<ref name=Landesman67>{{cite encyclopedia|last1=Landesman|first1=Charles Jr.|editor1-last=Edwards|editor1-first=Paul|encyclopedia=The Encyclopedia of Philosophy|contribution=Consciousness|volume= 2|date=1967|publisher=Macmillan, Inc.|pages=191–195|edition=Reprint 1972}}</ref>{{rp|191–192}}}} an activity seemingly distinct from that of perceiving the "outer world" and its physical phenomena. In 1892 [[William James]] noted the distinction along with doubts about the inward character of the mind:{{blockquote|'Things' have been doubted, but thoughts and feelings have never been doubted. The outer world, but never the inner world, has been denied. Everyone assumes that we have direct introspective acquaintance with our thinking activity as such, with our consciousness as something inward and contrasted with the outer objects which it knows. Yet I must confess that for my part I cannot feel sure of this conclusion. [...] It seems as if consciousness as an inner activity were rather a ''postulate'' than a sensibly given fact...<ref name=WJames92>{{cite book|last1=James|first1=William|title=Psychology|date=1948|orig-date=1892|publisher=Fine Editions Press, World Publishing Co.|location=Cleveland}}</ref>{{rp|467}}}} | ||
By the 1960s, for many philosophers and psychologists who talked about consciousness, the word no longer meant the 'inner world' but an indefinite, large category called ''[[awareness]]'', as in the following example:<!-- Without an established definition or an obvious thing, event or experience that consciousness must refer to, a number of items are often presented as examples, as in the following claim: --> | By the 1960s, for many philosophers and psychologists who talked about consciousness, the word no longer meant the 'inner world' but an indefinite, large category called ''[[awareness]]'', as in the following example:<!-- Without an established definition or an obvious thing, event or experience that consciousness must refer to, a number of items are often presented as examples, as in the following claim: --> | ||
{{blockquote|It is difficult for modern Western man to grasp that the Greeks really had no concept of consciousness in that they did not class together phenomena as varied as problem solving, remembering, imagining, perceiving, feeling pain, dreaming, and acting on the grounds that all these are manifestations of being aware or being conscious.<ref name="EPhil-Psyc" >{{cite encyclopedia|last1=Peters|first1=R. S.|last2=Mace|first2=C. A.|editor1-last=Edwards|editor1-first=Paul|encyclopedia=The Encyclopedia of Philosophy|contribution=Psychology|volume= 7|date=1967|publisher=Macmillan, Inc.|pages=1–27|edition=Reprint 1972}}</ref>{{rp|4}}}} | {{blockquote|It is difficult for modern Western man to grasp that the Greeks really had no concept of consciousness in that they did not class together phenomena as varied as problem solving, remembering, imagining, perceiving, feeling pain, dreaming, and acting on the grounds that all these are manifestations of being aware or being conscious.<ref name="EPhil-Psyc" >{{cite encyclopedia|last1=Peters|first1=R. S.|last2=Mace|first2=C. A.|editor1-last=Edwards|editor1-first=Paul|encyclopedia=The Encyclopedia of Philosophy|contribution=Psychology|volume= 7|date=1967|publisher=Macmillan, Inc.|pages=1–27|edition=Reprint 1972}}</ref>{{rp|4}}}} | ||
| Line 70: | Line 64: | ||
{{blockquote|'''Consciousness'''—The having of perceptions, thoughts, and [[feelings]]; awareness. The term is impossible to define except in terms that are unintelligible without a grasp of what consciousness means. Many fall into the trap of equating consciousness with [[self-consciousness]]—to be conscious it is only necessary to be aware of the external world. Consciousness is a fascinating but elusive phenomenon: it is impossible to specify what it is, what it does, or why it has evolved. Nothing worth reading has been written on it.<ref name=Sutherland>{{cite book|author=Stuart Sutherland|title=Macmillan Dictionary of Psychology|publisher=Macmillan|chapter=Consciousness|year=1989|isbn=978-0-333-38829-7|author-link=Stuart Sutherland}}</ref>}} | {{blockquote|'''Consciousness'''—The having of perceptions, thoughts, and [[feelings]]; awareness. The term is impossible to define except in terms that are unintelligible without a grasp of what consciousness means. Many fall into the trap of equating consciousness with [[self-consciousness]]—to be conscious it is only necessary to be aware of the external world. Consciousness is a fascinating but elusive phenomenon: it is impossible to specify what it is, what it does, or why it has evolved. Nothing worth reading has been written on it.<ref name=Sutherland>{{cite book|author=Stuart Sutherland|title=Macmillan Dictionary of Psychology|publisher=Macmillan|chapter=Consciousness|year=1989|isbn=978-0-333-38829-7|author-link=Stuart Sutherland}}</ref>}} | ||
Using 'awareness', however, as a definition or synonym of consciousness is not a simple matter: | Using 'awareness', however, as a definition or synonym of consciousness is not a simple matter: | ||
{{blockquote|text=If awareness of the environment . . . is the criterion of consciousness, then even the protozoans are conscious. If awareness of awareness is required, then it is doubtful whether the great apes and human infants are conscious.<ref name=Thomas67>{{cite encyclopedia|last= Thomas|first= Garth J.|encyclopedia= Encyclopædia Britannica|date=1967|volume=6| | {{blockquote|text=If awareness of the environment . . . is the criterion of consciousness, then even the protozoans are conscious. If awareness of awareness is required, then it is doubtful whether the great apes and human infants are conscious.<ref name=Thomas67>{{cite encyclopedia|last= Thomas|first= Garth J.|encyclopedia= Encyclopædia Britannica|date=1967|volume=6|page=366|title= Consciousness}}</ref>}} | ||
In 1974, philosopher [[Thomas Nagel]] used 'consciousness', 'conscious experience', 'subjective experience' and the 'subjective character of experience' as synonyms for something that "occurs at many levels of animal life ... [although] it is difficult to say in general what provides evidence of it."<ref name=NagelBat1>{{cite journal | last1 = Nagel | first1 = Thomas | year = 1974 | title = What Is It Like to Be a Bat? | journal = The Philosophical Review | volume = 83 | issue = 4| pages = 435–450 | doi=10.2307/2183914 | jstor=2183914 |url={{google books |plainurl=y |id=fBGPBRX3JsQC|page=165}}}}</ref> Nagel's terminology also included what has been described as "the standard 'what it's like' locution"<ref name=Levine10>Levine, Joseph (2010). Review of Uriah Kriegel, Subjective Consciousness: A Self-Representational Theory. ''Notre Dame Philosophical Reviews'' 2010 (3).</ref> in reference to the impenetrable [[subjectivity]] of any organism's [[experience]] which Nagel referred to as "inner life" without implying any kind of introspection. On Nagel's approach, [[Peter Hacker]] commented:{{r|Hacker2002|p=158}} "Consciousness, thus conceived, is extended to the whole domain of 'experience'—of 'Life' {{em|subjectively understood}}." He regarded this as a "novel analysis of consciousness"{{r|Hacker2012|p=14}} and has been particularly critical of Nagel's terminology and its philosophical consequences.{{r|Hacker2012}} In 2002 he attacked Nagel's 'what it's like' phrase as "malconstructed" and meaningless English—it sounds as if it asks for an analogy, but does not—and he called Nagel's approach logically "misconceived" as a definition of consciousness.<ref name=Hacker2002>{{cite journal |author-link= Peter Hacker |last= Hacker |first= P.M.S. |url=http://www.phps.at/texte/HackerP1.pdf |title= Is there anything it is like to be a bat? |journal= Philosophy |volume= 77 |date= 2002 |issue= 2 |pages= 157–174 |doi=10.1017/s0031819102000220}}</ref> In 2012 Hacker went further and asserted that Nagel had "laid the groundwork for ... forty years of fresh confusion about consciousness" and that "the contemporary philosophical conception of consciousness that is embraced by the 'consciousness studies community' is incoherent".{{r|Hacker2012|p=13-15}} | In 1974, philosopher [[Thomas Nagel]] used 'consciousness', 'conscious experience', 'subjective experience' and the 'subjective character of experience' as synonyms for something that "occurs at many levels of animal life ... [although] it is difficult to say in general what provides evidence of it."<ref name=NagelBat1>{{cite journal | last1 = Nagel | first1 = Thomas | year = 1974 | title = What Is It Like to Be a Bat? | journal = The Philosophical Review | volume = 83 | issue = 4| pages = 435–450 | doi=10.2307/2183914 | jstor=2183914 |url={{google books |plainurl=y |id=fBGPBRX3JsQC|page=165}}}}</ref> Nagel's terminology also included what has been described as "the standard 'what it's like' locution"<ref name=Levine10>Levine, Joseph (2010). Review of Uriah Kriegel, Subjective Consciousness: A Self-Representational Theory. ''Notre Dame Philosophical Reviews'' 2010 (3).</ref> in reference to the impenetrable [[subjectivity]] of any organism's [[experience]] which Nagel referred to as "inner life" without implying any kind of introspection. On Nagel's approach, [[Peter Hacker]] commented:{{r|Hacker2002|p=158}} "Consciousness, thus conceived, is extended to the whole domain of 'experience'—of 'Life' {{em|subjectively understood}}." He regarded this as a "novel analysis of consciousness"{{r|Hacker2012|p=14}} and has been particularly critical of Nagel's terminology and its philosophical consequences.{{r|Hacker2012}} In 2002 he attacked Nagel's 'what it's like' phrase as "malconstructed" and meaningless English—it sounds as if it asks for an analogy, but does not—and he called Nagel's approach logically "misconceived" as a definition of consciousness.<ref name=Hacker2002>{{cite journal |author-link= Peter Hacker |last= Hacker |first= P.M.S. |url=http://www.phps.at/texte/HackerP1.pdf |title= Is there anything it is like to be a bat? |journal= Philosophy |volume= 77 |date= 2002 |issue= 2 |pages= 157–174 |doi=10.1017/s0031819102000220}}</ref> In 2012 Hacker went further and asserted that Nagel had "laid the groundwork for ... forty years of fresh confusion about consciousness" and that "the contemporary philosophical conception of consciousness that is embraced by the 'consciousness studies community' is incoherent".{{r|Hacker2012|p=13-15}} | ||
===Influence on research=== | === Influence on research === | ||
Many philosophers have argued that consciousness is a unitary concept that is understood by the majority of people despite the difficulty philosophers have had defining it.<ref name="Antony2001">{{cite journal|author=Michael V. Antony|year=2001|title=Is ''consciousness'' ambiguous?|journal=Journal of Consciousness Studies|volume=8|pages=19–44}}</ref> The term 'subjective experience', following Nagel, is amibiguous, as philosophers seem to differ from non-philosophers in their intuitions about its meaning.<ref>{{cite journal |author=Justin Sytsma |author2=Edouard Machery |title=Two conceptions of subjective experience |journal=Philosophical Studies |year=2010 |volume=151 |issue=2 |pages=299–327 |doi=10.1007/s11098-009-9439-x|s2cid=2444730 |url=http://philsci-archive.pitt.edu/archive/00004888/01/Two_Conceptions_of_Subjective_Experience.pdf |archive-url=https://ghostarchive.org/archive/20221009/http://philsci-archive.pitt.edu/archive/00004888/01/Two_Conceptions_of_Subjective_Experience.pdf |archive-date=2022-10-09 |url-status=live}}</ref> [[Max Velmans]] proposed that the "everyday understanding of consciousness" uncontroversially "refers to experience itself rather than any particular thing that we observe or experience" and he added that consciousness "is [therefore] exemplified by {{em|all}} the things that we observe or experience",{{r|Velmans2009|p=4}}<!--COMMENT: Velman's statement is a confusion of logical categories, an error of logical typing. Empirical science has discovered much about the processes of perception because there are 'objects' perceived by 'organs' of perception; but "experience itself" is an idea, a concept, an abstraction. Since 'experience' is not some 'thing' experienced, therefore (by definition) it cannot be empirically analyzed, reduced or compared. The 'things that we actually observe or experience' are examples of 'things' and 'kinds of things'; it's incorrect to call them 'examples of experience'! --> whether thoughts, feelings, or perceptions. [[Max Velmans|Velmans]] noted however, as of 2009, that there was a deep level of "confusion and internal division"<ref name=Velmans2009>{{cite journal|author=Max Velmans|title=How to define consciousness—and how not to define consciousness|journal=Journal of Consciousness Studies|year=2009|volume=16|pages=139–156|author-link=Max Velmans}}</ref> among experts about the phenomenon of consciousness, because researchers lacked "a sufficiently well-specified use of the term...to agree that they are investigating the same thing".{{r|Velmans2009|p=3}} He argued additionally that "pre-existing theoretical commitments" to competing explanations of consciousness might be a source of bias. | Many philosophers have argued that consciousness is a unitary concept that is understood by the majority of people despite the difficulty philosophers have had defining it.<ref name="Antony2001">{{cite journal|author=Michael V. Antony|year=2001|title=Is ''consciousness'' ambiguous?|journal=Journal of Consciousness Studies|volume=8|pages=19–44}}</ref> The term 'subjective experience', following Nagel, is amibiguous, as philosophers seem to differ from non-philosophers in their intuitions about its meaning.<ref>{{cite journal |author=Justin Sytsma |author2=Edouard Machery |title=Two conceptions of subjective experience |journal=Philosophical Studies |year=2010 |volume=151 |issue=2 |pages=299–327 |doi=10.1007/s11098-009-9439-x|s2cid=2444730 |url=http://philsci-archive.pitt.edu/archive/00004888/01/Two_Conceptions_of_Subjective_Experience.pdf |archive-url=https://ghostarchive.org/archive/20221009/http://philsci-archive.pitt.edu/archive/00004888/01/Two_Conceptions_of_Subjective_Experience.pdf |archive-date=2022-10-09 |url-status=live}}</ref> [[Max Velmans]] proposed that the "everyday understanding of consciousness" uncontroversially "refers to experience itself rather than any particular thing that we observe or experience" and he added that consciousness "is [therefore] exemplified by {{em|all}} the things that we observe or experience",{{r|Velmans2009|p=4}}<!--COMMENT: Velman's statement is a confusion of logical categories, an error of logical typing. Empirical science has discovered much about the processes of perception because there are 'objects' perceived by 'organs' of perception; but "experience itself" is an idea, a concept, an abstraction. Since 'experience' is not some 'thing' experienced, therefore (by definition) it cannot be empirically analyzed, reduced or compared. The 'things that we actually observe or experience' are examples of 'things' and 'kinds of things'; it's incorrect to call them 'examples of experience'! --> whether thoughts, feelings, or perceptions. [[Max Velmans|Velmans]] noted however, as of 2009, that there was a deep level of "confusion and internal division"<ref name=Velmans2009>{{cite journal|author=Max Velmans|title=How to define consciousness—and how not to define consciousness|journal=Journal of Consciousness Studies|year=2009|volume=16|pages=139–156|author-link=Max Velmans}}</ref> among experts about the phenomenon of consciousness, because researchers lacked "a sufficiently well-specified use of the term...to agree that they are investigating the same thing".{{r|Velmans2009|p=3}} He argued additionally that "pre-existing theoretical commitments" to competing explanations of consciousness might be a source of bias. | ||
Within the "modern consciousness studies" community the technical phrase 'phenomenal consciousness' is a common synonym for all forms of awareness, or simply '[[experience]]',{{r|Velmans2009|p=4|quote=In common usage, the term "consciousness" is often synonymous with "awareness", "conscious awareness", and "experience".}} without differentiating between inner and outer, or between higher and lower types. With advances in brain research, "the presence or absence of ''experienced phenomena''"{{r|Velmans2009|p=3}} of any kind underlies the work of those [[neuroscientist]]s who seek "to analyze the precise relation of [[Phenomenology (psychology)|conscious phenomenology]] to its associated information processing" in the brain.{{r|Velmans2009|p=10}} This [[neuroscience|neuroscientific]] goal is to find the "neural correlates of consciousness" (NCC). One criticism of this goal is that it begins with a theoretical commitment to the neurological origin of all "experienced phenomena" whether inner or outer.{{efn|"Investigating "how experience ensues from the brain", rather than exploring a factual claim, betrays a philosophical commitment".<ref name="Gomez2019">{{cite journal|last1=Gomez-Marin|first1=Alex|last2=Arnau|first2=Juan|title=The False Problem of Consciousness|journal=Behavior of Organisms Laboratory|date=2019|url=http://philsci-archive.pitt.edu/15699/1/MS_GomezMarin_Arnau.pdf}}</ref>}} Also, the fact that the easiest 'content of consciousness' to be so analyzed is "the experienced three-dimensional world (the phenomenal world) beyond the body surface"{{r|Velmans2009|p=4}} invites another criticism, that most consciousness research since the 1990s, perhaps because of bias, has focused on processes of [[perception|external perception]].<ref name="Frith2016">{{cite book|editor-last1=Engel|editor-first1=Andreas K.|last1=Frith|first1=Chris|last2=Metzinger|first2=Thomas|title= The Pragmatic Turn: Toward Action-Oriented Views in Cognitive Science|url=https://www.researchgate.net/publication/304657860|chapter=What's the Use of Consciousness? How the Stab of Conscience Made Us Really Conscious|pages=193–214|isbn= | Within the "modern consciousness studies" community the technical phrase 'phenomenal consciousness' is a common synonym for all forms of awareness, or simply '[[experience]]',{{r|Velmans2009|p=4|quote=In common usage, the term "consciousness" is often synonymous with "awareness", "conscious awareness", and "experience".}} without differentiating between inner and outer, or between higher and lower types. With advances in brain research, "the presence or absence of ''experienced phenomena''"{{r|Velmans2009|p=3}} of any kind underlies the work of those [[neuroscientist]]s who seek "to analyze the precise relation of [[Phenomenology (psychology)|conscious phenomenology]] to its associated information processing" in the brain.{{r|Velmans2009|p=10}} This [[neuroscience|neuroscientific]] goal is to find the "neural correlates of consciousness" (NCC). One criticism of this goal is that it begins with a theoretical commitment to the neurological origin of all "experienced phenomena" whether inner or outer.{{efn|"Investigating "how experience ensues from the brain", rather than exploring a factual claim, betrays a philosophical commitment".<ref name="Gomez2019">{{cite journal|last1=Gomez-Marin|first1=Alex|last2=Arnau|first2=Juan|title=The False Problem of Consciousness|journal=Behavior of Organisms Laboratory|date=2019|url=http://philsci-archive.pitt.edu/15699/1/MS_GomezMarin_Arnau.pdf}}</ref>}} Also, the fact that the easiest 'content of consciousness' to be so analyzed is "the experienced three-dimensional world (the phenomenal world) beyond the body surface"{{r|Velmans2009|p=4}} invites another criticism, that most consciousness research since the 1990s, perhaps because of bias, has focused on processes of [[perception|external perception]].<ref name="Frith2016">{{cite book|editor-last1=Engel|editor-first1=Andreas K.|last1=Frith|first1=Chris|last2=Metzinger|first2=Thomas|title= The Pragmatic Turn: Toward Action-Oriented Views in Cognitive Science|url=https://www.researchgate.net/publication/304657860|chapter=What's the Use of Consciousness? How the Stab of Conscience Made Us Really Conscious|pages=193–214|isbn= 978-0-262-03432-6|doi= 10.7551/mitpress/9780262034326.003.0012|date=March 2016|author-link1=Chris Frith|author-link2=Thomas Metzinger}}</ref> | ||
From a [[history of psychology]] perspective, [[Julian Jaynes]] rejected popular but "superficial views of consciousness"{{r|JJ90|p=447}} especially those which equate it with "that vaguest of terms, [[experience]]".<ref name=JJ76>{{cite book|last=Jaynes|first=Julian|date=1976|isbn=0-395-20729-0|author-link=Julian Jaynes|publisher=Houghton Mifflin|title=The Origin of Consciousness in the Breakdown of the Bicameral Mind|url=https://archive.org/details/originofconsciou0000unse|url-access=registration}}</ref>{{rp|8}} In 1976 he insisted that if not for [[introspection]], which for decades had been ignored or taken for granted rather than explained, there could be no "conception of what consciousness is"{{r|JJ76|p=18}} and in 1990, he reaffirmed the traditional idea of the phenomenon called 'consciousness', writing that "its [[denotation|denotative definition]] is, as it was for René Descartes, John Locke, and [[David Hume]], what is introspectable".{{r|JJ90|p=450}} Jaynes saw consciousness as an important but small part of human mentality, and he asserted: "there can be no progress in the science of consciousness until ... what is introspectable [is] sharply distinguished"{{r|JJ90|p=447}} from the {{em|unconscious}} processes of [[cognition]] such as [[perception]], reactive [[awareness]] and [[attention]], and automatic forms of [[learning]], [[problem-solving]], and [[decision-making]].{{r|JJ76|p=21-47}} | From a [[history of psychology]] perspective, [[Julian Jaynes]] rejected popular but "superficial views of consciousness"{{r|JJ90|p=447}} especially those which equate it with "that vaguest of terms, [[experience]]".<ref name=JJ76>{{cite book|last=Jaynes|first=Julian|date=1976|isbn=0-395-20729-0|author-link=Julian Jaynes|publisher=Houghton Mifflin|title=The Origin of Consciousness in the Breakdown of the Bicameral Mind|url=https://archive.org/details/originofconsciou0000unse|url-access=registration}}</ref>{{rp|8}} In 1976 he insisted that if not for [[introspection]], which for decades had been ignored or taken for granted rather than explained, there could be no "conception of what consciousness is"{{r|JJ76|p=18}} and in 1990, he reaffirmed the traditional idea of the phenomenon called 'consciousness', writing that "its [[denotation|denotative definition]] is, as it was for René Descartes, John Locke, and [[David Hume]], what is introspectable".{{r|JJ90|p=450}} Jaynes saw consciousness as an important but small part of human mentality, and he asserted: "there can be no progress in the science of consciousness until ... what is introspectable [is] sharply distinguished"{{r|JJ90|p=447}} from the {{em|unconscious}} processes of [[cognition]] such as [[perception]], reactive [[awareness]] and [[attention]], and automatic forms of [[learning]], [[problem-solving]], and [[decision-making]].{{r|JJ76|p=21-47}} | ||
| Line 85: | Line 79: | ||
<!-- Al Byrd, the author of Superhuman Creators, defines consciousness, for animals, humans and artificial agents, as the effect of integrating and filtering many different types of affordance awareness; that is, awareness of the action possibilities in an environment. According to this definition, all agents that can perceive and act on affordances are conscious to some extent. --> | <!-- Al Byrd, the author of Superhuman Creators, defines consciousness, for animals, humans and artificial agents, as the effect of integrating and filtering many different types of affordance awareness; that is, awareness of the action possibilities in an environment. According to this definition, all agents that can perceive and act on affordances are conscious to some extent. --> | ||
=== Medical definition === | |||
In [[medicine]], a "level of consciousness" terminology is used to describe a patient's [[arousal]] and responsiveness, which can be seen as a continuum of states ranging from full alertness and [[Understanding|comprehension]], through disorientation, [[delirium]], loss of meaningful communication, and finally loss of movement in response to painful [[Stimulus (physiology)|stimuli]].<ref>{{cite book|first=Güven|last=Güzeldere|title=The Nature of Consciousness: Philosophical Debates|year=1997|editor-first=Ned|editor-last=Block|editor2-first=Owen|editor2-last=Flanagan|editor3-first=Güven|editor3-last=Güzeldere|pages=1–67|location=Cambridge, MA|publisher=MIT Press}}</ref> Issues of practical concern include how the level of consciousness can be assessed in severely ill, comatose, or anesthetized people, and how to treat conditions in which consciousness is impaired or disrupted.<ref>{{cite journal|title=Late recovery from the minimally conscious state: ethical and policy implications|first1=J. J.|last1=Fins|first2=N. D.|last2=Schiff|first3=K. M.|last3=Foley|journal=Neurology|year=2007|volume=68|pages=304–307|pmid=17242341|doi=10.1212/01.wnl.0000252376.43779.96|issue=4|s2cid=32561349}}</ref> The degree or level of consciousness is measured by standardized behavior observation scales such as the [[Glasgow Coma Scale]]. | In [[medicine]], a "level of consciousness" terminology is used to describe a patient's [[arousal]] and responsiveness, which can be seen as a continuum of states ranging from full alertness and [[Understanding|comprehension]], through disorientation, [[delirium]], loss of meaningful communication, and finally loss of movement in response to painful [[Stimulus (physiology)|stimuli]].<ref>{{cite book|first=Güven|last=Güzeldere|title=The Nature of Consciousness: Philosophical Debates|year=1997|editor-first=Ned|editor-last=Block|editor2-first=Owen|editor2-last=Flanagan|editor3-first=Güven|editor3-last=Güzeldere|pages=1–67|location=Cambridge, MA|publisher=MIT Press}}</ref> Issues of practical concern include how the level of consciousness can be assessed in severely ill, comatose, or anesthetized people, and how to treat conditions in which consciousness is impaired or disrupted.<ref>{{cite journal|title=Late recovery from the minimally conscious state: ethical and policy implications|first1=J. J.|last1=Fins|first2=N. D.|last2=Schiff|first3=K. M.|last3=Foley|journal=Neurology|year=2007|volume=68|pages=304–307|pmid=17242341|doi=10.1212/01.wnl.0000252376.43779.96|issue=4|s2cid=32561349}}</ref> The degree or level of consciousness is measured by standardized behavior observation scales such as the [[Glasgow Coma Scale]]. | ||
==Philosophy of mind== | == Philosophy of mind == | ||
While historically philosophers have defended various views on consciousness, surveys indicate that [[physicalism]] is now the dominant position among contemporary philosophers of mind.<ref>{{cite web|url=https://philpapers.org/surveys/results.pl|title=PhilPapers Survey 2020|publisher=PhilPapers|access-date=2023-12-15}}</ref> For an overview of the field, approaches often include both historical perspectives (e.g., Descartes, Locke, [[Immanuel Kant|Kant]]) and organization by key issues in contemporary debates. An alternative is to focus primarily on current philosophical stances and empirical findings. | While historically philosophers have defended various views on consciousness, surveys indicate that [[physicalism]] is now the dominant position among contemporary philosophers of mind.<ref>{{cite web|url=https://philpapers.org/surveys/results.pl|title=PhilPapers Survey 2020|publisher=PhilPapers|access-date=2023-12-15}}</ref> For an overview of the field, approaches often include both historical perspectives (e.g., Descartes, Locke, [[Immanuel Kant|Kant]]) and organization by key issues in contemporary debates. An alternative is to focus primarily on current philosophical stances and empirical findings. | ||
===Coherence of the concept=== | === Coherence of the concept === | ||
Philosophers differ from non-philosophers in their intuitions about what consciousness is.<ref>{{cite journal|author=Justin Sytsma|author2=Edouard Machery|title=Two conceptions of subjective experience|journal=Philosophical Studies|year=2010|volume=151|issue=2|pages=299–327|doi=10.1007/s11098-009-9439-x|s2cid=2444730|url=http://philsci-archive.pitt.edu/archive/00004888/01/Two_Conceptions_of_Subjective_Experience.pdf|archive-url=https://ghostarchive.org/archive/20221009/http://philsci-archive.pitt.edu/archive/00004888/01/Two_Conceptions_of_Subjective_Experience.pdf|archive-date=2022-10-09|url-status=live}}</ref> While most people have a strong intuition for the existence of what they refer to as consciousness,<ref name=Antony2001/> skeptics argue that this intuition is too narrow, either because the concept of consciousness is embedded in our intuitions, or because we all are illusions. [[Gilbert Ryle]], for example, argued that traditional understanding of consciousness depends on a [[dualism | Philosophers differ from non-philosophers in their intuitions about what consciousness is.<ref>{{cite journal|author=Justin Sytsma|author2=Edouard Machery|title=Two conceptions of subjective experience|journal=Philosophical Studies|year=2010|volume=151|issue=2|pages=299–327|doi=10.1007/s11098-009-9439-x|s2cid=2444730|url=http://philsci-archive.pitt.edu/archive/00004888/01/Two_Conceptions_of_Subjective_Experience.pdf|archive-url=https://ghostarchive.org/archive/20221009/http://philsci-archive.pitt.edu/archive/00004888/01/Two_Conceptions_of_Subjective_Experience.pdf|archive-date=2022-10-09|url-status=live}}</ref> While most people have a strong intuition for the existence of what they refer to as consciousness,<ref name=Antony2001/> skeptics argue that this intuition is too narrow, either because the concept of consciousness is embedded in our intuitions, or because we all are illusions. [[Gilbert Ryle]], for example, argued that traditional understanding of consciousness depends on a [[mind–body dualism|Cartesian dualist]] outlook that improperly distinguishes between mind and body, or between mind and world. He proposed that we speak not of minds, bodies, and the world, but of entities, or identities, acting in the world. Thus, by speaking of "consciousness" we end up leading ourselves by thinking that there is any sort of thing as consciousness separated from behavioral and linguistic understandings.<ref name=RyleConsciousness>{{cite book|title=The Concept of Mind|author=Gilbert Ryle|publisher=University of Chicago Press|date=2000 |orig-date=1949|pages=156–163|isbn=978-0-226-73296-1|title-link=The Concept of Mind|author-link=Gilbert Ryle}}</ref> | ||
===Types=== | === Types === | ||
[[Ned Block]] argues that discussions on consciousness | [[Ned Block]] argues that discussions on consciousness often fail to properly distinguish ''[[phenomenal]] consciousness'' from ''access consciousness''. These terms had been used before Block, but he adopted the short forms P-consciousness and A-consciousness.<ref name=block>{{cite book|title=The Nature of Consciousness: Philosophical Debates|editor=N. Block|editor2=O. Flanagan|editor3=G. Guzeldere|chapter=On a confusion about a function of consciousness|author=Ned Block|pages=375–415|year=1998|isbn=978-0-262-52210-6|publisher=MIT Press|chapter-url=http://cogprints.org/231/1/199712004.html|author-link=Ned Block|access-date=2011-09-10|archive-date=2011-11-03|archive-url=https://web.archive.org/web/20111103034117/http://cogprints.org/231/1/199712004.html|url-status=live}} Pages 230 and 231 in [https://www.nedblock.us/papers/1995_Function.pdf the version on the author's own website].</ref> According to Block: | ||
* P-consciousness is raw experience: it is moving, colored forms, sounds, sensations, emotions and feelings with our bodies and responses at the center. These experiences, considered independently of any impact on behavior, are called [[qualia]]. | * P-consciousness is raw experience: it is moving, colored forms, sounds, sensations, emotions and feelings with our bodies and responses at the center. These experiences, considered independently of any impact on behavior, are called [[qualia]]. | ||
* A-consciousness is the phenomenon whereby information in our minds is accessible for verbal report, reasoning, and the control of behavior. So, when we [[perception|perceive]], information about what we perceive is access conscious; when we [[introspection|introspect]], information about our thoughts is access conscious; when we [[memory|remember]], information about the past is access conscious, and so on. | * A-consciousness is the phenomenon whereby information in our minds is accessible for verbal report, reasoning, and the control of behavior. So, when we [[perception|perceive]], information about what we perceive is access conscious; when we [[introspection|introspect]], information about our thoughts is access conscious; when we [[memory|remember]], information about the past is access conscious, and so on. | ||
Block adds that P-consciousness does not allow of easy definition: he admits that he "cannot define P-consciousness in any remotely [[circular definition|noncircular]] way.<ref name=block /> | Block adds that P-consciousness does not allow of easy definition: he admits that he "cannot define P-consciousness in any remotely [[circular definition|noncircular]] way.<ref name=block /> | ||
Although some philosophers, such as [[Daniel Dennett]], have disputed the validity of this distinction,<ref name="D375">{{cite book|author=Daniel Dennett|year=2004|title=Consciousness Explained|page=375|publisher=Penguin|isbn=978-0-7139-9037-9|title-link=Consciousness Explained|author-link=Daniel Dennett}}</ref> others have broadly accepted it. [[David Chalmers]] has argued that A-consciousness can in principle be understood in mechanistic terms, but that understanding P-consciousness is much more challenging: he calls this the [[hard problem of consciousness]].<ref name=ChalmersHardProblem>{{cite journal|url=http://www.imprint.co.uk/chalmers.html|title=Facing up to the problem of consciousness|author=David Chalmers|journal=Journal of Consciousness Studies|volume=2|year=1995|pages=200–219 | Although some philosophers, such as [[Daniel Dennett]], have disputed the validity of this distinction,<ref name="D375">{{cite book|author=Daniel Dennett|year=2004|title=Consciousness Explained|page=375|publisher=Penguin|isbn=978-0-7139-9037-9|title-link=Consciousness Explained|author-link=Daniel Dennett}}</ref> others have broadly accepted it. [[David Chalmers]] has argued that A-consciousness can in principle be understood in mechanistic terms, but that understanding P-consciousness is much more challenging: he calls this the [[hard problem of consciousness]].<ref name=ChalmersHardProblem>{{cite journal|url=http://www.imprint.co.uk/chalmers.html|title=Facing up to the problem of consciousness|author=David Chalmers|journal=Journal of Consciousness Studies|volume=2|year=1995|pages=200–219|archive-url=https://web.archive.org/web/20050308163649/http://www.imprint.co.uk/chalmers.html|archive-date=2005-03-08|author-link=David Chalmers}}</ref> | ||
Some philosophers believe that Block's two types of consciousness are not the end of the story. [[William Lycan]], for example, argued in his book ''Consciousness and Experience'' that at least eight clearly distinct types of consciousness can be identified (organism consciousness; control consciousness; consciousness ''of''; state/event consciousness; reportability; introspective consciousness; subjective consciousness; self-consciousness)—and that even this list omits several more obscure forms.<ref>{{cite book|author=William Lycan|title=Consciousness and Experience|pages=1–4|year=1996|publisher=MIT Press|isbn=978-0-262-12197-2|author-link=William Lycan}}</ref> | Some philosophers believe that Block's two types of consciousness are not the end of the story. [[William Lycan]], for example, argued in his book ''Consciousness and Experience'' that at least eight clearly distinct types of consciousness can be identified (organism consciousness; control consciousness; consciousness ''of''; state/event consciousness; reportability; introspective consciousness; subjective consciousness; self-consciousness)—and that even this list omits several more obscure forms.<ref>{{cite book|author=William Lycan|title=Consciousness and Experience|pages=1–4|year=1996|publisher=MIT Press|isbn=978-0-262-12197-2|author-link=William Lycan}}</ref> | ||
There is also debate over whether or not A-consciousness and P-consciousness always coexist or if they can exist separately. Although P-consciousness without A-consciousness is more widely accepted, there have been some hypothetical examples of A without P. Block, for instance, suggests the case of a "[[Philosophical zombie|zombie]]" that is computationally identical to a person but without any subjectivity. However, he remains somewhat skeptical concluding "I don't know whether there are any actual cases of A-consciousness without P-consciousness, but I hope I have illustrated their conceptual possibility".<ref>{{cite journal|last= Block|first=Ned|year = 1995|title = How many concepts of consciousness?|url = https://pdfs.semanticscholar.org/6174/aff557977a75c5d76463871180f8d1befbbc.pdf|archive-url = https://web.archive.org/web/20200210172202/https://pdfs.semanticscholar.org/6174/aff557977a75c5d76463871180f8d1befbbc.pdf | There is also debate over whether or not A-consciousness and P-consciousness always coexist or if they can exist separately. Although P-consciousness without A-consciousness is more widely accepted, there have been some hypothetical examples of A without P. Block, for instance, suggests the case of a "[[Philosophical zombie|zombie]]" that is computationally identical to a person but without any subjectivity. However, he remains somewhat skeptical concluding "I don't know whether there are any actual cases of A-consciousness without P-consciousness, but I hope I have illustrated their conceptual possibility".<ref>{{cite journal|last= Block|first=Ned|year = 1995|title = How many concepts of consciousness?|url = https://pdfs.semanticscholar.org/6174/aff557977a75c5d76463871180f8d1befbbc.pdf|archive-url = https://web.archive.org/web/20200210172202/https://pdfs.semanticscholar.org/6174/aff557977a75c5d76463871180f8d1befbbc.pdf|archive-date = 2020-02-10|journal = Behavioral and Brain Sciences|volume = 18|issue = 2| pages = 272–284|doi=10.1017/s0140525x00038486| s2cid = 41023484}}</ref> | ||
===Distinguishing consciousness from its contents=== | === Distinguishing consciousness from its contents === | ||
[[Sam Harris]] observes: "At the level of your experience, you are not a body of cells, organelles, and atoms; you are consciousness and its ever-changing contents".<ref>Harris, S. (12 October 2011). The mystery of consciousness. ''Sam Harris.'' https://www.samharris.org/blog/the-mystery-of-consciousness {{Webarchive|url=https://web.archive.org/web/20230423061921/https://www.samharris.org/blog/the-mystery-of-consciousness|date=2023-04-23}}</ref> Seen in this way, consciousness is a subjectively experienced, ever-present field in which things (the contents of consciousness) come and go. | [[Sam Harris]] observes: "At the level of your experience, you are not a body of cells, organelles, and atoms; you are consciousness and its ever-changing contents".<ref>Harris, S. (12 October 2011). The mystery of consciousness. ''Sam Harris.'' https://www.samharris.org/blog/the-mystery-of-consciousness {{Webarchive|url=https://web.archive.org/web/20230423061921/https://www.samharris.org/blog/the-mystery-of-consciousness|date=2023-04-23}}</ref> Seen in this way, consciousness is a subjectively experienced, ever-present field in which things (the contents of consciousness) come and go. | ||
Christopher Tricker argues that this field of consciousness is symbolized by the mythical bird that opens the Daoist classic the [[Zhuangzi (book)|''Zhuangzi.'']] This bird's name is Of a Flock ([[Peng (mythology)|''peng'' 鵬]]), yet its back is countless thousands of miles across and its wings are like clouds arcing across the heavens. "Like Of a Flock, whose wings arc across the heavens, the wings of your consciousness span to the horizon. At the same time, the wings of every other being's consciousness span to the horizon. You are of a flock, one bird among kin."<ref>Tricker, C. (2022). [https://thecicadaandthebird.com The cicada and the bird. The usefulness of a useless philosophy. Chuang Tzu's ancient wisdom translated for modern life.] {{Webarchive|url=https://web.archive.org/web/20230421032929/https://thecicadaandthebird.com/|date=2023-04-21}} Page 52. [https://books.google.com/books?id=YnCaEAAAQBAJ (Google Books)] {{Webarchive|url=https://web.archive.org/web/20230608153319/https://books.google.com/books?id=YnCaEAAAQBAJ|date=2023-06-08}}</ref> | Christopher Tricker argues that this field of consciousness is symbolized by the mythical bird that opens the Daoist classic the [[Zhuangzi (book)|''Zhuangzi.'']] This bird's name is Of a Flock ([[Peng (mythology)|''peng'' 鵬]]), yet its back is countless thousands of miles across and its wings are like clouds arcing across the heavens. "Like Of a Flock, whose wings arc across the heavens, the wings of your consciousness span to the horizon. At the same time, the wings of every other being's consciousness span to the horizon. You are of a flock, one bird among kin."<ref>Tricker, C. (2022). [https://thecicadaandthebird.com The cicada and the bird. The usefulness of a useless philosophy. Chuang Tzu's ancient wisdom translated for modern life.] {{Webarchive|url=https://web.archive.org/web/20230421032929/https://thecicadaandthebird.com/|date=2023-04-21}} Page 52. [https://books.google.com/books?id=YnCaEAAAQBAJ (Google Books)] {{Webarchive|url=https://web.archive.org/web/20230608153319/https://books.google.com/books?id=YnCaEAAAQBAJ|date=2023-06-08}}</ref> | ||
===Mind–body problem=== | === Mind–body problem === | ||
{{Main|Mind–body problem}} | {{Main|Mind–body problem}} | ||
| Line 119: | Line 114: | ||
The first influential philosopher to discuss this question specifically was Descartes, and the answer he gave is known as [[mind–body dualism]]. Descartes proposed that consciousness resides within an immaterial domain he called ''[[mental substance|res cogitans]]'' (the realm of thought), in contrast to the domain of material things, which he called ''[[res extensa]]'' (the realm of extension).<ref>{{cite book|title=Philosophy of Man: selected readings|last=Dy|first=Manuel B. Jr.|publisher=Goodwill Trading Co.|year=2001|isbn=978-971-12-0245-3|page=97}}</ref> He suggested that the interaction between these two domains occurs inside the brain, perhaps in a small midline structure called the [[pineal gland]].<ref name="S_pineal">{{cite web|title=Descartes and the Pineal Gland|publisher=Stanford University|date=November 5, 2008|url=http://plato.stanford.edu/entries/pineal-gland/|access-date=2025-02-07|archive-date=2019-12-16|archive-url=https://web.archive.org/web/20191216035157/https://plato.stanford.edu/entries/pineal-gland/|url-status=live}}</ref> | The first influential philosopher to discuss this question specifically was Descartes, and the answer he gave is known as [[mind–body dualism]]. Descartes proposed that consciousness resides within an immaterial domain he called ''[[mental substance|res cogitans]]'' (the realm of thought), in contrast to the domain of material things, which he called ''[[res extensa]]'' (the realm of extension).<ref>{{cite book|title=Philosophy of Man: selected readings|last=Dy|first=Manuel B. Jr.|publisher=Goodwill Trading Co.|year=2001|isbn=978-971-12-0245-3|page=97}}</ref> He suggested that the interaction between these two domains occurs inside the brain, perhaps in a small midline structure called the [[pineal gland]].<ref name="S_pineal">{{cite web|title=Descartes and the Pineal Gland|publisher=Stanford University|date=November 5, 2008|url=http://plato.stanford.edu/entries/pineal-gland/|access-date=2025-02-07|archive-date=2019-12-16|archive-url=https://web.archive.org/web/20191216035157/https://plato.stanford.edu/entries/pineal-gland/|url-status=live}}</ref> | ||
Although it is widely accepted that Descartes explained the problem cogently, few later philosophers have been happy with his solution, and his ideas about the pineal gland have especially been ridiculed.<ref name="S_pineal" /> However, no alternative solution has gained general acceptance. Proposed solutions can be divided broadly into two categories: [[dualism | Although it is widely accepted that Descartes explained the problem cogently, few later philosophers have been happy with his solution, and his ideas about the pineal gland have especially been ridiculed.<ref name="S_pineal" /> However, no alternative solution has gained general acceptance. Proposed solutions can be divided broadly into two categories: [[mind–body dualism|dualist]] solutions that maintain Descartes's rigid distinction between the realm of consciousness and the realm of matter but give different answers for how the two realms relate to each other; and [[monism|monist]] solutions that maintain that there is really only one realm of being, of which consciousness and matter are both aspects. Each of these categories itself contains numerous variants. The two main types of dualism are [[substance dualism]] (which holds that the mind is formed of a distinct type of substance not governed by the laws of physics), and [[property dualism]] (which holds that the laws of physics are universally valid but cannot be used to explain the mind). The three main types of [[monism]] are physicalism (which holds that the mind is made out of matter), [[idealism]] (which holds that only thought or experience truly exists, and matter is merely an illusion), and [[neutral monism]] (which holds that both mind and matter are aspects of a distinct essence that is itself identical to neither of them). There are also, however, a large number of idiosyncratic theories that cannot cleanly be assigned to any of these schools of thought.<ref>{{cite book|author=William Jaworski|title=Philosophy of Mind: A Comprehensive Introduction|publisher=John Wiley and Sons|year=2011|isbn=978-1-4443-3367-1|pages=5–11}}</ref> | ||
Since the dawn of Newtonian science with its vision of simple mechanical principles governing the entire universe, some philosophers have been tempted by the idea that consciousness could be explained in purely physical terms. The first influential writer to propose such an idea explicitly was [[Julien Offray de La Mettrie]], in his book ''[[Man a Machine]]'' (''L'homme machine''). His arguments, however, were very abstract.<ref name=LaMettrie>{{cite book| editor=Ann Thomson|author=Julien Offray de La Mettrie|title=Machine man and other writings|publisher=Cambridge University Press|year=1996|isbn=978-0-521-47849-6|author-link=Julien Offray de La Mettrie}}</ref> The most influential modern physical theories of consciousness are based on [[psychology]] and [[neuroscience]]. Theories proposed by neuroscientists such as [[Gerald Edelman]]<ref>{{cite book|title=Bright Air, Brilliant Fire: On the Matter of the Mind|author=Gerald Edelman|publisher=Basic Books|year=1993|isbn=978-0-465-00764-6|author-link=Gerald Edelman|url-access=registration|url=https://archive.org/details/brightairbrillia00gera}}</ref> and [[António Damásio|Antonio Damasio]],<ref name=DamasioFeeling>{{cite book|author=Antonio Damasio|year=1999|title=The Feeling of What Happens: Body and Emotion in the Making of Consciousness|location=New York|publisher=Harcourt Press|isbn=978-0-15-601075-7|author-link=Antonio Damasio|url=https://archive.org/details/feelingofwhathap00dama_0}}</ref> and by philosophers such as Daniel Dennett,<ref>{{cite book|author=Daniel Dennett|year=1991|title=Consciousness Explained|url=https://archive.org/details/consciousnessexp00denn|url-access=registration|location=Boston|publisher=Little & Company|isbn=978-0-316-18066-5|author-link=Daniel Dennett}}</ref> seek to explain consciousness in terms of neural events occurring within the brain. Many other neuroscientists, such as [[Christof Koch]],<ref name=KochQuest>{{cite book| author=Christof Koch|year=2004|title=The Quest for Consciousness|location=Englewood, CO|publisher=Roberts & Company|isbn=978-0-9747077-0-9|author-link=Christof Koch}}</ref> have explored the neural basis of consciousness without attempting to frame all-encompassing global theories. At the same time, [[computer scientist]]s working in the field of [[artificial intelligence]] have pursued the goal of creating digital computer programs that can [[Artificial consciousness|simulate or embody consciousness]].<ref>Ron Sun and Stan Franklin, Computational models of consciousness: A taxonomy and some examples. In: P.D. Zelazo, M. Moscovitch, and E. Thompson (eds.), ''The Cambridge Handbook of Consciousness'', pp. 151–174. Cambridge University Press, New York. 2007</ref> | Since the dawn of Newtonian science with its vision of simple mechanical principles governing the entire universe, some philosophers have been tempted by the idea that consciousness could be explained in purely physical terms. The first influential writer to propose such an idea explicitly was [[Julien Offray de La Mettrie]], in his book ''[[Man a Machine]]'' (''L'homme machine''). His arguments, however, were very abstract.<ref name=LaMettrie>{{cite book| editor=Ann Thomson|author=Julien Offray de La Mettrie|title=Machine man and other writings|publisher=Cambridge University Press|year=1996|isbn=978-0-521-47849-6|author-link=Julien Offray de La Mettrie}}</ref> The most influential modern physical theories of consciousness are based on [[psychology]] and [[neuroscience]]. Theories proposed by neuroscientists such as [[Gerald Edelman]]<ref>{{cite book|title=Bright Air, Brilliant Fire: On the Matter of the Mind|author=Gerald Edelman|publisher=Basic Books|year=1993|isbn=978-0-465-00764-6|author-link=Gerald Edelman|url-access=registration|url=https://archive.org/details/brightairbrillia00gera}}</ref> and [[António Damásio|Antonio Damasio]],<ref name=DamasioFeeling>{{cite book|author=Antonio Damasio|year=1999|title=The Feeling of What Happens: Body and Emotion in the Making of Consciousness|location=New York|publisher=Harcourt Press|isbn=978-0-15-601075-7|author-link=Antonio Damasio|url=https://archive.org/details/feelingofwhathap00dama_0}}</ref> and by philosophers such as Daniel Dennett,<ref>{{cite book|author=Daniel Dennett|year=1991|title=Consciousness Explained|url=https://archive.org/details/consciousnessexp00denn|url-access=registration|location=Boston|publisher=Little & Company|isbn=978-0-316-18066-5|author-link=Daniel Dennett}}</ref> seek to explain consciousness in terms of neural events occurring within the brain. Many other neuroscientists, such as [[Christof Koch]],<ref name=KochQuest>{{cite book| author=Christof Koch|year=2004|title=The Quest for Consciousness|location=Englewood, CO|publisher=Roberts & Company|isbn=978-0-9747077-0-9|author-link=Christof Koch}}</ref> have explored the neural basis of consciousness without attempting to frame all-encompassing global theories. At the same time, [[computer scientist]]s working in the field of [[artificial intelligence]] have pursued the goal of creating digital computer programs that can [[Artificial consciousness|simulate or embody consciousness]].<ref>Ron Sun and Stan Franklin, Computational models of consciousness: A taxonomy and some examples. In: P.D. Zelazo, M. Moscovitch, and E. Thompson (eds.), ''The Cambridge Handbook of Consciousness'', pp. 151–174. Cambridge University Press, New York. 2007</ref> | ||
A few theoretical physicists have argued that classical physics is intrinsically incapable of explaining the holistic aspects of consciousness, but that [[Quantum mechanics|quantum theory]] may provide the missing ingredients. Several theorists have therefore proposed [[quantum mind]] (QM) theories of consciousness.<ref name="Stanford_qm_cos">{{cite book|title=Quantum Approaches to Consciousness|publisher=Stanford University|date=December 25, 2011|url=http://plato.stanford.edu/entries/qt-consciousness/|access-date=December 25, 2011|archive-date=August 8, 2021|archive-url=https://web.archive.org/web/20210808080906/https://plato.stanford.edu/entries/qt-consciousness/|url-status=live}}</ref> Notable theories falling into this category include the [[holonomic brain theory]] of [[Karl H. Pribram|Karl Pribram]] and [[David Bohm]], and the [[Orch-OR|Orch-OR theory]] formulated by [[Stuart Hameroff]] and [[Roger Penrose]]. Some of these QM theories offer descriptions of phenomenal consciousness, as well as QM interpretations of access consciousness. None of the quantum mechanical theories have been confirmed by experiment. Recent publications by G. Guerreshi, J. Cia, S. Popescu, and H. Briegel<ref name="Cai2010">{{cite journal|doi=10.1103/PhysRevE.82.021921|pmid=20866851|last1=Cai|first1=J.|last2=Popescu|first2=S.|last3=Briegel|first3=H.|title=Persistent dynamic entanglement from classical motion: How bio-molecular machines can generate non-trivial quantum states|journal=Physical Review E|volume=82|issue=2| | A few theoretical physicists have argued that classical physics is intrinsically incapable of explaining the holistic aspects of consciousness, but that [[Quantum mechanics|quantum theory]] may provide the missing ingredients. Several theorists have therefore proposed [[quantum mind]] (QM) theories of consciousness.<ref name="Stanford_qm_cos">{{cite book|title=Quantum Approaches to Consciousness|publisher=Stanford University|date=December 25, 2011|url=http://plato.stanford.edu/entries/qt-consciousness/|access-date=December 25, 2011|archive-date=August 8, 2021|archive-url=https://web.archive.org/web/20210808080906/https://plato.stanford.edu/entries/qt-consciousness/|url-status=live}}</ref> Notable theories falling into this category include the [[holonomic brain theory]] of [[Karl H. Pribram|Karl Pribram]] and [[David Bohm]], and the [[Orch-OR|Orch-OR theory]] formulated by [[Stuart Hameroff]] and [[Roger Penrose]]. Some of these QM theories offer descriptions of phenomenal consciousness, as well as QM interpretations of access consciousness. None of the quantum mechanical theories have been confirmed by experiment. Recent publications by G. Guerreshi, J. Cia, S. Popescu, and H. Briegel<ref name="Cai2010">{{cite journal|doi=10.1103/PhysRevE.82.021921|pmid=20866851|last1=Cai|first1=J.|last2=Popescu|first2=S.|last3=Briegel|first3=H.|title=Persistent dynamic entanglement from classical motion: How bio-molecular machines can generate non-trivial quantum states|journal=Physical Review E|volume=82|issue=2|article-number=021921|arxiv=0809.4906|bibcode=2010PhRvE..82b1921C|year=2010|s2cid=23336691}}</ref> could falsify proposals such as those of Hameroff, which rely on [[quantum entanglement]] in protein. At the present time many scientists and philosophers consider the arguments for an important role of quantum phenomena to be unconvincing.<ref>{{cite book|author=John Searle|year=1997|title=The Mystery of Consciousness|publisher=The New York Review of Books|pages=53–88|isbn=978-0-940322-06-6|author-link=John Searle}}</ref> Empirical evidence is against the notion of quantum consciousness, an experiment about [[wave function collapse]] led by [[Catalina Curceanu]] in 2022 suggests that quantum consciousness, as suggested by [[Roger Penrose]] and [[Stuart Hameroff]], is highly implausible.<ref name="Curceanuetal">{{cite journal|last1=Derakhshani|first1=Maaneli|last2=Diósi|first2=Lajos|last3=Laubenstein|first3=Matthias|last4=Piscicchia|first4=Kristian|last5=Curceanu|first5=Catalina|title=At the crossroad of the search for spontaneous radiation and the Orch OR consciousness theory|journal=Physics of Life Reviews|date=September 2022|volume=42|pages=8–14|doi=10.1016/j.plrev.2022.05.004|pmid=35617922|bibcode=2022PhLRv..42....8D|url=https://www.sciencedirect.com/science/article/abs/pii/S1571064522000197|url-access=subscription}}</ref> | ||
Apart from the general question of the [[Hard problem of consciousness|"hard problem" of consciousness]] (which is, roughly speaking, the question of how mental experience can arise from a physical basis<ref>{{cite book|title=The Consciousness Paradox: Consciousness, Concepts, and Higher-Order Thoughts|author= Rocco J. Gennaro|chapter-url=https://books.google.com/books?id=t-XgKMgzwk4C&pg=PA75|page=75|chapter=§4.4 The hard problem of consciousness|isbn=978-0-262-01660-5|year=2011|publisher=MIT Press}}</ref>), a more specialized question is how to square the subjective notion that we are in control of our decisions (at least in some small measure) with the customary view of causality that subsequent events are caused by prior events. The topic of [[free will]] is the philosophical and scientific examination of this conundrum. | Apart from the general question of the [[Hard problem of consciousness|"hard problem" of consciousness]] (which is, roughly speaking, the question of how mental experience can arise from a physical basis<ref>{{cite book|title=The Consciousness Paradox: Consciousness, Concepts, and Higher-Order Thoughts|author= Rocco J. Gennaro|chapter-url=https://books.google.com/books?id=t-XgKMgzwk4C&pg=PA75|page=75|chapter=§4.4 The hard problem of consciousness|isbn=978-0-262-01660-5|year=2011|publisher=MIT Press}}</ref>), a more specialized question is how to square the subjective notion that we are in control of our decisions (at least in some small measure) with the customary view of causality that subsequent events are caused by prior events. The topic of [[free will]] is the philosophical and scientific examination of this conundrum. | ||
===Problem of other minds=== | === Problem of other minds === | ||
{{Main|Problem of other minds}} | {{Main|Problem of other minds}} | ||
| Line 137: | Line 132: | ||
{{Main|Qualia}} | {{Main|Qualia}} | ||
The term "qualia" was introduced in philosophical literature by [[C. I. Lewis]]. The word is derived from Latin and means "of what sort". It is basically a quantity or property of something as perceived or experienced by an individual, like the scent of rose, the taste of wine, or the pain of a headache. They are difficult to articulate or describe. The philosopher and scientist [[Daniel Dennett]] describes them as "the way things seem to us", while philosopher and cognitive scientist [[David Chalmers]] expanded on qualia as the "[[hard problem of consciousness]]" in the 1990s. When qualia | The term "qualia" was introduced in philosophical literature by [[C. I. Lewis]]. The word is derived from Latin and means "of what sort". It is basically a quantity or property of something as perceived or experienced by an individual, like the scent of rose, the taste of wine, or the pain of a headache. They are difficult to articulate or describe. The philosopher and scientist [[Daniel Dennett]] describes them as "the way things seem to us", while philosopher and cognitive scientist [[David Chalmers]] expanded on qualia as the "[[hard problem of consciousness]]" in the 1990s. When qualia are experienced, activity is simulated in the brain, and these processes are called [[neural correlates of consciousness]] (NCCs). Many scientific studies have been done to attempt to link particular brain regions with emotions or experiences.<ref name=":0">{{Cite book |last1=Parsons |first1=Paul |title=50 Ideas You Really Need to Know: Science |last2=Dixon |first2=Gail |publisher=[[Quercus]] |year=2016 |isbn=978-1-78429-614-8 |location=London |pages=141–143 |language=en}}</ref><ref>Oxford English Dictionary, "qualia", 3rd ed., Oxford University Press, 2010. Accessed October 3, 2024. https://www.oed.com/search/dictionary/?scope=Entries&q=qualia.</ref><ref>{{Cite web |title=Qualia {{!}} Internet Encyclopedia of Philosophy |url=https://iep.utm.edu/qualia/#:~:text=The%20term%20%E2%80%9Cqualia%E2%80%9D%20(singular,properties%20of%20sense%2Ddata%20themselves. |access-date=4 October 2024 |website=Internet Encyclopedia of Philosophy}}</ref> | ||
Species which experience qualia are said to have [[sentience]], which is central to the [[animal rights movement]], because it includes the ability to experience pain and suffering.<ref name=":0" /> | Species which experience qualia are said to have [[sentience]], which is central to the [[animal rights movement]], because it includes the ability to experience pain and suffering.<ref name=":0" /> | ||
| Line 144: | Line 139: | ||
{{Main|Personal identity}} | {{Main|Personal identity}} | ||
An unsolved problem in the philosophy of consciousness is how it relates to the nature of personal identity.<ref>{{cite web |title=Personal Identity - Internet Encyclopedia of Philosophy |url=http://www.iep.utm.edu/person-i/ |url-status=live |archive-url=https://web.archive.org/web/20170903032724/http://www.iep.utm.edu/person-i/ |archive-date=3 September 2017 |access-date=24 January 2025 |website=www.iep.utm.edu}}</ref> This includes questions regarding whether someone is the "same person" from moment to moment. If that is the case, another question is what exactly the "identity carrier" is that makes a conscious being "the same" being from one moment to the next. The problem of determining personal identity also includes questions such as Benj Hellie's [[vertiginous question]], which can be summarized as "Why am I me and not someone else?".<ref>{{cite journal |last1=Hellie |first1=Benj | An unsolved problem in the philosophy of consciousness is how it relates to the nature of personal identity.<ref>{{cite web |title=Personal Identity - Internet Encyclopedia of Philosophy |url=http://www.iep.utm.edu/person-i/ |url-status=live |archive-url=https://web.archive.org/web/20170903032724/http://www.iep.utm.edu/person-i/ |archive-date=3 September 2017 |access-date=24 January 2025 |website=www.iep.utm.edu}}</ref> This includes questions regarding whether someone is the "same person" from moment to moment. If that is the case, another question is what exactly the "identity carrier" is that makes a conscious being "the same" being from one moment to the next. The problem of determining personal identity also includes questions such as Benj Hellie's [[vertiginous question]], which can be summarized as "Why am I me and not someone else?".<ref>{{cite journal |last1=Hellie |first1=Benj |date=2014 |title=Against Egalitarianism |url=https://philpapers.org/rec/HELCFC |journal=Analysis |volume=73 |issue= 2|pages=304–320 |doi=10.1093/analys/ans101 }}</ref> The philosophical problems regarding the nature of personal identity have been extensively discussed by Thomas Nagel in his book ''[[The View from Nowhere]]''. | ||
A common view of personal identity is that an individual has a continuous identity that persists from moment to moment, with an individual having a continuous identity consisting of a line segment stretching across time from birth to death. In the case of an afterlife as described in Abrahamic religions, one's personal identity is believed to stretch infinitely into the future, forming a ray or line. This notion of identity is similar to the form of dualism advocated by René Descartes. However, some philosophers argue that this common notion of personal identity is unfounded. [[Daniel Kolak]] has argued extensively against it in his book ''I am You''.<ref>{{Cite book |last=Kolak |first=Daniel |url=https://digitalphysics.ru/pdf/Kaminskii_A_V/Kolak_I_Am_You.pdf |title=I Am You: The Metaphysical Foundations for Global Ethics |date=2007-11-03 |publisher=Springer Science & Business Media |isbn=978-1-4020-3014-7 |language=en |archive-url=https://web.archive.org/web/20240906163443/https://digitalphysics.ru/pdf/Kaminskii_A_V/Kolak_I_Am_You.pdf |archive-date=2024-09-06 |url-status=live}}</ref> Kolak refers to the aforementioned notion of personal identity being linear as "Closed individualism". Another view of personal identity according to Kolak is "Empty individualism", in which one's personal identity only exists for a single moment of time. However, Kolak advocates for a view of personal identity called [[Open individualism]], in which all consciousness is in reality a single being and individual personal identity in reality does not exist at all. Another philosopher who has contested the notion of personal identity is [[Derek Parfit]]. In his book ''[[Reasons and Persons]]'',<ref>{{Cite book |last=Parfit |first=Derek |url=https://archive.org/details/trent_0116300637661/page/n5/mode/2up |title=Reasons and persons |date=1984 |isbn=0-19-824615-3 |location=Oxford [Oxfordshire] |publisher=Clarendon Press |oclc=9827659}}</ref> he describes a thought experiment known as the [[teletransportation paradox]]. In Buddhist philosophy, the concept of [[anattā]] refers to the idea that the self is an illusion. | A common view of personal identity is that an individual has a continuous identity that persists from moment to moment, with an individual having a continuous identity consisting of a line segment stretching across time from birth to death. In the case of an afterlife as described in Abrahamic religions, one's personal identity is believed to stretch infinitely into the future, forming a ray or line. This notion of identity is similar to the form of dualism advocated by René Descartes. However, some philosophers argue that this common notion of personal identity is unfounded. [[Daniel Kolak]] has argued extensively against it in his book ''I am You''.<ref>{{Cite book |last=Kolak |first=Daniel |url=https://digitalphysics.ru/pdf/Kaminskii_A_V/Kolak_I_Am_You.pdf |title=I Am You: The Metaphysical Foundations for Global Ethics |date=2007-11-03 |publisher=Springer Science & Business Media |isbn=978-1-4020-3014-7 |language=en |archive-url=https://web.archive.org/web/20240906163443/https://digitalphysics.ru/pdf/Kaminskii_A_V/Kolak_I_Am_You.pdf |archive-date=2024-09-06 |url-status=live}}</ref> Kolak refers to the aforementioned notion of personal identity being linear as "Closed individualism". Another view of personal identity according to Kolak is "Empty individualism", in which one's personal identity only exists for a single moment of time. However, Kolak advocates for a view of personal identity called [[Open individualism]], in which all consciousness is in reality a single being and individual personal identity in reality does not exist at all. Another philosopher who has contested the notion of personal identity is [[Derek Parfit]]. In his book ''[[Reasons and Persons]]'',<ref>{{Cite book |last=Parfit |first=Derek |url=https://archive.org/details/trent_0116300637661/page/n5/mode/2up |title=Reasons and persons |date=1984 |isbn=0-19-824615-3 |location=Oxford [Oxfordshire] |publisher=Clarendon Press |oclc=9827659}}</ref> he describes a thought experiment known as the [[teletransportation paradox]]. In Buddhist philosophy, the concept of [[anattā]] refers to the idea that the self is an illusion. | ||
Other philosophers have argued that Hellie's vertiginous question has a number of philosophical implications relating to the [[Metaphysics|metaphysical]] nature of consciousness. [[Christian List]] argues that the vertiginous question and the existence of first-personal facts is evidence against physicalism, and evidence against other third-personal metaphysical pictures, including standard versions of [[Mind–body dualism|dualism]].<ref>{{cite web |url=https://philpapers.org/rec/LISTFA |title=The first-personal argument against physicalism |last=List |first=Christian |date=2023 |access-date=3 September 2024}}</ref> List also argues that the vertiginous question implies a "quadrilemma" for theories of consciousness. He claims that at most three of the following metaphysical claims can be true: 'first-person [[Philosophical realism|realism]]', 'non-[[solipsism]]', 'non-fragmentation', and 'one world' | Other philosophers have argued that Hellie's vertiginous question has a number of philosophical implications relating to the [[Metaphysics|metaphysical]] nature of consciousness. [[Christian List]] argues that the vertiginous question and the existence of first-personal facts is evidence against physicalism, and evidence against other third-personal metaphysical pictures, including standard versions of [[Mind–body dualism|dualism]].<ref>{{cite web |url=https://philpapers.org/rec/LISTFA |title=The first-personal argument against physicalism |last=List |first=Christian |date=2023 |access-date=3 September 2024}}</ref> List also argues that the vertiginous question implies a "quadrilemma" for theories of consciousness. He claims that at most three of the following metaphysical claims can be true: 'first-person [[Philosophical realism|realism]]', 'non-[[solipsism]]', 'non-fragmentation', and 'one world'—and at least one of these four must be false.<ref>{{cite web |url=https://philarchive.org/rec/LISAQF |title=A quadrilemma for theories of consciousness |last=List |first=Christian |date=2023 |publisher=The Philosophical Quarterly |access-date=24 January 2025}}</ref> List has proposed a model he calls the "many-worlds theory of consciousness" in order to reconcile the subjective nature of consciousness without lapsing into solipsism.<ref>{{cite web |url=https://philarchive.org/rec/LISTMT-2 |title=The many-worlds theory of consciousness |last=List |first=Christian |date=2023 |publisher=The Philosophical Quarterly |access-date=24 January 2025}}</ref> Vincent Conitzer argues that the nature of identity is connected to [[A series and B series]] theories of time, and that A-theory being true implies that the "I" is metaphysically distinguished from other perspectives.<ref>{{cite arXiv|last=Conitzer|first=Vincent |date=30 Aug 2020|title=The Personalized A-Theory of Time and Perspective|eprint=2008.13207v1|class=physics.hist-ph}}</ref> Other philosophical theories regarding the metaphysical nature of self are Caspar Hare's theories of [[perspectival realism]],<ref>{{cite journal |last=Hare |first=Caspar |date=September 2010 |title=Realism About Tense and Perspective |url=http://web.mit.edu/~casparh/www/Papers/CJHarePerspectivalRealism.pdf |journal=Philosophy Compass |volume=5 |issue=9 |pages=760–769 |doi=10.1111/j.1747-9991.2010.00325.x |hdl-access=free |hdl=1721.1/115229}}</ref> in which things within perceptual awareness have a defining intrinsic property that exists absolutely and not relative to anything, and [[egocentric presentism]], in which the experiences of other individuals are not ''present'' in the way that one's current perspective is.<ref name="JPhil">{{cite journal|last=Hare|first=Caspar|title=Self-Bias, Time-Bias, and the Metaphysics of Self and Time|journal=The Journal of Philosophy|date=July 2007|volume=104|issue=7|pages=350–373|doi=10.5840/jphil2007104717|url=http://web.mit.edu/~casparh/www/Papers/CJHareSelfBias2.pdf}}</ref><ref>{{cite book|last=Hare|first=Caspar|title=On Myself, and Other, Less Important Subjects|year=2009|publisher=Princeton University Press|isbn=978-0-691-13531-1|url=http://press.princeton.edu/titles/8921.html}}</ref> | ||
==Scientific study== | == Scientific study == | ||
For many decades, consciousness as a research topic was avoided by the majority of mainstream scientists, because of a general feeling that a phenomenon defined in subjective terms could not properly be studied using objective experimental methods.<ref>{{cite book|author=Horst Hendriks-Jansen|title=Catching ourselves in the act: situated activity, interactive emergence, evolution, and human thought|year=1996|publisher=Massachusetts Institute of Technology|page=114|isbn=978-0-262-08246-4}}</ref> In 1975 [[George Mandler]] published an influential psychological study which distinguished between slow, serial, and limited conscious processes and fast, parallel and extensive unconscious ones.<ref>Mandler, G. "Consciousness: Respectable, useful, and probably necessary". In R. Solso (Ed.) ''Information processing and cognition'': NJ: LEA.</ref> The Science and Religion Forum<ref>{{Cite web|date=2021|title=Science and Religion Forum|url=https://www.srforum.org/about|url-status=live |archive-url=https://web.archive.org/web/20161103075415/http://srforum.org/about/|archive-date=2016-11-03}}</ref> 1984 annual conference, '''From Artificial Intelligence to Human Consciousness''<nowiki/>' identified the nature of consciousness as a matter for investigation; [[Donald Michie]] was a keynote speaker. Starting in the 1980s, an expanding community of neuroscientists and psychologists have associated themselves with a field called ''Consciousness Studies'', giving rise to a stream of experimental work published in books,<ref>Mandler, G. Consciousness recovered: Psychological functions and origins of thought. Philadelphia: John Benjamins. 2002</ref> journals such as ''[[Consciousness and Cognition]]'', ''Frontiers in Consciousness Research'', ''[[Psyche (consciousness journal)|Psyche]]'', and the ''[[Journal of Consciousness Studies]]'', along with regular conferences organized by groups such as the [[Association for the Scientific Study of Consciousness]]<ref>{{cite book|title=Toward a Science of Consciousness III: The Third Tucson Discussions and Debates|author=Stuart Hameroff|author2=Alfred Kaszniak|author3-link=David Chalmers|author3=David Chalmers|chapter=Preface|isbn=978-0-262-58181-3|publisher=MIT Press|year=1999|pages=xix–xx|author-link=Stuart Hameroff}}</ref> and the [[Society for Consciousness Studies]]. | For many decades, consciousness as a research topic was avoided by the majority of mainstream scientists, because of a general feeling that a phenomenon defined in subjective terms could not properly be studied using objective experimental methods.<ref>{{cite book|author=Horst Hendriks-Jansen|title=Catching ourselves in the act: situated activity, interactive emergence, evolution, and human thought|year=1996|publisher=Massachusetts Institute of Technology|page=114|isbn=978-0-262-08246-4}}</ref> In 1975 [[George Mandler]] published an influential psychological study which distinguished between slow, serial, and limited conscious processes and fast, parallel and extensive unconscious ones.<ref>Mandler, G. "Consciousness: Respectable, useful, and probably necessary". In R. Solso (Ed.) ''Information processing and cognition'': NJ: LEA.</ref> The Science and Religion Forum<ref>{{Cite web|date=2021|title=Science and Religion Forum|url=https://www.srforum.org/about|url-status=live |archive-url=https://web.archive.org/web/20161103075415/http://srforum.org/about/|archive-date=2016-11-03}}</ref> 1984 annual conference, '''From Artificial Intelligence to Human Consciousness''<nowiki/>' identified the nature of consciousness as a matter for investigation; [[Donald Michie]] was a keynote speaker. Starting in the 1980s, an expanding community of neuroscientists and psychologists have associated themselves with a field called ''Consciousness Studies'', giving rise to a stream of experimental work published in books,<ref>Mandler, G. Consciousness recovered: Psychological functions and origins of thought. Philadelphia: John Benjamins. 2002</ref> journals such as ''[[Consciousness and Cognition]]'', ''Frontiers in Consciousness Research'', ''[[Psyche (consciousness journal)|Psyche]]'', and the ''[[Journal of Consciousness Studies]]'', along with regular conferences organized by groups such as the [[Association for the Scientific Study of Consciousness]]<ref>{{cite book|title=Toward a Science of Consciousness III: The Third Tucson Discussions and Debates|author=Stuart Hameroff|author2=Alfred Kaszniak|author3-link=David Chalmers|author3=David Chalmers|chapter=Preface|isbn=978-0-262-58181-3|publisher=MIT Press|year=1999|pages=xix–xx|author-link=Stuart Hameroff}}</ref> and the [[Society for Consciousness Studies]]. | ||
Modern medical and psychological investigations into consciousness are based on psychological experiments (including, for example, the investigation of [[Priming (psychology)|priming]] effects using [[subliminal stimuli]]),<ref>Lucido, R. J. (2023). Testing the consciousness causing collapse interpretation of quantum mechanics using subliminal primes derived from random fluctuations in radioactive decay. Journal of Consciousness Exploration & Research, 14(3), 185-194. https://doi.org/10.13140/RG.2.2.20344.72969</ref> and on [[case studies]] of alterations in consciousness produced by trauma, illness, or drugs. Broadly viewed, scientific approaches are based on two core concepts. The first identifies the content of consciousness with the experiences that are reported by human subjects; the second makes use of the concept of consciousness that has been developed by neurologists and other medical professionals who deal with patients whose behavior is impaired. In either case, the ultimate goals are to develop techniques for assessing consciousness objectively in humans as well as other animals, and to understand the neural and psychological mechanisms that underlie it.<ref name=KochQuest/> | Modern medical and psychological investigations into consciousness are based on psychological experiments (including, for example, the investigation of [[Priming (psychology)|priming]] effects using [[subliminal stimuli]]),<ref>Lucido, R. J. (2023). Testing the consciousness causing collapse interpretation of quantum mechanics using subliminal primes derived from random fluctuations in radioactive decay. Journal of Consciousness Exploration & Research, 14(3), 185-194. https://doi.org/10.13140/RG.2.2.20344.72969</ref> and on [[case studies]] of alterations in consciousness produced by trauma, illness, or drugs. Broadly viewed, scientific approaches are based on two core concepts. The first identifies the content of consciousness with the experiences that are reported by human subjects; the second makes use of the concept of consciousness that has been developed by neurologists and other medical professionals who deal with patients whose behavior is impaired. In either case, the ultimate goals are to develop techniques for assessing consciousness objectively in humans as well as other animals, and to understand the neural and psychological mechanisms that underlie it.<ref name=KochQuest/> | ||
===Measurement via verbal report=== | === Measurement via verbal report === | ||
[[File:Necker cube.svg|thumb|upright|The [[Necker cube]], an ambiguous image]] | [[File:Necker cube.svg|thumb|upright|The [[Necker cube]], an ambiguous image]] | ||
| Line 175: | Line 170: | ||
Contingency awareness is another such approach, which is basically the conscious understanding of one's actions and its effects on one's environment.<ref>{{Cite web |title=Contingency Awareness - TalkSense |url=https://talksense.weebly.com/contingency-awareness.html#:~:text=Contingency%20Awareness%20(often%20referred%20to,actions%20elicit%20in%20the%20environment. |access-date=8 October 2024 |website=Weebly}}</ref> It is recognized as a factor in self-recognition. The brain processes during contingency awareness and learning is believed to rely on an intact [[medial temporal lobe]] and age. A study done in 2020 involving [[Transcranial direct-current stimulation|transcranial direct current stimulation]], [[Magnetic resonance imaging]] (MRI) and eyeblink classical conditioning supported the idea that the [[Parietal lobe|parietal cortex]] serves as a substrate for contingency awareness and that age-related disruption of this region is sufficient to impair awareness.<ref>{{Cite journal |last1=Cheng |first1=Dominic T. |last2=Katzenelson |first2=Alyssa M. |last3=Faulkner |first3=Monica L. |last4=Disterhoft |first4=John F. |last5=Power |first5=John M. |last6=Desmond |first6=John E. |date=4 March 2020 |title=Contingency awareness, aging, and the parietal lobe |journal=Neurobiology of Aging |language=en |volume=91 |pages=125–135 |doi=10.1016/j.neurobiolaging.2020.02.024 |pmc=7953809 |pmid=32241582}}</ref> | Contingency awareness is another such approach, which is basically the conscious understanding of one's actions and its effects on one's environment.<ref>{{Cite web |title=Contingency Awareness - TalkSense |url=https://talksense.weebly.com/contingency-awareness.html#:~:text=Contingency%20Awareness%20(often%20referred%20to,actions%20elicit%20in%20the%20environment. |access-date=8 October 2024 |website=Weebly}}</ref> It is recognized as a factor in self-recognition. The brain processes during contingency awareness and learning is believed to rely on an intact [[medial temporal lobe]] and age. A study done in 2020 involving [[Transcranial direct-current stimulation|transcranial direct current stimulation]], [[Magnetic resonance imaging]] (MRI) and eyeblink classical conditioning supported the idea that the [[Parietal lobe|parietal cortex]] serves as a substrate for contingency awareness and that age-related disruption of this region is sufficient to impair awareness.<ref>{{Cite journal |last1=Cheng |first1=Dominic T. |last2=Katzenelson |first2=Alyssa M. |last3=Faulkner |first3=Monica L. |last4=Disterhoft |first4=John F. |last5=Power |first5=John M. |last6=Desmond |first6=John E. |date=4 March 2020 |title=Contingency awareness, aging, and the parietal lobe |journal=Neurobiology of Aging |language=en |volume=91 |pages=125–135 |doi=10.1016/j.neurobiolaging.2020.02.024 |pmc=7953809 |pmid=32241582}}</ref> | ||
===Neural correlates=== | === Neural correlates === | ||
{{Main|Neural correlates of consciousness}} | |||
[[File:Neural Correlates Of Consciousness.jpg|thumb|upright=1.6|{{center|Schema of the neural processes underlying consciousness, from [[Christof Koch]]}}]] | [[File:Neural Correlates Of Consciousness.jpg|thumb|upright=1.6|{{center|Schema of the neural processes underlying consciousness, from [[Christof Koch]]}}]] | ||
A major part of the scientific literature on consciousness consists of studies that examine the relationship between the experiences reported by subjects and the activity that simultaneously takes place in their brains—that is, studies of the [[neural correlates of consciousness]]. The hope is to find that activity in a particular part of the brain, or a particular pattern of global brain activity, which will be strongly predictive of conscious awareness. Several brain imaging techniques, such as [[EEG]] and [[fMRI]], have been used for physical measures of brain activity in these studies.<ref>{{cite book|author=Christof Koch|year=2004|title=The Quest for Consciousness|location=Englewood, CO|publisher=Roberts & Company|isbn=978-0-9747077-0-9|pages=16–19|author-link=Christof Koch}}</ref> | A major part of the scientific literature on consciousness consists of studies that examine the relationship between the experiences reported by subjects and the activity that simultaneously takes place in their brains—that is, studies of the [[neural correlates of consciousness]]. The hope is to find that activity in a particular part of the brain, or a particular pattern of global brain activity, which will be strongly predictive of conscious awareness. Several brain imaging techniques, such as [[EEG]] and [[fMRI]], have been used for physical measures of brain activity in these studies.<ref>{{cite book|author=Christof Koch|year=2004|title=The Quest for Consciousness|location=Englewood, CO|publisher=Roberts & Company|isbn=978-0-9747077-0-9|pages=16–19|author-link=Christof Koch}}</ref> | ||
Another idea that has drawn attention for several decades is that consciousness is associated with high-frequency (gamma band) [[neural oscillations|oscillations in brain activity]]. This idea arose from proposals in the 1980s, by Christof von der Malsburg and Wolf Singer, that gamma oscillations could solve the so-called [[binding problem]], by linking information represented in different parts of the brain into a unified experience.<ref>{{cite journal|title=Binding by synchrony|author=Wolf Singer|journal=[[Scholarpedia]]|volume=2|issue=12| | Another idea that has drawn attention for several decades is that consciousness is associated with high-frequency (gamma band) [[neural oscillations|oscillations in brain activity]]. This idea arose from proposals in the 1980s, by Christof von der Malsburg and Wolf Singer, that gamma oscillations could solve the so-called [[binding problem]], by linking information represented in different parts of the brain into a unified experience.<ref>{{cite journal|title=Binding by synchrony|author=Wolf Singer|journal=[[Scholarpedia]]|volume=2|issue=12|page=1657|doi=10.4249/scholarpedia.1657|year=2007|bibcode=2007SchpJ...2.1657S|doi-access=free|s2cid=34682132}}</ref> [[Rodolfo Llinás]], for example, proposed that consciousness results from [[recurrent thalamo-cortical resonance]] where the specific thalamocortical systems (content) and the non-specific (centromedial thalamus) thalamocortical systems (context) interact in the [[gamma wave|gamma]] band frequency via synchronous oscillations.<ref>{{cite book|author=Rodolfo Llinás|year=2002|title=I of the vortex: from neurons to self|publisher=MIT Press|isbn=978-0-262-62163-2|author-link=Rodolfo Llinás|title-link=I of the vortex: from neurons to self}}</ref> | ||
A number of studies have shown that activity in primary sensory areas of the brain is not sufficient to produce consciousness: it is possible for subjects to report a lack of awareness even when areas such as the [[primary visual cortex|primary visual cortex (V1)]] show clear electrical responses to a stimulus.<ref>Koch, ''The Quest for Consciousness'', pp. 105–116</ref> Higher brain areas are seen as more promising, especially the [[prefrontal cortex]], which is involved in a range of higher cognitive functions collectively known as [[executive functions]].<ref>{{Cite journal|last1=Baldauf|first1=D.|last2=Desimone|first2=R.|date=2014-04-25|title=Neural Mechanisms of Object-Based Attention|journal=Science|language=en|volume=344|issue=6182|pages=424–427|doi=10.1126/science.1247003|pmid=24763592|bibcode=2014Sci...344..424B|s2cid=34728448|issn=0036-8075|doi-access=free}}</ref> There is substantial evidence that a "top-down" flow of neural activity (i.e., activity propagating from the frontal cortex to sensory areas) is more predictive of conscious awareness than a "bottom-up" flow of activity.<ref>{{cite journal|title=A framework for consciousness|author=Francis Crick|author2=Christof Koch|year=2003|journal=Nature Neuroscience|volume=6|pages=119–126|pmid=12555104|url=http://papers.klab.caltech.edu/29/1/438.pdf|doi=10.1038/nn0203-119|issue=2|s2cid=13960489 | A number of studies have shown that activity in primary sensory areas of the brain is not sufficient to produce consciousness: it is possible for subjects to report a lack of awareness even when areas such as the [[primary visual cortex|primary visual cortex (V1)]] show clear electrical responses to a stimulus.<ref>Koch, ''The Quest for Consciousness'', pp. 105–116</ref> Higher brain areas are seen as more promising, especially the [[prefrontal cortex]], which is involved in a range of higher cognitive functions collectively known as [[executive functions]].<ref>{{Cite journal|last1=Baldauf|first1=D.|last2=Desimone|first2=R.|date=2014-04-25|title=Neural Mechanisms of Object-Based Attention|journal=Science|language=en|volume=344|issue=6182|pages=424–427|doi=10.1126/science.1247003|pmid=24763592|bibcode=2014Sci...344..424B|s2cid=34728448|issn=0036-8075|doi-access=free}}</ref> There is substantial evidence that a "top-down" flow of neural activity (i.e., activity propagating from the frontal cortex to sensory areas) is more predictive of conscious awareness than a "bottom-up" flow of activity.<ref>{{cite journal|title=A framework for consciousness|author=Francis Crick|author2=Christof Koch|year=2003|journal=Nature Neuroscience|volume=6|pages=119–126|pmid=12555104|url=http://papers.klab.caltech.edu/29/1/438.pdf|doi=10.1038/nn0203-119|issue=2|s2cid=13960489|archive-url=https://web.archive.org/web/20120522054447/http://papers.klab.caltech.edu/29/1/438.pdf|archive-date=2012-05-22|author2-link=Christof Koch|author-link=Francis Crick}}</ref> The prefrontal cortex is not the only candidate area, however: studies by [[Nikos Logothetis]] and his colleagues have shown, for example, that visually responsive neurons in parts of the [[temporal lobe]] reflect the visual perception in the situation when conflicting visual images are presented to different eyes (i.e., bistable percepts during binocular rivalry).<ref>Koch, ''The Quest for Consciousness'', pp. 269–286</ref> Furthermore, top-down feedback from higher to lower visual brain areas may be weaker or absent in the peripheral visual field, as suggested by some experimental data and theoretical arguments;<ref name="Zhaoping-2019">{{Cite journal|last=Zhaoping|first=Li|date=2019-10-01|title=A new framework for understanding vision from the perspective of the primary visual cortex|url=https://psyarxiv.com/ds34j/download|journal=Current Opinion in Neurobiology|series=Computational Neuroscience|volume=58|pages=1–10|doi=10.1016/j.conb.2019.06.001|pmid=31271931|s2cid=195806018|issn=0959-4388|access-date=2022-03-02|url-access=subscription}}</ref> nevertheless humans can perceive visual inputs in the peripheral visual field arising from bottom-up V1 neural activities.<ref name="Zhaoping-2019" /><ref name="Zhaoping-2020">{{Cite journal|last=Zhaoping|first=Li|date=2020-07-30|title=The Flip Tilt Illusion: Visible in Peripheral Vision as Predicted by the Central-Peripheral Dichotomy|journal=i-Perception|volume=11|issue=4|article-number=2041669520938408|doi=10.1177/2041669520938408|issn=2041-6695|pmc=7401056|pmid=32782769}}</ref> Meanwhile, bottom-up V1 activities for the central visual fields can be vetoed, and thus made invisible to perception, by the top-down feedback, when these bottom-up signals are inconsistent with the brain's internal model of the visual world.<ref name="Zhaoping-2019" /><ref name="Zhaoping-2020" /> | ||
Modulation of neural responses may correlate with phenomenal experiences. In contrast to the raw electrical responses that do not correlate with consciousness, the modulation of these responses by other stimuli correlates surprisingly well with an important aspect of consciousness: namely with the phenomenal experience of stimulus intensity (brightness, contrast). In the research group of Danko Nikolić it has been shown that some of the changes in the subjectively perceived brightness correlated with the modulation of firing rates while others correlated with the modulation of neural synchrony.<ref>{{cite journal|author1=Biederlack J.|author2=Castelo-Branco M.|author3=Neuenschwander S.|author4=Wheeler D.W.|author5=Singer W.|author6=Nikolić D.|year = 2006|title = Brightness induction: Rate enhancement and neuronal synchronization as complementary codes|journal = Neuron|volume = 52|issue = 6| pages = 1073–1083|doi=10.1016/j.neuron.2006.11.012|pmid=17178409|s2cid=16732916|doi-access=free}}</ref> An fMRI investigation suggested that these findings were strictly limited to the primary visual areas.<ref>{{cite journal|author1=Williams Adrian L.|author2=Singh Krishna D.|author3=Smith Andrew T.|year = 2003|title = Surround modulation measured with functional MRI in the human visual cortex|journal = Journal of Neurophysiology|volume = 89|issue = 1| pages = 525–533|doi=10.1152/jn.00048.2002|pmid=12522199|citeseerx=10.1.1.137.1066}}</ref> This indicates that, in the primary visual areas, changes in firing rates and synchrony can be considered as neural correlates of qualia—at least for some type of qualia. | Modulation of neural responses may correlate with phenomenal experiences. In contrast to the raw electrical responses that do not correlate with consciousness, the modulation of these responses by other stimuli correlates surprisingly well with an important aspect of consciousness: namely with the phenomenal experience of stimulus intensity (brightness, contrast). In the research group of Danko Nikolić it has been shown that some of the changes in the subjectively perceived brightness correlated with the modulation of firing rates while others correlated with the modulation of neural synchrony.<ref>{{cite journal|author1=Biederlack J.|author2=Castelo-Branco M.|author3=Neuenschwander S.|author4=Wheeler D.W.|author5=Singer W.|author6=Nikolić D.|year = 2006|title = Brightness induction: Rate enhancement and neuronal synchronization as complementary codes|journal = Neuron|volume = 52|issue = 6| pages = 1073–1083|doi=10.1016/j.neuron.2006.11.012|pmid=17178409|s2cid=16732916|doi-access=free}}</ref> An fMRI investigation suggested that these findings were strictly limited to the primary visual areas.<ref>{{cite journal|author1=Williams Adrian L.|author2=Singh Krishna D.|author3=Smith Andrew T.|year = 2003|title = Surround modulation measured with functional MRI in the human visual cortex|journal = Journal of Neurophysiology|volume = 89|issue = 1| pages = 525–533|doi=10.1152/jn.00048.2002|pmid=12522199|citeseerx=10.1.1.137.1066}}</ref> This indicates that, in the primary visual areas, changes in firing rates and synchrony can be considered as neural correlates of qualia—at least for some type of qualia. | ||
| Line 194: | Line 191: | ||
A study in 2016 looked at lesions in specific areas of the brainstem that were associated with [[coma]] and vegetative states. A small region of the rostral dorsolateral [[pontine tegmentum]] in the brainstem was suggested to drive consciousness through functional connectivity with two cortical regions, the left ventral [[anterior insular cortex]], and the pregenual [[anterior cingulate cortex]]. These three regions may work together as a triad to maintain consciousness.<ref>{{Cite journal|last1=Fischer|first1=David B.|last2=Boes|first2=Aaron D.|last3=Demertzi|first3=Athena|last4=Evrard|first4=Henry C.|last5=Laureys|first5=Steven|last6=Edlow|first6=Brian L.|last7=Liu|first7=Hesheng|last8=Saper|first8=Clifford B.|last9=Pascual-Leone|first9=Alvaro|last10=Fox|first10=Michael D.|last11=Geerling|first11=Joel C.|date=2016-12-06|title=A human brain network derived from coma-causing brainstem lesions|journal=Neurology|language=en|volume=87|issue=23|pages=2427–2434|doi=10.1212/WNL.0000000000003404|issn=0028-3878|pmid=27815400|pmc=5177681}}</ref> | A study in 2016 looked at lesions in specific areas of the brainstem that were associated with [[coma]] and vegetative states. A small region of the rostral dorsolateral [[pontine tegmentum]] in the brainstem was suggested to drive consciousness through functional connectivity with two cortical regions, the left ventral [[anterior insular cortex]], and the pregenual [[anterior cingulate cortex]]. These three regions may work together as a triad to maintain consciousness.<ref>{{Cite journal|last1=Fischer|first1=David B.|last2=Boes|first2=Aaron D.|last3=Demertzi|first3=Athena|last4=Evrard|first4=Henry C.|last5=Laureys|first5=Steven|last6=Edlow|first6=Brian L.|last7=Liu|first7=Hesheng|last8=Saper|first8=Clifford B.|last9=Pascual-Leone|first9=Alvaro|last10=Fox|first10=Michael D.|last11=Geerling|first11=Joel C.|date=2016-12-06|title=A human brain network derived from coma-causing brainstem lesions|journal=Neurology|language=en|volume=87|issue=23|pages=2427–2434|doi=10.1212/WNL.0000000000003404|issn=0028-3878|pmid=27815400|pmc=5177681}}</ref> | ||
===Models=== | [[Krista and Tatiana Hogan]] have a unique [[thalami]]c connection that may provide insight into the philosophical and neurological foundations of consciousness. It has been argued that there's no empirical test that can conclusively establish that for some sensations, the twins share one token experience rather than two exactly matching token experiences. Yet background considerations about the way the brain has specific locations for conscious contents, combined with the evident overlapping pathways in the twins' brains, arguably implies that the twins share some conscious experiences. If this is true, then the twins may offer a proof of concept for how experiences in general could be shared between brains.<ref name="Cochrane"> {{Cite journal |last=Cochrane |first=Tom |date=2021 |title=A case of shared consciousness |url=https://philpapers.org/rec/COCACO-6 |journal=Synthese |language=en |volume=199 |issue=1–2 |pages=1019–1037 |doi=10.1007/s11229-020-02753-6 |s2cid=255063719 |issn=0039-7857}}</ref><ref>{{cite journal |last1=Kang |first1=Shao-Pu |title=Shared consciousness and asymmetry |journal=Synthese |date=2022 |volume=200 |issue=413 |article-number=413 |doi=10.1007/s11229-022-03890-w|url=https://philarchive.org/rec/KANSCA-5 }}</ref><ref>{{cite journal |last1=Roelofs |first1=Luke |last2=Sebo |first2=Jeff |title=Overlapping minds and the hedonic calculus |journal=Philosophical Studies |date=2024 |volume=181 |issue=6–7 |pages=1487–1506 |doi=10.1007/s11098-024-02167-x |doi-access=free }}</ref> | ||
{{ | |||
=== Academic definitions of consciousness === | |||
{{See also|Artificial_consciousness#Aspects_of_consciousness}} | |||
Clear definitions of consciousness in academic literature are rare. [[David Chalmers]] declared the task the [[hard problem of consciousness]]. However academic definitions do exist, from Tononi's [[integrated information theory]], Craig MacKenzie, and Cleeremans and Jimenez - the latter being a Definition of Learning with remarkable similarity to both Tononi and MacKenzie's definitions. Both [[Bernard Baars]] and [[Igor Aleksander]] worked out the [[Artificial_consciousness#Aspects_of_consciousness|aspects necessary for consciousness]]. | |||
Tononi's definition is as follows:<ref>{{Cite journal|last=Tononi|first=Giulio|date=2004-11-02|title=An information integration theory of consciousness|journal=BMC Neuroscience|volume=5|issue=1|article-number=42|doi=10.1186/1471-2202-5-42|issn=1471-2202|pmc=543470|pmid=15522121 |doi-access=free}}</ref> | |||
<blockquote> | |||
according to [[Integrated information theory]] (IIT), consciousness requires a grouping of elements within a system that have physical cause-effect power upon one another. This in turn implies that only [[Reentry (neural circuitry)|reentrant architecture]] consisting of feedback loops, whether neural or computational, will realize consciousness. | |||
</blockquote> | |||
McKenzie's definition begins:<ref>{{Cite arXiv |last=McKenzie|first=Craig|date=2024-06-01|title=Consciousness defined: requirements for biological and artificial general intelligence"|class=q-bio.NC |eprint=2406.01648}}</ref> | |||
<blockquote> | |||
Consciousness is the capacity to generate desires and decisions about perceived or imagined realities by distinguishing self from non-self through the use of perception, memory and imagination. | |||
... | |||
</blockquote> | |||
According to Axel Cleeremans and Luis Jiménez, learning is defined as:<ref>{{Cite journal|last=Cleeremans, Jiménez|date=2002|title=Implicit Learning and Consciousness: A Graded, Dynamic Perspective|journal=Psychology Press|url=https://philpapers.org/rec/CLEILA}}</ref><blockquote> | |||
a set of [[phylogenetic tree|phylogenetically]] advanced adaptation processes that critically depend on an evolved sensitivity to subjective experience so as to enable agents to afford flexible control over their actions in complex, unpredictable environments. | |||
</blockquote> | |||
This definition is notable for its similarity to the [[global workspace theory]] (GWT) theatre analogy | |||
=== Models === | |||
{{Main|Models of consciousness}} | |||
A wide range of empirical theories of consciousness have been proposed.<ref name="northoff-lamme-2020">{{cite journal|last1=Northoff|first1=Georg|last2=Lamme|first2=Victor|title=Neural signs and mechanisms of consciousness: Is there a potential convergence of theories of consciousness in sight?|journal=Neuroscience and Biobehavioral Reviews|date=2020|volume=118|pages=568–587|doi=10.1016/j.neubiorev.2020.07.019|pmid=32783969|s2cid=221084519}}</ref><ref name="seth-bayne-2022">{{cite journal|last1=Seth|first1=Anil K.|last2=Bayne|first2=Tim|title=Theories of consciousness|journal=Nature Reviews Neuroscience|date=2022|volume=23|issue=7|pages=439–452|doi=10.1038/s41583-022-00587-4|pmid=35505255|s2cid=242810797|url=http://sro.sussex.ac.uk/id/eprint/105030/1/SethBayne_NRN_accepted.pdf|access-date=2023-01-17|archive-date=2023-01-21|archive-url=https://web.archive.org/web/20230121221104/http://sro.sussex.ac.uk/id/eprint/105030/1/SethBayne_NRN_accepted.pdf|url-status=live}}</ref><ref name="doerig-et-al-2021">{{cite journal|last1=Doerig|first1=Adrian|last2=Schurger|first2=Aaron|last3=Herzog|first3=Michael H.|title=Hard criteria for empirical theories of consciousness|journal=Cognitive Neuroscience|date=2021|volume=12|issue=2|pages=41–62|doi=10.1080/17588928.2020.1772214|pmid=32663056|s2cid=220529998|doi-access=free|hdl=2066/228876|hdl-access=free}}</ref> Adrian Doerig and colleagues list 13 notable theories,<ref name="doerig-et-al-2021"/> while [[Anil Seth]] and Tim Bayne list 22 notable theories.<ref name="seth-bayne-2022"/> | A wide range of empirical theories of consciousness have been proposed.<ref name="northoff-lamme-2020">{{cite journal|last1=Northoff|first1=Georg|last2=Lamme|first2=Victor|title=Neural signs and mechanisms of consciousness: Is there a potential convergence of theories of consciousness in sight?|journal=Neuroscience and Biobehavioral Reviews|date=2020|volume=118|pages=568–587|doi=10.1016/j.neubiorev.2020.07.019|pmid=32783969|s2cid=221084519}}</ref><ref name="seth-bayne-2022">{{cite journal|last1=Seth|first1=Anil K.|last2=Bayne|first2=Tim|title=Theories of consciousness|journal=Nature Reviews Neuroscience|date=2022|volume=23|issue=7|pages=439–452|doi=10.1038/s41583-022-00587-4|pmid=35505255|s2cid=242810797|url=http://sro.sussex.ac.uk/id/eprint/105030/1/SethBayne_NRN_accepted.pdf|access-date=2023-01-17|archive-date=2023-01-21|archive-url=https://web.archive.org/web/20230121221104/http://sro.sussex.ac.uk/id/eprint/105030/1/SethBayne_NRN_accepted.pdf|url-status=live}}</ref><ref name="doerig-et-al-2021">{{cite journal|last1=Doerig|first1=Adrian|last2=Schurger|first2=Aaron|last3=Herzog|first3=Michael H.|title=Hard criteria for empirical theories of consciousness|journal=Cognitive Neuroscience|date=2021|volume=12|issue=2|pages=41–62|doi=10.1080/17588928.2020.1772214|pmid=32663056|s2cid=220529998|doi-access=free|hdl=2066/228876|hdl-access=free}}</ref> Adrian Doerig and colleagues list 13 notable theories,<ref name="doerig-et-al-2021"/> while [[Anil Seth]] and Tim Bayne list 22 notable theories.<ref name="seth-bayne-2022"/> | ||
==== Global workspace theory ==== | ==== Global workspace theory ==== | ||
[[Global workspace theory]] (GWT) is a [[cognitive architecture]] and theory of consciousness proposed by the cognitive psychologist [[Bernard Baars]] in 1988. Baars explains the theory with the metaphor of a theater, with conscious processes represented by an illuminated stage. This theater integrates inputs from a variety of unconscious and otherwise autonomous networks in the brain and then broadcasts them to unconscious networks (represented in the metaphor by a broad, unlit "audience"). The theory has since been expanded upon by other scientists including cognitive neuroscientist [[Stanislas Dehaene]] and [[Lionel Naccache]].<ref name="baars-2005">{{cite book |last1=Baars |first1=Bernard J. |title=The Boundaries of Consciousness: Neurobiology and Neuropathology |year=2005 |isbn= | [[Global workspace theory]] (GWT) is a [[cognitive architecture]] and theory of consciousness proposed by the cognitive psychologist [[Bernard Baars]] in 1988. Baars explains the theory with the metaphor of a theater, with conscious processes represented by an illuminated stage. This theater integrates inputs from a variety of unconscious and otherwise autonomous networks in the brain and then broadcasts them to unconscious networks (represented in the metaphor by a broad, unlit "audience"). The theory has since been expanded upon by other scientists including cognitive neuroscientist [[Stanislas Dehaene]] and [[Lionel Naccache]].<ref name="baars-2005">{{cite book |last1=Baars |first1=Bernard J. |title=The Boundaries of Consciousness: Neurobiology and Neuropathology |year=2005 |isbn=978-0-444-51851-4 |series=Progress in Brain Research |volume=150 |pages=45–53 |chapter=Global workspace theory of consciousness: Toward a cognitive neuroscience of human experience |citeseerx=10.1.1.456.2829 |doi=10.1016/S0079-6123(05)50004-9 |pmid=16186014}}</ref><ref name="dehaene-naccache">{{cite journal|last1=Dehaene|first1=Stanislas|last2=Naccache|first2=Lionel|title=Towards a cognitive neuroscience of consciousness: basic evidence and a workspace framework|journal=Cognition|date=2001|volume=79|issue=1–2|pages=1–37|url=http://zoo.cs.yale.edu/classes/cs671/12f/12f-papers/dehaene-consciousness.pdf|access-date=5 April 2019|doi=10.1016/S0010-0277(00)00123-2|pmid=11164022|s2cid=1762431|archive-date=13 July 2019|archive-url=https://web.archive.org/web/20190713125127/http://zoo.cs.yale.edu/classes/cs671/12f/12f-papers/dehaene-consciousness.pdf|url-status=live}}</ref> | ||
See also the [[Dehaene–Changeux model]]. | |||
==== Integrated information theory ==== | ==== Integrated information theory ==== | ||
[[Integrated information theory]] (IIT), pioneered by neuroscientist [[Giulio Tononi]] in 2004, postulates that consciousness resides in the information being processed and arises once the information reaches a certain level of complexity. | [[Integrated information theory]] (IIT), pioneered by neuroscientist [[Giulio Tononi]] in 2004, postulates that consciousness resides in the information being processed and arises once the information reaches a certain level of complexity. IIT proposes a 1:1 mapping between conscious states and precise, formal mathematical descriptions of those mental states. Proponents of this model suggest that it may provide a physical grounding for consciousness in neurons, as they provide the mechanism by which information is integrated. This also relates to the "[[hard problem of consciousness]]" proposed by [[David Chalmers]].<ref name="nature.com">{{Cite journal |last1=Tononi |first1=Giulio |last2=Boly |first2=Melanie |last3=Massimini |first3=Marcello |last4=Koch |first4=Christof |date=July 2016 |title=Integrated information theory: from consciousness to its physical substrate |url=https://www.nature.com/articles/nrn.2016.44 |url-status=live |journal=Nature Reviews Neuroscience |language=en |volume=17 |issue=7 |pages=450–461 |doi=10.1038/nrn.2016.44 |issn=1471-0048 |pmid=27225071 |s2cid=21347087 |url-access=subscription |archive-url=https://web.archive.org/web/20230504082713/https://www.nature.com/articles/nrn.2016.44 |archive-date=2023-05-04 |access-date=2023-05-21}}</ref><ref name=":0" /> In 2023, 124 scholars signed a letter saying that IIT gets disproportionate media attention relative to its supporting empirical evidence, and called it "pseudoscience", arguing that its core assumptions are not adequately testable. This led to academic debate, as some other researchers objected to the "pseudoscience" characterization.<ref>{{Cite journal |last=Lenharo |first=Mariana |date=2023-09-20 |title=Consciousness theory slammed as 'pseudoscience' — sparking uproar |url=https://www.nature.com/articles/d41586-023-02971-1 |journal=Nature |language=en |doi=10.1038/d41586-023-02971-1|pmid=37730789|url-access=subscription }}</ref> | ||
==== Orchestrated objective reduction ==== | ==== Orchestrated objective reduction ==== | ||
[[Orchestrated objective reduction]] (Orch-OR), or the quantum theory of mind, was proposed by scientists [[Roger Penrose]] and [[Stuart Hameroff]], and states that consciousness originates at the quantum level inside neurons. The mechanism is held to be a quantum process called objective reduction that is orchestrated by cellular structures called [[microtubule]]s, which form the cytoskeleton around which the brain is built. The duo proposed that these quantum processes accounted for creativity, innovation, and problem-solving abilities. Penrose published his views in the book ''[[The Emperor's New Mind]]''. In 2014, the discovery of quantum vibrations inside microtubules gave new life to the argument.<ref name=":0" /> | [[Orchestrated objective reduction]] (Orch-OR), or the quantum theory of mind, was proposed by scientists [[Roger Penrose]] and [[Stuart Hameroff]], and states that consciousness originates at the quantum level inside neurons. The mechanism is held to be a quantum process called objective reduction that is orchestrated by cellular structures called [[microtubule]]s, which form the cytoskeleton around which the brain is built. The duo proposed that these quantum processes accounted for creativity, innovation, and problem-solving abilities. Penrose published his views in the book ''[[The Emperor's New Mind]]''. In 2014, the discovery of quantum vibrations inside microtubules gave new life to the argument.<ref name=":0" /> | ||
However, scientists and philosophers have criticized Penrose's interpretation of [[Gödel's incompleteness theorems|Gödel's theorem]] and his conclusion that quantum phenomena play a role in human cognition.<ref>{{Cite web |title=Lucas-Penrose Argument about Gödel's Theorem |url=https://iep.utm.edu/lp-argue/ |access-date=2025-07-27 |website=Internet Encyclopedia of Philosophy |language=en-US}}</ref> | |||
==== Attention schema theory ==== | ==== Attention schema theory ==== | ||
In 2011, [[Michael | In 2011, [[Michael Graziano]] and Kastner<ref name="Graziano&Kastner2011">{{cite journal|author1=Graziano, M.S.A.|author2=Kastner, S|year=2011|title=Human consciousness and its relationship to social neuroscience: A novel hypothesis|journal=Cog. Neurosci|volume=2|issue=2|pages=98–113|doi=10.1080/17588928.2011.565121|pmid=22121395|pmc=3223025}}</ref> proposed the [[attention schema theory|"attention schema" theory of awareness]]. Graziano went on to publish an expanded discussion of this theory in his book "Consciousness and the Social Brain".<ref>{{Cite book|last=Graziano|first=Michael|year=2013|title=Consciousness and the Social Brain|publisher=Oxford University Press|isbn=978-0-19-992864-4}}</ref> In that theory, specific cortical areas, notably in the superior temporal sulcus and the temporo-parietal junction, are used to build the construct of awareness and attribute it to other people. The same cortical machinery is also used to attribute awareness to oneself. Damage to these cortical regions can lead to deficits in consciousness such as [[hemispatial neglect]]. In the [[attention]] schema theory, the value of explaining the feature of awareness and attributing it to a person is to gain a useful predictive model of that person's attentional processing. [[Attention]] is a style of [[Information processing (psychology)|information processing]] in which a brain focuses its resources on a limited set of interrelated signals. Awareness, in this theory, is a useful, simplified schema that represents attentional states. To be aware of X is explained by constructing a model of one's attentional focus on X. | ||
==== Entropic brain theory ==== | ==== Entropic brain theory ==== | ||
The entropic brain is a theory of conscious states informed by neuroimaging research with [[psychedelic drugs]]. The theory suggests that the brain in primary states such as [[Rapid eye movement sleep|rapid eye movement]] (REM) sleep, early [[psychosis]] and under the influence of psychedelic drugs, is in a disordered state; normal waking consciousness constrains some of this freedom and makes possible [[metacognitive]] functions such as internal self-administered [[reality testing]] and [[self-awareness]].<ref>{{cite journal|last1=Carhart-Harris|first1=R. L.|author1-link=Robin Carhart-Harris|last2=Friston|first2=K. J.|last3=Barker|first3=Eric L.|title=REBUS and the Anarchic Brain: Toward a Unified Model of the Brain Action of Psychedelics|journal=Pharmacological Reviews|date=20 June 2019|volume=71|issue=3|pages=316–344|doi=10.1124/pr.118.017160|pmid=31221820|pmc=6588209}}</ref><ref>{{cite journal|last1=Carhart-Harris|first1=Robin L.|title=The entropic brain – revisited|journal=Neuropharmacology|date=November 2018|volume=142|pages=167–178|doi=10.1016/j.neuropharm.2018.03.010|pmid=29548884|s2cid=4483591}}</ref><ref>{{cite journal|last1=Carhart-Harris|first1=Robin L.|last2=Leech|first2=Robert|last3=Hellyer|first3=Peter J.|last4=Shanahan|first4=Murray|last5=Feilding|first5=Amanda|last6=Tagliazucchi|first6=Enzo|last7=Chialvo|first7=Dante R.|last8=Nutt|first8=David|title=The entropic brain: a theory of conscious states informed by neuroimaging research with psychedelic drugs|journal=Frontiers in Human Neuroscience|date=2014|volume=8| | The entropic brain is a theory of conscious states informed by neuroimaging research with [[psychedelic drugs]]. The theory suggests that the brain in primary states such as [[Rapid eye movement sleep|rapid eye movement]] (REM) sleep, early [[psychosis]] and under the influence of psychedelic drugs, is in a disordered state; normal waking consciousness constrains some of this freedom and makes possible [[metacognitive]] functions such as internal self-administered [[reality testing]] and [[self-awareness]].<ref>{{cite journal|last1=Carhart-Harris|first1=R. L.|author1-link=Robin Carhart-Harris|last2=Friston|first2=K. J.|last3=Barker|first3=Eric L.|title=REBUS and the Anarchic Brain: Toward a Unified Model of the Brain Action of Psychedelics|journal=Pharmacological Reviews|date=20 June 2019|volume=71|issue=3|pages=316–344|doi=10.1124/pr.118.017160|pmid=31221820|pmc=6588209}}</ref><ref>{{cite journal|last1=Carhart-Harris|first1=Robin L.|title=The entropic brain – revisited|journal=Neuropharmacology|date=November 2018|volume=142|pages=167–178|doi=10.1016/j.neuropharm.2018.03.010|pmid=29548884|s2cid=4483591}}</ref><ref>{{cite journal|last1=Carhart-Harris|first1=Robin L.|last2=Leech|first2=Robert|last3=Hellyer|first3=Peter J.|last4=Shanahan|first4=Murray|last5=Feilding|first5=Amanda|last6=Tagliazucchi|first6=Enzo|last7=Chialvo|first7=Dante R.|last8=Nutt|first8=David|title=The entropic brain: a theory of conscious states informed by neuroimaging research with psychedelic drugs|journal=Frontiers in Human Neuroscience|date=2014|volume=8|page=20|doi=10.3389/fnhum.2014.00020|pmid=24550805|pmc=3909994|doi-access=free}}</ref><ref>{{Cite web|url = https://mind-foundation.org/entropy-as-more-than-chaos/|title = Entropy as More than Chaos in the Brain: Expanding Field, Expanding Minds|date = 2018-06-22|access-date = 2019-07-05|archive-date = 2019-07-05|archive-url = https://web.archive.org/web/20190705111205/https://mind-foundation.org/entropy-as-more-than-chaos/|url-status = live}}</ref> Criticism has included questioning whether the theory has been adequately tested.<ref>{{cite journal|last1=Papo|first1=David|title=Commentary: The entropic brain: a theory of conscious states informed by neuroimaging research with psychedelic drugs|journal=Frontiers in Human Neuroscience|date=30 August 2016|volume=10|page=423|doi=10.3389/fnhum.2016.00423|pmid=27624312|pmc=5004455|doi-access=free}}</ref> | ||
==== Projective consciousness model ==== | ==== Projective consciousness model ==== | ||
| Line 220: | Line 246: | ||
In 2004, a proposal was made by molecular biologist [[Francis Crick]] (co-discoverer of the double helix), which stated that to bind together an individual's experience, a conductor of an orchestra is required. Together with neuroscientist [[Christof Koch]], he proposed that this conductor would have to collate information rapidly from various regions of the brain. The duo reckoned that the [[claustrum]] was well suited for the task. However, Crick died while working on the idea.<ref name=":0" /> | In 2004, a proposal was made by molecular biologist [[Francis Crick]] (co-discoverer of the double helix), which stated that to bind together an individual's experience, a conductor of an orchestra is required. Together with neuroscientist [[Christof Koch]], he proposed that this conductor would have to collate information rapidly from various regions of the brain. The duo reckoned that the [[claustrum]] was well suited for the task. However, Crick died while working on the idea.<ref name=":0" /> | ||
The proposal is backed by a study done in 2014, where a team at the [[George Washington University]] induced unconsciousness in a 54-year-old woman suffering from [[Epilepsy|intractable epilepsy]] by stimulating her claustrum. The woman underwent depth electrode implantation and electrical stimulation mapping. The electrode between the left claustrum and anterior-dorsal insula was the one which induced unconsciousness. Correlation for interactions affecting medial parietal and posterior frontal channels during stimulation increased significantly as well. Their findings suggested that the left claustrum or anterior insula is an important part of a network that subserves consciousness, and that disruption of consciousness is related to increased [[Electroencephalography|EEG]] signal synchrony within frontal-parietal networks. However, this remains an isolated, hence inconclusive study.<ref name=":0" /><ref>{{Cite journal |last1=Koubeissi |first1=Mohamad Z. |last2=Bartolomei |first2=Fabrice |last3=Beltagy |first3=Abdelrahman |last4=Picard |first4=Fabienne |date= 2014|title=Electrical stimulation of a small brain area reversibly disrupts consciousness|url=https://linkinghub.elsevier.com/retrieve/pii/S1525505014002017 |journal=[[Epilepsy & Behavior]] |language=en |volume=37 |pages=32–35 |doi=10.1016/j.yebeh.2014.05.027|pmid=24967698|url-access=subscription }}</ref> | The proposal is backed by a study done in 2014, where a team at the [[George Washington University]] induced unconsciousness in a 54-year-old woman suffering from [[Epilepsy|intractable epilepsy]] by stimulating her claustrum. The woman underwent depth electrode implantation and electrical stimulation mapping. The electrode between the left claustrum and anterior-dorsal insula was the one which induced unconsciousness. Correlation for interactions affecting medial parietal and posterior frontal channels during stimulation increased significantly as well. Their findings suggested that the left claustrum or anterior insula is an important part of a network that subserves consciousness, and that disruption of consciousness is related to increased [[Electroencephalography|EEG]] signal synchrony within frontal-parietal networks. However, this remains an isolated, hence inconclusive study.<ref name=":0" /><ref>{{Cite journal |last1=Koubeissi |first1=Mohamad Z. |last2=Bartolomei |first2=Fabrice |last3=Beltagy |first3=Abdelrahman |last4=Picard |first4=Fabienne |date= 2014|title=Electrical stimulation of a small brain area reversibly disrupts consciousness|url=https://linkinghub.elsevier.com/retrieve/pii/S1525505014002017 |journal=[[Epilepsy & Behavior]] |language=en |volume=37 |pages=32–35 |doi=10.1016/j.yebeh.2014.05.027|pmid=24967698|url-access=subscription}}</ref> | ||
A study published in 2022 opposed the idea [[Claustrum]] is the seat of consciousness but instead concluded that it is more like a "router" transferring command and information across the brain.<ref>{{Cite web |date=2022-11-14 |title=A Brain Area Thought to Impart Consciousness Instead Behaves Like an Internet Router |url=https://neurosciencenews.com/claustrum-cognition-21834/ |access-date=2025-06-22 |website=Neuroscience News |language=en-US}}</ref><ref>Madden, M. B., Stewart, B. W., White, M. G., Krimmel, S. R., Qadir, H., Barrett, F. S., ... & Mathur, B. N. (2022). A role for the claustrum in cognitive control. Trends in cognitive sciences, 26(12), 1133-1152. doi: 10.1016/j.tics.2022.09.006</ref> The study showed that when the Claustrum is disabled, complex tasks could not be performed. | |||
===Biological function and evolution=== | === Biological function and evolution === | ||
The emergence of consciousness during [[ | The emergence of consciousness during [[biological evolution]] remains a topic of ongoing scientific inquiry. The survival value of consciousness is still a matter of exploration and understanding. While consciousness appears to play a crucial role in human cognition, decision-making, and self-awareness, its adaptive significance across different species remains a subject of debate. | ||
Some people question whether consciousness has any survival value. Some argue that consciousness is a [[Spandrel (biology)|by-product of evolution]]. [[Thomas Henry Huxley]] for example defends in an essay titled "On the Hypothesis that Animals are [[Automata]], and its History" an [[epiphenomenalist]] theory of consciousness, according to which consciousness is a causally inert effect of neural activity—"as the steam-whistle which accompanies the work of a locomotive engine is without influence upon its machinery".<ref>{{cite journal|author=T.H. Huxley|title=On the hypothesis that animals are automata, and its history|journal=The Fortnightly Review|volume=16|issue=253|pages=555–580|year=1874|author-link=T.H. Huxley|bibcode=1874Natur..10..362.|doi=10.1038/010362a0|doi-access=free}}</ref> To this [[William James]] objects in his essay ''Are We Automata?'' by stating an evolutionary argument for mind-brain interaction implying that if the preservation and development of consciousness in the biological evolution is a result of [[natural selection]], it is plausible that consciousness has not only been influenced by neural processes, but has had a survival value itself; and it could only have had this if it had been efficacious.<ref>{{cite journal|author=W. James|title=Are we automata?|journal=Mind|volume=4|issue=13|pages=1–22|year=1879|doi=10.1093/mind/os-4.13.1|author-link=William James|url=https://zenodo.org/record/1431809|access-date=2019-07-05|archive-date=2019-12-24|archive-url=https://web.archive.org/web/20191224150924/https://zenodo.org/record/1431809|url-status=live}}</ref><ref>{{cite journal|author=B.I.B. Lindahl|title=Consciousness and biological evolution|journal=Journal of Theoretical Biology|volume=187|issue=4|pages=613–629|year=1997|doi=10.1006/jtbi.1996.0394|pmid=9299304|bibcode=1997JThBi.187..613L}}</ref> [[Karl Popper]] develops a similar evolutionary argument in the book ''The Self and Its Brain''.<ref name=Popper1977>{{cite book|title=The Self and Its Brain|author=[[Karl Popper|Karl R. Popper]], [[John Eccles (neurophysiologist)|John C. Eccles]]|publisher=Springer International|year=1977|isbn=978-0-387-08307-0|url-access=registration|url=https://archive.org/details/selfitsbrain0000popp}}</ref> | Some people question whether consciousness has any survival value. Some argue that consciousness is a [[Spandrel (biology)|by-product of evolution]]. [[Thomas Henry Huxley]] for example defends in an essay titled "On the Hypothesis that Animals are [[Automata]], and its History" an [[epiphenomenalist]] theory of consciousness, according to which consciousness is a causally inert effect of neural activity—"as the steam-whistle which accompanies the work of a locomotive engine is without influence upon its machinery".<ref>{{cite journal|author=T.H. Huxley|title=On the hypothesis that animals are automata, and its history|journal=The Fortnightly Review|volume=16|issue=253|pages=555–580|year=1874|author-link=T.H. Huxley|bibcode=1874Natur..10..362.|doi=10.1038/010362a0|doi-access=free}}</ref> To this [[William James]] objects in his essay ''Are We Automata?'' by stating an evolutionary argument for mind-brain interaction implying that if the preservation and development of consciousness in the biological evolution is a result of [[natural selection]], it is plausible that consciousness has not only been influenced by neural processes, but has had a survival value itself; and it could only have had this if it had been efficacious.<ref>{{cite journal|author=W. James|title=Are we automata?|journal=Mind|volume=4|issue=13|pages=1–22|year=1879|doi=10.1093/mind/os-4.13.1|author-link=William James|url=https://zenodo.org/record/1431809|access-date=2019-07-05|archive-date=2019-12-24|archive-url=https://web.archive.org/web/20191224150924/https://zenodo.org/record/1431809|url-status=live}}</ref><ref>{{cite journal|author=B.I.B. Lindahl|title=Consciousness and biological evolution|journal=Journal of Theoretical Biology|volume=187|issue=4|pages=613–629|year=1997|doi=10.1006/jtbi.1996.0394|pmid=9299304|bibcode=1997JThBi.187..613L}}</ref> [[Karl Popper]] develops a similar evolutionary argument in the book ''The Self and Its Brain''.<ref name=Popper1977>{{cite book|title=The Self and Its Brain|author=[[Karl Popper|Karl R. Popper]], [[John Eccles (neurophysiologist)|John C. Eccles]]|publisher=Springer International|year=1977|isbn=978-0-387-08307-0|url-access=registration|url=https://archive.org/details/selfitsbrain0000popp}}</ref> | ||
Opinions are divided on when and how consciousness first arose. It has been argued that consciousness emerged (i) exclusively with the first humans, (ii) exclusively with the first mammals, (iii) independently in mammals and birds, or (iv) with the first reptiles.<ref>{{cite book|author=Peter Århem|author2=B.I.B. Lindahl|author3=Paul R. Manger|author4=Ann B. Butler|year=2008|editor=Hans Liljenström|editor2=Peter Århem|chapter=On the origin of consciousness—some amniote scenarios|title=Consciousness Transitions: Phylogenetic, Ontogenetic, and Physiological Aspects|publisher=Elsevier.|chapter-url=https://books.google.com/books?id=OQGJz1DVQNMC&pg=PA77|isbn=978-0-444-52977-0}}</ref> Other authors date the origins of consciousness to the first animals with nervous systems or early vertebrates in the Cambrian over 500 million years ago.<ref name="FeinbergMallat">{{cite journal|last1=Feinberg|first1=TE|last2=Mallatt|first2=J|date=October 2013|title=The evolutionary and genetic origins of consciousness in the Cambrian Period over 500 million years ago.|journal=Frontiers in Psychology|doi=10.3389/fpsyg.2013.00667|pmid=24109460|volume=4| | Opinions are divided on when and how consciousness first arose. It has been argued that consciousness emerged (i) exclusively with the first humans, (ii) exclusively with the first mammals, (iii) independently in mammals and birds, or (iv) with the first reptiles.<ref>{{cite book|author=Peter Århem|author2=B.I.B. Lindahl|author3=Paul R. Manger|author4=Ann B. Butler|year=2008|editor=Hans Liljenström|editor2=Peter Århem|chapter=On the origin of consciousness—some amniote scenarios|title=Consciousness Transitions: Phylogenetic, Ontogenetic, and Physiological Aspects|publisher=Elsevier.|chapter-url=https://books.google.com/books?id=OQGJz1DVQNMC&pg=PA77|isbn=978-0-444-52977-0}}</ref> Other authors date the origins of consciousness to the first animals with nervous systems or early vertebrates in the Cambrian over 500 million years ago.<ref name="FeinbergMallat">{{cite journal|last1=Feinberg|first1=TE|last2=Mallatt|first2=J|date=October 2013|title=The evolutionary and genetic origins of consciousness in the Cambrian Period over 500 million years ago.|journal=Frontiers in Psychology|doi=10.3389/fpsyg.2013.00667|pmid=24109460|volume=4|page=667|pmc=3790330|doi-access=free}}</ref> [[Donald Griffin]] suggests in his book ''Animal Minds'' a gradual evolution of consciousness.<ref name="Griffin2001"/> Further exploration of the origins of consciousness, particularly in molluscs, has been done by Peter Godfrey Smith in his book ''Metazoa''.<ref>{{Cite book |title=Metazoa |last=Godfrey Smith |first=Peter |year=2021 |isbn=978-0-00-832123-9}}</ref> | ||
Regarding the primary function of conscious processing, a recurring idea in recent theories is that phenomenal states somehow integrate neural activities and information-processing that would otherwise be independent.<ref>{{cite journal|author=Bernard Baars|title=The conscious access hypothesis: Origins and recent evidence|journal=Trends in Cognitive Sciences|volume=6|pages=47–52|pmid=11849615|doi=10.1016/S1364-6613(00)01819-2|issue=1|date=January 2002|s2cid=6386902|author-link=Bernard Baars}}</ref> This has been called the ''integration consensus''. Another example has been proposed by Gerald Edelman called dynamic core hypothesis which puts emphasis on [[reentry (neural circuitry)|reentrant]] connections that reciprocally link areas of the brain in a massively parallel manner.<ref>{{cite journal|last=Seth|first=Anil|author2=Eugene Izhikevich|author3=George Reeke|author4=Gerald Edelman|title=Theories and measures of consciousness: An extended framework|journal=Proceedings of the National Academy of Sciences|year=2006|volume=103|issue=28|doi=10.1073/pnas.0604347103|pages=10799–10804|pmid=16818879|pmc=1487169|bibcode=2006PNAS..10310799S|doi-access=free}}</ref> Edelman also stresses the importance of the evolutionary emergence of higher-order consciousness in humans from the historically older trait of primary consciousness which humans share with non-human animals (see ''[[#Neural correlates|Neural correlates]]'' section above). These theories of integrative function present solutions to two classic problems associated with consciousness: differentiation and unity. They show how our conscious experience can discriminate between a virtually unlimited number of different possible scenes and details (differentiation) because it integrates those details from our sensory systems, while the integrative nature of consciousness in this view easily explains how our experience can seem unified as one whole despite all of these individual parts. However, it remains unspecified which kinds of information are integrated in a conscious manner and which kinds can be integrated without consciousness. Nor is it explained what specific causal role conscious integration plays, nor why the same functionality cannot be achieved without consciousness. Not all kinds of information are capable of being disseminated consciously (e.g., neural activity related to vegetative functions, reflexes, unconscious motor programs, low-level perceptual analyzes, etc.), and many kinds of information can be disseminated and combined with other kinds without consciousness, as in intersensory interactions such as the [[ventriloquism effect]].<ref name="ReferenceA">{{cite journal|author=Ezequiel Morsella|year=2005|title=The function of phenomenal states: Supramodular Interaction Theory|journal=Psychological Review|volume=112|pages=1000–1021|pmid=16262477|issue=4|doi=10.1037/0033-295X.112.4.1000|s2cid=2298524|url=http://pdfs.semanticscholar.org/fdd7/81a15d0405a888abe4584a99ed9cbc6fb3ff.pdf|archive-url=https://web.archive.org/web/20201118022838/http://pdfs.semanticscholar.org/fdd7/81a15d0405a888abe4584a99ed9cbc6fb3ff.pdf | Regarding the primary function of conscious processing, a recurring idea in recent theories is that phenomenal states somehow integrate neural activities and information-processing that would otherwise be independent.<ref>{{cite journal|author=Bernard Baars|title=The conscious access hypothesis: Origins and recent evidence|journal=Trends in Cognitive Sciences|volume=6|pages=47–52|pmid=11849615|doi=10.1016/S1364-6613(00)01819-2|issue=1|date=January 2002|s2cid=6386902|author-link=Bernard Baars}}</ref> This has been called the ''integration consensus''. Another example has been proposed by Gerald Edelman called dynamic core hypothesis which puts emphasis on [[reentry (neural circuitry)|reentrant]] connections that reciprocally link areas of the brain in a massively parallel manner.<ref>{{cite journal|last=Seth|first=Anil|author2=Eugene Izhikevich|author3=George Reeke|author4=Gerald Edelman|title=Theories and measures of consciousness: An extended framework|journal=Proceedings of the National Academy of Sciences|year=2006|volume=103|issue=28|doi=10.1073/pnas.0604347103|pages=10799–10804|pmid=16818879|pmc=1487169|bibcode=2006PNAS..10310799S|doi-access=free}}</ref> Edelman also stresses the importance of the evolutionary emergence of higher-order consciousness in humans from the historically older trait of primary consciousness which humans share with non-human animals (see ''[[#Neural correlates|Neural correlates]]'' section above). These theories of integrative function present solutions to two classic problems associated with consciousness: differentiation and unity. They show how our conscious experience can discriminate between a virtually unlimited number of different possible scenes and details (differentiation) because it integrates those details from our sensory systems, while the integrative nature of consciousness in this view easily explains how our experience can seem unified as one whole despite all of these individual parts. However, it remains unspecified which kinds of information are integrated in a conscious manner and which kinds can be integrated without consciousness. Nor is it explained what specific causal role conscious integration plays, nor why the same functionality cannot be achieved without consciousness. Not all kinds of information are capable of being disseminated consciously (e.g., neural activity related to vegetative functions, reflexes, unconscious motor programs, low-level perceptual analyzes, etc.), and many kinds of information can be disseminated and combined with other kinds without consciousness, as in intersensory interactions such as the [[ventriloquism effect]].<ref name="ReferenceA">{{cite journal|author=Ezequiel Morsella|year=2005|title=The function of phenomenal states: Supramodular Interaction Theory|journal=Psychological Review|volume=112|pages=1000–1021|pmid=16262477|issue=4|doi=10.1037/0033-295X.112.4.1000|s2cid=2298524|url=http://pdfs.semanticscholar.org/fdd7/81a15d0405a888abe4584a99ed9cbc6fb3ff.pdf|archive-url=https://web.archive.org/web/20201118022838/http://pdfs.semanticscholar.org/fdd7/81a15d0405a888abe4584a99ed9cbc6fb3ff.pdf|archive-date=2020-11-18}}</ref> Hence it remains unclear why any of it is conscious. For a review of the differences between conscious and unconscious integrations, see the article of Ezequiel Morsella.<ref name="ReferenceA"/> | ||
As noted earlier, even among writers who consider consciousness to be well-defined, there is [[Animal consciousness|widespread dispute]] about which animals other than humans can be said to possess it.<ref name = "ingvww">{{cite book|author=S. Budiansky|title=If a Lion Could Talk: Animal Intelligence and the Evolution of Consciousness|year=1998|publisher=The Free Press|isbn=978-0-684-83710-9|url=https://archive.org/details/iflioncouldtalka00budi}}</ref> Edelman has described this distinction as that of humans possessing higher-order consciousness while sharing the trait of primary consciousness with non-human animals (see previous paragraph). Thus, any examination of the evolution of consciousness is faced with great difficulties. Nevertheless, some writers have argued that consciousness can be viewed from the standpoint of [[evolutionary biology]] as an [[adaptation]] in the sense of a [[Phenotypic trait|trait]] that increases [[Fitness (biology)|fitness]].<ref>{{cite journal|author=S. Nichols|author2=T. Grantham|title=Adaptive Complexity and Phenomenal Consciousness|year=2000|journal=Philosophy of Science|volume=67|issue=4|pages=648–670|doi=10.1086/392859|url=http://dingo.sbs.arizona.edu/~snichols/Papers/evolcons(final).pdf|citeseerx=10.1.1.515.9722|s2cid=16484193|access-date=2017-10-25|archive-url=https://web.archive.org/web/20170813055023/http://dingo.sbs.arizona.edu/~snichols/Papers/evolcons(final).pdf|archive-date=2017-08-13 | As noted earlier, even among writers who consider consciousness to be well-defined, there is [[Animal consciousness|widespread dispute]] about which animals other than humans can be said to possess it.<ref name = "ingvww">{{cite book|author=S. Budiansky|title=If a Lion Could Talk: Animal Intelligence and the Evolution of Consciousness|year=1998|publisher=The Free Press|isbn=978-0-684-83710-9|url=https://archive.org/details/iflioncouldtalka00budi}}</ref> Edelman has described this distinction as that of humans possessing higher-order consciousness while sharing the trait of primary consciousness with non-human animals (see previous paragraph). Thus, any examination of the evolution of consciousness is faced with great difficulties. Nevertheless, some writers have argued that consciousness can be viewed from the standpoint of [[evolutionary biology]] as an [[adaptation]] in the sense of a [[Phenotypic trait|trait]] that increases [[Fitness (biology)|fitness]].<ref>{{cite journal|author=S. Nichols|author2=T. Grantham|title=Adaptive Complexity and Phenomenal Consciousness|year=2000|journal=Philosophy of Science|volume=67|issue=4|pages=648–670|doi=10.1086/392859|url=http://dingo.sbs.arizona.edu/~snichols/Papers/evolcons(final).pdf|citeseerx=10.1.1.515.9722|s2cid=16484193|access-date=2017-10-25|archive-url=https://web.archive.org/web/20170813055023/http://dingo.sbs.arizona.edu/~snichols/Papers/evolcons(final).pdf|archive-date=2017-08-13}}</ref> In his article "Evolution of consciousness", John Eccles argued that special anatomical and physical properties of the mammalian [[cerebral cortex]] gave rise to consciousness ("[a] psychon ... linked to [a] dendron through quantum physics").<ref>{{cite journal|author=John Eccles|title=Evolution of consciousness|journal=Proc. Natl. Acad. Sci. USA|volume=89|issue=16|pages=7320–7324|year=1992|pmid=1502142|pmc=49701|doi=10.1073/pnas.89.16.7320|bibcode=1992PNAS...89.7320E|author-link=John Eccles (neurophysiologist)|doi-access=free}}</ref> Bernard Baars proposed that once in place, this "recursive" circuitry may have provided a basis for the subsequent development of many of the functions that consciousness facilitates in higher organisms.<ref name=Baars>{{cite book|author=Bernard Baars|title=A Cognitive Theory of Consciousness|year=1993|publisher=Cambridge University Press|isbn=978-0-521-42743-2|author-link=Bernard Baars}}</ref> [[Peter Carruthers (philosopher)|Peter Carruthers]] has put forth one such potential adaptive advantage gained by conscious creatures by suggesting that consciousness allows an individual to make distinctions between appearance and reality.<ref>{{cite book|last=Carruthers|first=Peter|title=Phenomenal Consciousness: A Naturalistic Theory|year=2004|publisher=Cambridge University Press|location=Cambridge}}</ref> This ability would enable a creature to recognize the likelihood that their perceptions are deceiving them (e.g. that water in the distance may be a mirage) and behave accordingly, and it could also facilitate the manipulation of others by recognizing how things appear to them for both cooperative and devious ends. | ||
Other philosophers, however, have suggested that consciousness would not be necessary for any functional advantage in evolutionary processes.<ref>{{cite journal|author=Owen Flanagan|author2=T.W. Polger|year=1995|title=Zombies and the function of consciousness|journal=Journal of Consciousness Studies|volume=2|pages=313–321|author-link=Owen Flanagan}}</ref><ref>{{cite journal|last=Rosenthal|first=David|title=Consciousness and its function|journal=Neuropsychologia|year=2008|volume=46|issue=3|pages=829–840|doi=10.1016/j.neuropsychologia.2007.11.012|pmid=18164042|s2cid=7791431}}</ref> No one has given a causal explanation, they argue, of why it would not be possible for a functionally equivalent non-conscious organism (i.e., a philosophical zombie) to achieve the very same survival advantages as a conscious organism. If evolutionary processes are blind to the difference between function ''F'' being performed by conscious organism ''O'' and non-conscious organism ''O*'', it is unclear what adaptive advantage consciousness could provide.<ref>{{cite book|author=Stevan Harnad|year=2002|chapter=Turing indistinguishability and the Blind Watchmaker|editor=J.H. Fetzer|title=Consciousness Evolving|publisher=John Benjamins|chapter-url=http://cogprints.org/1615|access-date=2011-10-26|author-link=Stevan Harnad|archive-date=2011-10-28|archive-url=https://web.archive.org/web/20111028162407/http://cogprints.org/1615/|url-status=live}}</ref> As a result, an exaptive explanation of consciousness has gained favor with some theorists that posit consciousness did not evolve as an adaptation but was an [[exaptation]] arising as a consequence of other developments such as increases in brain size or cortical rearrangement.<ref name="FeinbergMallat"/> Consciousness in this sense has been compared to the blind spot in the retina where it is not an adaption of the retina, but instead just a by-product of the way the retinal axons were wired.<ref>{{cite journal|author1=Zack Robinson|author2=Corey J. Maley|author3=Gualtiero Piccinini| year = 2015|title = Is Consciousness a Spandrel?.|journal = Journal of the American Philosophical Association|volume = 1|issue = 2| pages = 365–383|doi = 10.1017/apa.2014.10|s2cid=170892645}}</ref> Several scholars including [[Steven Pinker|Pinker]], [[Noam Chomsky|Chomsky]], [[Gerald Edelman|Edelman]], and [[Salvador Luria|Luria]] have indicated the importance of the emergence of human language as an important regulative mechanism of learning and memory in the context of the development of higher-order consciousness (see ''[[#Neural correlates|Neural correlates]]'' section above). | Other philosophers, however, have suggested that consciousness would not be necessary for any functional advantage in evolutionary processes.<ref>{{cite journal|author=Owen Flanagan|author2=T.W. Polger|year=1995|title=Zombies and the function of consciousness|journal=Journal of Consciousness Studies|volume=2|pages=313–321|author-link=Owen Flanagan}}</ref><ref>{{cite journal|last=Rosenthal|first=David|title=Consciousness and its function|journal=Neuropsychologia|year=2008|volume=46|issue=3|pages=829–840|doi=10.1016/j.neuropsychologia.2007.11.012|pmid=18164042|s2cid=7791431}}</ref> No one has given a causal explanation, they argue, of why it would not be possible for a functionally equivalent non-conscious organism (i.e., a philosophical zombie) to achieve the very same survival advantages as a conscious organism. If evolutionary processes are blind to the difference between function ''F'' being performed by conscious organism ''O'' and non-conscious organism ''O*'', it is unclear what adaptive advantage consciousness could provide.<ref>{{cite book|author=Stevan Harnad|year=2002|chapter=Turing indistinguishability and the Blind Watchmaker|editor=J.H. Fetzer|title=Consciousness Evolving|publisher=John Benjamins|chapter-url=http://cogprints.org/1615|access-date=2011-10-26|author-link=Stevan Harnad|archive-date=2011-10-28|archive-url=https://web.archive.org/web/20111028162407/http://cogprints.org/1615/|url-status=live}}</ref> As a result, an exaptive explanation of consciousness has gained favor with some theorists that posit consciousness did not evolve as an adaptation but was an [[exaptation]] arising as a consequence of other developments such as increases in brain size or cortical rearrangement.<ref name="FeinbergMallat"/> Consciousness in this sense has been compared to the blind spot in the retina where it is not an adaption of the retina, but instead just a by-product of the way the retinal axons were wired.<ref>{{cite journal|author1=Zack Robinson|author2=Corey J. Maley|author3=Gualtiero Piccinini| year = 2015|title = Is Consciousness a Spandrel?.|journal = Journal of the American Philosophical Association|volume = 1|issue = 2| pages = 365–383|doi = 10.1017/apa.2014.10|s2cid=170892645}}</ref> Several scholars including [[Steven Pinker|Pinker]], [[Noam Chomsky|Chomsky]], [[Gerald Edelman|Edelman]], and [[Salvador Luria|Luria]] have indicated the importance of the emergence of human language as an important regulative mechanism of learning and memory in the context of the development of higher-order consciousness (see ''[[#Neural correlates|Neural correlates]]'' section above). | ||
===Altered states=== | === Altered states === | ||
{{Main|Altered state of consciousness}} | {{Main|Altered state of consciousness}} | ||
[[File:Abbot of Watkungtaphao in Phu Soidao Waterfall.jpg|thumb|upright|A Buddhist monk [[Meditation|meditating]]]] | [[File:Abbot of Watkungtaphao in Phu Soidao Waterfall.jpg|thumb|upright|A Buddhist monk [[Meditation|meditating]]]] | ||
There are some brain states in which consciousness seems to be absent, including dreamless sleep or coma. There are also a variety of circumstances that can change the relationship between the mind and the world in less drastic ways, producing what are known as altered states of consciousness. Some altered states occur naturally; others can be produced by drugs or brain damage.<ref name=Vaitl>{{cite journal|last=Vaitl|first=Dieter|s2cid=6909813|title=Psychobiology of altered states of consciousness|year=2005|journal=Psychological Bulletin|volume=131|pages=98–127|doi=10.1037/0033-2909.131.1.98|pmid=15631555|issue=1|url=https://pdfs.semanticscholar.org/09d8/95b85d772fb505144969310255c0cbdc74a7.pdf|archive-url=https://web.archive.org/web/20201022093127/https://pdfs.semanticscholar.org/09d8/95b85d772fb505144969310255c0cbdc74a7.pdf | There are some brain states in which consciousness seems to be absent, including dreamless sleep or coma. There are also a variety of circumstances that can change the relationship between the mind and the world in less drastic ways, producing what are known as altered states of consciousness. Some altered states occur naturally; others can be produced by drugs or brain damage.<ref name=Vaitl>{{cite journal|last=Vaitl|first=Dieter|s2cid=6909813|title=Psychobiology of altered states of consciousness|year=2005|journal=Psychological Bulletin|volume=131|pages=98–127|doi=10.1037/0033-2909.131.1.98|pmid=15631555|issue=1|url=https://pdfs.semanticscholar.org/09d8/95b85d772fb505144969310255c0cbdc74a7.pdf|archive-url=https://web.archive.org/web/20201022093127/https://pdfs.semanticscholar.org/09d8/95b85d772fb505144969310255c0cbdc74a7.pdf|archive-date=2020-10-22}}</ref> Altered states can be accompanied by changes in thinking, disturbances in the sense of time, feelings of loss of control, changes in emotional expression, alternations in body image and changes in meaning or significance.<ref>{{cite book|last1=Schacter|first1=Daniel|last2=Gilbert|first2=Daniel|last3=Wegner|first3=Daniel|year=2011|title=Psychology 2nd Ed.|url=https://archive.org/details/psychology0000scha/page/190|location=New York|publisher=Worth Publishers|page=[https://archive.org/details/psychology0000scha/page/190 190]|isbn=978-1-4292-3719-2|access-date=27 October 2020}}</ref> | ||
The two most widely accepted altered states are [[sleep]] and [[dream]]ing. Although dream sleep and non-dream sleep appear very similar to an outside observer, each is associated with a distinct pattern of brain activity, metabolic activity, and eye movement; each is also associated with a distinct pattern of experience and cognition. During ordinary non-dream sleep, people who are awakened report only vague and sketchy thoughts, and their experiences do not cohere into a continuous narrative. During dream sleep, in contrast, people who are awakened report rich and detailed experiences in which events form a continuous progression, which may however be interrupted by bizarre or fantastic intrusions.<ref>{{cite journal|last=Coenen|first=Anton|title=Subconscious Stimulus Recognition and Processing During Sleep|url=http://journalpsyche.org/files/0xbb10.pdf|year=2010|journal=Psyche: An Interdisciplinary Journal of Research on Consciousness|volume=16-2|url-status=live|archive-url=https://web.archive.org/web/20170611115233/http://journalpsyche.org/files/0xbb10.pdf|archive-date=2017-06-11}}</ref>{{Failed verification|date=August 2021|reason=This source talks about responses to auditory stimuli during sleep, but not about dreams or the difference between dream and non-dream sleep.}} Thought processes during the dream state frequently show a high level of irrationality. Both dream and non-dream states are associated with severe disruption of memory: it usually disappears in seconds during the non-dream state, and in minutes after awakening from a dream unless actively refreshed.<ref>{{cite book|last1= Hobson|first1=J. Allan|author-link1=Allan Hobson|last2=Pace-Schott|first2=Edward F.|last3=Stickgold|first3=Robert|author-link3=Robert Stickgold|year=2003|title=Sleep and Dreaming: Scientific Advances and Reconsiderations|editor1-last=Pace-Schott|editor1-first=Edward F.|editor2-last=Solms|editor2-first=Mark|editor3-last=Blagrove|editor3-first=Mark|editor4-last=Harnad|editor4-first=Stevan|publisher=Cambridge University Press|chapter=Dreaming and the brain: Toward a cognitive neuroscience of conscious states|isbn=978-0-521-00869-3|chapter-url=https://www.researchgate.net/publication/2599957|archive-url=https://web.archive.org/web/20210810234114/https://www.researchgate.net/profile/Edward-Pace-Schott/publication/2599957_Dreaming_and_the_Brain_Toward_a_Cognitive_Neuroscience_of_Conscious_States/links/02e7e52f240372e115000000/Dreaming-and-the-Brain-Toward-a-Cognitive-Neuroscience-of-Conscious-States.pdf|archive-date=2021-08-10|url-status=live}}</ref> | The two most widely accepted altered states are [[sleep]] and [[dream]]ing. Although dream sleep and non-dream sleep appear very similar to an outside observer, each is associated with a distinct pattern of brain activity, metabolic activity, and eye movement; each is also associated with a distinct pattern of experience and cognition. During ordinary non-dream sleep, people who are awakened report only vague and sketchy thoughts, and their experiences do not cohere into a continuous narrative. During dream sleep, in contrast, people who are awakened report rich and detailed experiences in which events form a continuous progression, which may however be interrupted by bizarre or fantastic intrusions.<ref>{{cite journal|last=Coenen|first=Anton|title=Subconscious Stimulus Recognition and Processing During Sleep|url=http://journalpsyche.org/files/0xbb10.pdf|year=2010|journal=Psyche: An Interdisciplinary Journal of Research on Consciousness|volume=16-2|url-status=live|archive-url=https://web.archive.org/web/20170611115233/http://journalpsyche.org/files/0xbb10.pdf|archive-date=2017-06-11}}</ref>{{Failed verification|date=August 2021|reason=This source talks about responses to auditory stimuli during sleep, but not about dreams or the difference between dream and non-dream sleep.}} Thought processes during the dream state frequently show a high level of irrationality. Both dream and non-dream states are associated with severe disruption of memory: it usually disappears in seconds during the non-dream state, and in minutes after awakening from a dream unless actively refreshed.<ref>{{cite book|last1= Hobson|first1=J. Allan|author-link1=Allan Hobson|last2=Pace-Schott|first2=Edward F.|last3=Stickgold|first3=Robert|author-link3=Robert Stickgold|year=2003|title=Sleep and Dreaming: Scientific Advances and Reconsiderations|editor1-last=Pace-Schott|editor1-first=Edward F.|editor2-last=Solms|editor2-first=Mark|editor3-last=Blagrove|editor3-first=Mark|editor4-last=Harnad|editor4-first=Stevan|publisher=Cambridge University Press|chapter=Dreaming and the brain: Toward a cognitive neuroscience of conscious states|isbn=978-0-521-00869-3|chapter-url=https://www.researchgate.net/publication/2599957|archive-url=https://web.archive.org/web/20210810234114/https://www.researchgate.net/profile/Edward-Pace-Schott/publication/2599957_Dreaming_and_the_Brain_Toward_a_Cognitive_Neuroscience_of_Conscious_States/links/02e7e52f240372e115000000/Dreaming-and-the-Brain-Toward-a-Cognitive-Neuroscience-of-Conscious-States.pdf|archive-date=2021-08-10|url-status=live}}</ref> | ||
| Line 249: | Line 278: | ||
There has been some research into physiological changes in yogis and people who practise various techniques of [[meditation]]. Some research with brain waves during meditation has reported differences between those corresponding to ordinary relaxation and those corresponding to meditation. It has been disputed, however, whether there is enough evidence to count these as physiologically distinct states of consciousness.<ref name=MurphyMeditation>{{cite book|author1=M. Murphy|author2=S. Donovan|author3=E. Taylor|year=1997|title=The Physical and Psychological Effects of Meditation: A Review of Contemporary Research With a Comprehensive Bibliography, 1931–1996|publisher=Institute of Noetic Sciences}}</ref> | There has been some research into physiological changes in yogis and people who practise various techniques of [[meditation]]. Some research with brain waves during meditation has reported differences between those corresponding to ordinary relaxation and those corresponding to meditation. It has been disputed, however, whether there is enough evidence to count these as physiologically distinct states of consciousness.<ref name=MurphyMeditation>{{cite book|author1=M. Murphy|author2=S. Donovan|author3=E. Taylor|year=1997|title=The Physical and Psychological Effects of Meditation: A Review of Contemporary Research With a Comprehensive Bibliography, 1931–1996|publisher=Institute of Noetic Sciences}}</ref> | ||
The most extensive study of the characteristics of altered states of consciousness was made by psychologist [[Charles Tart]] in the 1960s and 1970s. Tart analyzed a state of consciousness as made up of a number of component processes, including exteroception (sensing the external world); [[interoception]] (sensing the body); input-processing (seeing meaning); emotions; memory; time sense; sense of identity; evaluation and cognitive processing; motor output; and interaction with the environment.<ref>{{cite book|last=Tart|first=Charles|author-link=Charles Tart|year=2001|title=States of Consciousness|publisher=IUniverse.com|chapter=Ch. 2: The components of consciousness|chapter-url=http://www.psychedelic-library.org/soc2.htm|isbn=978-0-595-15196-7|access-date=5 October 2011|archive-date=6 November 2011|archive-url=https://web.archive.org/web/20111106032020/http://www.psychedelic-library.org/soc2.htm|url-status=live}}</ref>{{self-published source|date=January 2023}} Each of these, in his view, could be altered in multiple ways by drugs or other manipulations. The components that Tart identified have not, however, been validated by empirical studies. Research in this area has not yet reached firm conclusions, but a recent questionnaire-based study identified eleven significant factors contributing to drug-induced states of consciousness: experience of unity; spiritual experience; blissful state; insightfulness; disembodiment; impaired control and cognition; anxiety; complex imagery; elementary imagery; audio-visual [[synesthesia]]; and changed meaning of percepts.<ref>{{cite journal|last1=Studerus|first1=Erich|last2=Gamma|first2=Alex|last3=Vollenweider|first3=Franz X.|year=2010|editor-last=Bell|editor-first=Vaughan|title=Psychometric evaluation of the altered states of consciousness rating scale (OAV)|journal=[[PLOS One]]|volume=5|issue=8| | The most extensive study of the characteristics of altered states of consciousness was made by psychologist [[Charles Tart]] in the 1960s and 1970s. Tart analyzed a state of consciousness as made up of a number of component processes, including exteroception (sensing the external world); [[interoception]] (sensing the body); input-processing (seeing meaning); emotions; memory; time sense; sense of identity; evaluation and cognitive processing; motor output; and interaction with the environment.<ref>{{cite book|last=Tart|first=Charles|author-link=Charles Tart|year=2001|title=States of Consciousness|publisher=IUniverse.com|chapter=Ch. 2: The components of consciousness|chapter-url=http://www.psychedelic-library.org/soc2.htm|isbn=978-0-595-15196-7|access-date=5 October 2011|archive-date=6 November 2011|archive-url=https://web.archive.org/web/20111106032020/http://www.psychedelic-library.org/soc2.htm|url-status=live}}</ref>{{self-published source|date=January 2023}} Each of these, in his view, could be altered in multiple ways by drugs or other manipulations. The components that Tart identified have not, however, been validated by empirical studies. Research in this area has not yet reached firm conclusions, but a recent questionnaire-based study identified eleven significant factors contributing to drug-induced states of consciousness: experience of unity; spiritual experience; blissful state; insightfulness; disembodiment; impaired control and cognition; anxiety; complex imagery; elementary imagery; audio-visual [[synesthesia]]; and changed meaning of percepts.<ref>{{cite journal|last1=Studerus|first1=Erich|last2=Gamma|first2=Alex|last3=Vollenweider|first3=Franz X.|year=2010|editor-last=Bell|editor-first=Vaughan|title=Psychometric evaluation of the altered states of consciousness rating scale (OAV)|journal=[[PLOS One]]|volume=5|issue=8|article-number=e12412|bibcode=2010PLoSO...512412S|doi=10.1371/journal.pone.0012412|pmc=2930851|pmid=20824211|doi-access=free}}</ref> | ||
== Medical aspects == | |||
{{Main|Altered level of consciousness}} | |||
The medical approach to consciousness is scientifically oriented. It derives from a need to treat people whose brain function has been impaired as a result of disease, brain damage, toxins, or drugs. In medicine, conceptual distinctions are considered useful to the degree that they can help to guide treatments. The medical approach mainly focuses on the amount of consciousness a person has: in medicine, consciousness is assessed as a "level" ranging from coma and brain death at the low end, to full alertness and purposeful responsiveness at the high end.<ref name=Blumenfeld>{{cite book|title=The Neurology of Consciousness: Cognitive Neuroscience and Neuropathology|editor=Steven Laureys|editor2=Giulio Tononi|chapter=The neurological examination of consciousness|author=Hal Blumenfeld|year=2009|publisher=Academic Press|isbn=978-0-12-374168-4}}</ref> | |||
The medical approach to consciousness is scientifically oriented. It derives from a need to treat people whose brain function has been impaired as a result of disease, brain damage, toxins, or drugs. In medicine, conceptual distinctions are considered useful to the degree that they can help to guide treatments. The medical approach focuses | |||
Consciousness is of concern to patients and physicians, especially [[neurology|neurologists]] and [[anesthesia|anesthesiologists]]. Patients may have disorders of consciousness or may need to be anesthetized for a surgical procedure. Physicians may perform consciousness-related interventions such as instructing the patient to sleep, administering [[general anesthesia]], or inducing [[induced coma|medical coma]].<ref name=Blumenfeld/> Also, [[bioethics|bioethicists]] may be concerned with the ethical implications of consciousness in medical cases of patients such as the [[Karen Ann Quinlan case]],<ref>{{cite journal|vauthors=Kinney HC, Korein J, Panigrahy A, Dikkes P, Goode R|date=26 May 1994|issue=21|journal=N Engl J Med|pages=1469–1475|pmid=8164698|title=Neuropathological findings in the brain of Karen Ann Quinlan – the role of the thalamus in the persistent vegetative state|volume=330|doi=10.1056/NEJM199405263302101|s2cid=5112573|url=http://pdfs.semanticscholar.org/44a2/3798f5dc002a79512bfa9bff974bdbb611e1.pdf|archive-url=https://web.archive.org/web/20201118022837/http://pdfs.semanticscholar.org/44a2/3798f5dc002a79512bfa9bff974bdbb611e1.pdf | Consciousness is of concern to patients and physicians, especially [[neurology|neurologists]] and [[anesthesia|anesthesiologists]]. Patients may have disorders of consciousness or may need to be anesthetized for a surgical procedure. Physicians may perform consciousness-related interventions such as instructing the patient to sleep, administering [[general anesthesia]], or inducing [[induced coma|medical coma]].<ref name=Blumenfeld/> Also, [[bioethics|bioethicists]] may be concerned with the ethical implications of consciousness in medical cases of patients such as the [[Karen Ann Quinlan case]],<ref>{{cite journal|vauthors=Kinney HC, Korein J, Panigrahy A, Dikkes P, Goode R|date=26 May 1994|issue=21|journal=N Engl J Med|pages=1469–1475|pmid=8164698|title=Neuropathological findings in the brain of Karen Ann Quinlan – the role of the thalamus in the persistent vegetative state|volume=330|doi=10.1056/NEJM199405263302101|s2cid=5112573|url=http://pdfs.semanticscholar.org/44a2/3798f5dc002a79512bfa9bff974bdbb611e1.pdf|archive-url=https://web.archive.org/web/20201118022837/http://pdfs.semanticscholar.org/44a2/3798f5dc002a79512bfa9bff974bdbb611e1.pdf|archive-date=18 November 2020}}</ref> while neuroscientists may study patients with impaired consciousness in hopes of gaining information about how the brain works.<ref>Koch, ''The Quest for Consciousness'', pp. 216–226</ref> | ||
===Assessment=== | === Assessment === | ||
In medicine, consciousness is examined using a set of procedures known as [[neuropsychological assessment]].<ref name=Giacino>{{cite journal|author1=J.T. Giacino|author2=C.M. Smart|year=2007|doi=10.1097/WCO.0b013e3282f189ef|journal=Current Opinion in Neurology|pages=614–619|pmid=17992078|title=Recent advances in behavioral assessment of individuals with disorders of consciousness|volume=20|issue=6|s2cid=7097163}}</ref> There are two commonly used methods for assessing the level of consciousness of a patient: a simple procedure that requires minimal training, and a more complex procedure that requires substantial expertise. The simple procedure begins by asking whether the patient is able to move and react to physical stimuli. If so, the next question is whether the patient can respond | In medicine, consciousness is examined using a set of procedures known as [[neuropsychological assessment]].<ref name=Giacino>{{cite journal|author1=J.T. Giacino|author2=C.M. Smart|year=2007|doi=10.1097/WCO.0b013e3282f189ef|journal=Current Opinion in Neurology|pages=614–619|pmid=17992078|title=Recent advances in behavioral assessment of individuals with disorders of consciousness|volume=20|issue=6|s2cid=7097163}}</ref> There are two commonly used methods for assessing the level of consciousness of a patient: a simple procedure that requires minimal training, and a more complex procedure that requires substantial expertise. The simple procedure begins by asking whether the patient is able to move and react to physical stimuli. If so, the next question is whether the patient can respond meaningfully to questions and commands. If so, the patient is asked for their name, current location, and current day and time. A patient who can answer all of these questions is said to be "alert and oriented times four" (sometimes denoted "A&Ox4" on a medical chart), and is usually considered fully conscious.<ref>{{cite book|title=Essentials of Abnormal Psychology|url=https://archive.org/details/isbn_9780495806134|url-access=registration|author=V. Mark Durand|author2=David H. Barlow|publisher=Cengage Learning|year=2009|isbn=978-0-495-59982-1|pages=[https://archive.org/details/isbn_9780495806134/page/74 74–75]}} Note: A patient who can additionally describe the current situation may be referred to as "oriented times four".</ref> | ||
The more complex procedure is known as a [[neurological examination]], and is usually carried out by a | The more complex procedure is known as a [[neurological examination]], and is usually carried out by a neurologist in a hospital setting. A formal neurological examination runs through a precisely delineated series of tests, beginning with tests for basic sensorimotor reflexes, and culminating with tests for sophisticated use of language. The outcome may be summarized using the [[Glasgow Coma Scale]], which yields a number in the range 3–15, with a score of 3 to 8 indicating coma, and 15 indicating full consciousness. The Glasgow Coma Scale has three subscales, measuring the best motor response (ranging from "no motor response" to "obeys commands"), the best eye response (ranging from "no eye opening" to "eyes opening spontaneously") and the best verbal response (ranging from "no verbal response" to "fully oriented"). There is also a simpler [[Paediatric Glasgow Coma Scale|pediatric]] version of the scale, for children too young to be able to use language.<ref name=Blumenfeld/> | ||
In 2013, an experimental procedure was developed to measure degrees of consciousness, the procedure involving stimulating the brain with a magnetic pulse, measuring resulting waves of electrical activity, and developing a consciousness score based on the complexity of the brain activity.<ref name=NBCnews20130814>{{cite web|url=https://www.nbcnews.com/healthmain/new-tool-peeks-brain-measure-consciousness-6c10919906|title=New tool peeks into brain to measure consciousness|last=Neergaard|first=Lauren|date=August 14, 2013|publisher=Associated Press through NBC News|archive-url=https://web.archive.org/web/20130816144320/https://www.nbcnews.com/health/new-tool-peeks-brain-measure-consciousness-6C10919906|archive-date=August 16, 2013|access-date=March 2, 2022}}</ref> | In 2013, an experimental procedure was developed to measure degrees of consciousness, the procedure involving stimulating the brain with a magnetic pulse, measuring resulting waves of electrical activity, and developing a consciousness score based on the complexity of the brain activity.<ref name=NBCnews20130814>{{cite web|url=https://www.nbcnews.com/healthmain/new-tool-peeks-brain-measure-consciousness-6c10919906|title=New tool peeks into brain to measure consciousness|last=Neergaard|first=Lauren|date=August 14, 2013|publisher=Associated Press through NBC News|archive-url=https://web.archive.org/web/20130816144320/https://www.nbcnews.com/health/new-tool-peeks-brain-measure-consciousness-6C10919906|archive-date=August 16, 2013|access-date=March 2, 2022}}</ref> | ||
===Disorders=== | === Disorders === | ||
Medical conditions that inhibit consciousness are considered [[disorders of consciousness]].<ref name="chronic"/> This category generally includes [[minimally conscious state]] and [[persistent vegetative state]], but sometimes also includes the less severe [[locked-in syndrome]] and more severe [[coma|chronic coma]].<ref name="chronic">{{cite journal|author=Bernat JL|date=8 Apr 2006|doi=10.1016/S0140-6736(06)68508-5|issue=9517|journal=Lancet|pages=1181–1192|pmid=16616561|title=Chronic disorders of consciousness|volume=367|s2cid=13550675}} | Medical conditions that inhibit consciousness are considered [[disorders of consciousness]].<ref name="chronic"/> This category generally includes [[minimally conscious state]] and [[persistent vegetative state]], but sometimes also includes the less severe [[locked-in syndrome]] and more severe [[coma|chronic coma]].<ref name="chronic">{{cite journal|author=Bernat JL|date=8 Apr 2006|doi=10.1016/S0140-6736(06)68508-5|issue=9517|journal=Lancet|pages=1181–1192|pmid=16616561|title=Chronic disorders of consciousness|volume=367|s2cid=13550675}} | ||
</ref><ref>{{cite journal|author=Bernat JL|date=20 Jul 2010|doi=10.1212/WNL.0b013e3181e8e960|issue=3|journal=Neurology|pages=206–207|pmid=20554939|title=The natural history of chronic disorders of consciousness|volume=75|s2cid=30959964}}</ref> [[Differential diagnosis]] of these disorders is an active area of [[biomedical research]].<ref>{{cite journal|vauthors=Coleman MR, Davis MH, Rodd JM, Robson T, Ali A, Owen AM, Pickard JD|date=September 2009|doi=10.1093/brain/awp183|issue=9|journal=Brain|pages=2541–2552|pmid=19710182|title=Towards the routine use of brain imaging to aid the clinical diagnosis of disorders of consciousness|volume=132|doi-access=free}}</ref><ref>{{cite journal|vauthors=Monti MM, Vanhaudenhuyse A, Coleman MR, Boly M, Pickard JD, Tshibanda L, Owen AM, Laureys S|s2cid=13358991|date=18 Feb 2010|doi=10.1056/NEJMoa0905370|issue=7|journal=N Engl J Med|pages=579–589|pmid=20130250|title=Willful modulation of brain activity in disorders of consciousness|volume=362|url=http://pdfs.semanticscholar.org/560f/d2dd08c0532dcf5c61668690dd88d19d7114.pdf|archive-url=https://web.archive.org/web/20190224091809/http://pdfs.semanticscholar.org/560f/d2dd08c0532dcf5c61668690dd88d19d7114.pdf | </ref><ref>{{cite journal|author=Bernat JL|date=20 Jul 2010|doi=10.1212/WNL.0b013e3181e8e960|issue=3|journal=Neurology|pages=206–207|pmid=20554939|title=The natural history of chronic disorders of consciousness|volume=75|s2cid=30959964}}</ref> [[Differential diagnosis]] of these disorders is an active area of [[biomedical research]].<ref>{{cite journal|vauthors=Coleman MR, Davis MH, Rodd JM, Robson T, Ali A, Owen AM, Pickard JD|date=September 2009|doi=10.1093/brain/awp183|issue=9|journal=Brain|pages=2541–2552|pmid=19710182|title=Towards the routine use of brain imaging to aid the clinical diagnosis of disorders of consciousness|volume=132|doi-access=free}}</ref><ref>{{cite journal|vauthors=Monti MM, Vanhaudenhuyse A, Coleman MR, Boly M, Pickard JD, Tshibanda L, Owen AM, Laureys S|s2cid=13358991|date=18 Feb 2010|doi=10.1056/NEJMoa0905370|issue=7|journal=N Engl J Med|pages=579–589|pmid=20130250|title=Willful modulation of brain activity in disorders of consciousness|volume=362|url=http://pdfs.semanticscholar.org/560f/d2dd08c0532dcf5c61668690dd88d19d7114.pdf|archive-url=https://web.archive.org/web/20190224091809/http://pdfs.semanticscholar.org/560f/d2dd08c0532dcf5c61668690dd88d19d7114.pdf|archive-date=24 February 2019}}</ref><ref>{{cite journal|vauthors=Seel RT, Sherer M, Whyte J, Katz DI, Giacino JT, Rosenbaum AM, Hammond FM, Kalmar K, Pape TL|date=December 2010|doi=10.1016/j.apmr.2010.07.218|issue=12|journal=Arch Phys Med Rehabil|pages=1795–1813|pmid=21112421|title=Assessment scales for disorders of consciousness: evidence-based recommendations for clinical practice and research|volume=91|display-authors=etal}}</ref> Finally, [[brain death]] results in possible irreversible disruption of consciousness.<ref name="chronic"/> While other conditions may cause a moderate deterioration (e.g., [[dementia]] and [[delirium]]) or transient interruption (e.g., [[tonic–clonic seizure|grand mal]] and [[absence seizure|petit mal seizures]]) of consciousness, they are not included in this category. | ||
{| class="wikitable" style="width:100%" | {| class="wikitable" style="width:100%" | ||
| Line 282: | Line 313: | ||
|} | |} | ||
Medical experts increasingly view [[anosognosia]] as a disorder of consciousness.<ref name="prigatano-2009">{{cite journal|last1=Prigatano|first1=George P.|title=Anosognosia: clinical and ethical considerations|journal=Current Opinion in Neurology|date=2009|volume=22|issue=6|pages=606–611|doi=10.1097/WCO.0b013e328332a1e7|pmid=19809315|s2cid=40751848}}</ref> ''Anosognosia'' is a Greek-derived term meaning "unawareness of disease". This is a condition in which patients are disabled in some way, most commonly as a result of a [[stroke]], but either misunderstand the nature of the problem or deny that there is anything wrong with them.<ref>{{cite book|editor=George Prigatano|editor2=[[Daniel Schacter]]|title=Awareness of Deficit After Brain Injury: Clinical and Theoretical Issues|publisher=Oxford University Press|year=1991|chapter=Introduction|author=George P. Prigatano|author2=Daniel Schacter|pages=3–16|isbn=978-0-19-505941-0|author2-link=Daniel Schacter}}</ref> The most frequently occurring form is seen in people who have experienced a stroke damaging the [[parietal lobe]] in the right hemisphere of the brain, giving rise to a syndrome known as [[hemispatial neglect]], characterized by an inability to direct action or attention toward objects located to the left with respect to their bodies. Patients with hemispatial neglect are often paralyzed on the left side of the body, but sometimes deny being unable to move. When questioned about the obvious problem, the patient may avoid giving a direct answer | Medical experts increasingly view [[anosognosia]] as a disorder of consciousness.<ref name="prigatano-2009">{{cite journal|last1=Prigatano|first1=George P.|title=Anosognosia: clinical and ethical considerations|journal=Current Opinion in Neurology|date=2009|volume=22|issue=6|pages=606–611|doi=10.1097/WCO.0b013e328332a1e7|pmid=19809315|s2cid=40751848}}</ref> ''Anosognosia'' is a Greek-derived term meaning "unawareness of disease". This is a condition in which patients are disabled in some way, most commonly as a result of a [[stroke]], but either misunderstand the nature of the problem or deny that there is anything wrong with them.<ref>{{cite book|editor=George Prigatano|editor2=[[Daniel Schacter]]|title=Awareness of Deficit After Brain Injury: Clinical and Theoretical Issues|publisher=Oxford University Press|year=1991|chapter=Introduction|author=George P. Prigatano|author2=Daniel Schacter|pages=3–16|isbn=978-0-19-505941-0|author2-link=Daniel Schacter}}</ref> The most frequently occurring form is seen in people who have experienced a stroke damaging the [[parietal lobe]] in the right hemisphere of the brain, giving rise to a syndrome known as [[hemispatial neglect]], characterized by an inability to direct action or attention toward objects located to the left with respect to their bodies. Patients with hemispatial neglect are often paralyzed on the left side of the body, but sometimes deny being unable to move. When questioned about the obvious problem, the patient may avoid giving a direct answer or an explanation that does not make sense. Patients with hemispatial neglect may also fail to recognize paralyzed parts of their bodies: one frequently mentioned case is of a man who repeatedly tried to throw his own paralyzed right leg out of the bed he was lying in, and when asked what he was doing, complained that somebody had put a dead leg into the bed with him. An even more striking type of anosognosia is [[Anton–Babinski syndrome]], a rarely occurring condition in which patients become blind but claim to be able to see normally, and persist in this claim in spite of all evidence to the contrary.<ref>{{cite book|editor=George Prigatano|editor2=[[Daniel Schacter]]|title=Awareness of Deficit After Brain Injury: Clinical and Theoretical Issues|publisher=Oxford University Press|year=1991|chapter=Anosognosia: possible neuropsychological mechanisms|author=Kenneth M. Heilman|pages=53–62|isbn=978-0-19-505941-0}}</ref> | ||
==Outside human adults== | == Outside human adults == | ||
===In children=== | === In children === | ||
{{See also|Theory of mind}} | {{See also|Theory of mind}} | ||
Of the eight types of consciousness in the Lycan classification, some are detectable in utero and others develop years after birth. Psychologist and educator William Foulkes studied children's dreams and concluded that prior to the shift in cognitive maturation that humans experience during ages five to seven,<ref>{{cite book|editor1=[[Arnold J. Sameroff]]|editor2=Marshall M. Haith|date=1996|title=The Five to Seven Year Shift: The Age of Reason and Responsibility|location=Chicago|publisher=University of Chicago Press}}</ref> children lack the Lockean consciousness that Lycan had labeled "introspective consciousness" and that Foulkes labels "self-reflection".<ref>{{cite book|last=Foulkes|first=David|date=1999|title=Children's Dreaming and the Development of Consciousness|page=13|location=Cambridge, Massachusetts|publisher=Harvard University Press|quote= In defining 'consciousness' as a self-reflective act, psychology loses much of the glamour and mystery of other areas of consciousness-study, but it also can proceed on a workaday basis without becoming paralyzed in pure abstraction.}}</ref> In a 2020 paper, [[Katherine Nelson]] and [[Robyn Fivush]] use "autobiographical consciousness" to label essentially the same faculty, and agree with Foulkes on the timing of this faculty's acquisition. Nelson and Fivush contend that "language is the tool by which humans create a new, uniquely human form of consciousness, namely, autobiographical consciousness".<ref>{{cite journal|last1=Nelson|first1=Katherine|last2=Fivush|first2=Robin|title=The Development of Autobiographical Memory, Autobiographical Narratives, and Autobiographical Consciousness|journal=Psychological Reports|year=2020|volume=123|issue=1|page=74|doi=10.1177/0033294119852574|pmid=31142189|s2cid=169038149|doi-access=free}}</ref> [[Julian Jaynes]] had staked out these positions decades earlier.<ref>{{cite book|last=Jaynes|first=Julian|title=The Origin of Consciousness in the Breakdown of the Bicameral Mind|publisher=Houghton Mifflin|orig- | Of the eight types of consciousness in the Lycan classification, some are detectable in utero and others develop years after birth. Psychologist and educator William Foulkes studied children's dreams and concluded that prior to the shift in cognitive maturation that humans experience during ages five to seven,<ref>{{cite book|editor1=[[Arnold J. Sameroff]]|editor2=Marshall M. Haith|date=1996|title=The Five to Seven Year Shift: The Age of Reason and Responsibility|location=Chicago|publisher=University of Chicago Press}}</ref> children lack the Lockean consciousness that Lycan had labeled "introspective consciousness" and that Foulkes labels "self-reflection".<ref>{{cite book|last=Foulkes|first=David|date=1999|title=Children's Dreaming and the Development of Consciousness|page=13|location=Cambridge, Massachusetts|publisher=Harvard University Press|quote= In defining 'consciousness' as a self-reflective act, psychology loses much of the glamour and mystery of other areas of consciousness-study, but it also can proceed on a workaday basis without becoming paralyzed in pure abstraction.}}</ref> In a 2020 paper, [[Katherine Nelson]] and [[Robyn Fivush]] use "autobiographical consciousness" to label essentially the same faculty, and agree with Foulkes on the timing of this faculty's acquisition. Nelson and Fivush contend that "language is the tool by which humans create a new, uniquely human form of consciousness, namely, autobiographical consciousness".<ref>{{cite journal|last1=Nelson|first1=Katherine|last2=Fivush|first2=Robin|title=The Development of Autobiographical Memory, Autobiographical Narratives, and Autobiographical Consciousness|journal=Psychological Reports|year=2020|volume=123|issue=1|page=74|doi=10.1177/0033294119852574|pmid=31142189|s2cid=169038149|doi-access=free}}</ref> [[Julian Jaynes]] had staked out these positions decades earlier.<ref>{{cite book|last=Jaynes|first=Julian|title=The Origin of Consciousness in the Breakdown of the Bicameral Mind|publisher=Houghton Mifflin|orig-date=1976| year=2000|page=447|quote=''Consciousness is based on language''.... Consciousness is not the same as cognition and should be sharply distinguished from it.|isbn=0-618-05707-2}}</ref><ref>{{cite book|last=Jaynes|first=Julian|title=The Origin of Consciousness in the Breakdown of the Bicameral Mind|publisher=Houghton Mifflin|orig-date=1976| year=2000|page=450|quote=The basic connotative definition of consciousness is thus an analog 'I' narratizing in a functional mind-space. The denotative definition is, as it was for Descartes, Locke, and Hume, what is introspectable.|isbn=0-618-05707-2}}</ref> Citing the developmental steps that lead the infant to autobiographical consciousness, Nelson and Fivush point to the acquisition of "[[theory of mind]]", calling theory of mind "necessary for autobiographical consciousness" and defining it as "understanding differences between one's own mind and others' minds in terms of beliefs, desires, emotions and thoughts". They write, "The hallmark of theory of mind, the understanding of false belief, occurs ... at five to six years of age".<ref>{{cite journal|last1=Nelson|first1=Katherine|last2=Fivush|first2=Robin|title=The Development of Autobiographical Memory, Autobiographical Narratives, and Autobiographical Consciousness|journal=Psychological Reports|year=2020|volume=123|issue=1|pages=80–83|doi=10.1177/0033294119852574|pmid=31142189|s2cid=169038149|doi-access=free}}</ref> | ||
===In animals=== | === In animals === | ||
{{Main|Animal consciousness}} | {{Main|Animal consciousness}} | ||
| Line 305: | Line 336: | ||
"Convergent evidence indicates that non-human animals ..., including all mammals and birds, and other creatures, ... have the necessary neural substrates of consciousness and the capacity to exhibit intentional behaviors."<ref>{{cite web|url=http://fcmconference.org/img/CambridgeDeclarationOnConsciousness.pdf|archive-url=https://ghostarchive.org/archive/20221009/http://fcmconference.org/img/CambridgeDeclarationOnConsciousness.pdf|archive-date=2022-10-09|url-status=live|title=Cambridge Declaration on Consciousness}}</ref> | "Convergent evidence indicates that non-human animals ..., including all mammals and birds, and other creatures, ... have the necessary neural substrates of consciousness and the capacity to exhibit intentional behaviors."<ref>{{cite web|url=http://fcmconference.org/img/CambridgeDeclarationOnConsciousness.pdf|archive-url=https://ghostarchive.org/archive/20221009/http://fcmconference.org/img/CambridgeDeclarationOnConsciousness.pdf|archive-date=2022-10-09|url-status=live|title=Cambridge Declaration on Consciousness}}</ref> | ||
===In artificial intelligence=== | === In artificial intelligence === | ||
{{Main|Artificial consciousness}} | {{Main|Artificial consciousness}} | ||
| Line 312: | Line 343: | ||
{{blockquote|It is desirable to guard against the possibility of exaggerated ideas that might arise as to the powers of the Analytical Engine. ... The Analytical Engine has no pretensions whatever to ''originate'' anything. It can do whatever we ''know how to order it'' to perform. It can ''follow'' analysis; but it has no power of ''anticipating'' any analytical relations or truths. Its province is to assist us in making ''available'' what we are already acquainted with.<ref>{{cite web|author=Ada Lovelace|title=Sketch of The Analytical Engine, Note G|url=http://www.fourmilab.ch/babbage/sketch.html|author-link=Ada Lovelace|access-date=2011-09-10|archive-date=2010-09-13|archive-url=https://web.archive.org/web/20100913042032/http://www.fourmilab.ch/babbage/sketch.html|url-status=live}}</ref>}} | {{blockquote|It is desirable to guard against the possibility of exaggerated ideas that might arise as to the powers of the Analytical Engine. ... The Analytical Engine has no pretensions whatever to ''originate'' anything. It can do whatever we ''know how to order it'' to perform. It can ''follow'' analysis; but it has no power of ''anticipating'' any analytical relations or truths. Its province is to assist us in making ''available'' what we are already acquainted with.<ref>{{cite web|author=Ada Lovelace|title=Sketch of The Analytical Engine, Note G|url=http://www.fourmilab.ch/babbage/sketch.html|author-link=Ada Lovelace|access-date=2011-09-10|archive-date=2010-09-13|archive-url=https://web.archive.org/web/20100913042032/http://www.fourmilab.ch/babbage/sketch.html|url-status=live}}</ref>}} | ||
One of the most influential contributions to this question was an essay written in 1950 by pioneering computer scientist [[Alan Turing]], titled ''[[Computing Machinery and Intelligence]]''. Turing disavowed any interest in terminology, saying that even "Can machines think?" is too loaded with spurious connotations to be meaningful; but he proposed to replace all such questions with a specific operational test, which has become known as the [[Turing test]].<ref name="tu">{{cite book|author=Stuart Shieber|title=The Turing Test : Verbal Behavior as the Hallmark of Intelligence|publisher=MIT Press|year=2004|isbn=978-0-262-69293-9}}</ref> To pass the test, a computer must be able to imitate a human well enough to fool interrogators. In his essay Turing discussed a variety of possible objections, and presented a counterargument to each of them. The Turing test is commonly cited in discussions of [[artificial intelligence]] as a proposed criterion for machine consciousness; it has provoked a great deal of philosophical debate. For example, Daniel Dennett and [[Douglas Hofstadter]] argue that anything capable of passing the Turing test is necessarily conscious,<ref name=MindsI>{{cite book|author=Daniel Dennett|author2=Douglas Hofstadter|year=1985|title=The Mind's I|publisher=Basic Books|isbn=978-0-553-34584-1|author2-link=Douglas Hofstadter|author-link=Daniel Dennett|url=https://archive.org/details/mindsifantasiesr1982hofs}}</ref> while [[David Chalmers]] argues that a [[philosophical zombie]] could pass the test, yet fail to be conscious.<ref name=Chalmers>{{cite book|author=David Chalmers|year=1997|title=The Conscious Mind: In Search of a Fundamental Theory|publisher=Oxford University Press|isbn=978-0-19-511789-9|author-link=David Chalmers}}</ref> A third group of scholars have argued that with technological growth once machines begin to display any substantial signs of human-like behavior then the dichotomy (of human consciousness compared to human-like consciousness) becomes passé and issues of machine autonomy begin to prevail even as observed in its nascent form within contemporary industry and [[technology]].<ref name="Ridley Scott pp. 133-144"/><ref name="Machine Morals 2010"/> [[Jürgen Schmidhuber]] argues that consciousness is the result of compression.<ref name="Schmidhuber2009">{{cite book|author=Jürgen Schmidhuber|year=2009|title=Driven by Compression Progress: A Simple Principle Explains Essential Aspects of Subjective Beauty, Novelty, Surprise, Interestingness, Attention, Curiosity, Creativity, Art, Science, Music, Jokes|url=https://archive.org/details/arxiv-0812.4360|author-link=Jürgen Schmidhuber|bibcode=2008arXiv0812.4360S|arxiv=0812.4360}}</ref> As an agent sees representation of itself recurring in the environment, the compression of this representation can be called consciousness. | One of the most influential contributions to this question was an essay written in 1950 by pioneering computer scientist [[Alan Turing]], titled ''[[Computing Machinery and Intelligence]]''. Turing disavowed any interest in terminology, saying that even "Can machines think?" is too loaded with spurious connotations to be meaningful; but he proposed to replace all such questions with a specific operational test, which has become known as the [[Turing test]].<ref name="tu">{{cite book|author=Stuart Shieber|title=The Turing Test: Verbal Behavior as the Hallmark of Intelligence|publisher=MIT Press|year=2004|isbn=978-0-262-69293-9}}</ref> To pass the test, a computer must be able to imitate a human well enough to fool interrogators. In his essay Turing discussed a variety of possible objections, and presented a counterargument to each of them. The Turing test is commonly cited in discussions of [[artificial intelligence]] as a proposed criterion for machine consciousness; it has provoked a great deal of philosophical debate. For example, Daniel Dennett and [[Douglas Hofstadter]] argue that anything capable of passing the Turing test is necessarily conscious,<ref name=MindsI>{{cite book|author=Daniel Dennett|author2=Douglas Hofstadter|year=1985|title=The Mind's I|publisher=Basic Books|isbn=978-0-553-34584-1|author2-link=Douglas Hofstadter|author-link=Daniel Dennett|url=https://archive.org/details/mindsifantasiesr1982hofs}}</ref> while [[David Chalmers]] argues that a [[philosophical zombie]] could pass the test, yet fail to be conscious.<ref name=Chalmers>{{cite book|author=David Chalmers|year=1997|title=The Conscious Mind: In Search of a Fundamental Theory|publisher=Oxford University Press|isbn=978-0-19-511789-9|author-link=David Chalmers}}</ref> A third group of scholars have argued that with technological growth once machines begin to display any substantial signs of human-like behavior then the dichotomy (of human consciousness compared to human-like consciousness) becomes passé and issues of machine autonomy begin to prevail even as observed in its nascent form within contemporary industry and [[technology]].<ref name="Ridley Scott pp. 133-144"/><ref name="Machine Morals 2010"/> [[Jürgen Schmidhuber]] argues that consciousness is the result of compression.<ref name="Schmidhuber2009">{{cite book|author=Jürgen Schmidhuber|year=2009|title=Driven by Compression Progress: A Simple Principle Explains Essential Aspects of Subjective Beauty, Novelty, Surprise, Interestingness, Attention, Curiosity, Creativity, Art, Science, Music, Jokes|url=https://archive.org/details/arxiv-0812.4360|author-link=Jürgen Schmidhuber|bibcode=2008arXiv0812.4360S|arxiv=0812.4360}}</ref> As an agent sees representation of itself recurring in the environment, the compression of this representation can be called consciousness. | ||
[[File:John searle2.jpg|thumb|upright|John Searle in December 2005]] | [[File:John searle2.jpg|thumb|upright|John Searle in December 2005]] | ||
In a lively exchange over what has come to be referred to as "the [[Chinese room]] argument", [[John Searle]] sought to refute the claim of proponents of what he calls "strong artificial intelligence (AI)" that a computer program can be conscious, though he does agree with advocates of "weak AI" that computer programs can be formatted to "simulate" conscious states. His own view is that consciousness has subjective, first-person causal powers by being essentially intentional due to the way human brains function biologically; conscious persons can perform computations, but consciousness is not inherently computational the way computer programs are. To make a Turing machine that speaks Chinese, Searle imagines a room with one monolingual English speaker (Searle himself, in fact), a book that designates a combination of Chinese symbols to be output paired with Chinese symbol input, and boxes filled with Chinese symbols. In this case, the English speaker is acting as a computer and the rulebook as a program. Searle argues that with such a machine, he would be able to process the inputs to outputs perfectly without having any understanding of Chinese, nor having any idea what the questions and answers could possibly mean. If the experiment were done in English, since Searle knows English, he would be able to take questions and give answers without any algorithms for English questions, and he would be effectively aware of what was being said and the purposes it might serve. Searle would pass the Turing test of answering the questions in both languages, but he is only conscious of what he is doing when he speaks English. Another way of putting the argument is to say that computer programs can pass the Turing test for processing the syntax of a language, but that the syntax cannot lead to semantic meaning in the way strong AI advocates hoped.<ref name=Searle1990>{{cite journal|author=John R. Searle|title=Is the brain's mind a computer program|journal=Scientific American|year=1990|volume= 262|issue=1|pages=26–31|url=http://www.cs.princeton.edu/courses/archive/spr06/cos116/Is_The_Brains_Mind_A_Computer_Program.pdf|archive-url=https://ghostarchive.org/archive/20221009/http://www.cs.princeton.edu/courses/archive/spr06/cos116/Is_The_Brains_Mind_A_Computer_Program.pdf|archive-date=2022-10-09|url-status=live|doi=10.1038/scientificamerican0190-26|pmid=2294583|bibcode=1990SciAm.262a..26S|author-link=John R. Searle}}</ref><ref name=SearleSEP>{{cite book| title=The Chinese Room Argument| url=http://plato.stanford.edu/entries/chinese-room| publisher=Metaphysics Research Lab, Stanford University| year=2019| access-date=2012-02-20| archive-date=2012-01-12| archive-url=https://web.archive.org/web/20120112034000/http://plato.stanford.edu/entries/chinese-room/| url-status=live}}</ref> | In a lively exchange over what has come to be referred to as "the [[Chinese room]] argument", [[John Searle]] sought to refute the claim of proponents of what he calls "strong artificial intelligence (AI)" that a computer program can be conscious, though he does agree with advocates of "weak AI" that computer programs can be formatted to "simulate" conscious states. His own view is that consciousness has subjective, first-person causal powers by being essentially intentional due to the way human brains function biologically; conscious persons can perform computations, but consciousness is not inherently computational the way computer programs are. To make a Turing machine that speaks Chinese, Searle imagines a room with one monolingual English speaker (Searle himself, in fact), a book that designates a combination of Chinese symbols to be output paired with Chinese symbol input, and boxes filled with Chinese symbols. In this case, the English speaker is acting as a computer and the rulebook as a program. Searle argues that with such a machine, he would be able to process the inputs to outputs perfectly without having any understanding of Chinese, nor having any idea what the questions and answers could possibly mean. If the experiment were done in English, since Searle knows English, he would be able to take questions and give answers without any algorithms for English questions, and he would be effectively aware of what was being said and the purposes it might serve. Searle would pass the Turing test of answering the questions in both languages, but he is only conscious of what he is doing when he speaks English. Another way of putting the argument is to say that computer programs can pass the Turing test for processing the syntax of a language, but that the syntax cannot lead to semantic meaning in the way strong AI advocates hoped.<ref name=Searle1990>{{cite journal|author=John R. Searle|title=Is the brain's mind a computer program|journal=Scientific American|year=1990|volume= 262|issue=1|pages=26–31|url=http://www.cs.princeton.edu/courses/archive/spr06/cos116/Is_The_Brains_Mind_A_Computer_Program.pdf|archive-url=https://ghostarchive.org/archive/20221009/http://www.cs.princeton.edu/courses/archive/spr06/cos116/Is_The_Brains_Mind_A_Computer_Program.pdf|archive-date=2022-10-09|url-status=live|doi=10.1038/scientificamerican0190-26|pmid=2294583|bibcode=1990SciAm.262a..26S|author-link=John R. Searle}}</ref><ref name=SearleSEP>{{cite book| title=The Chinese Room Argument| url=http://plato.stanford.edu/entries/chinese-room| publisher=Metaphysics Research Lab, Stanford University| year=2019| access-date=2012-02-20| archive-date=2012-01-12| archive-url=https://web.archive.org/web/20120112034000/http://plato.stanford.edu/entries/chinese-room/| url-status=live}}</ref> | ||
| Line 320: | Line 352: | ||
In 2014, Victor Argonov has suggested a non-Turing test for machine consciousness based on a machine's ability to produce philosophical judgments.<ref>{{cite journal|author=Victor Argonov|title=Experimental Methods for Unraveling the Mind-body Problem: The Phenomenal Judgment Approach|journal=Journal of Mind and Behavior|volume=35|year=2014|pages=51–70|url=http://philpapers.org/rec/ARGMAA-2|access-date=2016-12-06|archive-date=2016-10-20|archive-url=https://web.archive.org/web/20161020014221/http://philpapers.org/rec/ARGMAA-2|url-status=live}}</ref> He argues that a deterministic machine must be regarded as conscious if it is able to produce judgments on all problematic properties of consciousness (such as qualia or binding) having no innate (preloaded) philosophical knowledge on these issues, no philosophical discussions while learning, and no informational models of other creatures in its memory (such models may implicitly or explicitly contain knowledge about these creatures' consciousness). However, this test can be used only to detect, but not refute the existence of consciousness. A positive result proves that a machine is conscious but a negative result proves nothing. For example, absence of philosophical judgments may be caused by lack of the machine's intellect, not by absence of consciousness. | In 2014, Victor Argonov has suggested a non-Turing test for machine consciousness based on a machine's ability to produce philosophical judgments.<ref>{{cite journal|author=Victor Argonov|title=Experimental Methods for Unraveling the Mind-body Problem: The Phenomenal Judgment Approach|journal=Journal of Mind and Behavior|volume=35|year=2014|pages=51–70|url=http://philpapers.org/rec/ARGMAA-2|access-date=2016-12-06|archive-date=2016-10-20|archive-url=https://web.archive.org/web/20161020014221/http://philpapers.org/rec/ARGMAA-2|url-status=live}}</ref> He argues that a deterministic machine must be regarded as conscious if it is able to produce judgments on all problematic properties of consciousness (such as qualia or binding) having no innate (preloaded) philosophical knowledge on these issues, no philosophical discussions while learning, and no informational models of other creatures in its memory (such models may implicitly or explicitly contain knowledge about these creatures' consciousness). However, this test can be used only to detect, but not refute the existence of consciousness. A positive result proves that a machine is conscious but a negative result proves nothing. For example, absence of philosophical judgments may be caused by lack of the machine's intellect, not by absence of consciousness. | ||
[[Nick Bostrom]] has argued in 2023 that, being very sure that [[Large language model|large language models]] (LLMs) are not conscious, would require unwarranted confidence; in which consciousness theory is correct and how it applies to machines.<ref>{{Cite web |last=Leith |first=Sam |date=2022-07-07 |title=Nick Bostrom: How can we be certain a machine isn't conscious? |url=https://www.spectator.co.uk/article/nick-bostrom-how-can-we-be-certain-a-machine-isnt-conscious/ |access-date=2025-08-09 |website=The Spectator |language=en-GB}}</ref> He views consciousness as a matter of degree,<ref>{{Cite news |date=2023-04-12 |title=What if A.I. Sentience Is a Question of Degree? |url=https://www.nytimes.com/2023/04/12/world/artificial-intelligence-nick-bostrom.html |access-date=2025-08-09 |work=The New York Times |language=en}}</ref> and argued that machines could in theory be much more conscious than humans.<ref>{{Cite journal |last1=Shulman |first1=Carl |last2=Bostrom |first2=Nick |date=August 2021 |title=Sharing the World with Digital Minds |url=https://www.researchgate.net/publication/353967146 |journal=Rethinking Moral Status |pages=306–326 |doi=10.1093/oso/9780192894076.003.0018 |isbn=978-0-19-289407-6 }}</ref><ref>{{Cite web |date=2020-11-13 |title=The intelligent monster that you should let eat you |url=https://www.bbc.com/future/article/20201111-philosophy-of-utility-monsters-and-artificial-intelligence |access-date=2025-08-09 |website=BBC |language=en-GB}}</ref> | |||
==Stream of consciousness== | ==Stream of consciousness== | ||
| Line 333: | Line 367: | ||
# ''It is interested in some parts of these objects to the exclusion of others.'' | # ''It is interested in some parts of these objects to the exclusion of others.'' | ||
A similar concept appears in Buddhist philosophy, expressed by the Sanskrit term ''Citta-saṃtāna'', which is usually translated as [[mindstream]] or "mental continuum". Buddhist teachings describe that consciousness manifests moment to moment as sense impressions and mental phenomena that are continuously changing.<ref name=Aggregates>{{cite journal|author= Karunamuni N.D.|title=The Five-Aggregate Model of the Mind|journal=SAGE Open|volume=5|issue=2| | A similar concept appears in Buddhist philosophy, expressed by the Sanskrit term ''Citta-saṃtāna'', which is usually translated as [[mindstream]] or "mental continuum". Buddhist teachings describe that consciousness manifests moment to moment as sense impressions and mental phenomena that are continuously changing.<ref name=Aggregates>{{cite journal|author= Karunamuni N.D.|title=The Five-Aggregate Model of the Mind|journal=SAGE Open|volume=5|issue=2|page=215824401558386|date=May 2015|article-number=2158244015583860 |doi=10.1177/2158244015583860|doi-access=free}}</ref> The teachings list six triggers that can result in the generation of different mental events.<ref name="Aggregates"/> These triggers are input from the five senses (seeing, hearing, smelling, tasting or touch sensations), or a thought (relating to the past, present or the future) that happen to arise in the mind. The mental events generated as a result of these triggers are: feelings, perceptions and intentions/behaviour. The moment-by-moment manifestation of the mind-stream is said to happen in every person all the time. It even happens in a scientist who analyzes various phenomena in the world, or analyzes the material body including the organ brain.<ref name="Aggregates"/> The manifestation of the mindstream is also described as being influenced by physical laws, biological laws, psychological laws, volitional laws, and universal laws.<ref name="Aggregates"/> The purpose of the Buddhist practice of [[mindfulness]] is to understand the inherent nature of the consciousness and its characteristics.<ref>{{cite book|title=Losing the Clouds, Gaining the Sky: Buddhism and the Natural Mind|chapter=Taming the mindstream|author=Dzogchen Rinpoche|editor=Doris Wolter|year=2007|publisher=Wisdom Publications|isbn=978-0-86171-359-2|pages=[https://archive.org/details/losingcloudsgain0000unse/page/81 81–92]|chapter-url=https://archive.org/details/losingcloudsgain0000unse/page/81}}</ref> | ||
===Narrative form=== | === Narrative form === | ||
In the West, the primary impact of the idea has been on literature rather than science: "[[stream of consciousness (narrative mode)|stream of consciousness as a narrative mode]]" means writing in a way that attempts to portray the moment-to-moment thoughts and experiences of a character. This technique perhaps had its beginnings in the monologues of Shakespeare's plays and reached its fullest development in the novels of [[James Joyce]] and [[Virginia Woolf]], although it has also been used by many other noted writers.<ref>{{cite book|author=Robert Humphrey|title=Stream of Consciousness in the Modern Novel|date=1992 |orig-date=1954|publisher=University of California Press|isbn=978-0-520-00585-3|pages=23–49}}</ref> | In the West, the primary impact of the idea has been on literature rather than science: "[[stream of consciousness (narrative mode)|stream of consciousness as a narrative mode]]" means writing in a way that attempts to portray the moment-to-moment thoughts and experiences of a character. This technique perhaps had its beginnings in the monologues of Shakespeare's plays and reached its fullest development in the novels of [[James Joyce]] and [[Virginia Woolf]], although it has also been used by many other noted writers.<ref>{{cite book|author=Robert Humphrey|title=Stream of Consciousness in the Modern Novel|date=1992 |orig-date=1954|publisher=University of California Press|isbn=978-0-520-00585-3|pages=23–49}}</ref> | ||
| Line 342: | Line 376: | ||
{{blockquote|Yes because he never did a thing like that before as ask to get his breakfast in bed with a couple of eggs since the City Arms hotel when he used to be pretending to be laid up with a sick voice doing his highness to make himself interesting for that old faggot Mrs Riordan that he thought he had a great leg of and she never left us a farthing all for masses for herself and her soul greatest miser ever was actually afraid to lay out 4d for her methylated spirit telling me all her ailments she had too much old chat in her about politics and earthquakes and the end of the world let us have a bit of fun first God help the world if all the women were her sort down on bathingsuits and lownecks of course nobody wanted her to wear them I suppose she was pious because no man would look at her twice I hope Ill never be like her a wonder she didnt want us to cover our faces but she was a well-educated woman certainly and her gabby talk about Mr Riordan here and Mr Riordan there I suppose he was glad to get shut of her.<ref>{{cite book|author=James Joyce|title=Ulysses|year=1990|publisher=BompaCrazy.com|page=620|author-link=James Joyce}}</ref>}} | {{blockquote|Yes because he never did a thing like that before as ask to get his breakfast in bed with a couple of eggs since the City Arms hotel when he used to be pretending to be laid up with a sick voice doing his highness to make himself interesting for that old faggot Mrs Riordan that he thought he had a great leg of and she never left us a farthing all for masses for herself and her soul greatest miser ever was actually afraid to lay out 4d for her methylated spirit telling me all her ailments she had too much old chat in her about politics and earthquakes and the end of the world let us have a bit of fun first God help the world if all the women were her sort down on bathingsuits and lownecks of course nobody wanted her to wear them I suppose she was pious because no man would look at her twice I hope Ill never be like her a wonder she didnt want us to cover our faces but she was a well-educated woman certainly and her gabby talk about Mr Riordan here and Mr Riordan there I suppose he was glad to get shut of her.<ref>{{cite book|author=James Joyce|title=Ulysses|year=1990|publisher=BompaCrazy.com|page=620|author-link=James Joyce}}</ref>}} | ||
==Spiritual approaches== | == Spiritual approaches == | ||
{{Further|Higher consciousness}} | {{Further|Higher consciousness}} | ||
The [[Upanishads]] hold the oldest recorded map of consciousness, as explored by sages through meditation.<ref>{{Cite book |last=Thompson |first=Evan |url=https://books.google.com/books?id=q_vpBAAAQBAJ |title=Waking, Dreaming, Being: Self and Consciousness in Neuroscience, Meditation, and Philosophy |date=2014-11-18 |publisher=Columbia University Press |isbn=978-0-231-53831-2 | | The [[Upanishads]] hold the oldest recorded map of consciousness, as explored by sages through meditation.<ref>{{Cite book |last=Thompson |first=Evan |url=https://books.google.com/books?id=q_vpBAAAQBAJ |title=Waking, Dreaming, Being: Self and Consciousness in Neuroscience, Meditation, and Philosophy |date=2014-11-18 |publisher=Columbia University Press |isbn=978-0-231-53831-2 |page=19 |language=en}}</ref> | ||
The Canadian psychiatrist [[Richard Maurice Bucke]], author of the 1901 book ''[[Cosmic Consciousness|Cosmic Consciousness: A Study in the Evolution of the Human Mind]]'', distinguished between three types of consciousness: 'Simple Consciousness', awareness of the body, possessed by many animals; 'Self Consciousness', awareness of being aware, possessed only by humans; and 'Cosmic Consciousness', awareness of the life and order of the universe, possessed only by humans who have attained "intellectual enlightenment or illumination".<ref>{{cite book|author=Richard Maurice Bucke|title=Cosmic Consciousness: A Study in the Evolution of the Human Mind|publisher=Innes & Sons|year=1905|url=https://archive.org/details/cosmicconsciousn01buck|pages=[https://archive.org/details/cosmicconsciousn01buck/page/n19 1]–2|author-link=Richard Maurice Bucke}}</ref> | The Canadian psychiatrist [[Richard Maurice Bucke]], author of the 1901 book ''[[Cosmic Consciousness|Cosmic Consciousness: A Study in the Evolution of the Human Mind]]'', distinguished between three types of consciousness: 'Simple Consciousness', awareness of the body, possessed by many animals; 'Self Consciousness', awareness of being aware, possessed only by humans; and 'Cosmic Consciousness', awareness of the life and order of the universe, possessed only by humans who have attained "intellectual enlightenment or illumination".<ref>{{cite book|author=Richard Maurice Bucke|title=Cosmic Consciousness: A Study in the Evolution of the Human Mind|publisher=Innes & Sons|year=1905|url=https://archive.org/details/cosmicconsciousn01buck|pages=[https://archive.org/details/cosmicconsciousn01buck/page/n19 1]–2|author-link=Richard Maurice Bucke}}</ref> | ||
| Line 353: | Line 387: | ||
Other examples include the various levels of spiritual consciousness presented by [[Prem Saran Satsangi]] and [[Stuart Hameroff]].<ref>{{cite book|editor1-link=Prem Saran Satsangi|editor1-last=Satsangi|editor1-first=Prem Saran|editor2-link=Stuart Hameroff|editor2-last=Hameroff|editor2-first=Stuart|year=2016|title=Consciousness: Integrating Eastern and Western Perspectives|publisher=New Age Books|isbn=978-81-7822-493-0}}</ref> | Other examples include the various levels of spiritual consciousness presented by [[Prem Saran Satsangi]] and [[Stuart Hameroff]].<ref>{{cite book|editor1-link=Prem Saran Satsangi|editor1-last=Satsangi|editor1-first=Prem Saran|editor2-link=Stuart Hameroff|editor2-last=Hameroff|editor2-first=Stuart|year=2016|title=Consciousness: Integrating Eastern and Western Perspectives|publisher=New Age Books|isbn=978-81-7822-493-0}}</ref> | ||
==See also== | == See also == | ||
{{Cols|colwidth=26em}} | {{Cols|colwidth=26em}} | ||
* {{ | * {{Annotated link|Animal consciousness}} | ||
* {{ | * {{Annotated link|Artificial consciousness}} | ||
* {{ | * {{Annotated link|Bicameral mentality}} | ||
* {{ | * {{Annotated link|Chaitanya (consciousness)|Chaitanya}} | ||
* {{ | * {{Annotated link|Claustrum}} | ||
* {{ | * {{Annotated link|Habenula}} | ||
* {{ | * {{Annotated link|Higher-order theories of consciousness}} | ||
* {{ | * {{Annotated link|Models of consciousness}} | ||
* {{ | * {{Annotated link|Plant perception (paranormal)|Plant perception}} | ||
* {{Annotated link|Sakshi (witness)|Sakshi}} | |||
* {{Annotated link|Vertiginous question}} | |||
{{Colend}} | {{Colend}} | ||
==Notes== | == Notes == | ||
{{Notelist|30em}} | {{Notelist|30em}} | ||
==References== | == References == | ||
{{Reflist}} | {{Reflist}} | ||
==Further reading== | == Further reading == | ||
{{Div col|colwidth=30em}} | {{Div col|colwidth=30em}} | ||
* {{cite book|last1=Dehaene|first1=Stanislas|author1-link=Stanislas Dehaene|title=Consciousness and the Brain: Deciphering How the Brain Codes Our Thoughts|date=2014|publisher=Viking Press|isbn=978-0-670-02543-5|ref=none}} | * {{cite book |last1=Dehaene |first1=Stanislas |author1-link=Stanislas Dehaene |title=Consciousness and the Brain: Deciphering How the Brain Codes Our Thoughts |date=2014 |publisher=Viking Press |isbn=978-0-670-02543-5 |ref=none}} | ||
* {{cite book|last1=Frankish|first1=Keith|author1-link=Keith Frankish|title=Consciousness: The Basics|date=2021|publisher=Routledge|isbn=978-1-138-65598-0|ref=none}} | * {{cite book |last1=Frankish |first1=Keith |author1-link=Keith Frankish |title=Consciousness: The Basics |date=2021 |publisher=Routledge |isbn=978-1-138-65598-0 |ref=none}} | ||
* {{cite book|last=Harley|first=Trevor|title=The Science of Consciousness: Waking, Sleeping, and Dreaming|year=2021|publisher=Cambridge University Press|doi=10.1017/9781316408889|isbn=978-1-107-56330-8|s2cid=233977060|ref=none}} | * {{cite book |last=Harley |first=Trevor |title=The Science of Consciousness: Waking, Sleeping, and Dreaming |year=2021 |publisher=Cambridge University Press |doi=10.1017/9781316408889 |isbn=978-1-107-56330-8 |s2cid=233977060 |ref=none}} | ||
* {{cite book|last1=Irvine|first1=Elizabeth|title=Consciousness as a Scientific Concept: A Philosophy of Science Perspective|date=2013|publisher=Springer|location=Dordrecht, Netherlands|isbn=978-94-007-5172-9|doi=10.1007/978-94-007-5173-6|ref=none}} | * {{cite book |last1=Irvine |first1=Elizabeth |title=Consciousness as a Scientific Concept: A Philosophy of Science Perspective |date=2013 |publisher=Springer |location=Dordrecht, Netherlands |isbn=978-94-007-5172-9 |doi=10.1007/978-94-007-5173-6 |ref=none}} | ||
* {{cite book|last=Koch|first=Christof|author-link=Christof Koch|title= The Feeling of Life Itself: Why Consciousness Is Widespread but Can't Be Computed|year=2019|publisher=MIT Press|isbn=978-0-262-04281-9|ref=none}} | * {{cite book |last=Koch |first=Christof |author-link=Christof Koch |title=The Feeling of Life Itself: Why Consciousness Is Widespread but Can't Be Computed |year=2019 |publisher=MIT Press |isbn=978-0-262-04281-9 |ref=none}} | ||
* {{cite book|editor1-last=Overgaard|editor1-first=Morten|editor2-last=Mogensen|editor2-first=Jesper|editor2-link=Jesper Mogensen|editor3-last=Kirkeby-Hinrup|editor3-first=Asger|title=Beyond Neural Correlates of Consciousness|date=2021|publisher=Routledge|isbn=978-1-138-63798-6|ref=none}} | * {{cite book |editor1-last=Overgaard |editor1-first=Morten |editor2-last=Mogensen |editor2-first=Jesper |editor2-link=Jesper Mogensen |editor3-last=Kirkeby-Hinrup |editor3-first=Asger |title=Beyond Neural Correlates of Consciousness |date=2021 |publisher=Routledge |isbn=978-1-138-63798-6 |ref=none}} | ||
* {{cite book|last1=Prinz|first1=Jesse|author1-link=Jesse Prinz|title=The Conscious Brain: How Attention Engenders Experience|date=2012|publisher=Oxford University Press|isbn= | * {{cite book |last1=Prinz |first1=Jesse |author1-link=Jesse Prinz |title=The Conscious Brain: How Attention Engenders Experience |date=2012 |publisher=Oxford University Press |isbn=978-0-19-531459-5 |doi=10.1093/acprof:oso/9780195314595.001.0001 |ref=none}} | ||
* {{cite book|editor1-last=Schneider|editor1-first=Susan|editor2-last=Velmans|editor2-first=Max|editor1-link=Susan Schneider|editor2-link=Max Velmans|title=The Blackwell Companion to Consciousness|date=2017|publisher=Wiley-Blackwell|isbn=978-0-470-67406-2|edition=2nd|ref=none}} | * {{cite book |editor1-last=Schneider |editor1-first=Susan |editor2-last=Velmans |editor2-first=Max |editor1-link=Susan Schneider |editor2-link=Max Velmans |title=The Blackwell Companion to Consciousness |date=2017 |publisher=Wiley-Blackwell |isbn=978-0-470-67406-2 |edition=2nd |ref=none}} | ||
* {{cite book|last1=Seth|first1=Anil|author1-link=Anil Seth|title=Being You: A New Science of Consciousness|date=2021|publisher=Penguin Random House|isbn=978-1-5247-4287-4|ref=none}} | * {{cite book |last1=Seth |first1=Anil |author1-link=Anil Seth |title=Being You: A New Science of Consciousness |date=2021 |publisher=Penguin Random House |isbn=978-1-5247-4287-4 |ref=none}} | ||
* {{cite book|last1=Thompson|first1=Evan|author1-link=Evan Thompson|title=Waking, Dreaming, Being: Self and Consciousness in Neuroscience, Meditation, and Philosophy|date=2014|publisher=Columbia University Press|isbn=978-0-231-13695-2|ref=none}} | * {{cite book |last1=Thompson |first1=Evan |author1-link=Evan Thompson |title=Waking, Dreaming, Being: Self and Consciousness in Neuroscience, Meditation, and Philosophy |date=2014 |publisher=Columbia University Press |isbn=978-0-231-13695-2 |ref=none}} | ||
* {{cite book|editor1-last=Zelazo|editor1-first=Philip David|editor1-link=Philip David Zelazo|editor2-last=Moscovitch|editor2-first=Morris|editor2-link=Morris Moscovitch|editor3-last=Thompson|editor3-first=Evan|editor3-link=Evan Thompson|title=The Cambridge Handbook of Consciousness|year=2007|publisher=Cambridge University Press|isbn=978-0-521-67412-6|doi=10.1017/CBO9780511816789|ref=none}} | * {{cite book |editor1-last=Zelazo |editor1-first=Philip David |editor1-link=Philip David Zelazo |editor2-last=Moscovitch |editor2-first=Morris |editor2-link=Morris Moscovitch |editor3-last=Thompson |editor3-first=Evan |editor3-link=Evan Thompson |title=The Cambridge Handbook of Consciousness |year=2007 |publisher=Cambridge University Press |isbn=978-0-521-67412-6 |doi=10.1017/CBO9780511816789 |ref=none}} | ||
{{Div col end}} | {{Div col end}} | ||
===Articles=== | === Articles === | ||
* Lewis, Ralph. ''[https://www.psychologytoday.com/intl/blog/finding-purpose/202308/an-overview-of-the-leading-theories-of-consciousness An Overview of the Leading Theories of Consciousness].Organizing and comparing the major candidate theories in the field.'' Psychology Today, November 25, 2023. | * Lewis, Ralph. ''[https://www.psychologytoday.com/intl/blog/finding-purpose/202308/an-overview-of-the-leading-theories-of-consciousness An Overview of the Leading Theories of Consciousness].Organizing and comparing the major candidate theories in the field.'' Psychology Today, November 25, 2023. | ||
==External links== | == External links == | ||
{{Spoken Wikipedia|En-Consciousness-article.ogg|date=2023-07-30}} | {{Spoken Wikipedia|En-Consciousness-article.ogg|date=2023-07-30}} | ||
* {{Commons category-inline}} | * {{Commons category-inline}} | ||
* {{Library resources about}} | * {{Library resources about}} | ||
| Line 422: | Line 459: | ||
[[Category:Metaphysical properties]] | [[Category:Metaphysical properties]] | ||
[[Category:Metaphysics of mind]] | [[Category:Metaphysics of mind]] | ||
[[Category:Mind–body problem]] | |||
[[Category:Neuropsychological assessment]] | [[Category:Neuropsychological assessment]] | ||
[[Category:Ontology]] | [[Category:Ontology]] | ||
[[Category:Phenomenology]] | [[Category:Phenomenology]] | ||
[[Category:Theory of mind]] | [[Category:Theory of mind]] | ||
Latest revision as of 01:05, 9 November 2025
Template:Short description Script error: No such module "other uses". Script error: No such module "Distinguish". Template:Cs1 config Template:Use American English
Consciousness, at its simplest, is awareness of states or objects either internal to one's self or in one's external environment.[1] However, its nature has led to millennia of explanations, analyses, and debate among philosophers, scientists, and theologians. Opinions differ about what exactly needs to be studied, or can even be considered consciousness. In some explanations, it is synonymous with mind, and at other times, an aspect of it.
In the past, consciousness meant one's "inner life": the world of introspection, private thought, imagination, and volition.[2] Today, it often includes any kind of cognition, experience, feeling, or perception. It may be awareness, awareness of awareness, metacognition, or self-awareness, either continuously changing or not.[3][4] There is also a medical definition that helps, for example, to discern "coma" from other states. The disparate range of research, notions, and speculations raises some curiosity about whether the right questions are being asked.[5]
Examples of the range of descriptions, definitions and explanations are: ordered distinction between self and environment, simple wakefulness, one's sense of selfhood or soul explored by "looking within", being a metaphorical "stream" of contents, or being a mental state, mental event, or mental process of the brain.
Etymology
The words "conscious" and "consciousness" in the English language date to the 17th century, and the first recorded use of "conscious" as a simple adjective was applied figuratively to inanimate objects ("the conscious Groves", 1643).[6]Template:Rp It derived from the Latin conscius (con- "together" and scio "to know") which meant "knowing with" or "having joint or common knowledge with another", especially as in sharing a secret.[7] Thomas Hobbes in Leviathan (1651) wrote: "Where two, or more men, know of one and the same fact, they are said to be Conscious of it one to another".[8] There were also many occurrences in Latin writings of the phrase conscius sibi, which translates literally as "knowing with oneself", or in other words "sharing knowledge with oneself about something". This phrase has the figurative sense of "knowing that one knows", which is something like the modern English word "conscious", but it was rendered into English as "conscious to oneself" or "conscious unto oneself". For example, Archbishop Ussher wrote in 1613 of "being so conscious unto myself of my great weakness".[9]
The Latin conscientia, literally "knowledge-with", first appears in Roman juridical texts by writers such as Cicero. It means a kind of shared knowledge with moral value, specifically what a witness knows of someone else's deeds.[10][11] Although René Descartes (1596–1650), writing in Latin, is generally taken to be the first philosopher to use conscientia in a way less like the traditional meaning and more like the way modern English speakers would use "conscience", his meaning is nowhere defined.[12] In Search after Truth (Script error: No such module "Lang"., Amsterdam 1701) he wrote the word with a gloss: conscientiâ, vel interno testimonio (translatable as "conscience, or internal testimony").[13][14] It might mean the knowledge of the value of one's own thoughts.[12]
The origin of the modern concept of consciousness is often attributed to John Locke who defined the word in his Essay Concerning Human Understanding, published in 1690, as "the perception of what passes in a man's own mind".[15][16] The essay strongly influenced 18th-century British philosophy, and Locke's definition appeared in Samuel Johnson's celebrated Dictionary (1755).[17]
The French term conscience is defined roughly like English "consciousness" in the 1753 volume of Diderot and d'Alembert's Encyclopédie as "the opinion or internal feeling that we ourselves have from what we do".[18]
Problem of definition
Scholars are divided as to whether Aristotle had a concept of consciousness. He does not use any single word or terminology that is clearly similar to the phenomenon or concept defined by John Locke. Victor Caston contends that Aristotle did have a concept more clearly similar to perception.[19]
Modern dictionary definitions of the word consciousness evolved over several centuries and reflect a range of seemingly related meanings, with some differences that have been controversial, such as the distinction between inward awareness and perception of the physical world, or the distinction between conscious and unconscious, or the notion of a mental entity or mental activity that is not physical.
The common-usage definitions of consciousness in Webster's Third New International Dictionary (1966) are as follows:
- awareness or perception of an inward psychological or spiritual fact; intuitively perceived knowledge of something in one's inner self
- inward awareness of an external object, state, or fact
- concerned awareness; INTEREST, CONCERN—often used with an attributive noun [e.g. class consciousness]
- the state or activity that is characterized by sensation, emotion, volition, or thought; mind in the broadest possible sense; something in nature that is distinguished from the physical
- the totality in psychology of sensations, perceptions, ideas, attitudes, and feelings of which an individual or a group is aware at any given time or within a particular time span—
- waking life (as that to which one returns after sleep, trance, fever) wherein all one's mental powers have returned . . .
- the part of mental life or psychic content in psychoanalysis that is immediately available to the ego—
The Cambridge English Dictionary defines consciousness as "the state of being awake, thinking, and knowing what is happening around you", as well as "the state of understanding and realizing something".[20] The Oxford Living Dictionary defines consciousness as "[t]he state of being aware of and responsive to one's surroundings", "[a] person's awareness or perception of something", and "[t]he fact of awareness by the mind of itself and the world".[21]
Philosophers have attempted to clarify technical distinctions by using a jargon of their own. The corresponding entry in the Routledge Encyclopedia of Philosophy (1998) reads:
- Consciousness
- Philosophers have used the term consciousness for four main topics: knowledge in general, intentionality, introspection (and the knowledge it specifically generates) and phenomenal experience... Something within one's mind is 'introspectively conscious' just in case one introspects it (or is poised to do so). Introspection is often thought to deliver one's primary knowledge of one's mental life. An experience or other mental entity is 'phenomenally conscious' just in case there is 'something it is like' for one to have it. The clearest examples are: perceptual experience, such as tastings and seeings; bodily-sensational experiences, such as those of pains, tickles and itches; imaginative experiences, such as those of one's own actions or perceptions; and streams of thought, as in the experience of thinking 'in words' or 'in images'. Introspection and phenomenality seem independent, or dissociable, although this is controversial.[22]
Traditional metaphors for mind
During the early 19th century, the emerging field of geology inspired a popular metaphor that the mind likewise had hidden layers "which recorded the past of the individual".Template:R By 1875, most psychologists believed that "consciousness was but a small part of mental life",Template:R and this idea underlies the goal of Freudian therapy, to expose the Template:Em of the mind.
Other metaphors from various sciences inspired other analyses of the mind, for example: Johann Friedrich Herbart described ideas as being attracted and repulsed like magnets; John Stuart Mill developed the idea of "mental chemistry" and "mental compounds", and Edward B. Titchener sought the "structure" of the mind by analyzing its "elements". The abstract idea of states of consciousness mirrored the concept of states of matter.
In 1892, William James noted that the "ambiguous word 'content' has been recently invented instead of 'object'" and that the metaphor of mind as a Template:Em seemed to minimize the dualistic problem of how "states of consciousness can Template:Em" things, or objects;Template:R by 1899 psychologists were busily studying the "contents of conscious experience by introspection and experiment".[23]Template:Rp Another popular metaphor was James's doctrine of the stream of consciousness, with continuity, fringes, and transitions.Template:RTemplate:Efn
James discussed the difficulties of describing and studying psychological phenomena, recognizing that commonly used terminology was a necessary and acceptable starting point towards more precise, scientifically justified language. Prime examples were phrases like inner experience and personal consciousness:
<templatestyles src="Template:Blockquote/styles.css" />
The first and foremost concrete fact which every one will affirm to belong to his inner experience is the fact that Template:Em. [...] But everyone knows what the terms mean [only] in a rough way; [...] When I say Template:Em, 'personal consciousness' is one of the terms in question. Its meaning we know so long as no one asks us to define it, but to give an accurate account of it is the most difficult of philosophic tasks. [...] The only states of consciousness that we naturally deal with are found in personal consciousnesses, minds, selves, concrete particular I's and you's.Template:R
Script error: No such module "Check for unknown parameters".
From introspection to awareness and experience
Prior to the 20th century, philosophers treated the phenomenon of consciousness as the "inner world [of] one's own mind", and introspection was the mind "attending to" itself,Template:Efn an activity seemingly distinct from that of perceiving the "outer world" and its physical phenomena. In 1892 William James noted the distinction along with doubts about the inward character of the mind:<templatestyles src="Template:Blockquote/styles.css" />
'Things' have been doubted, but thoughts and feelings have never been doubted. The outer world, but never the inner world, has been denied. Everyone assumes that we have direct introspective acquaintance with our thinking activity as such, with our consciousness as something inward and contrasted with the outer objects which it knows. Yet I must confess that for my part I cannot feel sure of this conclusion. [...] It seems as if consciousness as an inner activity were rather a postulate than a sensibly given fact...[24]Template:Rp
Script error: No such module "Check for unknown parameters".
By the 1960s, for many philosophers and psychologists who talked about consciousness, the word no longer meant the 'inner world' but an indefinite, large category called awareness, as in the following example:
<templatestyles src="Template:Blockquote/styles.css" />
It is difficult for modern Western man to grasp that the Greeks really had no concept of consciousness in that they did not class together phenomena as varied as problem solving, remembering, imagining, perceiving, feeling pain, dreaming, and acting on the grounds that all these are manifestations of being aware or being conscious.[25]Template:Rp
Script error: No such module "Check for unknown parameters".
Many philosophers and scientists have been unhappy about the difficulty of producing a definition that does not involve circularity or fuzziness.[26] In The Macmillan Dictionary of Psychology (1989 edition), Stuart Sutherland emphasized external awareness, and expressed a skeptical attitude more than a definition:
<templatestyles src="Template:Blockquote/styles.css" />
Consciousness—The having of perceptions, thoughts, and feelings; awareness. The term is impossible to define except in terms that are unintelligible without a grasp of what consciousness means. Many fall into the trap of equating consciousness with self-consciousness—to be conscious it is only necessary to be aware of the external world. Consciousness is a fascinating but elusive phenomenon: it is impossible to specify what it is, what it does, or why it has evolved. Nothing worth reading has been written on it.[26]
Script error: No such module "Check for unknown parameters".
Using 'awareness', however, as a definition or synonym of consciousness is not a simple matter:
<templatestyles src="Template:Blockquote/styles.css" />
If awareness of the environment . . . is the criterion of consciousness, then even the protozoans are conscious. If awareness of awareness is required, then it is doubtful whether the great apes and human infants are conscious.[23]
Script error: No such module "Check for unknown parameters".
In 1974, philosopher Thomas Nagel used 'consciousness', 'conscious experience', 'subjective experience' and the 'subjective character of experience' as synonyms for something that "occurs at many levels of animal life ... [although] it is difficult to say in general what provides evidence of it."[27] Nagel's terminology also included what has been described as "the standard 'what it's like' locution"[28] in reference to the impenetrable subjectivity of any organism's experience which Nagel referred to as "inner life" without implying any kind of introspection. On Nagel's approach, Peter Hacker commented:Template:R "Consciousness, thus conceived, is extended to the whole domain of 'experience'—of 'Life' Template:Em." He regarded this as a "novel analysis of consciousness"Template:R and has been particularly critical of Nagel's terminology and its philosophical consequences.Template:R In 2002 he attacked Nagel's 'what it's like' phrase as "malconstructed" and meaningless English—it sounds as if it asks for an analogy, but does not—and he called Nagel's approach logically "misconceived" as a definition of consciousness.[29] In 2012 Hacker went further and asserted that Nagel had "laid the groundwork for ... forty years of fresh confusion about consciousness" and that "the contemporary philosophical conception of consciousness that is embraced by the 'consciousness studies community' is incoherent".Template:R
Influence on research
Many philosophers have argued that consciousness is a unitary concept that is understood by the majority of people despite the difficulty philosophers have had defining it.[30] The term 'subjective experience', following Nagel, is amibiguous, as philosophers seem to differ from non-philosophers in their intuitions about its meaning.[31] Max Velmans proposed that the "everyday understanding of consciousness" uncontroversially "refers to experience itself rather than any particular thing that we observe or experience" and he added that consciousness "is [therefore] exemplified by Template:Em the things that we observe or experience",Template:R whether thoughts, feelings, or perceptions. Velmans noted however, as of 2009, that there was a deep level of "confusion and internal division"[32] among experts about the phenomenon of consciousness, because researchers lacked "a sufficiently well-specified use of the term...to agree that they are investigating the same thing".Template:R He argued additionally that "pre-existing theoretical commitments" to competing explanations of consciousness might be a source of bias.
Within the "modern consciousness studies" community the technical phrase 'phenomenal consciousness' is a common synonym for all forms of awareness, or simply 'experience',Template:R without differentiating between inner and outer, or between higher and lower types. With advances in brain research, "the presence or absence of experienced phenomena"Template:R of any kind underlies the work of those neuroscientists who seek "to analyze the precise relation of conscious phenomenology to its associated information processing" in the brain.Template:R This neuroscientific goal is to find the "neural correlates of consciousness" (NCC). One criticism of this goal is that it begins with a theoretical commitment to the neurological origin of all "experienced phenomena" whether inner or outer.Template:Efn Also, the fact that the easiest 'content of consciousness' to be so analyzed is "the experienced three-dimensional world (the phenomenal world) beyond the body surface"Template:R invites another criticism, that most consciousness research since the 1990s, perhaps because of bias, has focused on processes of external perception.[33]
From a history of psychology perspective, Julian Jaynes rejected popular but "superficial views of consciousness"Template:R especially those which equate it with "that vaguest of terms, experience".[34]Template:Rp In 1976 he insisted that if not for introspection, which for decades had been ignored or taken for granted rather than explained, there could be no "conception of what consciousness is"Template:R and in 1990, he reaffirmed the traditional idea of the phenomenon called 'consciousness', writing that "its denotative definition is, as it was for René Descartes, John Locke, and David Hume, what is introspectable".Template:R Jaynes saw consciousness as an important but small part of human mentality, and he asserted: "there can be no progress in the science of consciousness until ... what is introspectable [is] sharply distinguished"Template:R from the Template:Em processes of cognition such as perception, reactive awareness and attention, and automatic forms of learning, problem-solving, and decision-making.Template:R
The cognitive science point of view—with an inter-disciplinary perspective involving fields such as psychology, linguistics and anthropology[35]—requires no agreed definition of "consciousness" but studies the interaction of many processes besides perception. For some researchers, consciousness is linked to some kind of "selfhood", for example to certain pragmatic issues such as the feeling of agency and the effects of regret[33] and action on experience of one's own body or social identity.[36] Similarly Daniel Kahneman, who focused on systematic errors in perception, memory and decision-making, has differentiated between two kinds of mental processes, or cognitive "systems":[37] the "fast" activities that are primary, automatic and "cannot be turned off",Template:R and the "slow", deliberate, effortful activities of a secondary system "often associated with the subjective experience of agency, choice, and concentration".Template:R Kahneman's two systems have been described as "roughly corresponding to unconscious and conscious processes".[38]Template:Rp The two systems can interact, for example in sharing the control of attention.Template:R While System 1 can be impulsive, "System 2 is in charge of self-control",Template:R and "When we think of ourselves, we identify with System 2, the conscious, reasoning self that has beliefs, makes choices, and decides what to think about and what to do".Template:R
Some have argued that we should eliminate the concept from our understanding of the mind, a position known as consciousness semanticism.[39]
Medical definition
In medicine, a "level of consciousness" terminology is used to describe a patient's arousal and responsiveness, which can be seen as a continuum of states ranging from full alertness and comprehension, through disorientation, delirium, loss of meaningful communication, and finally loss of movement in response to painful stimuli.[40] Issues of practical concern include how the level of consciousness can be assessed in severely ill, comatose, or anesthetized people, and how to treat conditions in which consciousness is impaired or disrupted.[41] The degree or level of consciousness is measured by standardized behavior observation scales such as the Glasgow Coma Scale.
Philosophy of mind
While historically philosophers have defended various views on consciousness, surveys indicate that physicalism is now the dominant position among contemporary philosophers of mind.[42] For an overview of the field, approaches often include both historical perspectives (e.g., Descartes, Locke, Kant) and organization by key issues in contemporary debates. An alternative is to focus primarily on current philosophical stances and empirical findings.
Coherence of the concept
Philosophers differ from non-philosophers in their intuitions about what consciousness is.[43] While most people have a strong intuition for the existence of what they refer to as consciousness,[30] skeptics argue that this intuition is too narrow, either because the concept of consciousness is embedded in our intuitions, or because we all are illusions. Gilbert Ryle, for example, argued that traditional understanding of consciousness depends on a Cartesian dualist outlook that improperly distinguishes between mind and body, or between mind and world. He proposed that we speak not of minds, bodies, and the world, but of entities, or identities, acting in the world. Thus, by speaking of "consciousness" we end up leading ourselves by thinking that there is any sort of thing as consciousness separated from behavioral and linguistic understandings.[44]
Types
Ned Block argues that discussions on consciousness often fail to properly distinguish phenomenal consciousness from access consciousness. These terms had been used before Block, but he adopted the short forms P-consciousness and A-consciousness.[45] According to Block:
- P-consciousness is raw experience: it is moving, colored forms, sounds, sensations, emotions and feelings with our bodies and responses at the center. These experiences, considered independently of any impact on behavior, are called qualia.
- A-consciousness is the phenomenon whereby information in our minds is accessible for verbal report, reasoning, and the control of behavior. So, when we perceive, information about what we perceive is access conscious; when we introspect, information about our thoughts is access conscious; when we remember, information about the past is access conscious, and so on.
Block adds that P-consciousness does not allow of easy definition: he admits that he "cannot define P-consciousness in any remotely noncircular way.[45]
Although some philosophers, such as Daniel Dennett, have disputed the validity of this distinction,[46] others have broadly accepted it. David Chalmers has argued that A-consciousness can in principle be understood in mechanistic terms, but that understanding P-consciousness is much more challenging: he calls this the hard problem of consciousness.[47]
Some philosophers believe that Block's two types of consciousness are not the end of the story. William Lycan, for example, argued in his book Consciousness and Experience that at least eight clearly distinct types of consciousness can be identified (organism consciousness; control consciousness; consciousness of; state/event consciousness; reportability; introspective consciousness; subjective consciousness; self-consciousness)—and that even this list omits several more obscure forms.[48]
There is also debate over whether or not A-consciousness and P-consciousness always coexist or if they can exist separately. Although P-consciousness without A-consciousness is more widely accepted, there have been some hypothetical examples of A without P. Block, for instance, suggests the case of a "zombie" that is computationally identical to a person but without any subjectivity. However, he remains somewhat skeptical concluding "I don't know whether there are any actual cases of A-consciousness without P-consciousness, but I hope I have illustrated their conceptual possibility".[49]
Distinguishing consciousness from its contents
Sam Harris observes: "At the level of your experience, you are not a body of cells, organelles, and atoms; you are consciousness and its ever-changing contents".[50] Seen in this way, consciousness is a subjectively experienced, ever-present field in which things (the contents of consciousness) come and go.
Christopher Tricker argues that this field of consciousness is symbolized by the mythical bird that opens the Daoist classic the Zhuangzi. This bird's name is Of a Flock (peng 鵬), yet its back is countless thousands of miles across and its wings are like clouds arcing across the heavens. "Like Of a Flock, whose wings arc across the heavens, the wings of your consciousness span to the horizon. At the same time, the wings of every other being's consciousness span to the horizon. You are of a flock, one bird among kin."[51]
Mind–body problem
Script error: No such module "Labelled list hatnote".
Mental processes (such as consciousness) and physical processes (such as brain events) seem to be correlated, however the specific nature of the connection is unknown.
The first influential philosopher to discuss this question specifically was Descartes, and the answer he gave is known as mind–body dualism. Descartes proposed that consciousness resides within an immaterial domain he called res cogitans (the realm of thought), in contrast to the domain of material things, which he called res extensa (the realm of extension).[52] He suggested that the interaction between these two domains occurs inside the brain, perhaps in a small midline structure called the pineal gland.[53]
Although it is widely accepted that Descartes explained the problem cogently, few later philosophers have been happy with his solution, and his ideas about the pineal gland have especially been ridiculed.[53] However, no alternative solution has gained general acceptance. Proposed solutions can be divided broadly into two categories: dualist solutions that maintain Descartes's rigid distinction between the realm of consciousness and the realm of matter but give different answers for how the two realms relate to each other; and monist solutions that maintain that there is really only one realm of being, of which consciousness and matter are both aspects. Each of these categories itself contains numerous variants. The two main types of dualism are substance dualism (which holds that the mind is formed of a distinct type of substance not governed by the laws of physics), and property dualism (which holds that the laws of physics are universally valid but cannot be used to explain the mind). The three main types of monism are physicalism (which holds that the mind is made out of matter), idealism (which holds that only thought or experience truly exists, and matter is merely an illusion), and neutral monism (which holds that both mind and matter are aspects of a distinct essence that is itself identical to neither of them). There are also, however, a large number of idiosyncratic theories that cannot cleanly be assigned to any of these schools of thought.[54]
Since the dawn of Newtonian science with its vision of simple mechanical principles governing the entire universe, some philosophers have been tempted by the idea that consciousness could be explained in purely physical terms. The first influential writer to propose such an idea explicitly was Julien Offray de La Mettrie, in his book Man a Machine (L'homme machine). His arguments, however, were very abstract.[55] The most influential modern physical theories of consciousness are based on psychology and neuroscience. Theories proposed by neuroscientists such as Gerald Edelman[56] and Antonio Damasio,[57] and by philosophers such as Daniel Dennett,[58] seek to explain consciousness in terms of neural events occurring within the brain. Many other neuroscientists, such as Christof Koch,[59] have explored the neural basis of consciousness without attempting to frame all-encompassing global theories. At the same time, computer scientists working in the field of artificial intelligence have pursued the goal of creating digital computer programs that can simulate or embody consciousness.[60]
A few theoretical physicists have argued that classical physics is intrinsically incapable of explaining the holistic aspects of consciousness, but that quantum theory may provide the missing ingredients. Several theorists have therefore proposed quantum mind (QM) theories of consciousness.[61] Notable theories falling into this category include the holonomic brain theory of Karl Pribram and David Bohm, and the Orch-OR theory formulated by Stuart Hameroff and Roger Penrose. Some of these QM theories offer descriptions of phenomenal consciousness, as well as QM interpretations of access consciousness. None of the quantum mechanical theories have been confirmed by experiment. Recent publications by G. Guerreshi, J. Cia, S. Popescu, and H. Briegel[62] could falsify proposals such as those of Hameroff, which rely on quantum entanglement in protein. At the present time many scientists and philosophers consider the arguments for an important role of quantum phenomena to be unconvincing.[63] Empirical evidence is against the notion of quantum consciousness, an experiment about wave function collapse led by Catalina Curceanu in 2022 suggests that quantum consciousness, as suggested by Roger Penrose and Stuart Hameroff, is highly implausible.[64]
Apart from the general question of the "hard problem" of consciousness (which is, roughly speaking, the question of how mental experience can arise from a physical basis[65]), a more specialized question is how to square the subjective notion that we are in control of our decisions (at least in some small measure) with the customary view of causality that subsequent events are caused by prior events. The topic of free will is the philosophical and scientific examination of this conundrum.
Problem of other minds
Script error: No such module "Labelled list hatnote".
Many philosophers consider experience to be the essence of consciousness, and believe that experience can only fully be known from the inside, subjectively. The problem of other minds is a philosophical problem traditionally stated as the following epistemological question: Given that I can only observe the behavior of others, how can I know that others have minds?[66] The problem of other minds is particularly acute for people who believe in the possibility of philosophical zombies, that is, people who think it is possible in principle to have an entity that is physically indistinguishable from a human being and behaves like a human being in every way but nevertheless lacks consciousness.[67] Related issues have also been studied extensively by Greg Littmann of the University of Illinois,[68] and by Colin Allen (a professor at the University of Pittsburgh) regarding the literature and research studying artificial intelligence in androids.[69]
The most commonly given answer is that we attribute consciousness to other people because we see that they resemble us in appearance and behavior; we reason that if they look like us and act like us, they must be like us in other ways, including having experiences of the sort that we do.[70] There are, however, a variety of problems with that explanation. For one thing, it seems to violate the principle of parsimony, by postulating an invisible entity that is not necessary to explain what we observe.[70] Some philosophers, such as Daniel Dennett in a research paper titled "The Unimagined Preposterousness of Zombies", argue that people who give this explanation do not really understand what they are saying.[71] More broadly, philosophers who do not accept the possibility of zombies generally believe that consciousness is reflected in behavior (including verbal behavior), and that we attribute consciousness on the basis of behavior. A more straightforward way of saying this is that we attribute experiences to people because of what they can do, including the fact that they can tell us about their experiences.[72]
Qualia
Script error: No such module "Labelled list hatnote".
The term "qualia" was introduced in philosophical literature by C. I. Lewis. The word is derived from Latin and means "of what sort". It is basically a quantity or property of something as perceived or experienced by an individual, like the scent of rose, the taste of wine, or the pain of a headache. They are difficult to articulate or describe. The philosopher and scientist Daniel Dennett describes them as "the way things seem to us", while philosopher and cognitive scientist David Chalmers expanded on qualia as the "hard problem of consciousness" in the 1990s. When qualia are experienced, activity is simulated in the brain, and these processes are called neural correlates of consciousness (NCCs). Many scientific studies have been done to attempt to link particular brain regions with emotions or experiences.[73][74][75]
Species which experience qualia are said to have sentience, which is central to the animal rights movement, because it includes the ability to experience pain and suffering.[73]
Identity
Script error: No such module "Labelled list hatnote".
An unsolved problem in the philosophy of consciousness is how it relates to the nature of personal identity.[76] This includes questions regarding whether someone is the "same person" from moment to moment. If that is the case, another question is what exactly the "identity carrier" is that makes a conscious being "the same" being from one moment to the next. The problem of determining personal identity also includes questions such as Benj Hellie's vertiginous question, which can be summarized as "Why am I me and not someone else?".[77] The philosophical problems regarding the nature of personal identity have been extensively discussed by Thomas Nagel in his book The View from Nowhere.
A common view of personal identity is that an individual has a continuous identity that persists from moment to moment, with an individual having a continuous identity consisting of a line segment stretching across time from birth to death. In the case of an afterlife as described in Abrahamic religions, one's personal identity is believed to stretch infinitely into the future, forming a ray or line. This notion of identity is similar to the form of dualism advocated by René Descartes. However, some philosophers argue that this common notion of personal identity is unfounded. Daniel Kolak has argued extensively against it in his book I am You.[78] Kolak refers to the aforementioned notion of personal identity being linear as "Closed individualism". Another view of personal identity according to Kolak is "Empty individualism", in which one's personal identity only exists for a single moment of time. However, Kolak advocates for a view of personal identity called Open individualism, in which all consciousness is in reality a single being and individual personal identity in reality does not exist at all. Another philosopher who has contested the notion of personal identity is Derek Parfit. In his book Reasons and Persons,[79] he describes a thought experiment known as the teletransportation paradox. In Buddhist philosophy, the concept of anattā refers to the idea that the self is an illusion.
Other philosophers have argued that Hellie's vertiginous question has a number of philosophical implications relating to the metaphysical nature of consciousness. Christian List argues that the vertiginous question and the existence of first-personal facts is evidence against physicalism, and evidence against other third-personal metaphysical pictures, including standard versions of dualism.[80] List also argues that the vertiginous question implies a "quadrilemma" for theories of consciousness. He claims that at most three of the following metaphysical claims can be true: 'first-person realism', 'non-solipsism', 'non-fragmentation', and 'one world'—and at least one of these four must be false.[81] List has proposed a model he calls the "many-worlds theory of consciousness" in order to reconcile the subjective nature of consciousness without lapsing into solipsism.[82] Vincent Conitzer argues that the nature of identity is connected to A series and B series theories of time, and that A-theory being true implies that the "I" is metaphysically distinguished from other perspectives.[83] Other philosophical theories regarding the metaphysical nature of self are Caspar Hare's theories of perspectival realism,[84] in which things within perceptual awareness have a defining intrinsic property that exists absolutely and not relative to anything, and egocentric presentism, in which the experiences of other individuals are not present in the way that one's current perspective is.[85][86]
Scientific study
For many decades, consciousness as a research topic was avoided by the majority of mainstream scientists, because of a general feeling that a phenomenon defined in subjective terms could not properly be studied using objective experimental methods.[87] In 1975 George Mandler published an influential psychological study which distinguished between slow, serial, and limited conscious processes and fast, parallel and extensive unconscious ones.[88] The Science and Religion Forum[89] 1984 annual conference, 'From Artificial Intelligence to Human Consciousness' identified the nature of consciousness as a matter for investigation; Donald Michie was a keynote speaker. Starting in the 1980s, an expanding community of neuroscientists and psychologists have associated themselves with a field called Consciousness Studies, giving rise to a stream of experimental work published in books,[90] journals such as Consciousness and Cognition, Frontiers in Consciousness Research, Psyche, and the Journal of Consciousness Studies, along with regular conferences organized by groups such as the Association for the Scientific Study of Consciousness[91] and the Society for Consciousness Studies.
Modern medical and psychological investigations into consciousness are based on psychological experiments (including, for example, the investigation of priming effects using subliminal stimuli),[92] and on case studies of alterations in consciousness produced by trauma, illness, or drugs. Broadly viewed, scientific approaches are based on two core concepts. The first identifies the content of consciousness with the experiences that are reported by human subjects; the second makes use of the concept of consciousness that has been developed by neurologists and other medical professionals who deal with patients whose behavior is impaired. In either case, the ultimate goals are to develop techniques for assessing consciousness objectively in humans as well as other animals, and to understand the neural and psychological mechanisms that underlie it.[59]
Measurement via verbal report
Experimental research on consciousness presents special difficulties, due to the lack of a universally accepted operational definition. In the majority of experiments that are specifically about consciousness, the subjects are human, and the criterion used is verbal report: in other words, subjects are asked to describe their experiences, and their descriptions are treated as observations of the contents of consciousness.[93]
For example, subjects who stare continuously at a Necker cube usually report that they experience it "flipping" between two 3D configurations, even though the stimulus itself remains the same.[94] The objective is to understand the relationship between the conscious awareness of stimuli (as indicated by verbal report) and the effects the stimuli have on brain activity and behavior. In several paradigms, such as the technique of response priming, the behavior of subjects is clearly influenced by stimuli for which they report no awareness, and suitable experimental manipulations can lead to increasing priming effects despite decreasing prime identification (double dissociation).[95]
Verbal report is widely considered to be the most reliable indicator of consciousness, but it raises a number of issues.[96] For one thing, if verbal reports are treated as observations, akin to observations in other branches of science, then the possibility arises that they may contain errors—but it is difficult to make sense of the idea that subjects could be wrong about their own experiences, and even more difficult to see how such an error could be detected.[97] Daniel Dennett has argued for an approach he calls heterophenomenology, which means treating verbal reports as stories that may or may not be true, but his ideas about how to do this have not been widely adopted.[98] Another issue with verbal report as a criterion is that it restricts the field of study to humans who have language: this approach cannot be used to study consciousness in other species, pre-linguistic children, or people with types of brain damage that impair language. As a third issue, philosophers who dispute the validity of the Turing test may feel that it is possible, at least in principle, for verbal report to be dissociated from consciousness entirely: a philosophical zombie may give detailed verbal reports of awareness in the absence of any genuine awareness.[99]
Although verbal report is in practice the "gold standard" for ascribing consciousness, it is not the only possible criterion.[96] In medicine, consciousness is assessed as a combination of verbal behavior, arousal, brain activity, and purposeful movement. The last three of these can be used as indicators of consciousness when verbal behavior is absent.[100][101] The scientific literature regarding the neural bases of arousal and purposeful movement is very extensive. Their reliability as indicators of consciousness is disputed, however, due to numerous studies showing that alert human subjects can be induced to behave purposefully in a variety of ways in spite of reporting a complete lack of awareness.[95] Studies related to the neuroscience of free will have also shown that the influence consciousness has on decision-making is not always straightforward.[102]
Mirror test and contingency awareness
Another approach applies specifically to the study of self-awareness, that is, the ability to distinguish oneself from others. In the 1970s Gordon Gallup developed an operational test for self-awareness, known as the mirror test. The test examines whether animals are able to differentiate between seeing themselves in a mirror versus seeing other animals. The classic example involves placing a spot of coloring on the skin or fur near the individual's forehead and seeing if they attempt to remove it or at least touch the spot, thus indicating that they recognize that the individual they are seeing in the mirror is themselves.[103] Humans (older than 18 months) and other great apes, bottlenose dolphins, orcas, pigeons, European magpies and elephants have all been observed to pass this test.[104] While some other animals like pigs have been shown to find food by looking into the mirror.[105]
Contingency awareness is another such approach, which is basically the conscious understanding of one's actions and its effects on one's environment.[106] It is recognized as a factor in self-recognition. The brain processes during contingency awareness and learning is believed to rely on an intact medial temporal lobe and age. A study done in 2020 involving transcranial direct current stimulation, Magnetic resonance imaging (MRI) and eyeblink classical conditioning supported the idea that the parietal cortex serves as a substrate for contingency awareness and that age-related disruption of this region is sufficient to impair awareness.[107]
Neural correlates
Script error: No such module "Labelled list hatnote".
A major part of the scientific literature on consciousness consists of studies that examine the relationship between the experiences reported by subjects and the activity that simultaneously takes place in their brains—that is, studies of the neural correlates of consciousness. The hope is to find that activity in a particular part of the brain, or a particular pattern of global brain activity, which will be strongly predictive of conscious awareness. Several brain imaging techniques, such as EEG and fMRI, have been used for physical measures of brain activity in these studies.[108]
Another idea that has drawn attention for several decades is that consciousness is associated with high-frequency (gamma band) oscillations in brain activity. This idea arose from proposals in the 1980s, by Christof von der Malsburg and Wolf Singer, that gamma oscillations could solve the so-called binding problem, by linking information represented in different parts of the brain into a unified experience.[109] Rodolfo Llinás, for example, proposed that consciousness results from recurrent thalamo-cortical resonance where the specific thalamocortical systems (content) and the non-specific (centromedial thalamus) thalamocortical systems (context) interact in the gamma band frequency via synchronous oscillations.[110]
A number of studies have shown that activity in primary sensory areas of the brain is not sufficient to produce consciousness: it is possible for subjects to report a lack of awareness even when areas such as the primary visual cortex (V1) show clear electrical responses to a stimulus.[111] Higher brain areas are seen as more promising, especially the prefrontal cortex, which is involved in a range of higher cognitive functions collectively known as executive functions.[112] There is substantial evidence that a "top-down" flow of neural activity (i.e., activity propagating from the frontal cortex to sensory areas) is more predictive of conscious awareness than a "bottom-up" flow of activity.[113] The prefrontal cortex is not the only candidate area, however: studies by Nikos Logothetis and his colleagues have shown, for example, that visually responsive neurons in parts of the temporal lobe reflect the visual perception in the situation when conflicting visual images are presented to different eyes (i.e., bistable percepts during binocular rivalry).[114] Furthermore, top-down feedback from higher to lower visual brain areas may be weaker or absent in the peripheral visual field, as suggested by some experimental data and theoretical arguments;[115] nevertheless humans can perceive visual inputs in the peripheral visual field arising from bottom-up V1 neural activities.[115][116] Meanwhile, bottom-up V1 activities for the central visual fields can be vetoed, and thus made invisible to perception, by the top-down feedback, when these bottom-up signals are inconsistent with the brain's internal model of the visual world.[115][116]
Modulation of neural responses may correlate with phenomenal experiences. In contrast to the raw electrical responses that do not correlate with consciousness, the modulation of these responses by other stimuli correlates surprisingly well with an important aspect of consciousness: namely with the phenomenal experience of stimulus intensity (brightness, contrast). In the research group of Danko Nikolić it has been shown that some of the changes in the subjectively perceived brightness correlated with the modulation of firing rates while others correlated with the modulation of neural synchrony.[117] An fMRI investigation suggested that these findings were strictly limited to the primary visual areas.[118] This indicates that, in the primary visual areas, changes in firing rates and synchrony can be considered as neural correlates of qualia—at least for some type of qualia.
In 2013, the perturbational complexity index (PCI) was proposed, a measure of the algorithmic complexity of the electrophysiological response of the cortex to transcranial magnetic stimulation. This measure was shown to be higher in individuals that are awake, in REM sleep or in a locked-in state than in those who are in deep sleep or in a vegetative state,[119] making it potentially useful as a quantitative assessment of consciousness states.
Assuming that not only humans but even some non-mammalian species are conscious, a number of evolutionary approaches to the problem of neural correlates of consciousness open up. For example, assuming that birds are conscious—a common assumption among neuroscientists and ethologists due to the extensive cognitive repertoire of birds—there are comparative neuroanatomical ways to validate some of the principal, currently competing, mammalian consciousness–brain theories. The rationale for such a comparative study is that the avian brain deviates structurally from the mammalian brain. So how similar are they? What homologs can be identified? The general conclusion from the study by Butler, et al.[120] is that some of the major theories for the mammalian brain[121][122][123] also appear to be valid for the avian brain. The structures assumed to be critical for consciousness in mammalian brains have homologous counterparts in avian brains. Thus the main portions of the theories of Crick and Koch,[121] Edelman and Tononi,[122] and Cotterill[123] seem to be compatible with the assumption that birds are conscious. Edelman also differentiates between what he calls primary consciousness (which is a trait shared by humans and non-human animals) and higher-order consciousness as it appears in humans alone along with human language capacity.[122] Certain aspects of the three theories, however, seem less easy to apply to the hypothesis of avian consciousness. For instance, the suggestion by Crick and Koch that layer 5 neurons of the mammalian brain have a special role, seems difficult to apply to the avian brain, since the avian homologs have a different morphology. Likewise, the theory of Eccles[124][125] seems incompatible, since a structural homolog/analogue to the dendron has not been found in avian brains. The assumption of an avian consciousness also brings the reptilian brain into focus. The reason is the structural continuity between avian and reptilian brains, meaning that the phylogenetic origin of consciousness may be earlier than suggested by many leading neuroscientists.
Joaquin Fuster of UCLA has advocated the position of the importance of the prefrontal cortex in humans, along with the areas of Wernicke and Broca, as being of particular importance to the development of human language capacities neuro-anatomically necessary for the emergence of higher-order consciousness in humans.[126]
A study in 2016 looked at lesions in specific areas of the brainstem that were associated with coma and vegetative states. A small region of the rostral dorsolateral pontine tegmentum in the brainstem was suggested to drive consciousness through functional connectivity with two cortical regions, the left ventral anterior insular cortex, and the pregenual anterior cingulate cortex. These three regions may work together as a triad to maintain consciousness.[127]
Krista and Tatiana Hogan have a unique thalamic connection that may provide insight into the philosophical and neurological foundations of consciousness. It has been argued that there's no empirical test that can conclusively establish that for some sensations, the twins share one token experience rather than two exactly matching token experiences. Yet background considerations about the way the brain has specific locations for conscious contents, combined with the evident overlapping pathways in the twins' brains, arguably implies that the twins share some conscious experiences. If this is true, then the twins may offer a proof of concept for how experiences in general could be shared between brains.[128][129][130]
Academic definitions of consciousness
Script error: No such module "Labelled list hatnote".
Clear definitions of consciousness in academic literature are rare. David Chalmers declared the task the hard problem of consciousness. However academic definitions do exist, from Tononi's integrated information theory, Craig MacKenzie, and Cleeremans and Jimenez - the latter being a Definition of Learning with remarkable similarity to both Tononi and MacKenzie's definitions. Both Bernard Baars and Igor Aleksander worked out the aspects necessary for consciousness.
Tononi's definition is as follows:[131]
according to Integrated information theory (IIT), consciousness requires a grouping of elements within a system that have physical cause-effect power upon one another. This in turn implies that only reentrant architecture consisting of feedback loops, whether neural or computational, will realize consciousness.
McKenzie's definition begins:[132]
Consciousness is the capacity to generate desires and decisions about perceived or imagined realities by distinguishing self from non-self through the use of perception, memory and imagination. ...
According to Axel Cleeremans and Luis Jiménez, learning is defined as:[133]
a set of phylogenetically advanced adaptation processes that critically depend on an evolved sensitivity to subjective experience so as to enable agents to afford flexible control over their actions in complex, unpredictable environments.
This definition is notable for its similarity to the global workspace theory (GWT) theatre analogy
Models
Script error: No such module "Labelled list hatnote".
A wide range of empirical theories of consciousness have been proposed.[134][135][136] Adrian Doerig and colleagues list 13 notable theories,[136] while Anil Seth and Tim Bayne list 22 notable theories.[135]
Global workspace theory
Global workspace theory (GWT) is a cognitive architecture and theory of consciousness proposed by the cognitive psychologist Bernard Baars in 1988. Baars explains the theory with the metaphor of a theater, with conscious processes represented by an illuminated stage. This theater integrates inputs from a variety of unconscious and otherwise autonomous networks in the brain and then broadcasts them to unconscious networks (represented in the metaphor by a broad, unlit "audience"). The theory has since been expanded upon by other scientists including cognitive neuroscientist Stanislas Dehaene and Lionel Naccache.[137][138] See also the Dehaene–Changeux model.
Integrated information theory
Integrated information theory (IIT), pioneered by neuroscientist Giulio Tononi in 2004, postulates that consciousness resides in the information being processed and arises once the information reaches a certain level of complexity. IIT proposes a 1:1 mapping between conscious states and precise, formal mathematical descriptions of those mental states. Proponents of this model suggest that it may provide a physical grounding for consciousness in neurons, as they provide the mechanism by which information is integrated. This also relates to the "hard problem of consciousness" proposed by David Chalmers.[139][73] In 2023, 124 scholars signed a letter saying that IIT gets disproportionate media attention relative to its supporting empirical evidence, and called it "pseudoscience", arguing that its core assumptions are not adequately testable. This led to academic debate, as some other researchers objected to the "pseudoscience" characterization.[140]
Orchestrated objective reduction
Orchestrated objective reduction (Orch-OR), or the quantum theory of mind, was proposed by scientists Roger Penrose and Stuart Hameroff, and states that consciousness originates at the quantum level inside neurons. The mechanism is held to be a quantum process called objective reduction that is orchestrated by cellular structures called microtubules, which form the cytoskeleton around which the brain is built. The duo proposed that these quantum processes accounted for creativity, innovation, and problem-solving abilities. Penrose published his views in the book The Emperor's New Mind. In 2014, the discovery of quantum vibrations inside microtubules gave new life to the argument.[73]
However, scientists and philosophers have criticized Penrose's interpretation of Gödel's theorem and his conclusion that quantum phenomena play a role in human cognition.[141]
Attention schema theory
In 2011, Michael Graziano and Kastner[142] proposed the "attention schema" theory of awareness. Graziano went on to publish an expanded discussion of this theory in his book "Consciousness and the Social Brain".[143] In that theory, specific cortical areas, notably in the superior temporal sulcus and the temporo-parietal junction, are used to build the construct of awareness and attribute it to other people. The same cortical machinery is also used to attribute awareness to oneself. Damage to these cortical regions can lead to deficits in consciousness such as hemispatial neglect. In the attention schema theory, the value of explaining the feature of awareness and attributing it to a person is to gain a useful predictive model of that person's attentional processing. Attention is a style of information processing in which a brain focuses its resources on a limited set of interrelated signals. Awareness, in this theory, is a useful, simplified schema that represents attentional states. To be aware of X is explained by constructing a model of one's attentional focus on X.
Entropic brain theory
The entropic brain is a theory of conscious states informed by neuroimaging research with psychedelic drugs. The theory suggests that the brain in primary states such as rapid eye movement (REM) sleep, early psychosis and under the influence of psychedelic drugs, is in a disordered state; normal waking consciousness constrains some of this freedom and makes possible metacognitive functions such as internal self-administered reality testing and self-awareness.[144][145][146][147] Criticism has included questioning whether the theory has been adequately tested.[148]
Projective consciousness model
In 2017, work by David Rudrauf and colleagues, including Karl Friston, applied the active inference paradigm to consciousness, leading to the projective consciousness model (PCM), a model of how sensory data is integrated with priors in a process of projective transformation. The authors argue that, while their model identifies a key relationship between computation and phenomenology, it does not completely solve the hard problem of consciousness or completely close the explanatory gap.[149]
Claustrum being the conductor for consciousness
In 2004, a proposal was made by molecular biologist Francis Crick (co-discoverer of the double helix), which stated that to bind together an individual's experience, a conductor of an orchestra is required. Together with neuroscientist Christof Koch, he proposed that this conductor would have to collate information rapidly from various regions of the brain. The duo reckoned that the claustrum was well suited for the task. However, Crick died while working on the idea.[73]
The proposal is backed by a study done in 2014, where a team at the George Washington University induced unconsciousness in a 54-year-old woman suffering from intractable epilepsy by stimulating her claustrum. The woman underwent depth electrode implantation and electrical stimulation mapping. The electrode between the left claustrum and anterior-dorsal insula was the one which induced unconsciousness. Correlation for interactions affecting medial parietal and posterior frontal channels during stimulation increased significantly as well. Their findings suggested that the left claustrum or anterior insula is an important part of a network that subserves consciousness, and that disruption of consciousness is related to increased EEG signal synchrony within frontal-parietal networks. However, this remains an isolated, hence inconclusive study.[73][150]
A study published in 2022 opposed the idea Claustrum is the seat of consciousness but instead concluded that it is more like a "router" transferring command and information across the brain.[151][152] The study showed that when the Claustrum is disabled, complex tasks could not be performed.
Biological function and evolution
The emergence of consciousness during biological evolution remains a topic of ongoing scientific inquiry. The survival value of consciousness is still a matter of exploration and understanding. While consciousness appears to play a crucial role in human cognition, decision-making, and self-awareness, its adaptive significance across different species remains a subject of debate.
Some people question whether consciousness has any survival value. Some argue that consciousness is a by-product of evolution. Thomas Henry Huxley for example defends in an essay titled "On the Hypothesis that Animals are Automata, and its History" an epiphenomenalist theory of consciousness, according to which consciousness is a causally inert effect of neural activity—"as the steam-whistle which accompanies the work of a locomotive engine is without influence upon its machinery".[153] To this William James objects in his essay Are We Automata? by stating an evolutionary argument for mind-brain interaction implying that if the preservation and development of consciousness in the biological evolution is a result of natural selection, it is plausible that consciousness has not only been influenced by neural processes, but has had a survival value itself; and it could only have had this if it had been efficacious.[154][155] Karl Popper develops a similar evolutionary argument in the book The Self and Its Brain.[156]
Opinions are divided on when and how consciousness first arose. It has been argued that consciousness emerged (i) exclusively with the first humans, (ii) exclusively with the first mammals, (iii) independently in mammals and birds, or (iv) with the first reptiles.[157] Other authors date the origins of consciousness to the first animals with nervous systems or early vertebrates in the Cambrian over 500 million years ago.[158] Donald Griffin suggests in his book Animal Minds a gradual evolution of consciousness.[159] Further exploration of the origins of consciousness, particularly in molluscs, has been done by Peter Godfrey Smith in his book Metazoa.[160]
Regarding the primary function of conscious processing, a recurring idea in recent theories is that phenomenal states somehow integrate neural activities and information-processing that would otherwise be independent.[161] This has been called the integration consensus. Another example has been proposed by Gerald Edelman called dynamic core hypothesis which puts emphasis on reentrant connections that reciprocally link areas of the brain in a massively parallel manner.[162] Edelman also stresses the importance of the evolutionary emergence of higher-order consciousness in humans from the historically older trait of primary consciousness which humans share with non-human animals (see Neural correlates section above). These theories of integrative function present solutions to two classic problems associated with consciousness: differentiation and unity. They show how our conscious experience can discriminate between a virtually unlimited number of different possible scenes and details (differentiation) because it integrates those details from our sensory systems, while the integrative nature of consciousness in this view easily explains how our experience can seem unified as one whole despite all of these individual parts. However, it remains unspecified which kinds of information are integrated in a conscious manner and which kinds can be integrated without consciousness. Nor is it explained what specific causal role conscious integration plays, nor why the same functionality cannot be achieved without consciousness. Not all kinds of information are capable of being disseminated consciously (e.g., neural activity related to vegetative functions, reflexes, unconscious motor programs, low-level perceptual analyzes, etc.), and many kinds of information can be disseminated and combined with other kinds without consciousness, as in intersensory interactions such as the ventriloquism effect.[163] Hence it remains unclear why any of it is conscious. For a review of the differences between conscious and unconscious integrations, see the article of Ezequiel Morsella.[163]
As noted earlier, even among writers who consider consciousness to be well-defined, there is widespread dispute about which animals other than humans can be said to possess it.[164] Edelman has described this distinction as that of humans possessing higher-order consciousness while sharing the trait of primary consciousness with non-human animals (see previous paragraph). Thus, any examination of the evolution of consciousness is faced with great difficulties. Nevertheless, some writers have argued that consciousness can be viewed from the standpoint of evolutionary biology as an adaptation in the sense of a trait that increases fitness.[165] In his article "Evolution of consciousness", John Eccles argued that special anatomical and physical properties of the mammalian cerebral cortex gave rise to consciousness ("[a] psychon ... linked to [a] dendron through quantum physics").[166] Bernard Baars proposed that once in place, this "recursive" circuitry may have provided a basis for the subsequent development of many of the functions that consciousness facilitates in higher organisms.[167] Peter Carruthers has put forth one such potential adaptive advantage gained by conscious creatures by suggesting that consciousness allows an individual to make distinctions between appearance and reality.[168] This ability would enable a creature to recognize the likelihood that their perceptions are deceiving them (e.g. that water in the distance may be a mirage) and behave accordingly, and it could also facilitate the manipulation of others by recognizing how things appear to them for both cooperative and devious ends.
Other philosophers, however, have suggested that consciousness would not be necessary for any functional advantage in evolutionary processes.[169][170] No one has given a causal explanation, they argue, of why it would not be possible for a functionally equivalent non-conscious organism (i.e., a philosophical zombie) to achieve the very same survival advantages as a conscious organism. If evolutionary processes are blind to the difference between function F being performed by conscious organism O and non-conscious organism O*, it is unclear what adaptive advantage consciousness could provide.[171] As a result, an exaptive explanation of consciousness has gained favor with some theorists that posit consciousness did not evolve as an adaptation but was an exaptation arising as a consequence of other developments such as increases in brain size or cortical rearrangement.[158] Consciousness in this sense has been compared to the blind spot in the retina where it is not an adaption of the retina, but instead just a by-product of the way the retinal axons were wired.[172] Several scholars including Pinker, Chomsky, Edelman, and Luria have indicated the importance of the emergence of human language as an important regulative mechanism of learning and memory in the context of the development of higher-order consciousness (see Neural correlates section above).
Altered states
Script error: No such module "Labelled list hatnote".
There are some brain states in which consciousness seems to be absent, including dreamless sleep or coma. There are also a variety of circumstances that can change the relationship between the mind and the world in less drastic ways, producing what are known as altered states of consciousness. Some altered states occur naturally; others can be produced by drugs or brain damage.[173] Altered states can be accompanied by changes in thinking, disturbances in the sense of time, feelings of loss of control, changes in emotional expression, alternations in body image and changes in meaning or significance.[174]
The two most widely accepted altered states are sleep and dreaming. Although dream sleep and non-dream sleep appear very similar to an outside observer, each is associated with a distinct pattern of brain activity, metabolic activity, and eye movement; each is also associated with a distinct pattern of experience and cognition. During ordinary non-dream sleep, people who are awakened report only vague and sketchy thoughts, and their experiences do not cohere into a continuous narrative. During dream sleep, in contrast, people who are awakened report rich and detailed experiences in which events form a continuous progression, which may however be interrupted by bizarre or fantastic intrusions.[175]Script error: No such module "Unsubst". Thought processes during the dream state frequently show a high level of irrationality. Both dream and non-dream states are associated with severe disruption of memory: it usually disappears in seconds during the non-dream state, and in minutes after awakening from a dream unless actively refreshed.[176]
Research conducted on the effects of partial epileptic seizures on consciousness found that patients who have partial epileptic seizures experience altered states of consciousness.[177][178] In partial epileptic seizures, consciousness is impaired or lost while some aspects of consciousness, often automated behaviors, remain intact. Studies found that when measuring the qualitative features during partial epileptic seizures, patients exhibited an increase in arousal and became absorbed in the experience of the seizure, followed by difficulty in focusing and shifting attention.
A variety of psychoactive drugs, including alcohol, have notable effects on consciousness.[179] These range from a simple dulling of awareness produced by sedatives, to increases in the intensity of sensory qualities produced by stimulants, cannabis, empathogens–entactogens such as MDMA ("Ecstasy"), or most notably by the class of drugs known as psychedelics.[173] LSD, mescaline, psilocybin, dimethyltryptamine, and others in this group can produce major distortions of perception, including hallucinations; some users even describe their drug-induced experiences as mystical or spiritual in quality. The brain mechanisms underlying these effects are not as well understood as those induced by use of alcohol,[179] but there is substantial evidence that alterations in the brain system that uses the chemical neurotransmitter serotonin play an essential role.[180]
There has been some research into physiological changes in yogis and people who practise various techniques of meditation. Some research with brain waves during meditation has reported differences between those corresponding to ordinary relaxation and those corresponding to meditation. It has been disputed, however, whether there is enough evidence to count these as physiologically distinct states of consciousness.[181]
The most extensive study of the characteristics of altered states of consciousness was made by psychologist Charles Tart in the 1960s and 1970s. Tart analyzed a state of consciousness as made up of a number of component processes, including exteroception (sensing the external world); interoception (sensing the body); input-processing (seeing meaning); emotions; memory; time sense; sense of identity; evaluation and cognitive processing; motor output; and interaction with the environment.[182]Template:Self-published source Each of these, in his view, could be altered in multiple ways by drugs or other manipulations. The components that Tart identified have not, however, been validated by empirical studies. Research in this area has not yet reached firm conclusions, but a recent questionnaire-based study identified eleven significant factors contributing to drug-induced states of consciousness: experience of unity; spiritual experience; blissful state; insightfulness; disembodiment; impaired control and cognition; anxiety; complex imagery; elementary imagery; audio-visual synesthesia; and changed meaning of percepts.[183]
Medical aspects
Script error: No such module "Labelled list hatnote".
The medical approach to consciousness is scientifically oriented. It derives from a need to treat people whose brain function has been impaired as a result of disease, brain damage, toxins, or drugs. In medicine, conceptual distinctions are considered useful to the degree that they can help to guide treatments. The medical approach mainly focuses on the amount of consciousness a person has: in medicine, consciousness is assessed as a "level" ranging from coma and brain death at the low end, to full alertness and purposeful responsiveness at the high end.[184]
Consciousness is of concern to patients and physicians, especially neurologists and anesthesiologists. Patients may have disorders of consciousness or may need to be anesthetized for a surgical procedure. Physicians may perform consciousness-related interventions such as instructing the patient to sleep, administering general anesthesia, or inducing medical coma.[184] Also, bioethicists may be concerned with the ethical implications of consciousness in medical cases of patients such as the Karen Ann Quinlan case,[185] while neuroscientists may study patients with impaired consciousness in hopes of gaining information about how the brain works.[186]
Assessment
In medicine, consciousness is examined using a set of procedures known as neuropsychological assessment.[100] There are two commonly used methods for assessing the level of consciousness of a patient: a simple procedure that requires minimal training, and a more complex procedure that requires substantial expertise. The simple procedure begins by asking whether the patient is able to move and react to physical stimuli. If so, the next question is whether the patient can respond meaningfully to questions and commands. If so, the patient is asked for their name, current location, and current day and time. A patient who can answer all of these questions is said to be "alert and oriented times four" (sometimes denoted "A&Ox4" on a medical chart), and is usually considered fully conscious.[187]
The more complex procedure is known as a neurological examination, and is usually carried out by a neurologist in a hospital setting. A formal neurological examination runs through a precisely delineated series of tests, beginning with tests for basic sensorimotor reflexes, and culminating with tests for sophisticated use of language. The outcome may be summarized using the Glasgow Coma Scale, which yields a number in the range 3–15, with a score of 3 to 8 indicating coma, and 15 indicating full consciousness. The Glasgow Coma Scale has three subscales, measuring the best motor response (ranging from "no motor response" to "obeys commands"), the best eye response (ranging from "no eye opening" to "eyes opening spontaneously") and the best verbal response (ranging from "no verbal response" to "fully oriented"). There is also a simpler pediatric version of the scale, for children too young to be able to use language.[184]
In 2013, an experimental procedure was developed to measure degrees of consciousness, the procedure involving stimulating the brain with a magnetic pulse, measuring resulting waves of electrical activity, and developing a consciousness score based on the complexity of the brain activity.[188]
Disorders
Medical conditions that inhibit consciousness are considered disorders of consciousness.[189] This category generally includes minimally conscious state and persistent vegetative state, but sometimes also includes the less severe locked-in syndrome and more severe chronic coma.[189][190] Differential diagnosis of these disorders is an active area of biomedical research.[191][192][193] Finally, brain death results in possible irreversible disruption of consciousness.[189] While other conditions may cause a moderate deterioration (e.g., dementia and delirium) or transient interruption (e.g., grand mal and petit mal seizures) of consciousness, they are not included in this category.
| Disorder | Description |
|---|---|
| Locked-in syndrome | The patient has awareness, sleep-wake cycles, and meaningful behavior (viz., eye-movement), but is isolated due to quadriplegia and pseudobulbar palsy. |
| Minimally conscious state | The patient has intermittent periods of awareness and wakefulness and displays some meaningful behavior. |
| Persistent vegetative state | The patient has sleep-wake cycles, but lacks awareness and only displays reflexive and non-purposeful behavior. |
| Chronic coma | The patient lacks awareness and sleep-wake cycles and only displays reflexive behavior. |
| Brain death | The patient lacks awareness, sleep-wake cycles, and brain-mediated reflexive behavior. |
Medical experts increasingly view anosognosia as a disorder of consciousness.[194] Anosognosia is a Greek-derived term meaning "unawareness of disease". This is a condition in which patients are disabled in some way, most commonly as a result of a stroke, but either misunderstand the nature of the problem or deny that there is anything wrong with them.[195] The most frequently occurring form is seen in people who have experienced a stroke damaging the parietal lobe in the right hemisphere of the brain, giving rise to a syndrome known as hemispatial neglect, characterized by an inability to direct action or attention toward objects located to the left with respect to their bodies. Patients with hemispatial neglect are often paralyzed on the left side of the body, but sometimes deny being unable to move. When questioned about the obvious problem, the patient may avoid giving a direct answer or an explanation that does not make sense. Patients with hemispatial neglect may also fail to recognize paralyzed parts of their bodies: one frequently mentioned case is of a man who repeatedly tried to throw his own paralyzed right leg out of the bed he was lying in, and when asked what he was doing, complained that somebody had put a dead leg into the bed with him. An even more striking type of anosognosia is Anton–Babinski syndrome, a rarely occurring condition in which patients become blind but claim to be able to see normally, and persist in this claim in spite of all evidence to the contrary.[196]
Outside human adults
In children
Script error: No such module "Labelled list hatnote".
Of the eight types of consciousness in the Lycan classification, some are detectable in utero and others develop years after birth. Psychologist and educator William Foulkes studied children's dreams and concluded that prior to the shift in cognitive maturation that humans experience during ages five to seven,[197] children lack the Lockean consciousness that Lycan had labeled "introspective consciousness" and that Foulkes labels "self-reflection".[198] In a 2020 paper, Katherine Nelson and Robyn Fivush use "autobiographical consciousness" to label essentially the same faculty, and agree with Foulkes on the timing of this faculty's acquisition. Nelson and Fivush contend that "language is the tool by which humans create a new, uniquely human form of consciousness, namely, autobiographical consciousness".[199] Julian Jaynes had staked out these positions decades earlier.[200][201] Citing the developmental steps that lead the infant to autobiographical consciousness, Nelson and Fivush point to the acquisition of "theory of mind", calling theory of mind "necessary for autobiographical consciousness" and defining it as "understanding differences between one's own mind and others' minds in terms of beliefs, desires, emotions and thoughts". They write, "The hallmark of theory of mind, the understanding of false belief, occurs ... at five to six years of age".[202]
In animals
Script error: No such module "Labelled list hatnote".
The topic of animal consciousness is beset by a number of difficulties. It poses the problem of other minds in an especially severe form, because non-human animals, lacking the ability to express human language, cannot tell humans about their experiences.[203] Also, it is difficult to reason objectively about the question, because a denial that an animal is conscious is often taken to imply that it does not feel, its life has no value, and that harming it is not morally wrong. Descartes, for example, has sometimes been blamed for mistreatment of animals due to the fact that he believed only humans have a non-physical mind.[204] Most people have a strong intuition that some animals, such as cats and dogs, are conscious, while others, such as insects, are not; but the sources of this intuition are not obvious, and are often based on personal interactions with pets and other animals they have observed.[203]
Philosophers who consider subjective experience the essence of consciousness also generally believe, as a correlate, that the existence and nature of animal consciousness can never rigorously be known. Thomas Nagel spelled out this point of view in an influential essay titled "What Is it Like to Be a Bat?". He said that an organism is conscious "if and only if there is something that it is like to be that organism—something it is like for the organism"; and he argued that no matter how much we know about an animal's brain and behavior, we can never really put ourselves into the mind of the animal and experience its world in the way it does itself.[205] Other thinkers, such as Douglas Hofstadter, dismiss this argument as incoherent.[206] Several psychologists and ethologists have argued for the existence of animal consciousness by describing a range of behaviors that appear to show animals holding beliefs about things they cannot directly perceive—Donald Griffin's 2001 book Animal Minds reviews a substantial portion of the evidence.[159]
On July 7, 2012, eminent scientists from different branches of neuroscience gathered at the University of Cambridge to celebrate the Francis Crick Memorial Conference, which deals with consciousness in humans and pre-linguistic consciousness in nonhuman animals. After the conference, they signed in the presence of Stephen Hawking, the 'Cambridge Declaration on Consciousness', which summarizes the most important findings of the survey:
"We decided to reach a consensus and make a statement directed to the public that is not scientific. It's obvious to everyone in this room that animals have consciousness, but it is not obvious to the rest of the world. It is not obvious to the rest of the Western world or the Far East. It is not obvious to the society."[207]
"Convergent evidence indicates that non-human animals ..., including all mammals and birds, and other creatures, ... have the necessary neural substrates of consciousness and the capacity to exhibit intentional behaviors."[208]
In artificial intelligence
Script error: No such module "Labelled list hatnote".
The idea of an artifact made conscious is an ancient theme of mythology, appearing for example in the Greek myth of Pygmalion, who carved a statue that was magically brought to life, and in medieval Jewish stories of the Golem, a magically animated homunculus built of clay.[209] However, the possibility of actually constructing a conscious machine was probably first discussed by Ada Lovelace, in a set of notes written in 1842 about the Analytical Engine invented by Charles Babbage, a precursor (never built) to modern electronic computers. Lovelace was essentially dismissive of the idea that a machine such as the Analytical Engine could think in a humanlike way. She wrote:
<templatestyles src="Template:Blockquote/styles.css" />
It is desirable to guard against the possibility of exaggerated ideas that might arise as to the powers of the Analytical Engine. ... The Analytical Engine has no pretensions whatever to originate anything. It can do whatever we know how to order it to perform. It can follow analysis; but it has no power of anticipating any analytical relations or truths. Its province is to assist us in making available what we are already acquainted with.[210]
Script error: No such module "Check for unknown parameters".
One of the most influential contributions to this question was an essay written in 1950 by pioneering computer scientist Alan Turing, titled Computing Machinery and Intelligence. Turing disavowed any interest in terminology, saying that even "Can machines think?" is too loaded with spurious connotations to be meaningful; but he proposed to replace all such questions with a specific operational test, which has become known as the Turing test.[211] To pass the test, a computer must be able to imitate a human well enough to fool interrogators. In his essay Turing discussed a variety of possible objections, and presented a counterargument to each of them. The Turing test is commonly cited in discussions of artificial intelligence as a proposed criterion for machine consciousness; it has provoked a great deal of philosophical debate. For example, Daniel Dennett and Douglas Hofstadter argue that anything capable of passing the Turing test is necessarily conscious,[212] while David Chalmers argues that a philosophical zombie could pass the test, yet fail to be conscious.[213] A third group of scholars have argued that with technological growth once machines begin to display any substantial signs of human-like behavior then the dichotomy (of human consciousness compared to human-like consciousness) becomes passé and issues of machine autonomy begin to prevail even as observed in its nascent form within contemporary industry and technology.[68][69] Jürgen Schmidhuber argues that consciousness is the result of compression.[214] As an agent sees representation of itself recurring in the environment, the compression of this representation can be called consciousness.
In a lively exchange over what has come to be referred to as "the Chinese room argument", John Searle sought to refute the claim of proponents of what he calls "strong artificial intelligence (AI)" that a computer program can be conscious, though he does agree with advocates of "weak AI" that computer programs can be formatted to "simulate" conscious states. His own view is that consciousness has subjective, first-person causal powers by being essentially intentional due to the way human brains function biologically; conscious persons can perform computations, but consciousness is not inherently computational the way computer programs are. To make a Turing machine that speaks Chinese, Searle imagines a room with one monolingual English speaker (Searle himself, in fact), a book that designates a combination of Chinese symbols to be output paired with Chinese symbol input, and boxes filled with Chinese symbols. In this case, the English speaker is acting as a computer and the rulebook as a program. Searle argues that with such a machine, he would be able to process the inputs to outputs perfectly without having any understanding of Chinese, nor having any idea what the questions and answers could possibly mean. If the experiment were done in English, since Searle knows English, he would be able to take questions and give answers without any algorithms for English questions, and he would be effectively aware of what was being said and the purposes it might serve. Searle would pass the Turing test of answering the questions in both languages, but he is only conscious of what he is doing when he speaks English. Another way of putting the argument is to say that computer programs can pass the Turing test for processing the syntax of a language, but that the syntax cannot lead to semantic meaning in the way strong AI advocates hoped.[215][216]
In the literature concerning artificial intelligence, Searle's essay has been second only to Turing's in the volume of debate it has generated.[217] Searle himself was vague about what extra ingredients it would take to make a machine conscious: all he proposed was that what was needed was "causal powers" of the sort that the brain has and that computers lack. But other thinkers sympathetic to his basic argument have suggested that the necessary (though perhaps still not sufficient) extra conditions may include the ability to pass not just the verbal version of the Turing test, but the robotic version,[218] which requires grounding the robot's words in the robot's sensorimotor capacity to categorize and interact with the things in the world that its words are about, Turing-indistinguishably from a real person. Turing-scale robotics is an empirical branch of research on embodied cognition and situated cognition.[219]
In 2014, Victor Argonov has suggested a non-Turing test for machine consciousness based on a machine's ability to produce philosophical judgments.[220] He argues that a deterministic machine must be regarded as conscious if it is able to produce judgments on all problematic properties of consciousness (such as qualia or binding) having no innate (preloaded) philosophical knowledge on these issues, no philosophical discussions while learning, and no informational models of other creatures in its memory (such models may implicitly or explicitly contain knowledge about these creatures' consciousness). However, this test can be used only to detect, but not refute the existence of consciousness. A positive result proves that a machine is conscious but a negative result proves nothing. For example, absence of philosophical judgments may be caused by lack of the machine's intellect, not by absence of consciousness.
Nick Bostrom has argued in 2023 that, being very sure that large language models (LLMs) are not conscious, would require unwarranted confidence; in which consciousness theory is correct and how it applies to machines.[221] He views consciousness as a matter of degree,[222] and argued that machines could in theory be much more conscious than humans.[223][224]
Stream of consciousness
Script error: No such module "Labelled list hatnote".
William James is usually credited with popularizing the idea that human consciousness flows like a stream, in his Principles of Psychology of 1890.
According to James, the "stream of thought" is governed by five characteristics:[225]
- Every thought tends to be part of a personal consciousness.
- Within each personal consciousness thought is always changing.
- Within each personal consciousness thought is sensibly continuous.
- It always appears to deal with objects independent of itself.
- It is interested in some parts of these objects to the exclusion of others.
A similar concept appears in Buddhist philosophy, expressed by the Sanskrit term Citta-saṃtāna, which is usually translated as mindstream or "mental continuum". Buddhist teachings describe that consciousness manifests moment to moment as sense impressions and mental phenomena that are continuously changing.[226] The teachings list six triggers that can result in the generation of different mental events.[226] These triggers are input from the five senses (seeing, hearing, smelling, tasting or touch sensations), or a thought (relating to the past, present or the future) that happen to arise in the mind. The mental events generated as a result of these triggers are: feelings, perceptions and intentions/behaviour. The moment-by-moment manifestation of the mind-stream is said to happen in every person all the time. It even happens in a scientist who analyzes various phenomena in the world, or analyzes the material body including the organ brain.[226] The manifestation of the mindstream is also described as being influenced by physical laws, biological laws, psychological laws, volitional laws, and universal laws.[226] The purpose of the Buddhist practice of mindfulness is to understand the inherent nature of the consciousness and its characteristics.[227]
Narrative form
In the West, the primary impact of the idea has been on literature rather than science: "stream of consciousness as a narrative mode" means writing in a way that attempts to portray the moment-to-moment thoughts and experiences of a character. This technique perhaps had its beginnings in the monologues of Shakespeare's plays and reached its fullest development in the novels of James Joyce and Virginia Woolf, although it has also been used by many other noted writers.[228]
Here, for example, is a passage from Joyce's Ulysses about the thoughts of Molly Bloom:
<templatestyles src="Template:Blockquote/styles.css" />
Yes because he never did a thing like that before as ask to get his breakfast in bed with a couple of eggs since the City Arms hotel when he used to be pretending to be laid up with a sick voice doing his highness to make himself interesting for that old faggot Mrs Riordan that he thought he had a great leg of and she never left us a farthing all for masses for herself and her soul greatest miser ever was actually afraid to lay out 4d for her methylated spirit telling me all her ailments she had too much old chat in her about politics and earthquakes and the end of the world let us have a bit of fun first God help the world if all the women were her sort down on bathingsuits and lownecks of course nobody wanted her to wear them I suppose she was pious because no man would look at her twice I hope Ill never be like her a wonder she didnt want us to cover our faces but she was a well-educated woman certainly and her gabby talk about Mr Riordan here and Mr Riordan there I suppose he was glad to get shut of her.[229]
Script error: No such module "Check for unknown parameters".
Spiritual approaches
Script error: No such module "labelled list hatnote".
The Upanishads hold the oldest recorded map of consciousness, as explored by sages through meditation.[230]
The Canadian psychiatrist Richard Maurice Bucke, author of the 1901 book Cosmic Consciousness: A Study in the Evolution of the Human Mind, distinguished between three types of consciousness: 'Simple Consciousness', awareness of the body, possessed by many animals; 'Self Consciousness', awareness of being aware, possessed only by humans; and 'Cosmic Consciousness', awareness of the life and order of the universe, possessed only by humans who have attained "intellectual enlightenment or illumination".[231]
Another thorough account of the spiritual approach is Ken Wilber's 1977 book The Spectrum of Consciousness, a comparison of western and eastern ways of thinking about the mind. Wilber described consciousness as a spectrum with ordinary awareness at one end, and more profound types of awareness at higher levels.[232]
Other examples include the various levels of spiritual consciousness presented by Prem Saran Satsangi and Stuart Hameroff.[233]
See also
- Template:Annotated link
- Template:Annotated link
- Template:Annotated link
- Template:Annotated link
- Template:Annotated link
- Template:Annotated link
- Template:Annotated link
- Template:Annotated link
- Template:Annotated link
- Template:Annotated link
- Template:Annotated link
Notes
References
Further reading
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
- Script error: No such module "citation/CS1".
Articles
- Lewis, Ralph. An Overview of the Leading Theories of Consciousness.Organizing and comparing the major candidate theories in the field. Psychology Today, November 25, 2023.
External links
Script error: No such module "Spoken Wikipedia".
- Template:Commons category-inline
- Template:Library resources about
- Template:Sister-inline
- Template:Sister-inline
- Template:Sister-inline
Template:Spirituality-related topics
Template:Footer Neuropsychology
- ↑ Template:Cite dictionary
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ a b Script error: No such module "Citation/CS1".
- ↑ Charles Adam, Paul Tannery (eds.), Oeuvres de Descartes X, 524 (1908).
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Jaucourt, Louis, chevalier de. "Consciousness." The Encyclopedia of Diderot & d'Alembert Collaborative Translation Project. Translated by Scott St. Louis. Ann Arbor: Michigan Publishing, University of Michigan Library, 2014. Originally published as "Conscience," Encyclopédie ou Dictionnaire raisonné des sciences, des arts et des métiers, 3:902 (Paris, 1753).
- ↑ Script error: No such module "citation/CS1".
- ↑ Template:Cite dictionary
- ↑ Template:Cite dictionary
- ↑ Script error: No such module "citation/CS1".
- ↑ a b Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ a b Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Levine, Joseph (2010). Review of Uriah Kriegel, Subjective Consciousness: A Self-Representational Theory. Notre Dame Philosophical Reviews 2010 (3).
- ↑ Script error: No such module "Citation/CS1".
- ↑ a b Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ a b Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ a b Script error: No such module "citation/CS1". Pages 230 and 231 in the version on the author's own website.
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Harris, S. (12 October 2011). The mystery of consciousness. Sam Harris. https://www.samharris.org/blog/the-mystery-of-consciousness Template:Webarchive
- ↑ Tricker, C. (2022). The cicada and the bird. The usefulness of a useless philosophy. Chuang Tzu's ancient wisdom translated for modern life. Template:Webarchive Page 52. (Google Books) Template:Webarchive
- ↑ Script error: No such module "citation/CS1".
- ↑ a b Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ a b Script error: No such module "citation/CS1".
- ↑ Ron Sun and Stan Franklin, Computational models of consciousness: A taxonomy and some examples. In: P.D. Zelazo, M. Moscovitch, and E. Thompson (eds.), The Cambridge Handbook of Consciousness, pp. 151–174. Cambridge University Press, New York. 2007
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ a b The Culture and Philosophy of Ridley Scott, Greg Littmann, pp. 133–144, Lexington Books (2013).
- ↑ a b Moral Machines, Wendell Wallach and Colin Allen, 288 pages, Oxford University Press, USA (June 3, 2010), Template:ISBN.
- ↑ a b Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ a b c d e f Script error: No such module "citation/CS1".
- ↑ Oxford English Dictionary, "qualia", 3rd ed., Oxford University Press, 2010. Accessed October 3, 2024. https://www.oed.com/search/dictionary/?scope=Entries&q=qualia.
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Mandler, G. "Consciousness: Respectable, useful, and probably necessary". In R. Solso (Ed.) Information processing and cognition: NJ: LEA.
- ↑ Script error: No such module "citation/CS1".
- ↑ Mandler, G. Consciousness recovered: Psychological functions and origins of thought. Philadelphia: John Benjamins. 2002
- ↑ Script error: No such module "citation/CS1".
- ↑ Lucido, R. J. (2023). Testing the consciousness causing collapse interpretation of quantum mechanics using subliminal primes derived from random fluctuations in radioactive decay. Journal of Consciousness Exploration & Research, 14(3), 185-194. https://doi.org/10.13140/RG.2.2.20344.72969
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ a b Script error: No such module "Citation/CS1".
- ↑ a b Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ a b Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Koch, The Quest for Consciousness, pp. 105–116
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Koch, The Quest for Consciousness, pp. 269–286
- ↑ a b c Script error: No such module "Citation/CS1".
- ↑ a b Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".Template:Dead link
- ↑ Script error: No such module "Citation/CS1".
- ↑ a b Script error: No such module "Citation/CS1".
- ↑ a b c Script error: No such module "citation/CS1".
- ↑ a b Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Joaquin Fuster, The Prefrontal Cortex, Second Edition.
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ a b Script error: No such module "Citation/CS1".
- ↑ a b Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Madden, M. B., Stewart, B. W., White, M. G., Krimmel, S. R., Qadir, H., Barrett, F. S., ... & Mathur, B. N. (2022). A role for the claustrum in cognitive control. Trends in cognitive sciences, 26(12), 1133-1152. doi: 10.1016/j.tics.2022.09.006
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ a b Script error: No such module "Citation/CS1".
- ↑ a b Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ a b Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ a b Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ a b Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ a b c Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Koch, The Quest for Consciousness, pp. 216–226
- ↑ Script error: No such module "citation/CS1". Note: A patient who can additionally describe the current situation may be referred to as "oriented times four".
- ↑ Script error: No such module "citation/CS1".
- ↑ a b c Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ a b Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".Template:Cbignore
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1". Note: In many stories the Golem was mindless, but some gave it emotions or thoughts.
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ a b c d Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- Pages with script errors
- Pages with broken file links
- Cognitive neuroscience
- Cognitive psychology
- Concepts in epistemology
- Concepts in the philosophy of mind
- Concepts in the philosophy of science
- Consciousness
- Emergence
- Mental processes
- Metaphysical properties
- Metaphysics of mind
- Mind–body problem
- Neuropsychological assessment
- Ontology
- Phenomenology
- Theory of mind