Facial Action Coding System: Difference between revisions
imported>No fortune machine m →Baby FACS: corrected reference |
imported>Garodueng m Added a comma after an abbreviation |
||
| Line 2: | Line 2: | ||
[[File:1106 Side Views of the Muscles of Facial Expressions.jpg|thumb|upright=1.3|Muscles of head and neck]] | [[File:1106 Side Views of the Muscles of Facial Expressions.jpg|thumb|upright=1.3|Muscles of head and neck]] | ||
The '''Facial Action Coding System''' (''' | The '''Facial Action Coding System''' ('''F.A.C.S.''') is a system to [[taxonomize]] human [[facial expression|facial movements]] by their appearance on the face, based on a system originally developed by a Swedish [[anatomist]] named [[Carl-Herman Hjortsjö]].<ref>{{cite book|url=https://books.google.com/books?id=BakQAQAAIAAJ|vauthors=Hjortsjö CH|title=Man's face and mimic language|year=1969}} free download: [http://diglib.uibk.ac.at/ulbtirol/content/titleinfo/782346 Carl-Herman Hjortsjö, Man's face and mimic language"] {{Webarchive|url=https://web.archive.org/web/20220806115847/https://diglib.uibk.ac.at/ulbtirol/content/titleinfo/782346 |date=2022-08-06 }}</ref> It was later adopted by [[Paul Ekman]] and [[Wallace V. Friesen]], and published in 1978.<ref>{{cite book | vauthors = Ekman P, Friesen W | title = Facial Action Coding System: A Technique for the Measurement of Facial Movement. | publisher = Consulting Psychologists Press | location = Palo Alto | date = 1978 }}</ref> Ekman, Friesen, and Joseph C. Hager published a significant update to F.A.C.S. in 2002.<ref>{{cite book | first1 = Paul | last1 = Ekman | first2 = Wallace V. | last2 = Friesen | first3 = Joseph C. | last3 = Hager | name-list-style = vanc | title = Facial Action Coding System: The Manual on CD ROM. | publisher = A Human Face | location = Salt Lake City | date = 2002 }}</ref> Movements of individual [[facial muscles]] are encoded by the F.A.C.S. from slight different instant changes in facial appearance. It has proven useful to [[psychologist]]s and to [[animator]]s. | ||
==Background== | ==Background== | ||
[[File:Rio2016 0809 MARTINET ©G-Picout 22.jpg|thumb|Blind athlete expressing joy in athletic competition. The fact that | [[File:Rio2016 0809 MARTINET ©G-Picout 22.jpg|thumb|Blind athlete [[Sandrine Martinet]] expressing joy in athletic competition. The fact that blind people use the same expressions as sighted people shows that expressions are innate.]] | ||
In 2009, a study was conducted to study spontaneous facial expressions in sighted and blind judo athletes. They discovered that many facial expressions are innate and not visually learned.<ref>Matsumoto, D., & Willingham, B. (2009). "Spontaneous facial expressions of emotion of blind individuals". ''[[Journal of Personality and Social Psychology]]'', 96(1), 1-10</ref> | In 2009, a study was conducted to study spontaneous facial expressions in sighted and blind judo athletes. They discovered that many facial expressions are innate and not visually learned.<ref>Matsumoto, D., & Willingham, B. (2009). "Spontaneous facial expressions of emotion of blind individuals". ''[[Journal of Personality and Social Psychology]]'', 96(1), 1-10</ref> | ||
== Method == | == Method == | ||
Using the | Using the F.A.C.S.,<ref>{{cite book|title=Encyclopedia of Human Behavior|vauthors=Freitas-Magalhães|date=2012|publisher=Elsevier/Academic Press|isbn=978-0-12-375000-6|veditors=Ramachandran VS|volume=2|location=Oxford|pages=173–183|chapter=Microexpression and macroexpression}}</ref> human coders can manually code nearly any anatomically possible facial expression, deconstructing it into the specific "action units" (A.U.) and their temporal segments that produced the expression. As A.U.s are independent of any interpretation, they can be used for any higher-order decision-making process including [[emotion recognition|recognition of basic emotions]], or pre-programmed commands for an ambient intelligent environment. The F.A.C.S. manual is over five hundred pages in length and provides the A.U.s, as well as Ekman's interpretation of their meanings. | ||
The | The F.A.C.S. defines A.U.s as contractions or relaxations of one or more muscles. It also defines a number of "action descriptors", which differ from A.U.s in that the authors of the F.A.C.S. have not specified the muscular basis for the action and have not distinguished specific behaviors as precisely as they have for the A.U.s. | ||
For example, the | For example, the F.A.C.S. can be used to distinguish two types of [[smile]]s as follows:<ref name="pmid17484588">{{cite journal | vauthors = Del Giudice M, Colle L | title = Differences between children and adults in the recognition of enjoyment smiles | journal = Developmental Psychology | volume = 43 | issue = 3 | pages = 796–803 | date = May 2007 | pmid = 17484588 | doi = 10.1037/0012-1649.43.3.796 }}</ref> | ||
* the insincere and voluntary [[Pan-Am smile]]: contraction of ''[[zygomatic major]]'' alone | * the insincere and voluntary [[Pan-Am smile]]: contraction of ''[[zygomatic major]]'' alone | ||
* the sincere and involuntary [[Duchenne smile]]: contraction of ''zygomatic major'' and inferior part of ''[[orbicularis oculi]]''. | * the sincere and involuntary [[Duchenne smile]]: contraction of ''zygomatic major'' and inferior part of ''[[orbicularis oculi]]''. | ||
The | The F.A.C.S. is designed to be self-instructional. People can learn the technique from a number of sources including manuals and workshops,<ref>{{Cite web |last=Rosenberg |first=Erika L. |name-list-style=vanc |title=Example and web site of one teaching professional |url=http://www.erikarosenberg.com/FACS.html |archive-url=https://web.archive.org/web/20090206232546/http://www.erikarosenberg.com/FACS.html |archive-date=2009-02-06 |access-date=2009-02-04}}</ref> and obtain certification through testing.<ref>{{Cite web |title=Facial Action Coding System |url=https://www.paulekman.com/facial-action-coding-system/ |access-date=2019-10-23 |website=Paul Ekman Group |language=en-US}}</ref> | ||
Although the labeling of expressions currently requires trained experts, researchers have had some success in using computers to automatically identify the | Although the labeling of expressions currently requires trained experts, researchers have had some success in using computers to automatically identify the F.A.C.S. codes.<ref>[http://www.cs.wpi.edu/~matt/courses/cs563/talks/face_anim/ekman.html Facial Action Coding System.] Retrieved July 21, 2007.</ref> One obstacle to automatic FACS code recognition is a shortage of manually coded ground truth data.<ref>{{Cite arXiv |last1=Song |first1=Juan |last2=Liu |first2=Zhilei |date=10 Mar 2023 |title=Self-supervised Facial Action Unit Detection with Region and Relation Learning |class=cs.CV |eprint=2303.05708 }}</ref> | ||
== Uses == | == Uses == | ||
=== Baby | === Baby F.A.C.S. === | ||
Baby | Baby F.A.C.S. (Facial Action Coding System for Infants and Young Children)<ref>{{cite book |last1=Oster |first1=Harriet |title=Baby FACS: Facial Action Coding System for Infants and Young Children |date=2006 |publisher=Unpublished monograph and coding manual. New York University. |location=New York}}</ref> is a behavioral coding system that adapts the adult F.A.C.S. to code facial expressions in infants aged 0–2 years. It corresponds to specific underlying facial muscles, tailored to infant facial anatomy and expression patterns. | ||
It was created by Dr. Harriet Oster and colleagues to address the limitations of applying adult | It was created by Dr. Harriet Oster and colleagues to address the limitations of applying adult F.A.C.S. directly to infants, whose facial musculature, proportions and developmental capabilities differ significantly. | ||
=== Use in medicine === | === Use in medicine === | ||
The use of the | The use of the F.A.C.S. has been proposed for use in the analysis of [[Clinical depression|depression]],<ref name="pmid18020726">{{cite journal | vauthors = Reed LI, Sayette MA, Cohn JF | title = Impact of depression on response to comedy: a dynamic facial coding analysis | journal = Journal of Abnormal Psychology | volume = 116 | issue = 4 | pages = 804–9 | date = November 2007 | pmid = 18020726 | doi = 10.1037/0021-843X.116.4.804 | citeseerx = 10.1.1.307.6950 }}</ref> and the measurement of pain in patients unable to express themselves verbally.<ref name="pmid18028046">{{cite journal | vauthors = Lints-Martindale AC, Hadjistavropoulos T, Barber B, Gibson SJ | title = A psychophysical investigation of the facial action coding system as an index of pain variability among older adults with and without Alzheimer's disease | journal = Pain Medicine | volume = 8 | issue = 8 | pages = 678–89 | year = 2007 | pmid = 18028046 | doi = 10.1111/j.1526-4637.2007.00358.x | doi-access = free }}</ref> | ||
=== | === Interspecial applications === | ||
The original | The original F.A.C.S. has been modified to analyze facial movements in several non-human primates, namely [[Common chimpanzee|chimpanzee]]s,<ref name="pmid17352572">{{cite journal | vauthors = Parr LA, Waller BM, Vick SJ, Bard KA | title = Classifying chimpanzee facial expressions using muscle action | journal = Emotion | volume = 7 | issue = 1 | pages = 172–81 | date = February 2007 | pmid = 17352572 | pmc = 2826116 | doi = 10.1037/1528-3542.7.1.172 }}</ref> [[rhesus macaque]]s,<ref>{{cite journal | vauthors = Parr LA, Waller BM, Burrows AM, Gothard KM, Vick SJ | title = Brief communication: MaqFACS: A muscle-based facial movement coding system for the rhesus macaque | journal = American Journal of Physical Anthropology | volume = 143 | issue = 4 | pages = 625–30 | date = December 2010 | pmid = 20872742 | pmc = 2988871 | doi = 10.1002/ajpa.21401 | bibcode = 2010AJPA..143..625P }}</ref> [[gibbon]]s, and [[siamang]]s,<ref>{{Cite journal | vauthors = Waller BM, Lembeck M, Kuchenbuch P, Burrows AM, Liebal K | title = GibbonFACS: A Muscle-Based Facial Movement Coding System for Hylobatids | doi = 10.1007/s10764-012-9611-6 | journal = International Journal of Primatology | volume = 33 | issue = 4 | pages = 809–821 | year = 2012 | s2cid = 18321096 }}</ref> and orangutans.<ref>{{Cite journal | vauthors = Caeiro CC, Waller BM, Zimmermann E, Burrows AM, Davila-Ross M | title = OrangFACS: A Muscle-Based Facial Movement Coding System for Orangutans (''Pongo'' spp.) | doi = 10.1007/s10764-012-9652-x | journal = International Journal of Primatology | volume = 34 | pages = 115–129 | year = 2012 | s2cid = 17612028 | url=http://irep.ntu.ac.uk/id/eprint/41473/1/1383920_Waller.pdf}}</ref> More recently, it was developed also for domesticated species, including dogs,<ref>{{cite journal | vauthors = Waller BM, Peirce K, Caeiro CC, Scheider L, Burrows AM, McCune S, Kaminski J | title = Paedomorphic facial expressions give dogs a selective advantage | journal = PLOS ONE | volume = 8 | issue = 12 | article-number = e82686 | year = 2013 | pmid = 24386109 | pmc = 3873274 | doi = 10.1371/journal.pone.0082686 | bibcode = 2013PLoSO...882686W | doi-access = free }}</ref> horses<ref>{{cite journal | vauthors = Wathan J, Burrows AM, Waller BM, McComb K | title = EquiFACS: The Equine Facial Action Coding System | journal = PLOS ONE | volume = 10 | issue = 8 | article-number = e0131738 | date = 2015-08-05 | pmid = 26244573 | pmc = 4526551 | doi = 10.1371/journal.pone.0131738 | bibcode = 2015PLoSO..1031738W | doi-access = free }}</ref> and cats.<ref>{{Cite journal|vauthors=Caeiro CC, Burrows AM, Waller BM|date=2017-04-01|title=Development and application of CatFACS: Are human cat adopters influenced by cat facial expressions?|journal=Applied Animal Behaviour Science|volume=189|pages=66–78|doi=10.1016/j.applanim.2017.01.005|issn=0168-1591|url=http://eprints.lincoln.ac.uk/25940/1/25940%20Proof_APPLAN_4392.pdf|archive-date=2018-07-21|access-date=2019-11-07|archive-url=https://web.archive.org/web/20180721154529/http://eprints.lincoln.ac.uk/25940/1/25940%20Proof_APPLAN_4392.pdf}}</ref> Similarly to the human F.A.C.S., the non-human F.A.C.S. has manuals available online for each species with the respective certification tests.<ref>{{Cite web|url=http://animalfacs.com|title=Home|website=animalfacs.com|access-date=2019-10-23}}</ref> | ||
Thus | Thus the F.A.C.S. can be used to compare facial repertoires across species due to its anatomical basis. A study conducted by Vick and others (2006) suggests that the F.A.C.S. can be modified by taking differences in underlying morphology into account. Such considerations enable a comparison of the homologous facial movements present in humans and chimpanzees, to show that the facial expressions of both species result from extremely notable appearance changes. The development of F.A.C.S. tools for different species allows the objective and anatomical study of facial expressions in communicative and emotional contexts. Furthermore, an interspecial analysis of facial expressions can help to answer interesting questions, such as which emotions are uniquely human.<ref>{{cite journal | vauthors = Vick SJ, Waller BM, Parr LA, Smith Pasqualini MC, Bard KA | title = A Cross-species Comparison of Facial Morphology and Movement in Humans and Chimpanzees Using the Facial Action Coding System (FACS) | journal = Journal of Nonverbal Behavior | volume = 31 | issue = 1 | pages = 1–20 | date = March 2007 | pmid = 21188285 | pmc = 3008553 | doi = 10.1007/s10919-006-0017-z }}</ref> | ||
The | The Emotional Facial Action Coding System (E.M.F.A.C.S.)<ref>{{citation | vauthors = Friesen W, Ekman P | title = EMFACS-7: Emotional Facial Action Coding System. Unpublished manuscript | publisher = University of California at San Francisco | date = 1983 | volume = 2 | issue = 36 | page = 1 }}</ref> and the Facial Action Coding System Affect Interpretation Dictionary (F.A.C.S.A.I.D.)<ref>{{Cite web |url=http://www.face-and-emotion.com/dataface/facsaid/description.jsp |title=Facial Action Coding System Affect Interpretation Dictionary (FACSAID) |access-date=2011-02-23 |archive-url=https://web.archive.org/web/20110520164308/http://face-and-emotion.com/dataface/facsaid/description.jsp |archive-date=2011-05-20 }}</ref> consider only emotion-related facial actions. Examples of these are: | ||
{| class="wikitable sortable" | {| class="wikitable sortable" | ||
|- | |- | ||
| Line 57: | Line 57: | ||
=== Computer-generated imagery === | === Computer-generated imagery === | ||
F.A.C.S. coding is also used extensively in [[computer animation]], in particular for [[computer facial animation]], with facial expressions being expressed as [[vector graphics]] of A.Us.<ref>{{Cite news |last=Walsh |first=Joseph |date=2016-12-16 |title=Rogue One: the CGI resurrection of Peter Cushing is thrilling – but is it right? |language=en-GB |work=The Guardian |url=https://www.theguardian.com/film/filmblog/2016/dec/16/rogue-one-star-wars-cgi-resurrection-peter-cushing |access-date=2023-10-23 |issn=0261-3077}}</ref> F.A.C.S. vectors are used as weights for [[Morph target animation|blend shape]]s corresponding to each A.U., with the resulting face mesh then being used to render the finished face.<ref>{{Cite journal |last1=Gilbert |first1=Michaël |last2=Demarchi |first2=Samuel |last3=Urdapilleta |first3=Isabel |date=October 2021 |title=FACSHuman, a software program for creating experimental material by modeling 3D facial expressions |url=https://link.springer.com/10.3758/s13428-021-01559-9 |journal=Behavior Research Methods |language=en |volume=53 |issue=5 |pages=2252–2272 |doi=10.3758/s13428-021-01559-9 |pmid=33825127 |issn=1554-3528|url-access=subscription }}</ref><ref>{{Cite web |title=Discover how to create FACS facial blendshapes in Maya {{!}} CG Channel |url=https://www.cgchannel.com/2021/04/discover-how-to-create-facs-facial-blendshapes-in-maya/ |access-date=2023-10-23 |language=en-US}}</ref> [[Deep-learning]] techniques can be used to determine the F.A.C.S. vectors from face images obtained during [[Motion capture|motion capture acting]], [[facial motion capture]] or other performances.<ref>{{Cite book |date=2015 |doi=10.1109/FG.2015.7284873 |language=en-US |last1=Gudi |first1=Amogh |last2=Tasli |first2=H. Emrah |last3=Den Uyl |first3=Tim M. |last4=Maroulis |first4=Andreas |title=2015 11th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG) |chapter=Deep learning based FACS Action Unit occurrence and intensity estimation |pages=1–5 |isbn=978-1-4799-6026-2 |s2cid=6283665 }}</ref> | |||
== Codes for action units == | == Codes for action units == | ||
{{See also|List of muscles in the human body#The muscles of the head}} | {{See also|List of muscles in the human body#The muscles of the head}} | ||
For clarification, the | For clarification, the F.A.C.S. is an index of facial expressions, but does not actually provide any biomechanical information about the degree of muscle activation. Though muscle activation is not part of the F.A.C.S., the main muscles involved in the facial expression have been added here. | ||
''Action units'' ( | ''Action units'' (A.U.s) are the fundamental actions of individual muscles or groups of muscles. | ||
''Action descriptors'' ( | ''Action descriptors'' (A.D.s) are unitary movements that may involve the actions of several muscle groups (e.g., a forward‐thrusting movement of the jaw). The muscular basis for these actions has not been specified and specific behaviors have not been distinguished as precisely as for the A.U.s. | ||
For the most accurate annotation, the | For the most accurate annotation, the F.A.C.S. suggests agreement from at least two independent certified F.A.C.S. encoders. | ||
=== Intensity scoring === | === Intensity scoring === | ||
Intensities of the | Intensities of the F.A.C.S. are annotated by appending letters A–E (for minimal-maximal intensity) to the action unit number (e.g. A.U. 1A is the weakest trace of A.U. 1 and A.U. 1E is the maximum intensity possible for the individual person). | ||
* A Trace | * A Trace | ||
* B Slight | * B Slight | ||
| Line 80: | Line 80: | ||
=== Other letter modifiers === | === Other letter modifiers === | ||
There are other modifiers present in | There are other modifiers present in F.A.C.S. codes for emotional expressions, such as "R" which represents an action that occurs on the right side of the face and "L" for actions which occur on the left. An action which is unilateral (occurs on only one side of the face) but has no specific side is indicated with a "U" and an action which is bilateral but has a stronger side is indicated with an "A" for "asymmetric". | ||
=== List of | === List of A.U.s and A.D.s (with underlying facial muscles) === | ||
==== Main codes ==== | ==== Main codes ==== | ||
{| class="wikitable sortable" | {| class="wikitable sortable" | ||
|- | |- | ||
! | ! A.U. number !! F.A.C.S. name !! Muscular basis | ||
|- | |- | ||
| 0 || Neutral face || | | 0 || Neutral face || | ||
| Line 149: | Line 149: | ||
{| class="wikitable sortable" | {| class="wikitable sortable" | ||
|- | |- | ||
! | ! A.U. number !! F.A.C.S. name !! Action | ||
|- | |- | ||
| 51 || Head turn left || | | 51 || Head turn left || | ||
| Line 183: | Line 183: | ||
{| class="wikitable sortable" | {| class="wikitable sortable" | ||
|- | |- | ||
! | ! A.U. number !! F.A.C.S. name !! Action | ||
|- | |- | ||
| 61 || Eyes turn left || | | 61 || Eyes turn left || | ||
| Line 205: | Line 205: | ||
| 69 || Eyes positioned to look at other person || The 4, 5, or 7, alone or in combination, occurs while the eye position is fixed on the other person in the conversation. | | 69 || Eyes positioned to look at other person || The 4, 5, or 7, alone or in combination, occurs while the eye position is fixed on the other person in the conversation. | ||
|- | |- | ||
| M69 || Head or eyes look at other person || The onset of the symmetrical 14 or | | M69 || Head or eyes look at other person || The onset of the symmetrical 14 or A.U.s 4, 5, and 7, alone or in combination, is immediately preceded or accompanied by a movement of the eyes or of the head and eyes to look at the other person in the conversation. | ||
|} | |} | ||
| Line 212: | Line 212: | ||
{| class="wikitable sortable" | {| class="wikitable sortable" | ||
|- | |- | ||
! | ! A.U. number !! F.A.C.S. name | ||
|- | |- | ||
| 70 || Brows and forehead not visible | | 70 || Brows and forehead not visible | ||
| Line 229: | Line 229: | ||
{| class="wikitable sortable" | {| class="wikitable sortable" | ||
|- | |- | ||
! | ! A.U. number !! F.A.C.S. name !! Muscular basis | ||
|- | |- | ||
| 29 || Jaw thrust || | | 29 || Jaw thrust || | ||
| Line 301: | Line 301: | ||
== External links == | == External links == | ||
* [https://web.archive.org/web/20150324015937/https://www.paulekman.com/research/ Paul Ekman's articles relating to | * [https://web.archive.org/web/20150324015937/https://www.paulekman.com/research/ Paul Ekman's articles relating to F.A.C.S.] | ||
* [https://web.archive.org/web/20080606005626/http://www.face-and-emotion.com/dataface/facs/description.jsp Paul Ekman's Facial Action Coding System ( | * [https://web.archive.org/web/20080606005626/http://www.face-and-emotion.com/dataface/facs/description.jsp Paul Ekman's Facial Action Coding System (F.A.C.S.)] | ||
* [http://www.animalfacs.com/ More information on the different animal | * [http://www.animalfacs.com/ More information on the different animal F.A.C.S. projects] | ||
*[https://www.newyorker.com/magazine/2002/08/05/the-naked-face New Yorker article discussing | *[https://www.newyorker.com/magazine/2002/08/05/the-naked-face New Yorker article discussing F.A.C.S.] | ||
* [http://www-2.cs.cmu.edu/afs/cs/project/face/www/facs.htm Details from 1978 edition of | * [http://www-2.cs.cmu.edu/afs/cs/project/face/www/facs.htm Details from 1978 edition of F.A.C.S.] | ||
* [http://www.cs.wpi.edu/~matt/courses/cs563/talks/face_anim/ekman.html Site at WPI] | * [http://www.cs.wpi.edu/~matt/courses/cs563/talks/face_anim/ekman.html Site at WPI] | ||
* download of [http://diglib.uibk.ac.at/ulbtirol/content/titleinfo/782346 Carl-Herman Hjortsjö, Man's face and mimic language"] {{Webarchive|url=https://web.archive.org/web/20220806115847/https://diglib.uibk.ac.at/ulbtirol/content/titleinfo/782346 |date=2022-08-06 }} (the original Swedish title of the book is: "Människans ansikte och mimiska språket". The correct translation would be: "Man's face and facial language") | * download of [http://diglib.uibk.ac.at/ulbtirol/content/titleinfo/782346 Carl-Herman Hjortsjö, Man's face and mimic language"] {{Webarchive|url=https://web.archive.org/web/20220806115847/https://diglib.uibk.ac.at/ulbtirol/content/titleinfo/782346 |date=2022-08-06 }} (the original Swedish title of the book is: "Människans ansikte och mimiska språket". The correct translation would be: "Man's face and facial language") | ||
Latest revision as of 17:03, 10 November 2025
The Facial Action Coding System (F.A.C.S.) is a system to taxonomize human facial movements by their appearance on the face, based on a system originally developed by a Swedish anatomist named Carl-Herman Hjortsjö.[1] It was later adopted by Paul Ekman and Wallace V. Friesen, and published in 1978.[2] Ekman, Friesen, and Joseph C. Hager published a significant update to F.A.C.S. in 2002.[3] Movements of individual facial muscles are encoded by the F.A.C.S. from slight different instant changes in facial appearance. It has proven useful to psychologists and to animators.
Background
In 2009, a study was conducted to study spontaneous facial expressions in sighted and blind judo athletes. They discovered that many facial expressions are innate and not visually learned.[4]
Method
Using the F.A.C.S.,[5] human coders can manually code nearly any anatomically possible facial expression, deconstructing it into the specific "action units" (A.U.) and their temporal segments that produced the expression. As A.U.s are independent of any interpretation, they can be used for any higher-order decision-making process including recognition of basic emotions, or pre-programmed commands for an ambient intelligent environment. The F.A.C.S. manual is over five hundred pages in length and provides the A.U.s, as well as Ekman's interpretation of their meanings.
The F.A.C.S. defines A.U.s as contractions or relaxations of one or more muscles. It also defines a number of "action descriptors", which differ from A.U.s in that the authors of the F.A.C.S. have not specified the muscular basis for the action and have not distinguished specific behaviors as precisely as they have for the A.U.s.
For example, the F.A.C.S. can be used to distinguish two types of smiles as follows:[6]
- the insincere and voluntary Pan-Am smile: contraction of zygomatic major alone
- the sincere and involuntary Duchenne smile: contraction of zygomatic major and inferior part of orbicularis oculi.
The F.A.C.S. is designed to be self-instructional. People can learn the technique from a number of sources including manuals and workshops,[7] and obtain certification through testing.[8]
Although the labeling of expressions currently requires trained experts, researchers have had some success in using computers to automatically identify the F.A.C.S. codes.[9] One obstacle to automatic FACS code recognition is a shortage of manually coded ground truth data.[10]
Uses
Baby F.A.C.S.
Baby F.A.C.S. (Facial Action Coding System for Infants and Young Children)[11] is a behavioral coding system that adapts the adult F.A.C.S. to code facial expressions in infants aged 0–2 years. It corresponds to specific underlying facial muscles, tailored to infant facial anatomy and expression patterns.
It was created by Dr. Harriet Oster and colleagues to address the limitations of applying adult F.A.C.S. directly to infants, whose facial musculature, proportions and developmental capabilities differ significantly.
Use in medicine
The use of the F.A.C.S. has been proposed for use in the analysis of depression,[12] and the measurement of pain in patients unable to express themselves verbally.[13]
Interspecial applications
The original F.A.C.S. has been modified to analyze facial movements in several non-human primates, namely chimpanzees,[14] rhesus macaques,[15] gibbons, and siamangs,[16] and orangutans.[17] More recently, it was developed also for domesticated species, including dogs,[18] horses[19] and cats.[20] Similarly to the human F.A.C.S., the non-human F.A.C.S. has manuals available online for each species with the respective certification tests.[21]
Thus the F.A.C.S. can be used to compare facial repertoires across species due to its anatomical basis. A study conducted by Vick and others (2006) suggests that the F.A.C.S. can be modified by taking differences in underlying morphology into account. Such considerations enable a comparison of the homologous facial movements present in humans and chimpanzees, to show that the facial expressions of both species result from extremely notable appearance changes. The development of F.A.C.S. tools for different species allows the objective and anatomical study of facial expressions in communicative and emotional contexts. Furthermore, an interspecial analysis of facial expressions can help to answer interesting questions, such as which emotions are uniquely human.[22]
The Emotional Facial Action Coding System (E.M.F.A.C.S.)[23] and the Facial Action Coding System Affect Interpretation Dictionary (F.A.C.S.A.I.D.)[24] consider only emotion-related facial actions. Examples of these are:
| Emotion | Action units |
|---|---|
| Happiness | 6+12 |
| Sadness | 1+4+15 |
| Surprise | 1+2+5B+26 |
| Fear | 1+2+4+5+7+20+26 |
| Anger | 4+5+7+23 |
| Disgust | 9+15+17 |
| Contempt | R12A+R14A |
Computer-generated imagery
F.A.C.S. coding is also used extensively in computer animation, in particular for computer facial animation, with facial expressions being expressed as vector graphics of A.Us.[25] F.A.C.S. vectors are used as weights for blend shapes corresponding to each A.U., with the resulting face mesh then being used to render the finished face.[26][27] Deep-learning techniques can be used to determine the F.A.C.S. vectors from face images obtained during motion capture acting, facial motion capture or other performances.[28]
Codes for action units
Script error: No such module "Labelled list hatnote".
For clarification, the F.A.C.S. is an index of facial expressions, but does not actually provide any biomechanical information about the degree of muscle activation. Though muscle activation is not part of the F.A.C.S., the main muscles involved in the facial expression have been added here.
Action units (A.U.s) are the fundamental actions of individual muscles or groups of muscles.
Action descriptors (A.D.s) are unitary movements that may involve the actions of several muscle groups (e.g., a forward‐thrusting movement of the jaw). The muscular basis for these actions has not been specified and specific behaviors have not been distinguished as precisely as for the A.U.s.
For the most accurate annotation, the F.A.C.S. suggests agreement from at least two independent certified F.A.C.S. encoders.
Intensity scoring
Intensities of the F.A.C.S. are annotated by appending letters A–E (for minimal-maximal intensity) to the action unit number (e.g. A.U. 1A is the weakest trace of A.U. 1 and A.U. 1E is the maximum intensity possible for the individual person).
- A Trace
- B Slight
- C Marked or pronounced
- D Severe or extreme
- E Maximum
Other letter modifiers
There are other modifiers present in F.A.C.S. codes for emotional expressions, such as "R" which represents an action that occurs on the right side of the face and "L" for actions which occur on the left. An action which is unilateral (occurs on only one side of the face) but has no specific side is indicated with a "U" and an action which is bilateral but has a stronger side is indicated with an "A" for "asymmetric".
List of A.U.s and A.D.s (with underlying facial muscles)
Main codes
| A.U. number | F.A.C.S. name | Muscular basis |
|---|---|---|
| 0 | Neutral face | |
| 1 | Inner brow raiser | frontalis (pars medialis) |
| 2 | Outer brow raiser | frontalis (pars lateralis) |
| 4 | Brow lowerer | depressor glabellae, depressor supercilii, corrugator supercilii |
| 5 | Upper lid raiser | levator palpebrae superioris, superior tarsal muscle |
| 6 | Cheek raiser | orbicularis oculi (pars orbitalis) |
| 7 | Lid tightener | orbicularis oculi (pars palpebralis) |
| 8 | Lips toward each other | orbicularis oris |
| 9 | Nose wrinkler | levator labii superioris alaeque nasi |
| 10 | Upper lip raiser | levator labii superioris, caput infraorbitalis |
| 11 | Nasolabial deepener | zygomaticus minor |
| 12 | Lip corner puller | zygomaticus major |
| 13 | Sharp lip puller | levator anguli oris (also known as caninus) |
| 14 | Dimpler | buccinator |
| 15 | Lip corner depressor | depressor anguli oris (also known as triangularis) |
| 16 | Lower lip depressor | depressor labii inferioris |
| 17 | Chin raiser | mentalis |
| 18 | Lip pucker | incisivii labii superioris and incisivii labii inferioris |
| 19 | Tongue show | |
| 20 | Lip stretcher | risorius with platysma |
| 21 | Neck tightener | platysma] |
| 22 | Lip funneler | orbicularis oris |
| 23 | Lip tightener | orbicularis oris |
| 24 | Lip pressor | orbicularis oris |
| 25 | Lips part | depressor labii inferioris, or relaxation of mentalis or orbicularis oris |
| 26 | Jaw drop | masseter; relaxed temporalis and internal pterygoid |
| 27 | Mouth stretch | pterygoids, digastric |
| 28 | Lip suck | orbicularis oris |
Head movement codes
| A.U. number | F.A.C.S. name | Action |
|---|---|---|
| 51 | Head turn left | |
| 52 | Head turn right | |
| 53 | Head up | |
| 54 | Head down | |
| 55 | Head tilt left | |
| M55 | Head tilt left | The onset of the symmetrical 14 is immediately preceded or accompanied by a head tilt to the left. |
| 56 | Head tilt right | |
| M56 | Head tilt right | The onset of the symmetrical 14 is immediately preceded or accompanied by a head tilt to the right. |
| 57 | Head forward | |
| M57 | Head thrust forward | The onset of 17+24 is immediately preceded, accompanied, or followed by a head thrust forward. |
| 58 | Head back | |
| M59 | Head shake up and down | The onset of 17+24 is immediately preceded, accompanied, or followed by an up-down head shake (nod). |
| M60 | Head shake side to side | The onset of 17+24 is immediately preceded, accompanied, or followed by a side to side head shake. |
| M83 | Head upward and to the side | The onset of the symmetrical 14 is immediately preceded or accompanied by a movement of the head, upward and turned or tilted to either the left or right. |
Eye movement codes
| A.U. number | F.A.C.S. name | Action |
|---|---|---|
| 61 | Eyes turn left | |
| M61 | Eyes left | The onset of the symmetrical 14 is immediately preceded or accompanied by eye movement to the left. |
| 62 | Eyes turn right | |
| M62 | Eyes right | The onset of the symmetrical 14 is immediately preceded or accompanied by eye movement to the right. |
| 63 | Eyes up | |
| 64 | Eyes down | |
| 65 | Walleye | |
| 66 | Cross-eye | |
| M68 | Upward rolling of eyes | The onset of the symmetrical 14 is immediately preceded or accompanied by an upward rolling of the eyes. |
| 69 | Eyes positioned to look at other person | The 4, 5, or 7, alone or in combination, occurs while the eye position is fixed on the other person in the conversation. |
| M69 | Head or eyes look at other person | The onset of the symmetrical 14 or A.U.s 4, 5, and 7, alone or in combination, is immediately preceded or accompanied by a movement of the eyes or of the head and eyes to look at the other person in the conversation. |
Visibility codes
| A.U. number | F.A.C.S. name |
|---|---|
| 70 | Brows and forehead not visible |
| 71 | Eyes not visible |
| 72 | Lower face not visible |
| 73 | Entire face not visible |
| 74 | Unscorable |
Gross behavior codes
These codes are reserved for recording information about gross behaviors that may be relevant to the facial actions that are scored.
| A.U. number | F.A.C.S. name | Muscular basis |
|---|---|---|
| 29 | Jaw thrust | |
| 30 | Jaw sideways | |
| 31 | Jaw clencher | masseter |
| 32 | [Lip] bite | |
| 33 | [Cheek] blow | |
| 34 | [Cheek] puff | |
| 35 | [Cheek] suck | |
| 36 | [Tongue] bulge | |
| 37 | Lip wipe | |
| 38 | Nostril dilator | nasalis (pars alaris) |
| 39 | Nostril compressor | nasalis (pars transversa) and depressor septi nasi |
| 40 | Sniff | |
| 41 | Lid droop | levator palpebrae superioris (relaxation) |
| 42 | Slit | orbicularis oculi muscle |
| 43 | Eyes closed | relaxation of levator palpebrae superioris |
| 44 | Squint | corrugator supercilii and orbicularis oculi muscle |
| 45 | Blink | relaxation of levator palpebrae superioris; contraction of orbicularis oculi (pars palpebralis) |
| 46 | Wink | orbicularis oculi |
| 50 | Speech | |
| 80 | Swallow | |
| 81 | Chewing | |
| 82 | Shoulder shrug | |
| 84 | Head shake back and forth | |
| 85 | Head nod up and down | |
| 91 | Flash | |
| 92 | Partial flash | |
| 97* | Shiver/tremble | |
| 98* | Fast up-down look |
See also
- Computer facial animation
- Computer processing of body language
- Emotion classification
- Facial electromyography
- Facial feedback hypothesis
- Facial muscles
- Microexpression
References
<templatestyles src="Reflist/styles.css" />
- ↑ Script error: No such module "citation/CS1". free download: Carl-Herman Hjortsjö, Man's face and mimic language" Template:Webarchive
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Matsumoto, D., & Willingham, B. (2009). "Spontaneous facial expressions of emotion of blind individuals". Journal of Personality and Social Psychology, 96(1), 1-10
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Facial Action Coding System. Retrieved July 21, 2007.
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "Citation/CS1".
- ↑ Script error: No such module "citation/CS1".
- ↑ Script error: No such module "citation/CS1".
Script error: No such module "Check for unknown parameters".
External links
- Paul Ekman's articles relating to F.A.C.S.
- Paul Ekman's Facial Action Coding System (F.A.C.S.)
- More information on the different animal F.A.C.S. projects
- New Yorker article discussing F.A.C.S.
- Details from 1978 edition of F.A.C.S.
- Site at WPI
- download of Carl-Herman Hjortsjö, Man's face and mimic language" Template:Webarchive (the original Swedish title of the book is: "Människans ansikte och mimiska språket". The correct translation would be: "Man's face and facial language")