Skip to content
Surf Wiki
Save to docs
general/facial-expressions

From Surf Wiki (app.surf) — the open knowledge base

Facial Action Coding System

System of classifying human facial movements

Facial Action Coding System

System of classifying human facial movements

Muscles of head and neck

The Facial Action Coding System (F.A.C.S.) is a system to taxonomize human facial movements by their appearance on the face, based on a system originally developed by a Swedish anatomist named Carl-Herman Hjortsjö. It was later adopted by Paul Ekman and Wallace V. Friesen, and published in 1978. Ekman, Friesen, and Joseph C. Hager published a significant update to F.A.C.S. in 2002. Movements of individual facial muscles are encoded by the F.A.C.S. from slight different instant changes in facial appearance. It has proven useful to psychologists and to animators.

Background

In 2009, a study was conducted to study spontaneous facial expressions in sighted and blind judo athletes. They discovered that many facial expressions are innate and not visually learned.

Method

Using the F.A.C.S., human coders can manually code nearly any anatomically possible facial expression, deconstructing it into the specific "action units" (A.U.) and their temporal segments that produced the expression. As A.U.s are independent of any interpretation, they can be used for any higher-order decision-making process including recognition of basic emotions, or pre-programmed commands for an ambient intelligent environment. The F.A.C.S. manual is over five hundred pages in length and provides the A.U.s, as well as Ekman's interpretation of their meanings.

The F.A.C.S. defines A.U.s as contractions or relaxations of one or more muscles. It also defines a number of "action descriptors", which differ from A.U.s in that the authors of the F.A.C.S. have not specified the muscular basis for the action and have not distinguished specific behaviors as precisely as they have for the A.U.s.

For example, the F.A.C.S. can be used to distinguish two types of smiles as follows:

  • the insincere and voluntary Pan-Am smile: contraction of zygomatic major alone
  • the sincere and involuntary Duchenne smile: contraction of zygomatic major and inferior part of orbicularis oculi.

The F.A.C.S. is designed to be self-instructional. People can learn the technique from a number of sources including manuals and workshops, and obtain certification through testing.

Although the labeling of expressions currently requires trained experts, researchers have had some success in using computers to automatically identify the F.A.C.S. codes. One obstacle to automatic FACS code recognition is a shortage of manually coded ground truth data.

Uses

Baby F.A.C.S.

Baby F.A.C.S. (Facial Action Coding System for Infants and Young Children) is a behavioral coding system that adapts the adult F.A.C.S. to code facial expressions in infants aged 0–2 years. It corresponds to specific underlying facial muscles, tailored to infant facial anatomy and expression patterns.

It was created by Dr. Harriet Oster and colleagues to address the limitations of applying adult F.A.C.S. directly to infants, whose facial musculature, proportions and developmental capabilities differ significantly.

Use in medicine

The use of the F.A.C.S. has been proposed for use in the analysis of depression, and the measurement of pain in patients unable to express themselves verbally.

Interspecial applications

The original F.A.C.S. has been modified to analyze facial movements in several non-human primates, namely chimpanzees, rhesus macaques, gibbons, and siamangs, and orangutans. More recently, it was developed also for domesticated species, including dogs, horses and cats. Similarly to the human F.A.C.S., the non-human F.A.C.S. has manuals available online for each species with the respective certification tests.

Thus the F.A.C.S. can be used to compare facial repertoires across species due to its anatomical basis. A study conducted by Vick and others (2006) suggests that the F.A.C.S. can be modified by taking differences in underlying morphology into account. Such considerations enable a comparison of the homologous facial movements present in humans and chimpanzees, to show that the facial expressions of both species result from extremely notable appearance changes. The development of F.A.C.S. tools for different species allows the objective and anatomical study of facial expressions in communicative and emotional contexts. Furthermore, an interspecial analysis of facial expressions can help to answer interesting questions, such as which emotions are uniquely human.

The Emotional Facial Action Coding System (E.M.F.A.C.S.) and the Facial Action Coding System Affect Interpretation Dictionary (F.A.C.S.A.I.D.) consider only emotion-related facial actions. Examples of these are:

EmotionAction units
Happiness6+12
Sadness1+4+15
Surprise1+2+5B+26
Fear1+2+4+5+7+20+26
Anger4+5+7+23
Disgust9+15+17
ContemptR12A+R14A

Computer-generated imagery

F.A.C.S. coding is also used extensively in computer animation, in particular for computer facial animation, with facial expressions being expressed as vector graphics of A.Us. F.A.C.S. vectors are used as weights for blend shapes corresponding to each A.U., with the resulting face mesh then being used to render the finished face. Deep-learning techniques can be used to determine the F.A.C.S. vectors from face images obtained during motion capture acting, facial motion capture or other performances.

Codes for action units

For clarification, the F.A.C.S. is an index of facial expressions, but does not actually provide any biomechanical information about the degree of muscle activation. Though muscle activation is not part of the F.A.C.S., the main muscles involved in the facial expression have been added here.

Action units (A.U.s) are the fundamental actions of individual muscles or groups of muscles.

Action descriptors (A.D.s) are unitary movements that may involve the actions of several muscle groups (e.g., a forward‐thrusting movement of the jaw). The muscular basis for these actions has not been specified and specific behaviors have not been distinguished as precisely as for the A.U.s.

For the most accurate annotation, the F.A.C.S. suggests agreement from at least two independent certified F.A.C.S. encoders.

Intensity scoring

Intensities of the F.A.C.S. are annotated by appending letters A–E (for minimal-maximal intensity) to the action unit number (e.g. A.U. 1A is the weakest trace of A.U. 1 and A.U. 1E is the maximum intensity possible for the individual person).

  • A Trace
  • B Slight
  • C Marked or pronounced
  • D Severe or extreme
  • E Maximum

Other letter modifiers

There are other modifiers present in F.A.C.S. codes for emotional expressions, such as "R" which represents an action that occurs on the right side of the face and "L" for actions which occur on the left. An action which is unilateral (occurs on only one side of the face) but has no specific side is indicated with a "U" and an action which is bilateral but has a stronger side is indicated with an "A" for "asymmetric".

List of A.U.s and A.D.s (with underlying facial muscles)

Main codes

A.U. numberF.A.C.S. nameMuscular basis
0Neutral face
1Inner brow raiserfrontalis (pars medialis)
2Outer brow raiserfrontalis (pars lateralis)
4Brow lowererdepressor glabellae, depressor supercilii, corrugator supercilii
5Upper lid raiserlevator palpebrae superioris, superior tarsal muscle
6Cheek raiserorbicularis oculi (pars orbitalis)
7Lid tightenerorbicularis oculi (pars palpebralis)
8Lips toward each otherorbicularis oris
9Nose wrinklerlevator labii superioris alaeque nasi
10Upper lip raiserlevator labii superioris, caput infraorbitalis
11Nasolabial deepenerzygomaticus minor
12Lip corner pullerzygomaticus major
13Sharp lip pullerlevator anguli oris (also known as caninus)
14Dimplerbuccinator
15Lip corner depressordepressor anguli oris (also known as triangularis)
16Lower lip depressordepressor labii inferioris
17Chin raisermentalis
18Lip puckerincisivii labii superioris and incisivii labii inferioris
19Tongue show
20Lip stretcherrisorius with platysma
21Neck tightenerplatysma]
22Lip funnelerorbicularis oris
23Lip tightenerorbicularis oris
24Lip pressororbicularis oris
25Lips partdepressor labii inferioris, or relaxation of mentalis or orbicularis oris
26Jaw dropmasseter; relaxed temporalis and internal pterygoid
27Mouth stretchpterygoids, digastric
28Lip suckorbicularis oris

Head movement codes

A.U. numberF.A.C.S. nameAction
51Head turn left
52Head turn right
53Head up
54Head down
55Head tilt left
M55Head tilt leftThe onset of the symmetrical 14 is immediately preceded or accompanied by a head tilt to the left.
56Head tilt right
M56Head tilt rightThe onset of the symmetrical 14 is immediately preceded or accompanied by a head tilt to the right.
57Head forward
M57Head thrust forwardThe onset of 17+24 is immediately preceded, accompanied, or followed by a head thrust forward.
58Head back
M59Head shake up and downThe onset of 17+24 is immediately preceded, accompanied, or followed by an up-down head shake (nod).
M60Head shake side to sideThe onset of 17+24 is immediately preceded, accompanied, or followed by a side to side head shake.
M83Head upward and to the sideThe onset of the symmetrical 14 is immediately preceded or accompanied by a movement of the head, upward and turned or tilted to either the left or right.

Eye movement codes

A.U. numberF.A.C.S. nameAction
61Eyes turn left
M61Eyes leftThe onset of the symmetrical 14 is immediately preceded or accompanied by eye movement to the left.
62Eyes turn right
M62Eyes rightThe onset of the symmetrical 14 is immediately preceded or accompanied by eye movement to the right.
63Eyes up
64Eyes down
65Walleye
66Cross-eye
M68Upward rolling of eyesThe onset of the symmetrical 14 is immediately preceded or accompanied by an upward rolling of the eyes.
69Eyes positioned to look at other personThe 4, 5, or 7, alone or in combination, occurs while the eye position is fixed on the other person in the conversation.
M69Head or eyes look at other personThe onset of the symmetrical 14 or A.U.s 4, 5, and 7, alone or in combination, is immediately preceded or accompanied by a movement of the eyes or of the head and eyes to look at the other person in the conversation.

Visibility codes

A.U. numberF.A.C.S. name
70Brows and forehead not visible
71Eyes not visible
72Lower face not visible
73Entire face not visible
74Unscorable

Gross behavior codes

These codes are reserved for recording information about gross behaviors that may be relevant to the facial actions that are scored.

A.U. numberF.A.C.S. nameMuscular basis
29Jaw thrust
30Jaw sideways
31Jaw clenchermasseter
32[Lip] bite
33[Cheek] blow
34[Cheek] puff
35[Cheek] suck
36[Tongue] bulge
37Lip wipe
38Nostril dilatornasalis (pars alaris)
39Nostril compressornasalis (pars transversa) and depressor septi nasi
40Sniff
41Lid drooplevator palpebrae superioris (relaxation)
42Slitorbicularis oculi muscle
43Eyes closedrelaxation of levator palpebrae superioris
44Squintcorrugator supercilii and orbicularis oculi muscle
45Blinkrelaxation of levator palpebrae superioris; contraction of orbicularis oculi (pars palpebralis)
46Winkorbicularis oculi
50Speech
80Swallow
81Chewing
82Shoulder shrug
84Head shake back and forth
85Head nod up and down
91Flash
92Partial flash
97*Shiver/tremble
98*Fast up-down look

References

References

  1. (2022-08-06). "Man's face and mimic language".
  2. (1978). "Facial Action Coding System: A Technique for the Measurement of Facial Movement.". Consulting Psychologists Press.
  3. (2002). "Facial Action Coding System: The Manual on CD ROM.". A Human Face.
  4. Matsumoto, D., & Willingham, B. (2009). "Spontaneous facial expressions of emotion of blind individuals". ''[[Journal of Personality and Social Psychology]]'', 96(1), 1-10
  5. (2012). "Encyclopedia of Human Behavior". Elsevier/Academic Press.
  6. (May 2007). "Differences between children and adults in the recognition of enjoyment smiles". Developmental Psychology.
  7. Rosenberg, Erika L.. "Example and web site of one teaching professional".
  8. "Facial Action Coding System".
  9. [http://www.cs.wpi.edu/~matt/courses/cs563/talks/face_anim/ekman.html Facial Action Coding System.] Retrieved July 21, 2007.
  10. (10 Mar 2023). "Self-supervised Facial Action Unit Detection with Region and Relation Learning".
  11. (2006). "Baby FACS: Facial Action Coding System for Infants and Young Children". Unpublished monograph and coding manual. New York University..
  12. (November 2007). "Impact of depression on response to comedy: a dynamic facial coding analysis". Journal of Abnormal Psychology.
  13. (2007). "A psychophysical investigation of the facial action coding system as an index of pain variability among older adults with and without Alzheimer's disease". Pain Medicine.
  14. (February 2007). "Classifying chimpanzee facial expressions using muscle action". Emotion.
  15. (December 2010). "Brief communication: MaqFACS: A muscle-based facial movement coding system for the rhesus macaque". American Journal of Physical Anthropology.
  16. (2012). "GibbonFACS: A Muscle-Based Facial Movement Coding System for Hylobatids". International Journal of Primatology.
  17. (2012). "OrangFACS: A Muscle-Based Facial Movement Coding System for Orangutans (''Pongo'' spp.)". International Journal of Primatology.
  18. (2013). "Paedomorphic facial expressions give dogs a selective advantage". PLOS ONE.
  19. (2015-08-05). "EquiFACS: The Equine Facial Action Coding System". PLOS ONE.
  20. (2017-04-01). "Development and application of CatFACS: Are human cat adopters influenced by cat facial expressions?". Applied Animal Behaviour Science.
  21. "Home".
  22. (March 2007). "A Cross-species Comparison of Facial Morphology and Movement in Humans and Chimpanzees Using the Facial Action Coding System (FACS)". Journal of Nonverbal Behavior.
  23. (1983). "EMFACS-7: Emotional Facial Action Coding System. Unpublished manuscript". University of California at San Francisco.
  24. "Facial Action Coding System Affect Interpretation Dictionary (FACSAID)".
  25. Walsh, Joseph. (2016-12-16). "Rogue One: the CGI resurrection of Peter Cushing is thrilling – but is it right?". The Guardian.
  26. (October 2021). "FACSHuman, a software program for creating experimental material by modeling 3D facial expressions". Behavior Research Methods.
  27. "Discover how to create FACS facial blendshapes in Maya {{!}} CG Channel".
  28. (2015). "2015 11th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG)".
Info: Wikipedia Source

This article was imported from Wikipedia and is available under the Creative Commons Attribution-ShareAlike 4.0 License. Content has been adapted to SurfDoc format. Original contributors can be found on the article history page.

Want to explore this topic further?

Ask Mako anything about Facial Action Coding System — get instant answers, deeper analysis, and related topics.

Research with Mako

Free with your Surf account

Content sourced from Wikipedia, available under CC BY-SA 4.0.

This content may have been generated or modified by AI. CloudSurf Software LLC is not responsible for the accuracy, completeness, or reliability of AI-generated content. Always verify important information from primary sources.

Report