When are facial gestures in a signed language linguistic?

Elisabeth Engberg-Pedersen, Department of Nordic Studies and Linguistics, University of Copenhagen


Signers of Danish Sign Language (DTS) use two clearly identifiable, nonmanual signals involving muscles around the mouth. For g-a (guarded assessment) signers typically raise their chins, the corners of the mouth appear to be lowered, and the lips may protrude. In n-c (nose consent) the muscles of the cheek and on the side of the nose contract and raise the upper lip, the brows may be lowered. The two signals have different distributions, but also overlap distributionally. Both can be used in responses with and without a manual sign.

Grammatical nonmanual signals in signed languages are described as co-extensive with the manual signals in their semantic scope (Baker-Shenk & Cokely 1980). However, n-c is brief and cannot be extended, whereas g-a can be extended in time, and although it has a clear start, it may linger after the signer has finished the signs in its scope. N-c appears to be a nonmanual equivalent of the manual sign YES used for consent and to be an areal feature of Northern European signed languages. G-a signals skepticism, but can be used with the gesture palm-up (Müller 2004) as a tag to encourage agreement from one’s conversation partner (cf. tags like English isn’t it).

The muscles involved in making g-a and n-c are used in non-linguistic facial expressions signaling disgust and anger (Ekman & Friesen 2003), i.e., negative feelings. Comparing human and chimpanzee facial expressions, Vick et al. describe Chin Raiser as the pushing of “the chin and lower lip upwards, often causing it to protrude… As the center of the lips is pushed upwards, the mouth corners appear to be pulled downwards” (2007: 12). In chimpanzees, this particular signal typically occurs in pouts in “contexts of embraces, invitations, play, approaches, and in response to aggression… Therefore, pouts may represent a need for contact, or reassurance, and physical affinity” (Parr et al. 2007: 177).

In my presentation, I will discuss the signals’ possible origins, linguistic status, and routes into DTS (Wilcox 2004) and their status in relation to Crasborn et al.’s (2008) classification of mouth actions in signed languages.



Crasborn, Onno, Els van der Kooij, Dafydd Waters, Bencie Woll & Johanna Mesch. 2008. Frequency distribution and spreading behavior of different types of mouth actions in three sign languages. Sign Language and Linguistics 11(1), 45–67.

Ekman, Paul, & Friesen, Wallace V. 2003. Unmasking the face: A guide to recognizing emotions from facial expressions. Cambridge, MA: Malor Books.

Vick, Sarah-Jane, Waller, Bridget M., Parr, Lisa A., Pasqualini, Marcia C. S., & Bard, Kim A. 2007. A cross-species comparison of facial morphology and movement in humans and chimpanzees using the Facial Action Coding System (FACS). Journal of Nonverbal Behavior 31, 1–20.

Wilcox, Sherman. 2004. Gesture and language: Cross-linguistic and historical data from signed languages. Gesture 4(1), 43–73.

Lämna ett svar

Din e-postadress kommer inte publiceras. Obligatoriska fält är märkta *