| Tool Module: Sign Language When 
the vast majority of us think about language, we inevitably think of words spoken 
by the human voice. Be it our mother tongue, our second language, or a language 
that we don’t know at all, our first and primal experience of that language 
is through our sense of hearing and our faculty of speech. Then, a bit later, 
we learn to transpose these sounds into writing. Learning a new language thus 
means learning to pronounce new sounds, worrying about our accents, and trying 
to decode sequences of sounds that at first seem totally undifferentiated. For 
millennia, people have thought about spoken and written language in this way. 
We analyze it, we estheticize it, we perfect its use for persuasion or seduction. 
Literary authors and critics explore it endlessly, while linguists attempt to 
dissect it, and neurobiologists look for the parts of the brain that become active 
when we hear it, understand it, or speak it. In fact, ever since the Sophists 
of Ancient Greece, philosophers have inquired into this human ability to express 
thoughts in words. Even today, we still tend to accept the Platonic concept that 
language, conveyed by speech, is noble, pure, spiritual, and somehow separate 
from the body, the seat of all that is base and vile, the origin of all the impulses 
that Western culture tries to repress.  One therefore has to wonder what 
all these philosophers would think if they could see entire languages being expressed 
through movements of the body: the sign languages that are now used by people 
all around the world every day. One thing is certain—this sight would certainly 
make it hard to defend the time-honoured view of language as completely dissociated 
from the body.  Sign languages now represent the cornerstone of the cultural 
identities of deaf communities throughout the world. These languages are not universal. 
Each community has its own, and each of these languages, like any other language, 
has a developmental history that is inextricably linked to the social and cultural 
circumstances of the people who use it. Nor are sign languages mutually intelligible; 
someone who uses Quebec Sign Language cannot understand someone who uses Japanese 
Sign Language. Yet Quebec Sign Language is closer to Japanese Sign Language than 
to spoken Quebec French, for example. Thus, sign languages are full-fledged languages 
that, just like oral language, enable their users to discourse on philosophy and 
to create literature. Sign languages are visual/spatial languages, that 
is, they are designed to be seen, and their grammar is structured in space. Though 
they are not perfectly equivalent, the signs in a sign language can be regarded 
as the counterparts of words in an oral language. Each sign consists of a complex 
movement of the hands and can be described by several parameters: the shape of 
the hand, the point of articulation, the orientation of the palm, movement, the 
arrangement of the hands in relation to each other, and rhythm. In linguistic 
descriptions of sign languages, this movement of the hands is called “manual 
behaviour”.  In addition to hand movements, signing requires non-manual 
behaviours, in particular the position of the head and trunk, facial expression, 
and the direction in which the signer is looking. For example, in Quebec Sign 
Language, the sign [SPORT] (meaning “sports”) is distinguished from 
the sign [DRÔLE] (meaning “funny”) solely by the configuration 
of the face: for [SPORT], the eyebrows are in a neutral position and the mouth 
is slightly rounded; for [DRÔLE], the eyebrows are raised and the mouth 
is smiling. In both cases, the hand movement is the same: opening and closing 
the index and middle fingers together in front of the nose while keeping the thumb 
extended and the ring finger and little finger folded under.      
 The 
word “funny” in Quebec Sign Language   
 The word 
“sports” in Quebec Sign Language   With some signs, such 
as the Quebec Sign Language signs meaning “to sleep”and “to 
drink”, the connection between the sign and the action or object that it 
represents is illustrative, so that even someone who does not know the sign language 
may recognize it. This aspect of a sign is called its iconicity. The degree of 
iconicity can vary. For example, the sign for “psychologist” in both 
American Sign Language and Quebec Sign Language consists of a representation of 
the Greek letter “psi”, but this sign is less readily recognizable 
and usually goes unnoticed by people unfamiliar with these languages. In still 
other cases, the sign is completely arbitrary and no direct link can be made between 
it and the thing that it designates.  The grammar of sign languages is constructed 
in space, and someone who is telling a story in sign language must make it visually 
coherent. Here is one simple example. When we say that a car passes over a bridge, 
the rules of English syntax require us to mention the car first and the bridge 
second. The rules of French syntax require the same, so that a French-speaking 
Quebecer will say: “La voiture passe sur le pont.” But someone signing 
the same message in Quebec Sign Language must first make the sign designating 
the bridge and position it in space, and then make the sign for the car, in motion, 
passing over that position. Not only that, but the signer must also describe the 
car’s movement qualitatively—faster, if the driver has the pedal to 
the floor, zig-zagging from left to right, if the driver is drunk and shouldn’t 
be behind the wheel. Brain activity in signers  The study of 
the brain areas involved in signing has contributed greatly to our understanding 
of the relationship between the brain and language. One first, rather surprising 
finding is that contrary to what one might have supposed, it is generally not 
the right hemisphere that plays the dominant role in processing the visual/spatial 
code of a sign language, but rather the left, just as with spoken language. Brain-imaging 
studies have shown that the two large areas involved in language processing in 
the left hemisphere—Broca’s area and Wernicke’s area—are 
activated in exactly the same way in people who communicate with a sign language 
as in people who use a spoken language. And as would follow logically, if hearing 
persons do not understand sign language, their Broca’s and Wernicke’s 
areas do not show any particular activity when they watch deaf people signing. 
  These same studies did, of course, show differences between the primary 
auditory perception areas for language in the hearing subjects and the primary 
visual areas in the signers. Studies of left-hemisphere lesions in deaf 
persons who used sign language pointed to the same conclusion regarding dominance 
for language. In these subjects, left-hemisphere lesions that included Broca’s 
area resulted in agrammatism and a reduction in the number of sentences produced, 
closely matching the classic model for expressive aphasia. When the lesions instead 
affected Wernicke’s area in the left temporal lobe, the symptoms were very 
much like those of receptive aphasia in hearing persons: the subjects displayed 
significantly impaired understanding, while producing signs that were abundant 
but imprecise.   One remarkable finding is that when deaf people who know 
sign language have a right-hemisphere lesion, they tend to ignore the space in 
front of their left eye most of the time, but not when they are signing. This 
lends further support to the idea that their brains treat space differently when 
using it to express grammatical relationships in sign language rather than ordinary 
visual/spatial relationships.   The left hemisphere of the human brain thus 
does indeed seem to specialize in symbolic representation and communication—i.e., 
language—in general, regardless of whether the channels for receiving and 
expressing language are auditory or visual. Rather than being programmed solely 
for spoken language, humans thus appear to be programmed for language in general, 
be it verbal or gestural.  One last interesting observation along these 
same lines: when deaf couples who sign have babies who are congenitally deaf, 
these babies babble with their hands, making gestures that are the precursors 
of actual signs, just the way hearing babies babble with their vocal apparatus! 
 To find a course in sign language:Courses in sign language 
are offered at specialized training centres for the deaf and at some universities. 
You can also get information about sign-language courses from your local Association 
of the Deaf.
 Thanks to Julie Châteauvert for her contributions 
to this module.
 
 
 
   |