Any attempt to define the precise
boundaries of a particular area of the brain, such as Broca’s area or Wernicke’s
area, will involve some serious problems. But we do know that the cytoarchitectonic
areas described by Brodmann provide better anatomical correlates for brain functions
than do the shape of the brain’s convolutions. That said, a cortical area
such as Broca’s cannot be precisely described by reference to Brodmann areas
alone. Though many authors regard Broca’s area as consisting of Brodmann
areas 44 and 45, other authors say it consists only of area 44, still others only
of area 45, and yet others of areas 44, 45, and 47.
Broca’s area
may also include the most ventral portion of Brodmann area 6, as well as other
parts of the cortex lying deep within the lateral sulcus. It is even possible
that only certain parts of these areas are actually dedicated to language.
Language acquisition in humans is
based on our capacities for abstraction and for applying rules of syntax—capacities
that other animals lack. For example, brain-imaging experiments have shown that
Broca’s area becomes active when subjects are learning actual rules of grammar
in another language, but not when they are exposed to fictitious rules that actually
violate the grammar of that language.
These findings
suggest that in Broca’s area, biological constraints interact with experience
to make the acquisition of languages possible. Broca’s area may thus represent
the neuronal substrate of the “universal grammar” shared by all of
the world’s languages.
BROCA’S AREA , WERNICKE’S AREA,
AND OTHER LANGUAGE-PROCESSING AREAS IN THE BRAIN
Broca’s
area is generally defined as comprising Brodmann areas 44 and 45, which lie
anterior to the premotor
cortex in the inferior posterior portion of the frontal lobe. Though both
area 44 and area 45 contribute to verbal fluency, each seems to have a separate
function, so that Broca’s area can be divided into two functional units.
Area
44 (the posterior part of the inferior frontal gyrus) seems to
be involved in phonological processing and in language production as such; this
role would be facilitated by its position close to the motor
centres for the mouth and the tongue. Area 45 (the anterior part
of the inferior frontal gyrus) seems more involved in the semantic aspects of
language. Though not directly involved in accessing meaning, Broca’s area
therefore plays a role in verbal memory (selecting and manipulating semantic elements).
Wernicke’s
area lies in the left temporal lobe and, like Broca’s area, is no longer
regarded as a single, uniform anatomical/functional region of the brain. By analyzing
data from numerous brain-imaging experiments, researchers have now distinguished
three sub-areas within Wernicke’s area. The first responds to spoken words
(including the individual’s own) and other sounds. The second responds only
to words spoken by someone else but is also activated when the individual recalls
a list of words. The third sub-area seems more closely associated with producing
speech than with perceiving it. All of these findings are still compatible, however,
with the general role of Wernicke’s area, which relates to the representation
of phonetic sequences, regardless of whether the individual hears them, generates
them himself or herself, or recalls them from memory.
Wernicke’s
area, of which the temporal
planum is a key anatomical component, is located on the superior temporal
gyrus, in the superior portion of Brodmann area 22. This is a strategic location,
given the language functions that Wernicke’s area performs. It lies between
the primary auditory cortex (Brodmann areas 41 and 42) and the inferior parietal
lobule.
This lobule is composed mainly of two distinct regions: caudally,
the angular gyrus (area 39), which itself is bounded by the
visual occipital areas (areas 17, 18, and 19), and dorsally, the supramarginal
gyrus (area 40) which arches over the end of the lateral
sulcus, adjacent to the inferior portion of the somatosensory cortex.
The
supramarginalgyrus seems to be involved in
phonological and articulatory processing of words, whereas the angular
gyrus (together with the posterior cingulate gyrus)
seems more involved in semantic processing. The right angular gyrus appears to
be active as well as the left, thus revealing that the right hemisphere also contributes
to semantic processing of language.
Together, the angular and supramarginal
gyri constitute a multimodal associative area that receives auditory,
visual, and somatosensory inputs. The neurons in this area are thus very well
positioned to process the phonological and semantic aspect of language that enables
us to identify and categorize objects.
The language areas of the brain
are distinct from the circuits responsible for auditory perception of the words
we hear or visual perception of the words we read. The auditory cortex lets us
recognize sounds, an essential prerequisite for understanding language. The visual
cortex, which lets us consciously see the outside world, is also crucial for
language, because it enables us to read words and to recognize objects as the
first step in identifying them by a name.
There are wide variations in the
size and position of Broca’s area and Wernicke’s area as described
by various authors.
Brain
areas such as these, which perform high-level integration functions, are more
heterogeneous than areas that perform primary functions. This greater heterogeneity
might reflect greater sensitivity to environmental influences and greater plasticity
(ability to adapt to them). The functional organization of language would even
appear to vary within the same individual at various stages of his or her life!
One important idea in Mesulam’s
model is that the function of a brain area dedicated to language is not fixed
but rather varies according to the “neural context”. In other words,
the function of a particular area depends on the task to be performed, because
these areas do not always activate the same connections between them. For instance,
the left inferior frontal gyrus interacts with different areas depending on whether
it is processing the sound of a word or its meaning.
This networked
type of organization takes us beyond the “one area = one function”
equation and explains many of the sometimes
highly specific language disorders. For example, some people cannot state
the names of tools or the colours of objects. Other people can explain an object’s
function without being able to say its name, and vice versa.
Brain-imaging studies have shown
to what a large extent cognitive tasks such as those involving language correspond
to a complex pattern of activation of various areas distributed throughout the
cortex. That a particular area of the brain becomes activated when the brain is
performing certain tasks therefore does not imply that this area constitutes the
only clearly defined location for a given function. In the more distributed model
of cognitive functions that is now increasingly accepted by cognitive scientists,
all it means is that the neurons in this particular area of the brain are more
involved in this particular task than their neighbours. It in no way excludes
the possibility that other neurons located elsewhere, and sometimes even quite
far from this area, may be just as involved.
Thus, just because the
content of a word is encoded in a particular neuronal assembly does not necessarily
mean that all of the neurons in this assembly are located at the same place in
the brain. On the contrary, understanding or producing a spoken or written word
can require the simultaneous contribution of several modalities (auditory, visual,
somatosensory, and motor). Hence the interconnected neurons in the assembly responsible
for this task may be distributed across the various cortexes dedicated to these
modalities.
In contrast, the neuronal assemblies involved
in encoding grammatical functions appear to be less widely distributed.
It may therefore be that the brain processes language functions in two ways simultaneously:
in parallel mode by means of distributed networks, and in serial mode by means
of localized convergence zones.
MODELS OF SPOKEN AND WRITTEN LANGUAGE FUNCTIONS
IN THE BRAIN
In the 1980s, American
neurologist Marsel Mesulam proposed an alternative to the Wernicke-Geschwind
model for understanding the brain’s language circuits. Mesulam’s model
posits a hierarchy of networks in which information is processed by levels of
complexity.
For example, when you perform simple language processes
such as reciting the months of the year in order, the
motor and premotor areas for language are activated directly. But when you
make a statement that requires a more extensive semantic and phonological analysis,
other areas come into play first.
When you hear words spoken, they are
perceived by the primary auditory cortex, then processed by unimodal associative
areas of the cortex: the superior and anterior temporal lobes and the opercular
part of the left inferior frontal gyrus.
According
to Mesulam’s model, these unimodal areas then send their information on
to two separate sites for integration. One of these is the temporal pole of the
paralimbic system, which provides access to the long-term
memory system and the emotional
system. The other is the posterior terminal portion of the superior temporal sulcus,
which provides access to meaning. The triangular and orbital portions of the inferior
frontal gyrus also play a role in semantic processing.
Approximate location of the inferior
frontal gyrus. It is divided into three parts: the opercular, triangular, and
orbital. The triangular part of the inferior frontal gyrus forms Broca’s
area.
Mesulam does,
however, still believe that there are two “epicentres”for
semantic processing, i.e., Broca’s
area and Wernicke’s area. This new
conception of these two areas is consistent with the fact that they often work
synchronously when the brain is performing a word processing task, which supports
the idea that there are very strong connections between them.
Mesulam’s
concept of epicentres resembles that of convergence zones as
proposed by other authors: zones where information obtained through various sensory
modalities can be combined. This combining process is achieved through the forming
of cell assemblies: groups of interconnected neurons whose synapses have been
strengthened by their simultaneous firing, in accordance with Hebb’s
law. This concept of language areas as convergence zones where neuronal assemblies
are established thus accords a prominent place to epigenetic influences in the
process of learning a language.
Unquestionably, one of these convergence
zones is the left inferior parietal lobule, which comprises the angular
gyrus and the supramarginal gyrus. In addition to receiving information from
the right
hemisphere, the left inferior parietal lobule also integrates emotional associations
from the amygdala
and the cingulate gyrus.
Some scientists believe that over the course
of evolution, language remained under limbic
control until the inferior parietal lobule evolved and became a convergence zone
that provides a wealth of inputs to Broca’s
area. Some scientists also think that it was the emergence of the inferior
parietal lobule that gave humans the ability to break down the sounds that they
heard so as to make sense of them and, conversely, to express sounds in a sequential
manner so as to convey meaning. In this way, primitive emotional and social vocalizations
would have eventually come to be governed by grammatical rules of organization
to create what we know as modern language.
Lastly, a number of researchers
now reject classic locationist models of language such as Geschwind’s
and Mesulam’s. Instead, they conceptualize language, and cognitive functions
in general, as being distributed across anatomically separate areas that process
information in parallel (rather than serially, from one “language area”
to another).
Even those researchers who embrace this view that linguistic
information is processed in parallel still accept that the primary language functions,
both auditory and articulatory, are localized to some extent.
This concept
of a parallel, distributed processing network for linguistic information constitutes
a distinctive epistemological
paradigm that is leading to the reassessment of certain functional brain imaging
studies.
The proponents of this paradigm believe that the extensive activation
of various areas in the left hemisphere and the large number of psychological
processes involved make it impossible to associate specific language functions
with specific anatomical areas of the brain. For example, the single act of recalling
words involves a highly distributed network that is located primarily in the left
brain and that includes the inferolateral temporal lobe, the inferior posterior
parietal lobule, the premotor areas of the frontal lobe, the anterior cingulate
gyrus, and the supplementary motor area. According to this paradigm, with such
a widely distributed, parallel processing network, there is no way to ascribe
specific functions to each of these structures that contribute to the performance
of this task.
The brain does seem to access meanings
by way of categories that it stores in different physical locations. For example,
if the temporal pole (the anterior end of the temporal lobe) is damaged, the category
“famous people” is lost; if a lesion occurs in the intermediate and
inferior parts of the temporal lobe, the category “animals” disappears.
It also seems that the networks involved in encoding words activate areas in the
motor and visual systems. The task of naming tools activates the frontal premotor
areas, while that of naming animals activates the visual areas. But in both cases,
Broca’s area and Wernicke’s area are not even activated.
Among those scientists who argue
that the brain’s language processing system is distributed across various
structures, some, such as Philip Lieberman, believe that the basal
ganglia play a very important role in language. These researchers further
believe that other subcortical structures traditionally regarded as involved in
motor control, such as the cerebellum
and the thalamus, also contribute to language processing. These views stand in
opposition to Chomsky’s on the exceptional nature of human language and
fall squarely within an adaptationist, evolutionary
perspective.
Even in many species that are quite
distant from humans in evolutionary terms (frogs, for example), the brain is left-lateralized
for the vocalization function.
In chimpanzees, lateralization for the
anatomical areas corresponding to Broca’s and Wernicke’s areas already
exists, even though it does not yet correspond to the language function. And like
the majority of humans, the majority of chimpanzees use their right hand in preference
to their left.
These asymmetries in the other primates represent persuasive
evidence of the ancient phylogenetic origin of lateralization in the human brain.
The
expansion of the prefrontal cortex in humans might in part reflect its
role in the production of language.
Women have the reputation of being
able to talk and listen while doing all sorts of things at the same time, whereas
men supposedly prefer to talk or hear about various things in succession rather
than simultaneously. Brain-imaging studies may now have revealed an anatomical
substrate for this behavioural difference, by demonstrating that language functions
tend to place more demands on both hemispheres in women while being more lateralized
(and mainly left-lateralized) in men. Women also have more nerve fibres connecting
the two hemispheres of their brains, which also suggests that more information
is exchanged between them.
HANDEDNESS, LANGUAGE, AND BRAIN LATERALIZATION
The brain’s anatomical
asymmetry, its
lateralization for language, and the phenomenon of handedness are all clearly
interrelated, but their influences on one another are complex. Though about
90% of people are right-handed, and about 95% of right-handers have their language
areas on the left side of their brains, that still leaves 5% of right-handers
who are either right-lateralized for language or have their language areas distributed
between their two hemispheres. And then there are the left-handers, among
whom all of these patterns can be found, including left-lateralization.
Some scientists suggest that the left hemisphere’s dominance for language
evolved from this hemisphere’s better control over the right hand. The circuits
controlling this “skilful hand” may have evolved so as to take control
over the motor circuits involved in language. Broca’s
area, in particular, is basically a premotor module of the neocortex and co-ordinates
muscle contraction patterns that are related to other things besides language.
Brain-imaging
studies have shown that several structures involved in language processing are
larger in the left hemisphere than in the right. For instance, Broca’s area
in the left frontal lobe is larger than the homologous area in the right hemisphere.
But the greatest asymmetries are found mainly in the posterior language areas,
such as the temporal
planum and the angular gyrus.
Two other notable asymmetries are the larger protrusions
of the frontal lobe on the right side and the occipital lobe on the left. These
protrusions might, however, be due to a slight rotation of the hemispheres (counterclockwise,
as seen from above) rather than to a difference in the volume of these areas.
These protrusions are known as the right-frontal and left-occipital petalias (“petalias”
originally referred to the indentations that these protrusions make on the inside
of of the skull).
The structures involved in producing and understanding
language seem to be laid down in accordance with genetic instructions that come
into play as neuronal
migration proceeds in the human embryo. Nevertheless, the two hemispheres
can remain just about equipotent until language
acquisition occurs. Normally, the language specialization develops in the
left hemisphere, which matures slightly earlier. The earlier, more intense activity
of the neurons in the left hemisphere would then lead both to right-handedness
and to the control of language functions by this hemisphere.
But if the
left hemisphere is damaged or defective, language
can be acquired by the right hemisphere. An excess of testosterone in newborns
due to stress at the time of birth might well be one of the most common causes
of slower development in the left hemisphere resulting in greater participation
by the right.
This hypothesis of a central role for
testosterone is supported by experiments which showed that in rats, cortical asymmetry
is altered if the rodents are injected with testosterone at birth. This hormonal
hypothesis would also explain why two-thirds of all left-handed persons are males.
Interindividual variations,
which are essential for natural selection, are expressed in various ways in the
human brain. Volume and weight can vary by a factor of two or even more. The brain’s
vascular structures are extremely variable; the deficit caused by an obstruction
at a given point in the vascular system can vary greatly from one individual to
another. At the macroscopic anatomical level, the folds and grooves in the brain
also vary tremendously from individual to individual, especially in the areas
associated with language. Variability in the language areas can also be
observed at the microscopic level, for example, in the synaptic structure of the
neurons in Wernicke’s area.
Interindividual variability is also
expressed in the brain’s functional organization, and particularly in the
phenomenon of hemispheric asymmetry. For instance, some data indicate that language
functions may be more bilateral in women than in men. The percentage of atypical
lateralization for language also varies with handedness: it is considerably higher
among left-handers than among right-handers.
Lastly, as if all this were
not enough, there is also such a thing as intraindividual variability. In the
same individual, a given mental task can sometimes activate different neuronal
assemblies in different circumstances—for instance, when the individual
is performing this task for the first time, as opposed to when he or she has already
performed it many times before.
Many theories have been offered
to explain people’s ability to adapt their use of language to the interpersonal
context. One of these is the theory of mind. According to Premack
and Woodruff (1978), the theory of mind is the ability that lets people ascribe
mental processes to other people, to reason on the basis of these ascribed processes,
and to understand the behaviours that arise from them. Premack and Woodruff were
the first authors to use the term “theory of mind”. They did so in
a study on the ability of chimpanzees to ascribe beliefs and intentions to human
beings. Since the time of this study, the theory of mind has been applied mainly
in studies comparing the cognitive
development of normal children and autistic children, because the latter represent
a population that is known to display deficits in social reasoning from the very
earliest age.
When experimental subjects are asked
to identify the emotional content of recorded sentences that are played back into
only one of their ears, they perform better if these sentences are played into
their left ear (which sends them to the right hemisphere) then into their right
(which sends them to the left hemisphere).
THE RIGHT HEMISPHERE’S CONTRIBUTION
TO LANGUAGE
To follow a conversation,
a written document, or an exchange of witticisms, you must be able not only to
understand the syntax of sentences and the meanings of words, but also to interrelate
multiple elements and interpret them with respect to a given context. While various
types of damage to the left hemisphere produce the many
documented forms of aphasia, right hemisphere damage (RHD) causes a variety
of communication deficits involving the interpretation of context. These deficits
can be divided into two main categories.
The first category of RHD-induced
deficits affect communication indirectly, by disrupting people’s ability
to interact effectively with their environment.
One example of a deficit
that can be caused by RHD is hemineglect, in which an individual
pays no attention to stimuli presented to the various sensory modalities on the
left side of the body.
Drawings
2, 4, 5, and 6 were made by a patient with hemineglect.
The individual may also suffer
from anosognosia: unawareness of such deficits. For instance,
some people who have damage just posterior to the central sulcus in their right
hemispheres cannot even recognize certain parts of their own bodies as being their
own. Thus this type of RHD produces a kind of indifference that is the opposite
of the minimum emotional investment required to establish harmonious communication.
The other major family of RHD-induced deficits
affect communication and cognition directly. These deficits can be grouped under
the heading of pragmatic communication disorders, pragmatics being the discipline
that studies the relationships between language and the way that people use it
in context. Pragmatic disorders can be subdivided into disorders in prosody, discourse
organization, and understanding of non-literal language.
Image of the brain of a woman
who is deciding whether or not certain words rhyme. As can be seen, the right
hemisphere is very active.
Source: Shaywitz and Shaywitz,
Yale Medical School
Prosody
refers to the intonation and stress with which the phonemes of a language
are pronounced. People with aprosodia—RHD that impairs
their use of prosody—cannot use intonation and stress to effectively express
the emotions they actually feel. As a result, they speak and behave in a way that
seems flat and emotionless.
The second category of pragmatic communication
disorders that can be caused by RHD affect the organization of discourse
according to the rules that govern its construction. In some individuals, these
disorders take the form of a reduced ability to interpret the signs that establish
the context for a communication, or the nuances conveyed by certain words, or
the speaker’s intentions or body language, or the applicable social conventions.
With regard to social conventions, for example, people generally do not address
their boss the same way they would their brother, but people with certain kinds
of RHD have difficulty in making this distinction.
Last
but not least among the types of pragmatic communication disorders caused by RHD
are disorders in the understanding of non-literal language. It
is estimated that fewer than half of the sentences that we speak express our meaning
literally, or at least they do not do so entirely. For instance, whenever we use
irony, or metaphors, or other forms of indirect language, people’s ability
to understand our actual meaning depends on their ability to interpret our intentions.
To understand irony, for example, people must apply two levels of awareness,
just as they must do to understand jokes. First, they must understand the speaker’s
state of mind, and second, they must understand the speaker’s intentions
as to how his or her words should be construed. Someone who is telling a joke
wants these words not to be taken seriously, while someone who is speaking ironically
wants the listener to perceive their actual meaning as the opposite of their literal
one.
Metaphors too express an intention that belies a literal interpretation
of the words concerned. If a student turns to a classmate and says “This
prof is a real sleeping pill”, the classmate will understand the implicit
analogy between the pill and the prof and realize that the other student finds
this prof boring. But someone with RHD that affects their understanding of non-literal
language might not get this message.
Lastly, the various
indirect ways that we commonly use language in everyday life can cause problems
for people with RHD. In such cases, the speaker’s actual intention underlies
their oral statement as such. For example, someone who says “I wonder what
the time is now ” is indirectly asking for someone to tell them the time,
but a person with RHD may not understand that.
Though
the left hemisphere is still regarded as the dominant hemisphere for language,
the role of the right hemisphere in understanding the context in which language
is used is now well established. We know that in the absence of the left hemisphere
(for example, when Wada’s
test is performed), the right hemisphere can produce some rudimentary language.
But lesion studies have shown that the right hemisphere’s role in language
appears to be far wider—so much so that it is now more accurate to think
of the two hemispheres’ language specializations not as separate functions,
but rather as a variety of abilities that operate in parallel and whose interaction
makes human language in all its complexity possible.