Funding for this site is provided by readers like you.
From thought to language
Sub-Topics

Linked
HelpL'alphabet phonétique international (API)International Phonetic AlphabetAlphabet phonétique international
The landscapes of languageLes sciences cognitivesDes bébés, des mots et des chosesEditorial: Memes as Signs
LE LANGAGELE LANGAGEThe international phonetic associationWhat are waveforms?
Original modules
Tool Module: Chomsky’s Universal GrammarChomsky’s Universal Grammar

 

Phonetic notation uses square brackets [ ] while phonological notation uses slashes / /. For example, in English, the morpheme "bed" would be represented phonetically as [bed] and phonologically as /bed/. Likewise, the morpheme "bet" would be represented as [bet] phonetically and /bet/ phonologically. This is an example of what is called a minimal opposition or a minimal pair. These two morphemes are recognized as different in English, but might be perceived as identical in some other language that does not differentiate between what English-speakers hear as the sounds of "d" and "t".

 

“As soon as a living being has a memory and a plan, he can give a meaning to what he perceives. Meaning therefore does not reside in things themselves. It resides in the living being who uses things to imbue them with meaning. ”

“When we cast a light on one piece of the world, we extinguish everything on which we have not cast that light. This is how we create the things we say: to speak is to create a piece of the world; it is to mould it, to make it, and to make it live.”

- Boris Cyrulnik


The content of a spoken message depends not only on the factual meaning of the words and the prosody (intonation) with which they are spoken, but also on non-linguistic codes such as hand gestures and other body movements. Think of just how well mimes, for example, manage to communicate with non-linguistic codes alone.

That is why, if someone speaks a sentence to you over the telephone, it will be less rich in meaning than if they had spoken that same sentence to you in person. That is also why the same sentence, in written form, will have even less potential meaning than if you heard it over the telephone. The tendency of many people to use “smilies” in their e-mails represents an attempt to restore the prosodic dimension to their communications.

THE CONNECTIONS BETWEEN THOUGHT AND LANGUAGE
LEARNING TO SPEAK A FIRST AND SECOND LANGUAGELANGUAGE DISORDERS

When someone is speaking to you, they do not separate each word from the next with a pause, like the spaces between written words on a page, yet your brain can still recognize each word individually. This ability is truly remarkable. (To realize just how hard it really is to isolate the components of spoken language, just try listening to someone speaking a language that you don’t know at all.)

In linguistics, the smallest unit of meaning, corresponding roughly to a word, is called a morpheme. To recognize words or morphemes from their sounds, the brain breaks them down into phonemes: the smallest units of sound that are used to construct the words of a given language.

There are two different disciplines that study the role of sounds in language. These disciplines are historically related to each other, but one is more relevant than the other for understanding the language functions of the brain.

The first of these disciplines is phonetics, which describes and classifies the sounds of all languages according to the way that these sounds are physically produced by our organs of speech. In phonetics, the sounds of words are represented by symbols placed between square brackets: [ ] (see the first sidebar on this page for an example). Examples of studies in phonetics would include a comparison of the use of different sounds in different languages, or a description of how the sounds in a given language have evolved.

The second discipline is phonology, which historically grew out of phonetics. Phonology is not interested in describing the sounds of language so precisely as phonetics, but is interested in examining the internal structure specific to a given language. In phonology, what is important is not the sound itself, but rather the way that it compares and contrasts with other sounds in the same phonological table typifying a given language. Phonological descriptions of words are represented by symbols between slashes: / / (see the first sidebar on this page for an example). Thus it is phonological analysis that lets us study the cerebral substrate of linguistic encoding and decoding with words, sentences, and meanings in a given language.

The ultimate function of language is to convey meaning. Once a word has been recognized from its phonemes, its meaning will depend on several factors: what it designates in the world, the context in which it is being spoken, and, most important, the way that it fits together with its neighbours in the sentence—that is, its syntax.

The order of the words in a sentence can be extremely important. The English sentences “The man eats the alligator.” and “The alligator eats the man.” contain the same words but mean two very different things. Only the order of the two nouns, and hence their relationship to the verb, has changed.

In every language, there are certain words that mean nothing in themselves but perform a syntactic function in the sequence of words that constitutes a sentence. The usefulness of these “relational” words, such as and, the, a, and with, becomes especially clear when they are left out, sometimes causing inadvertently comic ambiguities, in contexts such as classified advertisements or newspaper headlines where space is at a premium (“Bicyclist struck by car in fair condition.” “Nuns forgive break-in, assault suspect.”)

The American linguist Noam Chomsky demonstrated the importance of syntax in natural languages with his famous sentence “Colorless green ideas sleep furiously.” Obviously, this sentence has no real meaning, but its syntax is so correct that we try to find one anyway. Such observations led Chomsky to formulate his theory of universal grammar (follow the Tool module link to the left). According to this theory, syntax is independent of meaning, of context, of the information stored in the speaker’s memory, and of what the speaker wants to communicate. Chomsky’s approach is disputed, however, by linguists such as George Lakoff, who instead regard conceptual metaphors based on our bodily experiences as the central feature of language.

Be that as it may, the words that we know form a mental lexicon where each word can evoke several different meanings depending on the context in which is is spoken. When we speak, each word is thus related to several other words with which it shares connected meanings. This is what enables the brain to construct categories.

Categorization is one of the most important aspects of language. Without our ability to group similar objects into categories, language would be an infinite set of nouns designating specific objects. In other words, it would be impossible.

The most valuable thing about categorization is that it lets us create concepts— general, abstract mental representations. And concepts in turn make language a tool that lets us expand our cognitive capacities so that we can apply them more effectively to better understand the world.


Many experiments have shown that language enables this transformation of information into abstract representations. For example, if a group of people listen to several sentences that form a paragraph, most of these people will then be able to state the general idea of the paragraph in their own words, but not to repeat the exact sentences they heard. It is as if two transformations have taken place. In the first, the people took the sentences that they heard and represented them mentally in more abstract, consolidated terms, which seem easier to memorize. In the second, the people retrieved this more abstract representation and converted it into their own words.

Scientists are still debating whether the meaning of a word and the characteristics of the real-world object that it designates are stored in the same location in the brain. Some researchers believe that there is a single storage site for every individual concept, idea, or object. For instance, all of the characteristics of lions would be stored together in one area of the brain, and the audible and written forms of the word “lion” would be stored in other areas that were connected to the one where this animal’s characteristics were stored.

But other scientists believe that information is processed in the brain in a much more distributed fashion: the lion’s smell, roar, and physical appearance would thus be stored in many areas of the brain that were closely interconnected. Thus, when you heard or read the word “lion”, all of these areas would be activated simultaneously.

In addition to its fundamental role in communication, language also gives us a powerful internal mechanism for retrieving, critiquing, and changing our thoughts. This internal mode of communication enables us to perform more complex mental operations in both the logical and the affective spheres. And the ability to predict consequences from these conceptual operations provides a definite adaptive advantage to a social species such as our own.

 

Brain-imaging studies have shown that within the brain, language is organized according to semantic categories and not according to words. For example, depending on whether you ask the participants in an experiment to name people, or animals, or tools, you will observe increased activity in different regions of the temporal cortex. This form of organization also explains why relatively small lesions in the left temporal lobe sometimes result in the loss of the words that designate one particular category of objects, but not others.

    

Linked
A Theory of Neurolinguistic DevelopmentÉPIGENÈSE NEURONALE DU SIGNE LINGUISTIQUENew Lessons in How Brain Acquires Language Offered at SeminarTHE LEFT CEREBRAL HEMISPHERE, LANGUAGE & THOUGHT
Imagerie cérébrale du langageImagerie cérébrale du bilinguisme et de l'apprentissage des languesMODÈLES COGNITIFS ISSUS DE L’ANALYSE DE
Original modules
Tool Module: Chomsky’s Universal GrammarChomsky’s Universal Grammar
 Experiment Module: Attempts To Teach Language to Primates Attempts To Teach Language to Primates
Tool Module: Different Types of Bilingualism Different Types of Bilingualism

When a child is one year old, the temporal lobe that includes Wernicke’s area is still very immature and has scarcely more than 50% of the surface area that it will have when the child becomes an adult. Moreover, the central part of this lobe, which in adults is associated with lexical storage, is scarcely 20% of its adult size. The same thing goes for the inferior parietal lobule, which is connected to Wernicke’s area and enables words to be assigned to visual, auditory, and somatosensory events. The neurons of this lobule show relatively little myelinization during the first year of life, and its surface is less than 40% of an adult’s.

By the age of about 20 months, when the child can speak nearly 100 words and understand twice as many, the surface of the temporal lobe has grown to about 65% of an adult’s. At age 30 months, when the child has mastered about 500 words, its temporal lobe is 85% of the size of an adult’s.

The maturation of Wernicke’s area thus seems to be one factor that contributes to the growth of a child’s lexical capacities.


Procedural (implicit) memory for language depends on the integrity of the cerebellum, the corpus striatum, and other basal ganglia, as well as on a particular area in the left perisylvian cortex. Implicit language skills also seem to call on the limbic system, which governs emotions and motivations.

Declarative (explicit) memory, on the other hand, depends on the integrity of the hippocampus, the medial temporal lobe, and large areas of the associative cortex in both hemispheres.


The neuronal phenomenon of the activation threshold is not associated with any particular system of the brain but affects all the higher functions, including language skills. The neural substrate of any mental representation requires a certain frequency of nerve impulses to reach its activation threshold, that is, to generate action potentials itself. Whenever someone uses a particular word or syntactic construction, their activation threshold for it is lowered and its subsequent reuse is facilitated. Conversely, if a neural circuit remains inactive, its activation threshold gradually increases. The same effects are also seen at the molecular level, on two phenomena that play a role in the activation threshold: long-term potentiation (LTP) and long-term depression (LTD).

 

LEARNING TO SPEAK A FIRST AND SECOND LANGUAGE
THE CONNECTIONS BETWEEN THOUGHT AND LANGUAGELANGUAGE DISORDERS

Research by authors such as the American linguist Noam Chomsky has shown that for human language to be as sophisticated as it is, the brain must contain some mechanisms that are partly preprogrammed for this purpose (follow the Tool module link to the left). Babies are born with a language-acquisition faculty that lets them master thousands of words and complex rules of grammar in just a few years. This is not the case for our closest primate cousins, who have never succeeded in learning more than a few hundred symbols and a few simple sentences (follow the Experiment module link to the left).

Until babies are about one year old, they cannot utter anything but babble. This limitation is due to the immaturity of their temporal lobe, which includes Wernicke’s area. This area, by associating words with their meanings, is directly involved in the memorization of the signs used in language. The acquisition of vocabulary during the first years of life seems to closely track the maturation of Wernicke’s area, which eventually enables adults to maintain a vocabulary of some 50 000 to 250 000 words.

Our ability to retain such an impressive number of words involves two different types of memory, depending on whether the language in question is our mother tongue or a second language that we learned later in life (follow the Tool module link to the left).

To learn our mother tongue, we rely on procedural memory (also known as implicit memory), the same kind that is involved when we learn skills that become automatic, like riding a bike or tying our shoelaces. Because we are so immersed in our mother tongue, we end up using it just as automatically, without even being aware of the rules that govern it.

In contrast, to learn a second language, we must usually make a conscious effort to store the vocabulary and the grammatical rules of this language in our memories. When learned in this way, a second language depends on declarative memory (also known as explicit memory). Sometimes, however, people learn a second language “in the street” without having to pay much attention. In this case, the learning process is much the same as it was for their first language and, as in that case, is handled by procedural memory.

In fact, the more the method used to teach a second language is based on communicating and practicing, the more the students who learn it will rely on procedural memory when using it. Conversely, the more formal and systematic the method used to teach the second language, the more the students will rely on declarative memory.


Learning and using a language may thus involve applying either implicit linguistic skills or explicit metalinguistic knowledge. Because each of these skill sets is supported by different structures in the brain (see sidebar), language disorders can affect people’s first languages and second languages selectively. Following brain injuries, bilingual people may selectively lose the use of one of their two languages. But the language that they retain is not necessarily their mother tongue, nor is it necessarily the language that they spoke most fluently before their accident.

Some types of brain damage can make people amnesic without affecting their ability to speak their mother tongue (which depends on procedural memory). But other types can cause serious problems in someone’s automatic use of speech without affecting their ability to remember a language that they learned consciously (using declarative memory). Other observations have been made that reflect this same distinction. For example, some people who have aphasia seem to recover their second language more successfully than their first, whereas some people with amnesia lose access to their second language completely. Alzheimer’s patients retain those language functions based on procedural memory but lose those, such as vocabulary, that are based on declarative memory.

But even in one’s mother tongue, not all aspects of language rely on procedural memory. It is believed, for example, that the lexicon for a person’s first language, which consists of the association of groups of phonemes with meanings, may have close connections with declarative memory. Vocabulary thus seems to constitute a special aspect of language: the great apes are capable of learning a large number of symbols related to words (follow the Experiment module link to the left); “wild children” who are deprived of language at the start of their lives can also learn many words, but comparatively little syntax; and people who have anterograde amnesia, though they can acquire new motor or cognitive skills, cannot learn new words.

While the lexicon for a person’s first language depends on declarative memory, which involves the parietal and temporal lobes, the grammar of this language depends on procedural memory, which involves the frontal lobes and the basal ganglia. Procedural memory is used for unconscious learning of motor and cognitive skills that involve chronological sequences of operations. This description clearly applies to grammatical operations, which consist in sequencing the lexical elements of a language in real time.

Broca’s area and the supplementary motor area and the premotor cortex of the left hemisphere, all of which participate in the production of language, are activated when you repeat words mentally without saying them out loud. In this way, you continually refresh your “phonological buffer” and thus increase the time that you can hold this information in your verbal memory. Thus these frontal areas of the left hemisphere are involved in actively maintaining information in working memory.

Some studies of children with reading problems have shown that these problems were actually due to difficulties in understanding syntax, which were in turn caused by deficiencies in the children’s working memory.

It is also working memory that lets you understand especially long or complex sentences such as “The clown who is carrying the little boy kisses the little girl.” More specifically, working memory lets you keep this verbal information in mind long enough for the sequence of words in the sentence to assume a meaning.


Simultaneous interpretation may well be the most complex verbal task imaginable. To take a speech that is being delivered in one language and simultaneously translate it into another language orally, the interpreter must understand the words that the speaker is saying, hold them in working memory while encoding them into the other language, then speak them in this other language. At the same time, the interpreter must continue listening to the speaker so as to be able to keep repeating this process as the speech goes on.

Link : FIVE PRINCIPLES AND FIVE SKILLS FOR TRAINING INTERPRETERSLink : La pédagogie de la traduction simultanée

    

Linked
MODULARITY, DOMAIN SPECIFICITY AND THE DEVELOPMENT OF LANGUAGE
Experiment
Syllables and Nonpronounceable Clusters Do Not Appear to Activate Distinct Regions in the Brain: A Functional Magnetic Resonance Imaging StudyHuman Brain Language Areas Identified by Functional Magnetic Resonance Imaging

Some famous dyslexics include Einstein, Rodin, Edison, Pasteur, Andersen, and Leonardo da Vinci. In fact, throughout his life, Da Vinci wrote in “mirror script”. When you think about what these geniuses achieved, it almost makes you wish you were dyslexic.

Research : Léonard de VinciResearch : Renaissance ManLink : Les dyslexiques, des gens doués?
Link : L'écriture de Léonard de VinciLink : Famous dyslexics

One of the rarest and strangest types of aphasia is foreign-accent syndrome, of which fewer than 20 cases have been reported over the past 80 years. From one day to the next, people who come down with this syndrome suddenly start speaking with what sounds like a strong foreign accent. In one case, a woman who was born in Boston, had lived her entire life there, had never travelled overseas, and had never learned a language other than English woke up one morning speaking it as if her first language were French!

But a subsequent acoustic analysis of her speech showed that she was not really speaking with a French accent. In reality, she had developed a speech-production disorder that resulted in an acoustical spectrum similar to that of an American comedian imitating a French accent.

Small lesions in various regions of the brain might be the underlying cause of the subtle changes in pronunciation (longer syllables, different tonalities, and so on) that create the impression of a foreign accent.

The existence of foreign-accent syndrome does not mean that there might be an “accent zone” in the brain, but it does give us some indications about the way that language is produced.

Lien : 'Foreign accent syndrome' explainedLien : The foreign accent syndrome: a reconsiderationLien : A case of foreign accent syndrome without aphasia caused by a lesion of the left precentral gyrusLien : Foreign Accent Syndrome (FAS) Support

How brain damage will affect language use in bilingual people is hard to predict. Factors that may influence whether a bilingual person recovers the use of one or both languages include the order in which they were learned, the person’s ease of expression in each of them, and which language the person used the most recently.

We know, for example, that if the person learned both languages at the same time, the damage will usually affect both languages in the same way. But if the person learned the two languages at different times, one will likely be more affected than the other.

Lien : The Bilingual BrainLien : L'aphasie chez les personnes bilinguesLien : The role of cognates in bilingual aphasia: Implications for assessment and treatmentChercheur : Michel Paradis
Expérience : Pathological switching between languages after frontal lesions in a bilingual patientLien : The Handbook of BilingualismOutil: Différents types de bilinguisme
LANGUAGE DISORDERS
THE CONNECTIONS BETWEEN THOUGHT AND LANGUAGELEARNING TO SPEAK A FIRST AND SECOND LANGUAGE

Many language disorders still remain mysterious or surprising. Such is the case with dyslexia, which is one of the developmental language deficits (dysphasias), as well as with certain types of aphasia resulting from highly localized brain lesions.

Dyslexia consists of various degrees of difficulty in learning to read and write. This is a developmental disorder that is discovered when children are learning to read (around age 6 or 7) and that is more common among boys and among children who are left-handed. (Some reading problems can be acquired in adult life, as the result of brain injuries, in which case they are referred to as alexia.)

People with dyslexia may confuse certain sounds (for instance, p with b, or f with v) or letters that are visually similar (such as m and n). To dyslexics, some letters may seem reversed (perhaps a d that looks like a b) or even words may appear so (“bag” looks like “gab”). In cases of what is known as profound dyslexia, when people read out loud, they will actually substitute one word for another with a related meaning (for example, read “cow” in place of “horse”).

Dyslexics represent 5 to 10% of the total population, and their other cognitive abilities are completely normal. The severity of this disorder varies widely; some dyslexics have only slight trouble in reading, while others are completely illiterate.

Dyslexics may be able to express themselves orally in a completely normal way, but the problem begins when they are confronted with written words. However, the nature of dyslexia is probably more complex than a simple difficulty in reading. Some authorities even regard it as more of a problem in sensory processing, while others consider it a memory disorder. The purpose of most research on dyslexia is to establish the entire chain of causal links between certain genes, certain parts of the brain, certain cognitive functions, and the ability to read and write.

Thus researchers have begun to identify various signs of pathology in the brains of dyslexics. For example, some obvious abnormalities have been reported in the arrangement of cortical cells, especially in certain areas of the left frontal and temporal cortexes. The scientists who uncovered these unusual cellular configurations in language-related areas of the brain believe that they probably start developing in the middle period of fetal gestation, when active cell migration is observed in the cerebral cortex.

In most people, the left temporal planum is considerably larger than the right. But some authors say that in dyslexics, these two structures are similar in size. The presence or absence of temporal planum asymmetry in dyslexics remains a controversial topic, however. When differences in age, sex, and overall brain size are taken into account, the anatomical differences between the temporal planum in dyslexics and in control groups are far smaller.

Other studies suggest that some changes in dyslexics’ sensory pathways may be responsible for their problems with reading. Autopsies have shown that in dyslexics, the neurons in the magnocellular layers of the lateral geniculate nucleus were smaller than in control groups and were arranged in a disorganized fashion. These abnormalities might interfere with the rapid processing required for changing visual signals such as those involved in reading.

Brain-imaging studies of dyslexics have shown reduced activity in visual area V5/MT, which is responsible for detecting movement, or in the lower part of the left temporal lobe.

Link : LANGAGE ECRITLink : Evaluation et Soutien de l'Organisation de la Parole et du Langage de l'EnfantLink : Atypical Brain Activity Detected in People with DyslexiaLink : Dr. Sally Shaywitz  - The Brain and Dyslexia - What Brain Imaging Can and Can't Tell Us About Reading Difficulties
Link : La DyslexieLink : Dyslexie : la cognition en désordre ?

 

 

Depending on the extent of the brain damage that causes them, the various forms of aphasia range from subtle speech impairments to a complete inability to speak.

Global aphasia is equivalent to having both expressive and receptive aphasia. In global aphasia, extensive damage to the frontal, temporal, and parietal cortexes, including in particular Broca’s area, Wernicke’s area, and the supramarginal gyrus, leads to the total loss of the ability to understand, speak, read, or write language.

People with global aphasia can manage to pronounce just a very few words, unconnected by any syntax. At best, global aphasics have an automatic form of expressive language, composed especially of emotional exclamations. They may also still have control over facial expressions, hand gestures, and vocal intonations. Their prognosis for recovering use of language is nevertheless extremely poor.

In conduction aphasia, language comprehension and spontaneous oral expression are normal, but individuals have a great deal of difficulty when asked to repeat words or phrases.When they try to do so, they mix up the sounds in words and make numerous transformations and omissions of words.

The location of the brain lesions that cause conduction aphasia is still controversial. Wernicke, and later Geschwind, believed that conduction aphasia was caused by the destruction of the arcuate fasciculus, the fibre bundle, in the suprasylvian parietal cortex, that connects Wernicke’s area to Broca’s area. But other authors have proposed that the symptoms of conduction aphasia might be produced by dysfunctions in areas such as the auditory cortex, the insula, and the supramarginal gyrus.

In anomic aphasia (also known as nominal aphasia), oral expression and syntactic structure remain intact, and the main difficulty is in finding certain words. People with anomic aphasia compensate for their trouble in finding the right words by using other, vaguer words such as “thing” or “whatsit”, or they may use circumlocutions such as “the instrument that you wear on the wrist and that tells you the time”. If you show someone with anomic aphasia a photo of John F. Kennedy, for example, she may say that he was president of the United States, and that he was assassinated, but will not find his name until you help her by hinting “John F… ”. It is still possible to communicate with anomic aphasics, however, if you know the context or subject of the conversation.

Anomic aphasia is often caused by parietal lobe damage that is limited to the angular gyrus or the area just above it. This disorder has also been associated with damage to the pulvinar in the thalamus. Because the language processing system in the human brain forms a densely interconnected network, damage just about anywhere in the left hemisphere can cause some form of anomic aphasia. Depending on the location of the lesion, an individual may, for example, be unable to name an item when it is shown to him (because of a disconnection between the visual cortex and the inferior parietal cortex) but still be able to name it if he is allowed to touch it or if it is described to him out loud (if the auditory and tactile pathways to the parietal cortex remain intact).

Many other, less common forms of aphasia have also been described. In alexia, caused by damage to the inferior part of the left occipital and temporal lobes, the individual cannot read but can still write. In agraphia, the individual can reason normally, but cannot write. In anarthria, a malfunction in the system that controls the motor aspects of speech prevents individuals from articulating the words that would convey their thoughts. Progressive aphasia develops insidiously, resulting in a gradually worsening loss of speech. Subcortical aphasia is caused by small lesions in the subcortical areas of the left hemisphere and presents a variety of the symptoms seen in other kinds of aphasia. Transcortical motor aphasia (TMA) is characterized by abnormalities in spontaneous expression but distinguished from Broca’s aphasia in that people with TMA can repeat long sentences, whereas Broca’s aphasics can repeat only simple words.

The fact that each of these different types of aphasia typically includes several subtypes clearly shows just how complex language pathologies can be.

Link : Troubles du langageLink : Les aphasiesLink : alexie sans agraphieLink : Types d'aphasies
Link : Les aphasiesLink : Mind and Brain

In deaf people who use sign language, left hemisphere damage can cause language deficits comparable to those observed in verbal aphasics. For example, in some cases very similar to Broca’s aphasia, it becomes very difficult for people to sign, even though neither their comprehension nor their non-sign-language gestures are impaired.

Likewise, there is a manifestation of Wernicke’s aphasia that occurs in deaf people. In these cases, people can still sign fluently, but make frequent mistakes, and they have difficulty in understanding other people’s signs.

There was also one very rare case involving a man who could speak but also knew sign language because his parents were deaf. After suffering a left-hemisphere stroke, he displayed global aphasia, from which he recovered gradually. Interestingly, he recovered his ability to express himself in both spoken language and sign language at the same time. Other studies have shown that the two areas of the left hemisphere that are involved in these two types of language overlap, though not completely.

Outil: La langue des signes

  Presentations | Credits | Contact | Copyleft