How literacy transforms the human brain6 Replies
One of the great things about Twitter is that it dredges up and recycles good information for those of us who were gazing out the window thinking about lunch the first time around, sigh.
The Twitterverse recently took me to a June 2013 article called “Inside the letterbox: How literacy transforms the human brain“, by French cognitive neuroscientist Stanislas Dehaene.
It’s 4500 words, and not an easy read, but contains great information I want lots of people to know. So I thought I’d have crack at summarising it in easier language, both to help myself understand it better and to encourage others to read it.
Here’s roughly what it says, with my apologies for any (almost inevitable) oversimplifications:
All the world’s writing systems rely on the same part of the brain. Under a brain scan, whatever language you’re reading, the same small section at the base of the left hemisphere lights up.
This area is called the visual word form area, or the “brain’s letterbox” because it responds to written words more than almost any other stimulus.
It does this so efficiently that words can flash at a speed that we don’t notice, but still light up the letterbox. Its operations are sophisticated and essential for fluent reading. It recognises that “READ” and “read” are the same word despite one being upper case and one lower case. If a stroke, tumour or other brain damage destroys this area, the person won’t be able to read, or recognise faces, objects, digits or Arabic numerals. Yet they may still be able to speak, understand language and even write. But not read what they’ve written.
Written language was only invented 4000 years ago, and for most of the time since then, most people couldn’t read, so the brain’s letterbox can’t have developed via evolution. There just hasn’t been time. So we must be repurposing a part of our brain that evolved for something else.
READING AS NEURONAL RECYCLING
New cultural inventions are only possible insofar as our brains can adapt to allow them. As our culture changes, we re-use brain areas that evolved to do one or more jobs, and start using them to do a different job or jobs.
The brain has particular parts that evolved for the jobs of vision and language, which are connected in a particular way. This locks reading into a unique circuit.
Humans rely on particular brain structures for their spoken language: the network of left superior temporal and inferior frontal regions. Even at 2 months of age this network can be seen in brain scans, when babies listen to sentences in their mother tongue.
Because language is processed on the left side of the brain, reading must be processed there too (a rare few people process language on the right, and for them, the brain’s letterbox is on the right). This keeps the distance between the visual and language areas involved short and helps make the process efficient.
The brain’s letterbox is also located in the part of the visual cortex that gets inputs from the high-resolution part of the back of the eye called the fovea. This allows us to discriminate very small differences between letters, especially in small print.
The neurons in this pathway, both in humans and other primates, often respond well to simple shapes which are common in nature. It seems that over generations, whether they were writing down single speech sounds or larger units like syllables, scribes chose to write using shapes that human brains could easily recognise and learn.
SCANNING ILLITERATE BRAINS
Dehaene has been researching this brain-area-repurposing hypothesis, and particularly what functions get squeezed out when a brain area is used for a new function.
Most of what we know about the brain is based on experimental studies of highly educated students, but for this research, Dehaene and others from several countries studied a group of adults aged around 50 who despite having jobs and being integrated into society had never been to school or learnt to read. Ten of them couldn’t read at all, and twenty had learnt to read to some extent as adults. They were compared with 32 literate adults, some from the same socioeconomic background.
The results of this research demonstrate that a vast brain circuit is transformed when we learn to read. There was a direct relationship between how well each person could read and the amount of activity in their brains’ letterboxes.
The literate people also had extra brain activity in response to other visual stimuli, like faces, houses, checkerboards, and contours, suggesting that learning to read makes it easier to recognise any picture, and makes some other visual processes more accurate and efficient.
On the left side of the brain, activation reduced in response to faces, squeezed out by the response to written words. However, it increased on the other side of the brain, explaining why many studies (of literate adults) show a specialisation for face recognition in the right side of the brain.
The same sorts of patterns were observed in a group of weak and strong 9-year-old readers. The weaker readers had weaker responses to words in their left brains, and responses to faces that were more similar on both sides of the brain. Reading reorganises the brain to recognise words on the left, shifting recognition of faces towards the right, but with no overall loss in the ability to recognise faces.
UNLEARNING MIRROR INVARIANCE
Our brains, and the brains of all primates, have evolved to recognise an object as the same object from any angle, e.g. a monkey can be trained to recognise an object from one side, and then immediately recognise it from the other side without further training.
Letters mess with this. The non-cursive letters b, d, p and q are thus at first hard to discriminate, and the same goes for m and w, n and u, and t and f. Learning to read involves overriding our brain’s tendency to see mirror images as identical, and happens as part of the process of learning to read.
Ask 5 or 6-year-olds to write their names next to dots at the right side of the page, and most will unhesitatingly write from right to left. This ability is slowly unlearnt as they learn to read, but remains in illiterate people, whereas literate adults don’t recognise words written backwards.
Parents should only worry about occasional mirror errors like writing “b” instead of “d” if they persist beyond the age of 9 or 10, as they are due to a universal feature of the primate brain which all children possess and must unlearn.
As we become literate we get a bit worse at recognising that two images are the same, but better at noticing the differences between images. We also increase our capacity to discriminate mirror-symmetrical pseudo-words such as “iduo” and “oubi”. The overall effect of literacy is to give us more flexibility in treating things as visually identical or different, depending on context.
LITERACY ENHANCES SPEECH PROCESSING
The same research also shows that learning literacy has a massive, positive effect on the brain’s network for spoken language processing.
Writing acts as a substitute for speech, activating the same areas of the brain, and giving us access to language through vision.
In order to understand the same spoken sentence, the best readers required less brain activity than the illiterate people. Several brain areas associated with mental effort also decreased dramatically in activity in literate people, confirming that reading facilitates the comprehension of complex sentences.
However, one area called the left planum temporale, just behind the primary auditory area, increased its response to sentences, words and even meaningless pseudowords. This area might be responsible for turning speech sounds (phonemes) into spellings (graphemes), as it’s important in infancy in learning the sounds and sound rules of one’s language. Once you learn that letters represent sounds as part of learning to read, you become more aware of sounds themselves (gain phonemic awareness) and your brain encodes them differently.
A BIDIRECTIONAL SIGHT-TO-SOUND PATHWAY
Good readers can activate the spelling code when listening to speech. Their brains’ letterboxes light up when asked to decide whether a word is meaningful or not e.g. when listening to the pseudoword “ploot”. Illiterate people show no such activation. With literacy comes the ability to recode speech visually, and then the letters help with the decision about whether something is a real word or not.
Spelling also affects speech processing. Literate adults tend to think there are more sounds in “pitch” than in “rich” because there are more letters. They tend to think that if you remove the “n” sound from “bind”, you get the word “bid”, rather than “bide”. This increases the number of redundant codes available to process speech and can help explain why literate people usually have more verbal memory than illiterate people.
If Dehaene and colleagues’ theory is correct, reading enhances a fast and two-directional link between letters and sounds. They’ve also identified the nerve fibres which could be responsible for linking the brain’s letterbox and the area that converts sounds to letters and vice versa.
THE NEXT STEP
Once children learn to read, their brains are massively different. These changes are mostly for the better, enhancing their visual and phonological areas. However, they also shift some face recognition tasks to the right side of the brain and reduce their capacity to see things as the same, no matter how they’re oriented.
Happily, this knowledge dovetails nicely with educational research suggesting that the systematic teaching of how letters map onto speech sounds may help the rapid establishment of the brain’s visual and phonological circuits.
After a little bit of training on sounds (phonemes) and spellings (graphemes) using a computer program called GraphoGame the brain-letterboxes of six-year-olds started to respond to written words. BUT 2018 update: new research has found that Graphogame did not improve reading or spelling scores.