How the human brain, complex wonder that it is, transforms visual cues such a drawing or an image into language has never been fully understood but a new study has gone a long way towards explaining how.

Three British researchers have now mapped the journey language takes starting with the eyes and ending up at an area of the brain called the ventral occipitotemporal cortex, known to process visual information.

The way letters look are processed in the back of the ventral occipitotemporal cortex but sounds and meanings are processed in an area further forward in that brain region. This forward region better handles abstract concepts.

The study, published in the Proceedings of the National Academy of Sciences on Aug. 19, clearly depicts the transition of a word as it enters the eye to the cortex, and as it moves along the cortex. It was authored by Jo Taylor from University College London, Kathy Rastle from Royal Holloway University of London in Egham and Matthew Davis from the University of Cambridge.

“We didn’t evolve to read,” said Taylor. “So we don’t (start with) a bit of the brain that does reading.”

Taylor said the brain can make sense of words written in different fonts or sizes from visual cues. The brain can also do this because it connects the information with what it knows about spoken language. Eventually, “when you see a word, you immediately get its sound and its meaning without any effort,” Taylor explained.

This new study also confirmed that learning to read makes parts of the ventral occipitotemporal cortex more attuned to reading. The cortex does this by displacing other functions such as recognizing objects or by encroaching on areas less tied to specific functions, according to Rastle.

This reorganization might explain how reading becomes automatic.

“Without that pathway … we would be like children reading letter by letter,” Rastle added.

The study recruited 24 English-speaking adult volunteers who were taught made-up words written in two unfamiliar, archaic scripts over a period of two weeks.

The words were assigned the meanings of common nouns such as truck for example. Researchers scanned the brains of the volunteers using functional MRI (fMRI) to track, which tiny chunks of brain in the ventral occipitotemporal cortex became active when participants were shown the words learnt in training.

New research from Hong Kong linked a reduced risk of dementia with intellectual activities like reading. Ksenia Makagonova/Unsplash
Published by