Skip to main content Skip to navigation

Language - computation to conversation

Originally published 12 November 2002


Language surrounds us. Everyone picks up a language when they are too young to be aware that they are learning a language. You don’t have to go to school to learn your first language. It happens so easily that it is easy to take for granted. Easy as dropping off an apple tree. How hard is that? Well, understanding how things drop off apple trees has made for interesting science.

Bees don’t do it. Even educated chimps can’t shuffle fridge magnets in a linguistic way. But all normal human beings do pick up language. According to Noam Chomsky, the American linguist, we all speak the same language. The words might be different, but the language is universal. The difference between English and Latin is entirely superficial.

Language is a natural phenomenon, like gravity. It may be all around us, but that doesn’t mean we understand the phenomenon. Language is something people know, and even very similar organisms simply don’t. A key project in cognitive science is to characterise just what it is a person who knows some language does know. They know words, and they know how to use words to articulate their thoughts. When you start to try to spell out that knowledge, you start to run out of paper.

My own research does no more than scrape the surface. Let me describe two of my current projects. The first was with colleagues Vicky Lewis (Open University) and Ros Hill (Aston University) on communication between parents and children.

It focused on road safety and was an exploratory study, testing out a new avenue for research. We were interested in the way parents and children talk together when they are solving a problem. Were there any parallels between those conversations and the way they interacted crossing roads? Pairs that focused better on the problem, and had shorter conversations, appeared to be more disciplined when they were crossing the road. Whether this result holds up, and how best to explain it, is one of the things I am working on now.

The second project looks at the organisation of word meanings. A word can convey different thoughts on different occasions. For example, I can describe a potato or a candle as waxy. Is there one basic concept from which the distinct meanings can be worked out, or are two separate concepts stored? Using a computer model, I have explored processes that could separate different meanings into concepts. Adaptive resonance networks are circuits of simple processors, modelled on neurons. The network looks at each usage and compares it to those it has seen before. A close enough match brings recognition. But if the match fails, the usage gets stored separately as a new concept. At the right settings, the network can divide usages into senses in the same way as a dictionary.

Language is built into all of us. As scholars, we can look at it as a biological, a linguistic, a cognitive, a social, or a computational problem. Each perspective is rewarding. Language challenges the traditional structure of the academy. To truly understand language, we will need to combine the insights of many disciplines.

George Dunbar, Department of Psychology