WASHINGTON — Traveling in a country whose language you don’t speak is certainly difficult. Use your dictionary with care, or you’ll be asking a shopkeeper if you can marry an eggplant. But imagine a day in which nobody used language at all. Ultimately, would it be much different from a day in the life of any other intelligent, social animal?
Language is at the very core of what makes us human, though how we evolved this ability has provoked intense debate. New research may help scientists dissect just what it is about the human brain that endows us with language.
Researchers have found that tamarin monkeys have some distinctly languagelike abilities but that they can’t quite master the more complex rules of human grammar. The findings appear in Friday’s issue of the journal Science, published by AAAS, the non-profit science society.
The grammatical toolkit
“A relatively open question concerning language evolution is, ‘What aspects of the language faculty are shared with other animals, and what aspects are unique to humans?’” said study author Marc Hauser of Harvard University.
More from TODAY.com
Join the TODAY Parenting Team! We're all in this together
Please share your best advice, toughest questions and cutest photos to our first challenge: What do you wish you'd known b...
- Savannah Guthrie: What I wish I knew before Vale
- Why you need to join the TODAY Parenting Team community
- Molly Ringwald: My daughter loves ‘The Breakfast Club’
- Planter bookends, etched glass and more DIY projects you really can do
- Join the TODAY Parenting Team! We're all in this together
To investigate, Hauser and W. Tecumseh Fitch of the University of St. Andrews, in Scotland, devised tests for cotton-top tamarin monkeys and human volunteers. Tamarins have been evolving separately from humans for approximately 40 million years — suggesting that any shared machinery in human and tamarin brains is old enough to be relatively common among primates.
Instead of trying to teach the monkeys real words, Hauser and Fitch generated strings of one-syllable words that followed various grammatical rules.
According to linguistics expert Noam Chomsky, the simplest type of grammar is a “finite state grammar” or “FSG,” which dictates which types of words go near each other in a sentence. In English, for example, an adjective like “fast” must go directly in front of “car,” the noun it’s describing.
Building on previous experiments, Hauser and Fitch recorded word-strings that obeyed a specific FSG, in which any syllable spoken by a female voice was automatically followed by one from a male voice.
After listening to a series of word-strings, the monkeys were able to distinguish between those that followed this rule and others that didn’t. Human test subjects could tell the difference as well, implying that tamarins and humans may share at least some components of what Hauser called “the universal toolkit underlying all languages.”
Mastering this type of grammar represents the ability to compute some simple statistics, something human infants accomplish early on as they learn to speak. This ability may not be specific to language, however.
“Either the same mechanism or some approximation of it is used in mathematics, vision, music and other activities,” Hauser said.
Upping the Complexity
The grammatical rules of real languages govern more than just the placement of neighboring words, as anyone who had to diagram sentences in English class may remember all too well.
One of the more complex types of grammar is known as a “phrase structure grammar,” or PSG. These grammars involve relationships between words that aren’t next to each other in a sentence and thus allow for a more complex range of expression. The “if … then” construction is an example of a PSG.
The researchers generated a second set of word-strings that followed a PSG in which a pairing of syllables spoken by a female and a male could be embedded within another pairing. This grammar produces structures like [female [female, male] male].
After playing these recordings repeatedly to the monkeys, the researchers found that the animals didn’t seem to notice the difference between word strings that obeyed the PSG and other strings that did not. In contrast, the human volunteers did notice the difference.
The human brain’s specialized language machinery thus seems to kick in right around this junction between the two grammar types.
If further research shows that monkeys, as well as other nonhuman species, can’t process these and other PSGs, “then there is strong evidence that humans had to pass through a fundamental bottleneck in their capacity to generate a virtually limitless variety of meaningful expressions to tell others what they think and feel,” Hauser said.
“When this occurred is anyone’s guess, but when it happened, it transformed the mind,” he added.
Future experiments may entail using brain imaging technology to monitor the monkeys’ and humans’ brain activity during both sets of experiments. This may set researchers on the path to discovering physical brain differences behind the diverging abilities of humans and tamarins.
Language and Intelligence
The fact that a virtually infinite number of phrases can be nested inside one another gives human language an open-endedness, allowing us to express new ideas. In a commentary that accompanies the Science study, psychologist David Premack has proposed that the flexibility of human grammar may be a central aspect of human intelligence.
Whatever it is about the brain that allows such linguistic flexibility may also be key to the human imagination, according to Premack. Unlike other animals, which specialize in various skills, humans are supremely adaptable, able to learn new tasks and develop new technologies.
“Human intelligence and evolution are the only flexible processes on Earth capable of producing endless solutions to the problems confronted by living creatures,” Premack writes.
© 2004 American Association for the Advancement of Science