[ weird things ] | why proving universal grammar is harder than it seems

why proving universal grammar is harder than it seems

A new study shows that human languages do tend to follow the same logical rules, but the theory's sheer scope means it only takes one exception to invalidate its premise.
babel fish
Illustration by John Matrz

When finding yourself in a debate with a partisan ideologue who claims that all higher education is simply anti-American socialist brainwashing, he will often bring up that Noam Chomsky is one of the most cited scholars in the world despite his penchant for left wing radical conspiracies he adamantly supports in his books. However, the reason why Chomsky is cited so often has zilch to do with his politics and everything to do with his study of language, particularly his theory of a universal grammar. According to his work, all human languages share common patterns which we can use to create universal translators and pinpoint the semantic details of each word with a proper context. This idea is particularly popular in the field of computer science, particularly in a number of AI experiments because it can give us algorithms for symbol grounding, a fancy term for deciding exactly what a word is supposed to represent in a given situation. This is one of the fundamental leaps needed to make for machines to truly understand what humans say.

Of course, as with any theory with the word universal in the title, there’s plenty of criticism about how universal it actually is, and some escalated into a full blown feud among linguists. Critics of the theory have went as far as to say that that universal grammar is whatever Chomsky wants it to be when it’s being debated, which in academia is actually a pretty vicious burn. But it’s rather expected since a theory that claims to apply to every language on the planet can be challenged with a single example that fails to conform to it, no matter how obscure. Considering that we not only have to consider modern languages, but the evolution of all known languages to make the theory airtight, there’s still a lot to flesh out in Chomsky’s defining work. Working with all modern languages is hard enough, but working with historical ones is even more challenging because a majority of modern human history was not recorded, and the majority of what has been is pretty sparse. I’d wager that 95% of all languages ever created are likely to be lost to time.

Even worse than that is knowing our languages change so much that their historical origins can be totally obscured with enough time. While the first physiologically modern humans evolved in North Africa some 100,000 years ago, a comparative analysis of today’s language patterns just doesn’t show any founder effect, meaning that if one of our first ancestors stumbled into a time machine and traveled to today, she would not be able to understand even a single sound out of our mouths without instruction from us. Research like this has led many linguists to believe that language is shaped by culture and history more than just the raw wiring of our brains as per the universal grammar theory. Others, disagree producing papers such as the recent MIT study of logical patterns in 37 languages showing that all of the languages prefer very similar rules when it comes to their grammatical style, meaning that the underlying logic had to be the same, even when comparing Ancient Greek to modern languages as different as English and Chinese.

By analyzing how closely related concepts cluster in sentences across all the languages chosen for the project, researchers found that all of them prefer to keep related concepts close to each other in what they considered a proper, grammatically correct sentence. To use the example in the study, in the sentence “John threw the trash out,” the domestic hero of our story was tied to his action and the villainous refuse was tied to where it was thrown. These concepts weren’t on the opposite sides of a sentence or at a random distance from each other. This is what’s known as dependency length minimization, or DLM, in linguist-speak. One of the few undisputed rules of universal grammar is that in every language, the core concepts’ DLM should be lower than a random baseline, and this study pretty solidly showed that they weren’t. In fact, every language seemed to have an extremely similar DLM measure to the others, seemingly proving one of the key rules of universal grammar. So where exactly does that leave the theory’s critics?

Well, as said before, calling any theory universal is fraught with problems and leaves it open to the most minor, nit-picking criticism, and we all know of exactly one society based around logic, and that’s the Vulcans from Star Trek. To dispute the theory, linguists had to go out of their way to tribes so vaguely aware of the modern world, we may as well be from another planet to them, and look for the smallest cultural inconsistencies that conflict with the current interpretation of a theory they say is somewhat vague. Certainly they could produce a language that eschews the rules of universal grammar in favor of tradition and religion, and maybe Chomsky can just tone his theory’s presumptuous name down a bit and accept that his work can’t apply to every single language humans have ever used or will invent in the future. But in the end, universal grammar does seem to appear extremely useful and shows that logic plays the most important part of all languages’ initial structures. We might not be able to use the theory to build perpetual universal translators, but we could come quite close since the required patterns exist as predicted.

# science // language / linguistics / logic / research


  Show Comments