As a participant in the Amazon Services LLC Associates Program, this site may earn from qualifying purchases. We may also earn commissions on purchases from other retail websites.
Did you know that there are approximately 6,500 spoken languages on our planet? Yet not one commonality has been found which would connect all of them, that is , not until now.
According to a new study which was published in PNAS, a team of MIT researchers led by Richard Futrell, Kyle Mahowald, and Edward Gibson claim to have found a “Language Universal”.
According to the authors, all languages are self-organized in a way that concepts will remain connected as closely as possible so that it is easier to rebuild a universal meaning.
Explaining the variation between human languages and the constraints on that variation is a core goal of linguistics. In the last 20 y, it has been claimed that many striking universals of cross-linguistic variation follow from a hypothetical principle that dependency length—the distance between syntactically related words in a sentence—is minimized. Various models of human sentence production and comprehension predict that long dependencies are difficult or inefficient to process; minimizing dependency length thus enables effective communication without incurring processing difficulty. However, despite widespread application of this idea in theoretical, empirical, and practical work, there is not yet large-scale evidence that dependency length is actually minimized in real utterances across many languages; previous work has focused either on a small number of languages or on limited kinds of data about each language. Here, using parsed corpora of 37 diverse languages, we show that overall dependency lengths for all languages are shorter than conservative random baselines. The results strongly suggest that dependency length minimization is a universal quantitative property of human languages and support explanations of linguistic variation in terms of general properties of human information processing. (source)
The idea of finding a universal language is a big deal since it sheds light on numerous questions directly related to human cognition. Noam Chomsky, who is one of the most famous proponents of the idea of a universal language suggested there is a universal grammar which underlines all the languages on our planet. Finding elements that are recurring in every single language on earth would suggest there are certain elements in every language genetically predetermined and suggests there is a specific brain architecture specifically designed for “language”.
Without finding the ultimate universal, it is unclear whether the language is a specific cognitive package rather than a more general result of the remarkable capabilities of the human brain. At MIT, researchers were interested in whether all languages use a certain type of organization to make sentences easier to understand. Researchers analyzed 37 different languages where each phrase in the database was given a score based on the degree of Dependency length minimization which determines how separated were interrelated words in a sentence. According to the MIT Study, Dependency lengths are the distances between linguistic heads and dependents. In natural language syntax, roughly speaking, and heads are words that license the presence of other words (dependents) modifying them.
Researchers at MIT also found that some languages shown DLM more than others. Languages that are not based only on the order of the words to convey the relationships between words tended to have higher scores. According to researchers, languages such as German have marks which contribute to memory and understanding which makes the DLM index slightly less important when compared to other languages. However, even these languages had scores below the line of random basis.
The study at MIT was limited to 37 languages.
You can find out more about the study by reading this.