How did language originate? | Page 2 | INFJ Forum

How did language originate?

So you're saying Korean has a less complex language?
It is said that it's the simplest of the widely known Asian languages with a pictographic script.

I don't know for sure, but it's not far off to come to that conclusion based on these facts.
 
what has always puzzled me is why so many languages? I mean, it seems that we know that humankind originated from the same place, so how did a species that come from one area create so many different languages. . and why are there accents..there's another mystery

Language is probably older than mankind (Homo sapiens). We have areas of our brain (Broca and Wernicke's areas) specialized in producing and interpreting spoken language at speeds far superior than other information processing functions like math. These things take more time to develop than the short lapse of time Homo sapiens has been around. It's probably safe to assume Neanderthals, and earlier hominids in Asia, had language as well. Many modern primates are very vocal as well.

Even my cat has some kind of vocabulary. She has no sense of grammar, her language is purely nominal, but I'd say she has distinct sounds (words) for at least 3 concepts ("I want soft food", "I want to go outside", "I just took a dump and I was constipated"), and we're able to recognize and understand them. She also understands a word we say, when we want to inform her that there is another cat at our door, and she responds the same way everytime.

But why does Korean have the most words? What is so special about that peninsula?

Korean and Japanese have a massive amount of loanwords from English, and Korean has a lot of loanwords from Japanese too, acquired through the many occupations of Korea by Japan. These entries might be words of that kind. Basically any concept brought by western civilization (objects, types of places, institutions, terms related to western workplace, etc.) was borrowed as is, and the pronunciation was adapted to the local tongue.

Apparently the more north the equator, the less languages that are needed, because groups become larger. In warmer climates, one can work in smaller groups, and thus, more languages.

That's interesting, I've never heard about that.

It is said that it's the simplest of the widely known Asian languages with a pictographic script.

When it comes to writing and reading, it is indeed the simplest, by far. It uses Hangeul, a writing system created in the 15th century, which has nothing to do with the Chinese writing system still used in China and Japan. You can learn Hangeul in a few days, while in Chinese and Japanese, it takes many years to achieve the reading skills of an average adult.

However, the grammar is quite complex, more so than Chinese and Japanese. I don't know much about it, but they say Chinese grammar is very simple and very powerful, meaning there aren't a lot of constraints on the kind of sentences one can create. Japanese grammar is also very flexible. In a given sentence, you can omit many pieces of information that you'd expect in another language. This information can be left unsaid when it is understood or obvious in context. This is partly why Japanese is still a nightmare for machine translation. Korean grammar has a lot of similarities with that, but is like a more complex version.

Sometimes, I wonder if the more people speak a language, the simpler its grammar gets.
 
Personally, I wish they would devise an app that could act as a sort of spoken "google translate". As an American living in Holland, I would happily accept the first prototype..

*Speaks sentence in English - watches in amazement as flawless Dutch emerges*
 
what has always puzzled me is why so many languages? I mean, it seems that we know that humankind originated from the same place, so how did a species that come from one area create so many different languages. . and why are there accents..there's another mystery
I think you can see how new languages evolve from old ones by looking at some of the differences between closely related ones. In the UK a rubber is something you use to erase pencilled writing, but it means something very different in the US. Similarly, a trunk is something on the front of an elephant in the UK, but is primarily part of a car in the US. Over many generations in the past these differences continued to accumulate and would carry two languages with a common origin so far apart that they were no longer inter-communicable. Perhaps modern global communications will stop this happening with English and other universal languages. It doesn't take us long to find out that it's unwise to yell out for a rubber in a US office when its really an eraser that you need lol.
 
Accents are perhaps a result of climate. Climates may change the evolution of intonation. What's interesting is in more brutal climates like Korea, Japan, China, Norway, Finland, Sweden, Russia, Iceland, UK, Germany, the languages tend to have the most words / complex grammar.
The brain is just a processor. We lost fur for a better cooling system for it. So it probably stopped some areas from overcomplicating their grammar to prevent overheat.
 
I suppose the number of words depends on how big the tribe, kingdom, or empire.
Korea united its country by simplifying the alphabet. They have an easy ability to add suffixes or prefixes and create more words, rather than recombining existing words. This is common among all or a lot of the top wordy languages on the list you posted.

They often have their own ethnic words, on top of keeping the Sino-Chinese version of many of their original vocabulary (makes up 60-70% of it IIRC). So they have multiple languages in one.
 
There's surprisingly a lot of information on this topic on wikipedia.
https://en.wikipedia.org/wiki/Origin_of_language

Proposed theory


Chomsky's single step theory

According to Chomsky's single mutation theory, the emergence of language resembled the formation of a crystal; with digital infinity as the seed crystal in a super-saturated primate brain, on the verge of blossoming into the human mind, by physical law, once evolution added a single small but crucial keystone.[70][64] Thus, in this theory, language appeared rather suddenly within the history of human evolution. Chomsky, writing with computational linguist and computer scientist Robert C. Berwick, suggests that this scenario is completely compatible with modern biology. They note "none of the recent accounts of human language evolution seem to have completely grasped the shift from conventional Darwinism to its fully stochastic modern version—specifically, that there are stochastic effects not only due to sampling like directionless drift, but also due to directed stochastic variation in fitness, migration, and heritability—indeed, all the "forces" that affect individual or gene frequencies. ... All this can affect evolutionary outcomes—outcomes that as far as we can make out are not brought out in recent books on the evolution of language, yet would arise immediately in the case of any new genetic or individual innovation, precisely the kind of scenario likely to be in play when talking about language's emergence."

Citing evolutionary geneticist Svante Pääbo they concur that a substantial difference must have occurred to differentiate Homo sapiens from Neanderthals to "prompt the relentless spread of our species who had never crossed open water up and out of Africa and then on across the entire planet in just a few tens of thousands of years. ... What we do not see is any kind of "gradualism" in new tool technologies or innovations like fire, shelters, or figurative art." Berwick and Chomsky therefore suggest language emerged approximately between 200,000 years ago and 60,000 years ago (between the arrival of the first anatomically modern humans in southern Africa, and the last exodus from Africa, respectively). "That leaves us with about 130,000 years, or approximately 5,000–6,000 generations of time for evolutionary change. This is not 'overnight in one generation' as some have (incorrectly) inferred—but neither is it on the scale of geological eons. It's time enough—within the ballpark for what Nilsson and Pelger (1994) estimated as the time required for the full evolution of a vertebrate eye from a single cell, even without the invocation of any 'evo-devo' effects."

https://academic.oup.com/jole/article/2/2/114/3112197

Language and Thought A common theme among those who argue for the sudden emergence of language is that it signaled a change in the manner of thought itself. Thus, Tattersall writes that from about 100 000 years ago ‘we start finding plausible indications that members of the new species were starting to think symbolically’ [5]. Chomsky writes of ‘internal language’ (I-language) as the fundamental basis of human symbolic thought with communication merely a byproduct [1,7]. It is, then, through a secondary process of ‘externalization’ that the diverse languages, spoken and signed, are formed. I-language is considered common to all humans, underlying what Chomsky calls ‘universal grammar’ – a term that goes back to 17th-century scholars who sought to identify aspects of language common to all languages [16]. There have been many attempts to specify such a grammar, once satirized by James D. McCawley in his book Thirty Million Theories of Grammar [17], but Chomsky’s most recent and most economical account is the Minimalist Program [18]. The main ingredient is ‘unbounded Merge’, whereby elements are merged recursively to generate a potential infinity of structures.

The pressure to develop output systems flexible enough to communicate our thoughts and experiences probably grew during the Pleistocene when our hominin forebears were increasingly forced from a forested environment to more open territory with dangerous predators, and survival depended on cooperation and social interaction – the aforementioned ‘cognitive niche’ [13]. The initial impetus may have come from the sharing of episodic events, whether remembered, planned, or fictitious but perhaps also increasingly including information about territory, danger, food sources, tool making, and the habits and abilities of individuals. Its course of evolution is probably best indexed by the threefold increase in brain size that began some 2 million years ago and is probably attributable to the emergence of grammatical language [75] and the vast increase in knowledge that it afforded. The increase in brain size was incremental through this period, again suggesting gradual evolution rather than the sudden appearance of a prodigious Prometheus within the past 100 000 years.

https://www.univie.ac.at/mcogneu/lit/corballis-17.pdf
 
what has always puzzled me is why so many languages?

I recall asking this same question when I was younger. My instructor stated that war is a huge factor.

It's best if the enemy doesn't know what you're saying/can't understand.
 
I recall asking this same question when I was younger. My instructor stated that war is a huge factor.

It's best if the enemy doesn't know what you're saying/can't understand.

It more region.

Language evolves with slang. Locals know the slang, foreigners don't. Eventually slang becomes just language.

Humans have been very seperated for long time.

Languages evolved in parallel.
 
How did they come up with vowels and consonants?
1. To expand on the phonetic inventory.
2. They didn't. They didn't plan on creating language, so the use of this word is both misleading and leading to false conclusions (whichever comes first). They "came up" with syllables that incidentally consist of vowels framed by consonants. I didn't study language acquisition in children as much as I had hoped, but it is key to get to an approach towards a hypothesis on universal grammar and language (structure).

You could argue that vowels have been in the human vocabulary for longer than consonants have been added to form syllables - if early hominid depictions of communication are in any way accurate. I suppose since then the voice box, tongue and velum have evolved to allow for a more defined phonetic inventory. It's very much observable in infants growing up into children and how their ability to swallow, gag and speak develops over time, as there are biological indicators on a less time consuming scale as well.
 
but it is key to get to an approach towards a hypothesis on universal grammar and language (structure).

You could argue that vowels have been in the human vocabulary for longer than consonants have been added to form syllables - if early hominid depictions of communication are in any way accurate. I suppose since then the voice box, tongue and velum have evolved to allow for a more defined phonetic inventory. It's very much observable in infants growing up into children and how their ability to swallow, gag and speak develops over time, as there are biological indicators on a less time consuming scale as well.

@Ginny is there a hypothesis you prefer in regards to universal grammar and language?
It's an interesting topic (genuinely interested).
 
@Ginny is there a hypothesis you prefer in regards to universal grammar and language?
It's an interesting topic (genuinely interested).
There is a hypothesis I like, but it's less focussed on history than the structure of language itself. I have been exposed to RRG (role and reference grammar) by my prof in uni, Dr. Van Valin. I took two more of his courses, one of them on different morphological theories, among them Chomsky's theory on universal grammar. He's a hard one to read. Back then I really had trouble parsing the excerpt. Van Valin was very different in that respect, I highly recommend reading his book - although it is rather focussed on an educated audience that knows the basics of morphology and syntax.

What I consider really important in a grammar theory (especially universal grammar) is how it treats the lexicon and how it works together with both semantics and pragmatics. Usually pragmatics is left out entirely, because it's so unpredictable and requires a lot of intuition and internal logic to work. It's not exactly compatible with Te, which is the life blood of a linguistic structure theory. An ideal theory would have to be able to balance this paradox.
 
Last edited:
There is a hypothesis I like, but it's less focussed on history than the structure of language itself. I have been exposed to RRG (role and reference grammar) by my prof in uni, Dr. Van Valin. I took two more of his courses, one of them on different morphological theories, among them Chomsky's theory on universal grammar. He's a hard one to read. Back then I really had trouble parsing the excerpt. Van Valin was very different in that respect, I highly recommend reading his book - although it is rather focussed on an educated audience that knows the basics of morphology and syntax.

Noted, I don't have the proper educational perspective on this but I'll try to have a read on Van Valin's work. Thanks Ginny!
 
@dragulagu

I'm trying to find the document we read in the course (basically his whole book) but then I found this article from 2015:

https://www.computerworld.com/article/2929085/blame-chomsky-for-non-speaking-ai.html

Blame Chomsky for non-speaking A.I.
Communications is the key.

Comstock

Professor Noam Chomsky revolutionized linguistics in 1957 with his publication of Syntactic Structures, and his Chomsky hierarchy from the previous year remains a foundation stone in computer science for programming languages. But programming languages are a far cry from speaking A.I., and Chomsky’s unprecedented success in that part of linguistics should bear the blame for holding back the advancement in another part of linguistics -- the use of human language for A.I.

Obviously, how we use language to communicate is key, but there are a few flavors of the science of language, or linguistics. Chomsky studied formal linguistics, or "the formal relations between linguistic elements," but another type, functional linguistics, studies "the way language is actually used in communicative context." In other words, amazingly, Chomsky's approach, unlike functional linguistics, is not concerned with actual communications!

Chomsky’s linguistics, without communications, has been responsible for A.I. that doesn’t speak. While it’s not his fault that others used his approach to solve the wrong problems, we now have the opportunity to progress with different science.


Formal linguistics emerges from early computer days
How did we get here? The birth of A.I. was tumultuous. A number of new sciences were coming together, computer science and linguistics in particular, and they were still being developed.

This early work in A.I. was dominated by mathematicians partly due to the archaic stage of digital computers, but while human brains can be good at mathematics, it is just one of the skills they can learn. The problem arises when trying to fit a mathematical model to a non-mathematical brain.

Cognitive science, my discipline, focuses on how our brains work. It combines computer science with philosophy, linguistics, neuroscience, psychology and anthropology. It emerged with the goal of replicating cognition on machines roughly 20 years after A.I. was named at the 1956 Dartmouth Summer Research Project on Artificial Intelligence.

In the first sixty years since computers exploded into our world, we have seen formal and computational linguistics dominate, despite their scientific conflicts. Early success is good, but hitting the target once isn't the same as hitting a bulls-eye. Also, hitting the bulls-eye once isn’t the same as doing it repeatedly. Science is about ongoing accuracy, hitting the bulls-eye every time.



As I wrote recently on this blog, in 1969 John Pierce of Bell Labs advised us to work out the science before pushing ahead with engineering. But probably due to frustration at the lack of progress for over a decade, engineering based on statistics was embraced anyway, before the science was ready.

To meet the increasing demand for speaking A.I., the key is functional linguistics combined with a brain-based platform. Our goal should be to talk like Sonny because, like the evolution of personal computing, once unleashed, progress will be unstoppable.

The right linguistics
Patom theory is my computing approach, in which stored patterns do the work of programmers. But in 2006, as I was adding patterns to the system, the limitations of Chomsky's linguistics hit me.

What's the best way to extract meaning from a matched sentence?

I spent a lot of time researching the answer and decided to create my own model. It was a big decision because it was like starting a whole new scientific investigation. The implementation was difficult, too, because Chomsky's model was a bit like working in an office tower with a broken elevator where each floor possibly held something important. Moving between floors to check was annoying!

And then while browsing in a New Jersey bookshop, I stumbled across the answer. How could I have a degree in cognitive science, but still have missed out on the answers, based on more than 30 years of development, from Role and Reference Grammar (RRG) theory?

RRG deals with functional linguistics and considers language to consist of three pieces – grammar linking to meaning in context. You know, word sequences and meaning in conversation. Communication!

RRG was developed with the inspiration that all human languages are based on common principles and that clauses (parts of sentences) contain meaning. Its success in modeling the range of human languages is impressive. Speaking A.I. can use RRG’s linking algorithm to map word sequences in context to meaning, and vice versa.

It was an eye-opener.

The science speaks for itself in whatever language you read it.

I subsequently met with the primary developer of RRG, Professor Robert D. Van Valin, Jr., who convinced me that I no longer needed to develop a scientific model to link phrases and meaning because RRG already explains how to do it in depth, like a cook book.

It just got better and better. He also pointed out that the same algorithm works for any human language. I was sold, as it not only filled the Chomsky gap, but it meant Patom theory could be used with any language as well. [Disclosure: As our work is synergistic, Van Valin has become one of the advisers to my lab at Thinking Solutions.]

Why isn’t RRG used to speak to machines?
Here we have unfortunate timing. In the 1980s as RRG was being developed, programmer's continued to struggle with Chomsky’s linguistics.

Without waiting for another underlying scientific solution, the industry finally decided to proceed with a method of incremental improvement for computational linguistics, based only on the statistics of sequences of sounds and words.

Despite not meeting expectations, computational linguistics and its fixation on word sequences independent of meaning remains at the core of today's A.I. troubles.

Our next step will build on the new scientific approach using RRG for linguistics and Patom theory for programmer-free computing. It promises progress while the dominant paradigms deliver disappointment. With a plan for the future, speaking A.I. is finally coming of age.

The book: https://1drv.ms/b/s!AgOamNGJZGJn8jsW6DNJ3GyskPl-?e=jZZaQE
 
How did they come up with vowels and consonants?

Depends of the alphabet and phoenetics. Chinese is very different to the Latin Alphabet. Kanji is different to Cyrillic.

There are some common sounds the voice box can make but constructing those sounds into letters and words varies between communities.

Here is the Indo European Tree.

As people moved to new regions new langauges evolved from the local communities.

196.jpg
 
Last edited: