Of all the behaviors that people engage in, speech is the biggest mystery. Even the physical act of speaking is bizarre, if you think about it. We use our tongues to mold air currents into the right frequencies, building structured strings of sounds that our fellow humans can process once they hit their earlobes; somehow, our brains are then able to process these sounds, decoding them back into the intended content to build up a coherent, meaningful idea.
Because I am able to put the ideas from my brain into these sound signals, and because you are able to decode them back into ideas, I can put my ideas in your head and you can put your ideas in mine. This ability to influence each other’s mental states through verbal communication thus gives rise to shared belief systems, which underlie things like religion, money, and science – basically, everything that distinguishes us from other species.
A key question in psychology is which of these linguistic abilities are inborn. Influential nativists like Noam Chomsky and Steven Pinker argue that we are born into the world with a language acquisition device in our head, all preprogrammed with thoughts in the shape of grammatical sentences. We just have to learn the meanings of words and how to produce the sounds that pronounce them (or gestures, or anything else that has sufficiently rich expressive possibilities). But otherwise, language development is more a matter of biological maturation than of real learning.
The prime argument to motivate this idea is the so-called poverty of the stimulus argument. This argument holds that children do not receive properly structured grammatical sentences as input (anybody who ever read a verbatim transcript knows this to be true). In addition, children learn without instruction, and do so at a staggering pace: nobody seems to actively teach young children how to speak (i.e., nothing resembling high school grammar learning takes place in how parents interact with toddlers) but they learn it quicker than how to tie their shoelaces. Since it doesn’t look like children get either the data or the feedback required to learn grammatical rules, these rules must already have been there all along, the nativists say – genetically encoded into our brains by evolution.
This used to be a strong argument. However, the development of artificial intelligence shows that statistical algorithms are able to pick up staggering amounts of detail from imperfect data in pattern recognition without much preprogrammed structure. Newer programs also seem able to pick up rules by themselves; e.g., they can learn how to play computer games all on their own. This is all the more interesting if you consider that our current computers are still small and insignificant compared to the human brain, with its massive architecture involving 1015 connections between 100 billion neurons.
So the next decades are going to be extremely interesting for the language debate. If we can’t create linguistically competent AI without putting the basic thought structure into the program, then this will reinforce the nativist position. If we are able to build AI that can learn language from the get go, however, then the nativist position will be seriously weakened. I wouldn’t dare to place a bet right now.
Of all the behaviors that people engage in, speech is the biggest mystery. Even the physical act of speaking is bizarre, if you think about it. We use our tongues to mold air currents into the right frequencies, building structured strings of sounds that our fellow humans can process once they hit their earlobes; somehow, our brains are then able to process these sounds, decoding them back into the intended content to build up a coherent, meaningful idea.
Because I am able to put the ideas from my brain into these sound signals, and because you are able to decode them back into ideas, I can put my ideas in your head and you can put your ideas in mine. This ability to influence each other’s mental states through verbal communication thus gives rise to shared belief systems, which underlie things like religion, money, and science – basically, everything that distinguishes us from other species.
A key question in psychology is which of these linguistic abilities are inborn. Influential nativists like Noam Chomsky and Steven Pinker argue that we are born into the world with a language acquisition device in our head, all preprogrammed with thoughts in the shape of grammatical sentences. We just have to learn the meanings of words and how to produce the sounds that pronounce them (or gestures, or anything else that has sufficiently rich expressive possibilities). But otherwise, language development is more a matter of biological maturation than of real learning.
The prime argument to motivate this idea is the so-called poverty of the stimulus argument. This argument holds that children do not receive properly structured grammatical sentences as input (anybody who ever read a verbatim transcript knows this to be true). In addition, children learn without instruction, and do so at a staggering pace: nobody seems to actively teach young children how to speak (i.e., nothing resembling high school grammar learning takes place in how parents interact with toddlers) but they learn it quicker than how to tie their shoelaces. Since it doesn’t look like children get either the data or the feedback required to learn grammatical rules, these rules must already have been there all along, the nativists say – genetically encoded into our brains by evolution.
This used to be a strong argument. However, the development of artificial intelligence shows that statistical algorithms are able to pick up staggering amounts of detail from imperfect data in pattern recognition without much preprogrammed structure. Newer programs also seem able to pick up rules by themselves; e.g., they can learn how to play computer games all on their own. This is all the more interesting if you consider that our current computers are still small and insignificant compared to the human brain, with its massive architecture involving 1015 connections between 100 billion neurons.
So the next decades are going to be extremely interesting for the language debate. If we can’t create linguistically competent AI without putting the basic thought structure into the program, then this will reinforce the nativist position. If we are able to build AI that can learn language from the get go, however, then the nativist position will be seriously weakened. I wouldn’t dare to place a bet right now.