London: Researchers have developed a new cognitive model, made up of two million interconnected artificial neurons, which can learn to communicate using human language.
The model developed by researchers from the University of Sassari in Italy and the University of Plymouth in UK is called ANNABELL (Artificial Neural Network with Adaptive Behaviour Exploited for Language Learning).
It can learn a new language from scratch, only through communication with a human interlocutor.
The research sheds light on the neural processes that underlie the development of language.
Researchers have not yet been able to explain how our brain develops the ability to perform complex cognitive functions, such as those needed for language and reasoning.
In the human brain there are about one hundred billion neurons that communicate by means of electrical signals.
We might think that the brain works in a similar way to a computer since even computers work through electrical signals.
However, apart from the structural differences, there are profound differences between the brain and a computer, especially in learning and information processing mechanisms.
Computers work through programmes developed by humans. In these programmes there are coded rules that the computer must follow in handling the information to perform a given task.
However there is no evidence of the existence of such programmes in our brain.
Many researchers believed that our brain is able to develop higher cognitive skills simply by interacting with the environment, starting from very little innate knowledge. The ANNABELL model appears to confirm this perspective.
ANNABELL does not have pre-coded language knowledge; it learns only through communication with a human interlocutor, using two fundamental mechanisms, which are also present in the biological brain - synaptic plasticity and neural gating.
Synaptic plasticity is the ability of the connection between two neurons to increase its efficiency when the two neurons are often active simultaneously.
This mechanism is essential for learning and for long-term memory. Neural gating mechanisms are based on the properties of certain neurons (called bistable neurons) to behave as switches that can be turned "on" or "off" by a control signal coming from other neurons.
When turned on, the bistable neurons transmit the signal from a part of the brain to another, otherwise they block it.
The model is able to learn, due to synaptic plasticity, to control the signals that open and close the neural gates, so as to control the flow of information among different areas.
The cognitive model has been validated using a database of about 1500 input sentences, based on literature on early language development, and has responded by producing a total of about 500 sentences in output, containing nouns, verbs, adjectives, pronouns, and other word classes, demonstrating the ability to express a wide range of capabilities in human language processing.
The study was published in the journal PLOS ONE.