The way we study is a question in which scientists from different fields approach their knowledge from different perspectives. Those who work in artificial intelligence, the proper construction of its processes, and the means of building knowledge ensure that AI can learn vital lessons from our own biology in order to function optimally.
This is how researchers are Imperial College London They found it Differentiation between brain cells can accelerate learning and improve brain function The future of IA.
New Research It was found By adjusting the electrical characteristics of individual cells in brain network simulations, They learned faster than simulations with similar cells.
They found that networks needed less modified cells to achieve the same results and that this method used less energy than models with similar cells.
The authors say their findings will teach us why our brains are better at learning and help build better artificial intelligence systems, such as digital assistants or autonomous car technology that can recognize sounds and faces.
Editor-in-Chief, Nicholas Perez, Doctoral student in Electrical and Electronic Engineering Imperial College London, “Our work suggests that the brain must have energy efficiency as well as excellence in solving complex tasks, and that the diversity of neurons in the brain and artificial intelligence can meet these needs and lead to learning.
Neurons like snowflakes
The brain is made up of billions of cells called neurons, which are connected by vast networks that allow us to learn about the world. Neurons are like snowflakes– They look the same from a distance, but on closer inspection it becomes clear that the two are not exactly the same.
In contrast, each cell in an artificial neural network is similar to the technology based on artificial intelligence, only its connectivity is different. Despite the advancing speed of AI technology, their neural networks do not study as accurately or as fast as the human brain, and the researchers wondered if this was due to a lack of cellular variation.
They decided to study whether mimicking the brain by modifying the properties of neural network cells would stimulate learning in AI.. They found that cell proliferation improved their study and reduced energy consumption.
Dan Goodman, Another author of the document, from the Department of Electrical and Electronic Engineering of the same institution, “Evolution has given us incredible brain functions, many of which we are beginning to understand”.
To conduct the study, the researchers focused on regulating “time stability”, i.e. Each cell decides what to do quickly based on what the cells attached to them are doing. Some will make a very quick decision just by looking at what the concerned neighbors have done now. Others will react more slowly by deciding what others have been doing for a while.
After modifying the time constants of the cells, specialists commissioned the network to perform some benchmark machine learning activities: sorting clothing images and handwriting numbers; Recognize human gestures; Identify spoken numbers and commands.
The results show that by allowing the network to integrate faster and faster information, tasks can be better solved in more complex real-world environments.
When they changed the magnitude of the variation in the simulated networks, they found that it matched the magnitude of the variation seen in the best-performing brain., Indicating that this may have evolved into the right amount of variation for the optimal study.
“We show that AI can approach the function of our brain by mimicking certain brain features. However, current AI systems are far from reaching the level of energy efficiency we find in biological systems – Pérez- explains. As the next step, we will see how AI networks can be brought to a brain-like performance to reduce the consumption of these networks.
Problem solver. Incurable bacon specialist. Falls down a lot. Coffee maven. Communicator.