Human brain structure inspires artificial intelligence
Scientists are looking at how the brain processes information, and how that can be used in AI
The human brain is the most powerful supercomputer on Earth, and now researchers from the University of Southern California are taking inspiration from the structure of the human brain to make better artificial intelligence systems.
What is artificial intelligence?
Artificial intelligence (or AI) is a system of computing that aims to mimic the power of the human brain. We have more than 100 trillion neurons, or electrically conducting cells in our brain, that give us the incredible computing power for which we are known. Computers can do things like multiply 134,341 by 989,999 really well, but they can't do things like recognize human faces or learn or change their understanding of the world. At least not yet, and that's the goal of AI: to devise a computer system that can learn, process images and otherwise be human-like.
Why do we want a computer that is human-like?
Very good question! Part of this answer is: why not? AI is the holy grail for computer scientists who want to make a computer as powerful as the human brain. Basically, they want to create a computer that doesn't need to be programmed with all the variables because it can learn them just like our brain does.
Another reason scientists are interested in AI is that it could be used for things like surveillance and face recognition, and having computer systems that can learn new terrain or solve a new problem somewhat autonomously, which, in certain situations, could be very beneficial.
Why is it so hard to mimic the human brain?
In order to fully mimic the power of our own cognitive capacity, we have to first understand how the brain works, which is a feat in and of itself. We have to re-engineer and re-envision the computer to be completely different from hardware to software and everything in between, and the reason we have to do this has to do with how our brains are powered.
"If we compare, for example, our brain to the super computers we have today, they run on megawatts, [which is] a huge amount of power that's equivalent to a few hundred households, while our brain only relies on water and sandwiches to function," said artificial intelligence and computing expert Han Wang from the University of Southern California said. "It consumes power that's equivalent to a light bulb."
So you see the incredible efficiency of millions of years of evolution on our brain means we have learned to work with limited resources and become so power-efficient that we can beat a supercomputer for complex processing without breaking the energy bank.
How does the brain work at such low energy levels?
This is where the main difference between the brain and the computer lie.
"Our current computers, there's a very powerful core…but then you have a long queue of tasks [which] come in sequentially and are processed sequentially," Wang said. "While our brain, the computation of units, which are the neurons, are connected in highly parallel manner. It's this high level parallelism that has advantages in learning and recognition."
So it's the parallelism in the brain that allows us to use only what we need only when we need it, and to not waste energy on running background processes that we all know slow down our computing power.
What's the new finding that helps us get closer to making computers like the brain?
It's this concept of running at low energy in parallel circuits. The key to this is to make computer circuits more complex in the messages they can send.
In a typical computer, we think that each node sends a one or a zero, and then there's a series of ones and zeros until a program is made.
In the brain, it's a very small circuit and they can send a one which means go, a zero which means no signal, and/or a two that says stop, or both a one and a two at the same time.
In other words, our brains can send double the information in any given exchange compared to a computer, and that, coupled with smaller networks working in parallel, reduces the power strain.
What Wang and colleagues did was to create a system of wires that connect using tin selenite and black phosphate that can send, stop, go, do nothing, or do both signals, depending on the voltage sent.
Now the plan is to re-engineer the computer from the ground up and build a computer that has the capacity for these low voltage decisions that aren't wired through these few cores that we see today, but instead with each circuit of messages working in parallel like the brain does.
Until recently, this was a theoretical concept because there was really no way to send as much information in a single transmission as we have now.
So, artificial intelligence is only a few incredible brilliant research careers away from a reality.