According to prescouter.com, the brain easily keeps the current lead in intelligence over machines for a number of reasons. First, it has the ability to store and process the information within the same units, neurons, and their synapses. Second, apart from the superior architectural design, the brain clearly holds the advantage in the numbers of the cores if neurons are assumed for the comparative role. Advanced supercomputers have up to 10 million cores, while the brain features nearly 100 billion neurons.
"In computer science, significant research is directed to creating new computing units modeled according to neuronal function. This direction is referred to as neuromorphic engineering. In neuroscience, most efforts are directed towards understanding, as well as preventing age and disease-induced deterioration of brain function. Relatively small efforts are put to research for enhancing overall processing power and functioning of the normal human nervous system. Enhancing human brain power by interfering with the basic functional parameters, may provide the sufficient counterweight to the “existential risks” posed by the rise of AI," wrote Giorgi Kharebava.
In the developed brain, significant improvements to architecture will be nearly impossible to implement in the near future. However, temporary or even permanent improvement to the brain’s processing speed could be a much easier reach for current neuroscience. The cognitive power of the brain, in its significant parts, is a reflection of two processes: impulse conduction in the axon and synaptic transmission. The speed of these functions is the key to a better processing power and is highly variable in the brain. Maximizing or even enhancing these parameters through molecular manipulations may significantly boost overall processing speeds, hence cognitive function.
source: Wall Street Daily
AI can be simply divided into two streams: Generalised AI, which we call as Machine Learning (ML) and Applied AI, which focuses on replicating human behavior, such as making robots. "Artificial intelligence (or AI) is a system of computing that aims to mimic the power of the human brain. We have more than 100 trillion neurons, or electrically conducting cells in our brain, that give us the incredible computing power for which we are known. Computers can do things like multiply 134,341 by 989,999 really well, but they can't do things like recognize human faces or learn or change their understanding of the world. At least not yet, and that's the goal of AI: to devise a computer system that can learn, process images and otherwise be human-like," wrote Torah Kachur for cbc.ca.
In either case of generalized AI (ML) or applied AI, we see that the system learns from historical data and parameters that learning into higher order or logic, or pattern recognition, and does its job.
According to journal.thriveglobal.com, as of 2017, brains still have a leg up on AI. By comparisons, human brains can process far more information than the fastest computers. In fact, in the 2000s, the complexity of the entire Internet was compared to a single human brain. After all, brains are great at parallel processing and sorting information. "They are so good at some activities that we take their strengths for granted, like being able to recognize a cat, tell a joke, or make a jump shot. Brains are also about 100,000 times more energy-efficient than computers, but that will change as technology advances," said Frits van Paasschen.
At the same time, estimates are that computers will surpass the capability of human brains around the year 2040, plus or minus a few decades. Whenever computers reach “human capacity,” they may just keep right on improving. They are not burdened by the constraints that hold back brains. Neurons, for example, are the brain’s building blocks and can only fire about 200 times per second, or 200 hertz.
When it comes to differences, http://scienceblogs.com, points out 10 important differences between the brain and the computers, that you can read about here.