1. Machine learning is moving more and more towards indirect programming i.e. you program the computer with a learning algorithm and let it work out what to do. Google reinforcement learning, or machine learning. This greatly reduces the programming bottleneck.
2. People underestimate how much processing power the human brain has. Think 100,000,000,000 neurons, each with 1,000 active connections on average and perhaps 10,000 latent connections (which are being updated via Hebbian learning). The connections (axons and dendrites) are the active processing units. The cycle time is .01 seconds or so. Only the very largest computers are anywhere near this processing power (~10^16 operations/second). My current desktop is about 10,000 times less powerful. Now imagine trying to build a tractor with a 1/100 horsepower motor - such a difference is beyond being a gap, it is a qualitative difference.
Given the limited processing power available it is amazing computers can do what they can. Back in the 1980s a large bank was run on the equivalent of (1/10 of a millimeter of brain tissue)^3.
There are some caveats to comparing the computing power of the brain to computers. Computers have vastly more serial processing power than the brain. Transistors are thousands to millions of times faster than neurons. Signals propagate through the brain relatively slowly (only about 200 times per second.) For problems that can be done iteratively, computers are vastly superior.
Second computers are intentionally designed to be general purpose and they sacrifice a lot of potential speed to do this. If you were to build some algorithms into the hardware they would get vastly more performance (eg bitcoin mining), but that's extremely expensive. However being general purpose has a lot of advantages. Computers will always be faster at many things than neurons.
1. Machine learning is moving more and more towards indirect programming i.e. you program the computer with a learning algorithm and let it work out what to do. Google reinforcement learning, or machine learning. This greatly reduces the programming bottleneck.
2. People underestimate how much processing power the human brain has. Think 100,000,000,000 neurons, each with 1,000 active connections on average and perhaps 10,000 latent connections (which are being updated via Hebbian learning). The connections (axons and dendrites) are the active processing units. The cycle time is .01 seconds or so. Only the very largest computers are anywhere near this processing power (~10^16 operations/second). My current desktop is about 10,000 times less powerful. Now imagine trying to build a tractor with a 1/100 horsepower motor - such a difference is beyond being a gap, it is a qualitative difference.
Given the limited processing power available it is amazing computers can do what they can. Back in the 1980s a large bank was run on the equivalent of (1/10 of a millimeter of brain tissue)^3.