The Skinny on Artificial Intelligence
It seems every week there is report of yet another advancement in artificial intelligence (AI), each replicating or exceeding some ability once deemed the exclusive domain of human intellect.
It is touted in the popular media that self-driving cars will soon become the norm, as computer controlled auto-pilots are safer than human drivers. Even hearing aids incorporate AI to minimize the need for volume adjustments.
We are told there are computers that can visually recognize and classify objects with greater speed and accuracy than the average person. While this may be true, there is more to this story than meets the eye, so keep reading.
So what exactly is AI? In simple terms it is a computer system which mimics an aspect or aspects of human intelligence.
Due to a shift in approaches, the past decade has brought exceptionally rapid developments in AI. Instead of trying to program every facet of human knowledge and understanding into a computer, today’s computer engineers are exploiting the advantages of something called computer learning.
Computer learning does not rely on breaking down every detail of human understanding into raw data and then entering it into a computer. Scientists have tried this method for decades, but it has proven to be an insurmountable task.
The seemingly simple things we understand are almost innumerable. Take for example, we know that a second hand on a clock-face may disappear momentarily as it crosses over the hour hand, or that a pot plant will die if you give it bleach instead of water, and many trillions of other random factoids that we take for granted and don’t even consider to be actual knowledge. Trying to identify and systemize these random chunks of human knowledge seems impossible through data programming alone.
By contrast, computer learning allows the computer to run trillions of hypothetical experiments and observe the outcome for itself, or is given millions of examples to extrapolate relationships. Computer learning relies on an architecture that permits analysis of data to evolve an understanding of how things work, much in the way a child learns about his/her environment over time.
The advantage of the computer is that it can learn a task far more quickly and accurately than a human. For example, a computer may learn to visually identify a bat, regardless of its species, whether it is flying, hanging, feeding, young, old, dead or alive. Once learnt, this knowledge can be shared with other computers relatively quickly. Imagine trying to impart everything you know on a particular subject to another person.
The limitation of current forms of AI, is that they are limited to immense but still narrow facets of knowledge, and extrapolating the bigger picture is something humans are far better at. While the computer may recognize the correct species of bat, it doesn’t necessarily know that feeding it bleach is harmful! Nor can a computer know whose voice is most important in the cross-talk of group conversation.
AI is augmenting human abilities and decision making, but it still has a long way to go to perceive the world on a truly human scale.