Artificial intelligence (AI) represents the best of computing without the drawbacks of human reasoning
THE MIL & AERO COMMENTARY – Seems like everything that has to do with military computing today has some sort of artificial intelligence (AI) angle to it. AI is a catchy phrase that captures the imagination. Yet when we look beyond the shiny veneer, there’s a lot of serious and difficult computer science going on to help humans make quick decisions, help unmanned vehicles navigate and carry out missions on their own, and many other computer tasks thought to be impossible only a few years ago.
Military electronics designers are asking a lot from AI today. We want it for rudimentary jobs like helping manned and unmanned aircraft navigate to and from their mission areas, and making sure the right equipment is on the battlefield to supply the troops. We also want artificial intelligence for new endeavors like ferreting-out fact from potential enemy propaganda in news reports, deploying new control surfaces on aircraft to make the most of aerodynamic efficiency, and evading enemy attempts to jam tactical communications.
Artificial intelligence can be a tough term to pin down. Is it real intelligence? Well, no, and probably won’t be in my lifetime — maybe never. Essentially it describes a computer’s ability to learn from its experience — particularly from its mistakes, like a human does. I don’t think the goal of AI research is to create complex human-like thinking. The aim is to pull relevant data out of experience and then do what a computer does best, which is to process that data very quickly.
Does AI do some kinds of information processing better than a human? Well, it might. Computing is not bogged-down by emotions, a difficult upbringing, burning last night’s dinner, partying too hard on the weekend, and doing things the way we’ve always done them.
Is AI a substitute for human intelligence? Probably not. The human brain is staggeringly complex, and is designed through evolution to make potentially life-saving decisions with a minimum of information. Human reasoning relies on intuition, a gut-feel, a hunch. What are those, by the way? I can’t really explain it, much less program it. Any takers out there on programming a computer to have intuition?
Just how does the human brain do what it does? I don’t think anyone’s quite sure. People might claim they know, but a deep understanding of the human mind is in its infancy. People a hundred years from now might not be any closer to figuring out how the brain works than they are today.
What will change, however, is what we know about how computers work, and what we’ll learn about them over time. Will we be able to emulate the complex firing of a countless number of neurons in an advanced computer architecture? Maybe, and maybe not.
The point is that doesn’t much matter, because we have a pretty good understanding of how computers work. We know what their strengths and weaknesses are, and how to build computers to do what they do best. Computers can sift through mountains of data very quickly — much faster than humans can. Computers don’t get tired, hungry, distracted, bored, or have to stop in mid-task and explain to a 3-year-old why grass is green.
Furthermore, we can design computers to run several different kinds of processing in parallel, with each doing what it’s strongest at. Think of a team of home organizers descending on your house and accomplishing in a matter of hours what you hadn’t even dreamed of over the past three years.
This takes us back to the question, what is artificial intelligence? Today it’s a nice marketing phrase for some of today’s most advanced computing. Today’s artificial intelligence will be tomorrow’s conventional processing … and THEN what will artificial intelligence be? Probably what it is today — computing on the cutting-edge.
Suffice it to say that today’s artificial intelligence researchers should stick to what computers do best, and improve each new generation of technology. The most advanced computing always will be described as intelligent. Let us just be thankful that machine intelligence doesn’t have the worst weaknesses of human reasoning.