honoluluadvertiser.com

Sponsored by:

Comment, blog & share photos

Log in | Become a member
The Honolulu Advertiser

Posted on: Tuesday, May 21, 2002

Artificial intelligence

By John Yaukey
Gannett News Service

A funny thing happens to the virtual creatures in the popular computer game "Black & White: Creature Isle." They respond independently, which is to say in an unprogrammed way, to the actions of the player and to their surroundings. For example, if you punish a creature for eating another, it will refrain from devouring its cohorts and other similar behaviors, establishing a cycle of feedback with the player and its environment. Ultimately, the character develops a highly nuanced relationship with its surroundings — on its own.

They may not be Einsteins. But at a simple level, these creatures learn.

If you've thought of artificial intelligence — "AI" — as something destined to guide space probes and control robots, you need to take a look right under your nose.

"We have recently witnessed a real surge in the commercialization of AI," said John Laird, a University of Michigan computer scientist who has been researching artificial intelligence for 25 years. "We're talking about AI applications that don't require anything close to human-level intelligence. These are much simpler, but they are AI nevertheless."

Call it AI lite.

The term "artificial intelligence" was coined in the mid-1950s to mean "a subfield of computer science concerned with the concepts and methods of symbolic inference by computer." Translation: computers capable of learning rather than just spitting back preprogrammed answers.

In the early days of AI, its gurus sought to create computers capable of humanlike reasoning but never came close, plunging the field into what came to be known as "AI winter." They have since scaled back their ambitions considerably, producing smart, tightly defined applications for specific tasks, and they're succeeding.

AI is being used widely not only in computer games, but also in movie animation, financial services and consumer credit protection, popular software, airport security and other applications that pop up in daily life.

If you've bought a book recommended by Amazon.com, you've responded to the Web site's "collaborative filtering" software — a fancy name for AI — which makes purchase recommendations based on intelligent associations between your previous purchases and other data on you.

If you've seen the movie "The Lord of the Rings: The Fellowship of the Ring," you've witnessed the handiwork of AI in creating autonomous animated characters — a real timesaver because filmmakers don't have to program each one.

"A lot of AI is invisible to the user," said David Leake, editor of AI Magazine (www.aaai.org/Magazine/magazine.html). "Because users don't see AI, it's been unsung in many ways."

Learning from the past

Al is an umbrella term that refers to numerous methods of writing code that "thinks."

Some AI programmers create highly adaptable algorithms (formulas that define a sequence of steps) capable of reasoning and self-correcting, usually through complex feedback loops.

Others have found inspiration from the past.

One of the most successful AI paradigms is based on the work of 18th-century Presbyterian minister Thomas Bayes, who dabbled in probability modeling as a hobby. Bayes' work focused on calculating the relationship among multiple variables and then determining how changes in one variable would affect the others.

For example, in scanning documents or databases for information about computer chips, Bayesian-powered AI software is capable of looking for word clusters related to computer topics such as "memory capacity" and "processor speed." By the relationship among variables (words and their relationship to one another), it knows the difference between documents about computer chips and chocolate chips.

Some of the areas where AI is showing great promise or has already started affecting the daily lives of consumers:

• Gaming. For a variety of reasons, computer games have become one of the leading areas for developing AI applications.

"Games are very attractive places to develop AI because you have closed, circumscribed environments that are protected from the almost infinite number of variables that can seep in from the real world," Leake said. "They're great little laboratories.

AI also helps with more practical concerns such as memory and processor limits. Designing increasingly lifelike games sucks up considerable computing power, forcing game creators to make the game code do some of the heavy lifting.

According to a recent poll cited in Wired magazine (www.wired.com), AI code in the average computer game has increased sevenfold since 1997, injecting new levels of complexity.

Multisensory data

Consider the nonplayer characters in the Xbox game "Halo." Their actions are based on real-time multisensory data about their environments. For example, they synthesize visual and auditory information against a backdrop of memories. The result is characters that become anxious or relaxed in real time based on events around them. This system of simultaneous, multisensory input much more resembles a human mind than did the old linear "if 'X' then 'Y' " method of programming games.

"AI is rapidly becoming the differentiator in games," said Steven Woodcock, a gaming programmer from Colorado who also maintains a Web site that tracks AI in gaming (www.gameai.com). "It's the games with smart enemies and good stories that people remember."

• Consumer protection. HNC Software in San Diego makes software that uses AI to expose credit card fraud. The software examines transaction, cardholder and merchant data to ferret out fraud based on what it has learned from patterns in legitimate and fraudulent transactions. The company said it can improve fraud detection rates by up to 70 percent and significantly lower embarrassing false positives.

• Personal communications. Microsoft's Outlook Mobile Manager, an application designed to extend communication and personal management functions from desktop computers out to wireless devices, uses AI to filter and prioritize e-mail and other communications it sends.

Mobile Manager uses what Microsoft calls "artificial-intelligence engines" to gather information from user profiles and create customized filters capable of deciding what messages — e-mails, calendar entries and contact information — should be sent as a priority to wireless devices.

• Voice recognition. AI is the driving force behind potentially the most important development in computer innovation since the graphic user interface: voice recognition. This goes beyond the fairly pedestrian technology now available through some wireless services where you can ask for "stock quotes."

Legions of application writers are incorporating AI into natural-language voice recognition systems capable of acting as an automated secretary, butler and personal valet rolled into one.

These systems will look up movie times, buy tickets, search for nearby Italian restaurants, make reservations and check the weather — without a single mouse click.

• The desktop. AI's evolution hasn't been without a few duds, and the computer desktop has seen some of the most infamous. To wit: "Clippy," Microsoft's cartoon paper clip character designed to pop onto computer screens and help with tasks such as writing formal letters. Clippy used some components of AI, but not to the satisfaction of many Windows users who said the icon intruded more often than it helped, and successfully called for its ouster.

Clippy, however, taught AI developers an important lesson:

Help only works when it's wanted, and sometimes the smartest move for an AI application is to stay out.

Where experts once predicted an unstoppable revolution of AI, they now speak more of a comfortable renaissance where machines learn to lend a hand when needed rather than take over.

Most of it will continue to unfold under the radar screen and behind the scenes where AI has already been quietly changing the way chips and wires interact with flesh and blood.