We use cookies to improve your experience on our website. Accept | Find out more

x

Talk Show

After crushing mankind on Jeopardy!, IBM’s Watson, the cutting-edge, speech-understanding computer, is going after the world’s most loathed invention: the phone tree

Author Nitasha Tiku Illustration Alex Nabaum

SCIENTISTS HAVE BEEN TRYING to build a heap of metal that can understand people since well before Captain Kirk stood on the deck of the U.S.S. Enterprise asking Computer to “Locate crew member Uhura.” But while Hollywood was recasting artificial intelligence as the omnipotent Skynet wreaking havoc on its unsuspecting masters, real-world computer programmers were struggling with a much more mundane problem: how to get a bunch of 1s and 0s to understand natural language.

IBM had a breakthrough and built Watson, a computer designed to beat human encyclopedia Ken Jennings at Jeopardy! After all, to win the game, Watson would need to understand Alex Trebek. But now that Watson’s off the quiz show circuit — and proven it plays well with others — IBM is looking to shop the technology around to industries like retail and customer service. For all its sci-fi roots, Watson’s skill in understanding human speech, mining massive databases and returning questions with answers will likely end up being harnessed to address more pedestrian concerns, the kinds of nagging inefficiencies that make you want to slam down the phone when you call an automated customer-service line or that make it hard for store owners to turn a Twitter follower into a sale.

“Retailers are going to jump to the top of the queue for Watson-like technologies because of all the data traffic they generate,” says Peter Charness, president of Mathan Systems, a company that produces analytic software for businesses. Every time you drag a piece of clothing into an online shopping cart, look up the closest store on your smartphone or post about a bad restaurant on Twitter, “It generates an unbelievable blizzard of data,” says Charness. Between web analytics, behavioral tracking and location-aware mobile apps, companies are inundated. The language element comes into play when it’s time for a company’s marketing department to figure out what to do with all that data. “They’re not Ph.Ds in statistical mathematics,” notes Charness. But with Watson, suddenly they can recognize patterns across databases like a number-crunching whiz. Ask Watson, “How do I send that customer a coupon she’ll actually use?” and Watson might tell you she just bought dog food for the first time and suggest you send a deal on a dog bed to her iPad at 8 p.m.

“The whole notion of getting a computer to fluently dialogue in natural language is arguably one of the holy grails of artificial intelligence,” says David Ferrucci, the Ph.D. who led the effort to create Watson. Ferrucci is sitting inside the auditorium of IBM’s upstate New York research center, across from a custom-made Jeopardy! set that no one here seems to be in a hurry to dismantle. In order to teach Watson how to talk to a human, Ferrucci explains, “I would often think about how kids process language. I might say to my kid, ‘You didn’t understand that because it’s extremely intricate.’ Kids find the part of the sentence they’re not able to match, and they generate a hypothesis. They say, ‘Does intricate mean hard, Daddy?’ And I’ll say, ‘Yes!’” He pauses with pride. “Well, we got Watson to do something very similar.”

For every query, Watson generates its own questions, combs the information you’ve fed it and scores the evidence. That diagnostic skill set makes the software a natural fit for healthcare. IBM just launched a five-year plan to commercialize the technology to help diagnose patients. Retail applications are still under discussion, but it’s not hard to jump from a souped-up WebMD to a blissfully easy-to-use customer service line, with Watson explaining that you can’t hook up your Blu-ray because it looks as if you bought the wrong cables.

Of course, the irony here is that businesses need to buy a machine to let employees and consumers be their most human — which is to say, imprecise, uninformed and unable to discern patterns of behavior — and still respond with an answer worthy of a Jeopardy! champ. And while calling 1-800-WATSON may still seem a distant second to talking to actual people, Katharine Frase, IBM’s vice president of industry solutions and emerging business, argues that the program may ultimately offer better customer service. “When I call a help desk, even when I get to talk to a human, I’m never sure they understood what I meant,” she says. “And I’m pretty sure that I never completely understand what they say back.”

SPEECH!

A selected history of natural language processing

1950
English mathematician Alan Turing proposes the Turing Test: a measure of a computer program’s ability to use real-time written conversation to fool a judge into thinking it’s human.

1957
Noam Chomsky introduces the notion of generative grammar, which attempts to describe the framework of rules that govern all natural languages.

1966
MIT develops a computer program named ELIZA that uses pattern matching techniques to converse in ways that appear human — such as mimicking a therapist.

1973
Video game designer Don Daglow creates Ecala, a computer conversation program that can surpass ELIZA’s functionality in less than two weeks.

1994
Lycos founder Michael “Fuzzy” Mauldin coins the term “chatterbot” to describe text-based conversation agents, like, say, Microso Office’s “Clippy.”

2004
Charles Lickel, retired vice president of global research software strategy for IBM, gets a bright idea while watching Ken Jennings’ historic Jeopardy! run on TV.

Leave your comments


*