IBM Watson
Watson, named after IBM's founder, Thomas J. Watson, is an artificial intelligence program developed by IBM designed to answer questions posed in natural language.[1] It is being developed as part of the DeepQA research project.[2] The program is in the final stages of completion and will run on a POWER7 processor-based system. It is scheduled to compete on the television quiz show Jeopardy! as a test of its abilities; the competition will be aired in three Jeopardy! episodes running from February 14–16, 2011 at 7 PM EST. Watson will compete against Brad Rutter, the current biggest all-time money winner on Jeopardy!, and Ken Jennings, the record holder for the longest championship streak.[3][4]
Technology
IBM states:
Watson is an application of advanced natural language processing, information retrieval, knowledge representation and reasoning, and machine learning technologies to the field of open domain question answering. At its core, Watson is built on IBM's DeepQA technology for hypothesis generation, massive evidence gathering, analysis, and scoring.[5]
Watson is a workload optimized system designed for complex analytics, made possible by integrating massively parallel POWER7 processors and the IBM DeepQA software to answer Jeopardy! questions in under three seconds. Watson is made up of a cluster of ninety IBM Power 750 servers (plus additional I/O, network and cluster controller nodes in 10 racks) with a total of 2880 POWER7 processor cores and 16 Terabytes of RAM. Each Power 750 server uses a 3.5 GHz POWER7 eight core processor, with four threads per core. The POWER7 processor's massively parallel processing capability is an ideal match for Watsons IBM DeepQA software which is embarrassingly parallel (that is a workload that executes multiple threads in parallel).[6]
While primarily an IBM effort, the development team includes faculty and students from Carnegie Mellon University, University of Massachusetts, University of Southern California/Information Sciences Institute, University of Texas, Massachusetts Institute of Technology, University of Trento, and Rensselaer Polytechnic Institute.[7]
Genesis
An IBM executive had proposed that Watson compete on Jeopardy!, but the suggestion was initially dismissed. In competitions run by the United States government, Watson's predecessors were able to answer no more than 70% of questions correctly and often took several minutes to come up with an answer. To compete successfully on Jeopardy!, Watson would need to come up with answers in no more than a few seconds, and the problems posed by the challenge of competing on the game show were initially deemed to be impossible to solve.[8]
In initial tests run in 2006 by David Ferrucci, the senior manager of IBM's Semantic Analysis and Integration department, Watson was given 500 clues from past Jeopardy! programs. While the top real-life competitors buzzed in half the time and answered as much as 95% of questions correctly, Watson's first pass could only get about 15% right. In 2007, the IBM team was given three to five years and a staff of 15 people to develop a solution to the problems posed. The IBM team provided Watson with millions of documents, including dictionaries, encyclopedias and other reference material that it could use to build its knowledge, such as bibles, novels and plays. Rather than relying on a single algorithm, Watson uses thousands of algorithms simultaneously to understand the question being asked and find the correct path to the answer.[9] The more algorithms that independently arrive at the same answer, the more likely Watson is to be correct. Once Watson comes up with a small number of potential solutions, it is able to check against its database to ascertain if the solution made sense. In a sequence of 20 mock games, human participants were able to use the six to eight seconds it takes to read the clue to decide whether to buzz in with the correct answer. During that time, Watson is also able to evaluate the answer and determine if it is sufficiently confident in the result to buzz in.[8]
By 2008, the developers had advanced to the point where Watson could compete with Jeopardy! champions. That year, IBM contacted Jeopardy! executive producer Harry Friedman about the possibility of having Watson compete as a contestant on the show. The show's producers agreed.[8][10]
Performance
This article documents a current event. Information may change rapidly as the event progresses, and initial news reports may be unreliable. The latest updates to this article may not reflect the most current information. (February 2011) |
As of February 2010 Watson can beat human Jeopardy! contestants on a regular basis.[11] IBM set up a mock set to mimic the one used on Jeopardy! in a conference room at one of its technology sites, and had individuals including former Jeopardy! contestants participate in mock games against Watson, with Todd Alan Crain of The Onion playing host. Watson, located on another floor, was given the clues electronically and was able to buzz in and speak with an electronic voice when it gave the responses in Jeopardy!'s question format.[8]
In a practice match before the press on January 13, 2011, Watson won a 15-question round against Ken Jennings and Brad Rutter with a score of $4,400 to Jennings's $3,400 and Rutter's $1,200, though Jennings and Watson were tied before the final $1,000 question.[12] None of the three players answered a question incorrectly; the match was decided by who could buzz in more quickly with their correct answer.[13]
Future uses
According to IBM, "The goal is to have computers start to interact in natural human terms across a range of applications and processes, understanding the questions that humans ask and providing answers that humans can understand and justify."[14]
Watson is based on commercially available IBM Power 750 servers that have been marketed since February 2010. IBM also intends to market the DeepQA software to large corporations, with a price in the millions of dollars, reflecting the $1 million needed to acquire the complete system that runs Watson. IBM expects the price to drop substantially within a decade as the technology improves.[8]
See also
References
- ^ Smartest Machine on Earth Retrieved 14 February 2011.
- ^ The DeepQA Project
- ^ Markoff, John (2009-04-26). "Computer Program to Take On 'Jeopardy!'". New York Times. Retrieved 2009-04-27.
- ^ Loftus, Jack (2009-04-26). "IBM Prepping Soul-Crushing 'Watson' Computer to Compete on Jeopardy!". Gizmodo. Retrieved 2009-04-27.
- ^ "What kind of technology is Watson based on?". IBM Corporation. Retrieved 2011-02-11.
- ^ "Is Watson the smartest machine on earth?". Computer Science and Electrical Engineering Department, UMBC. February 10, 2011. Retrieved 2011-02-11.
- ^ Building Watson: An Overview of the DeepQA Project. AI Magazine, Vol 31, No 3. 2010
- ^ a b c d e Clive Thompson (June 14, 2010). "What Is I.B.M.'s Watson?". New York Times. Retrieved 2010-07-02.
{{cite news}}
: Cite has empty unknown parameter:|coauthors=
(help) - ^ "PBS Nova ScienceNOW: Will Watson Win On Jeopardy!?". 2011-01-20. Retrieved 2011-01-27. Three artificial-intelligence experts, including the leader of the Watson team, discuss the supercomputer’s prospects.
- ^ Brian Stelter (December 14, 2010). "I.B.M. Supercomputer 'Watson' to Challenge 'Jeopardy' Stars". New York Times. Retrieved 2010-12-14.
An I.B.M. supercomputer system named after the company's founder, Thomas J. Watson Sr., is almost ready for a televised test: a bout of questioning on the quiz show "Jeopardy." I.B.M. and the producers of "Jeopardy" will announce on Tuesday that the computer, "Watson," will face the two most successful players in "Jeopardy" history, Ken Jennings and Brad Rutter, in three episodes that will be broadcast Feb. 14–16, 2011.
{{cite news}}
: Cite has empty unknown parameter:|coauthors=
(help) - ^ IBM's Jeopardy-playing machine can now beat human contestants NetworkWorld, February 10, 2010
- ^ Dignan, Larry (2011-1-13). "IBM's Watson wins Jeopardy practice round: Can humans hang?". pp. ZDnet. Retrieved 2011-01-13.
{{cite web}}
: Check date values in:|date=
(help) - ^ Pepitone, Julianne (2011-1-13). "IBM's Jeopardy supercomputer beats humans in practice bout". CNNMoney. Retrieved 2011-01-13.
{{cite web}}
: Check date values in:|date=
(help) - ^ http://www.networkworld.com/news/2010/021010-ibm-jeopardy-game.html?page=2
Further reading
- Building Watson: An Overview of the DeepQA Project. AI Magazine, Vol 31, No 3. 2010.
- Baker, Stephen (2011). Final Jeopardy: Man vs. Machine and the Quest to Know Everything. NY: Houghton Mifflin Harcourt. ISBN 9780547483160.
External links
- Watson homepage
- DeepQA homepage
- About Watson on Jeopardy.com
- NOVA: Smartest Machine on Earth (documentary)
- POWER7
- Power 750
- Power Systems
- Noam Chomsky on Watson and AI
Videos
- Smartest Machine on Earth (documentary)
- IBM and the Jeopardy Challenge on YouTube (3:59), IBM
- IBM "Watson" System to Challenge Humans at Jeopardy! on YouTube (2:29), IBMLabs
- Building Watson – A Brief Overview of the DeepQA Project on YouTube (21:42), IBMLabs