Jump to content

IBM Watson

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Hoverflyz (talk | contribs) at 01:04, 15 February 2011 (We need lots more work on this...). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Watson, named after IBM's founder, Thomas J. Watson, is an artificial intelligence program developed by IBM designed to answer questions posed in natural language.[1] It is being developed as part of the DeepQA research project.[2] The program is in the final stages of completion and will run on a POWER7 processor-based system. It is scheduled to compete on the television quiz show Jeopardy! as a test of its abilities; the competition will be aired in three Jeopardy! episodes running from February 14–16, 2011 at 7 PM EST. Watson will compete against Brad Rutter, the current biggest all-time money winner on Jeopardy!, and Ken Jennings, the record holder for the longest championship streak.[3][4]

Technology

IBM states:

Watson is an application of advanced natural language processing, information retrieval, knowledge representation and reasoning, and machine learning technologies to the field of open domain question answering. At its core, Watson is built on IBM's DeepQA technology for hypothesis generation, massive evidence gathering, analysis, and scoring.[5]

Watson is a workload optimized system designed for complex analytics, made possible by integrating massively parallel POWER7 processors and the IBM DeepQA software to answer Jeopardy! questions in under three seconds. Watson is made up of a cluster of ninety IBM Power 750 servers (plus additional I/O, network and cluster controller nodes in 10 racks) with a total of 2880 POWER7 processor cores and 16 Terabytes of RAM. Each Power 750 server uses a 3.5 GHz POWER7 eight core processor, with four threads per core. The POWER7 processor's massively parallel processing capability is an ideal match for Watsons IBM DeepQA software which is embarrassingly parallel (that is a workload that executes multiple threads in parallel).[6]

While primarily an IBM effort, the development team includes faculty and students from Carnegie Mellon University, University of Massachusetts, University of Southern California/Information Sciences Institute, University of Texas, Massachusetts Institute of Technology, University of Trento, and Rensselaer Polytechnic Institute.[7]

Genesis

An IBM executive had proposed that Watson compete on Jeopardy!, but the suggestion was initially dismissed. In competitions run by the United States government, Watson's predecessors were able to answer no more than 70% of questions correctly and often took several minutes to come up with an answer. To compete successfully on Jeopardy!, Watson would need to come up with answers in no more than a few seconds, and the problems posed by the challenge of competing on the game show were initially deemed to be impossible to solve.[8]

In initial tests run in 2006 by David Ferrucci, the senior manager of IBM's Semantic Analysis and Integration department, Watson was given 500 clues from past Jeopardy! programs. While the top real-life competitors buzzed in half the time and answered as much as 95% of questions correctly, Watson's first pass could only get about 15% right. In 2007, the IBM team was given three to five years and a staff of 15 people to develop a solution to the problems posed. The IBM team provided Watson with millions of documents, including dictionaries, encyclopedias and other reference material that it could use to build its knowledge, such as bibles, novels and plays. Rather than relying on a single algorithm, Watson uses thousands of algorithms simultaneously to understand the question being asked and find the correct path to the answer.[9] The more algorithms that independently arrive at the same answer, the more likely Watson is to be correct. Once Watson comes up with a small number of potential solutions, it is able to check against its database to ascertain if the solution made sense. In a sequence of 20 mock games, human participants were able to use the six to eight seconds it takes to read the clue to decide whether to buzz in with the correct answer. During that time, Watson is also able to evaluate the answer and determine if it is sufficiently confident in the result to buzz in.[8]

By 2008, the developers had advanced to the point where Watson could compete with Jeopardy! champions. That year, IBM contacted Jeopardy! executive producer Harry Friedman about the possibility of having Watson compete as a contestant on the show. The show's producers agreed.[8][10]

Performance

As of February 2010 Watson can beat human Jeopardy! contestants on a regular basis.[11] IBM set up a mock set to mimic the one used on Jeopardy! in a conference room at one of its technology sites, and had individuals including former Jeopardy! contestants participate in mock games against Watson, with Todd Alan Crain of The Onion playing host. Watson, located on another floor, was given the clues electronically and was able to buzz in and speak with an electronic voice when it gave the responses in Jeopardy!'s question format.[8]

In a practice match before the press on January 13, 2011, Watson won a 15-question round against Ken Jennings and Brad Rutter with a score of $4,400 to Jennings's $3,400 and Rutter's $1,200, though Jennings and Watson were tied before the final $1,000 question.[12] None of the three players answered a question incorrectly; the match was decided by who could buzz in more quickly with their correct answer.[13]

Future uses

According to IBM, "The goal is to have computers start to interact in natural human terms across a range of applications and processes, understanding the questions that humans ask and providing answers that humans can understand and justify."[14]

Watson is based on commercially available IBM Power 750 servers that have been marketed since February 2010. IBM also intends to market the DeepQA software to large corporations, with a price in the millions of dollars, reflecting the $1 million needed to acquire the complete system that runs Watson. IBM expects the price to drop substantially within a decade as the technology improves.[8]

See also

References

  1. ^ Smartest Machine on Earth Retrieved 14 February 2011.
  2. ^ The DeepQA Project
  3. ^ Markoff, John (2009-04-26). "Computer Program to Take On 'Jeopardy!'". New York Times. Retrieved 2009-04-27.
  4. ^ Loftus, Jack (2009-04-26). "IBM Prepping Soul-Crushing 'Watson' Computer to Compete on Jeopardy!". Gizmodo. Retrieved 2009-04-27.
  5. ^ "What kind of technology is Watson based on?". IBM Corporation. Retrieved 2011-02-11.
  6. ^ "Is Watson the smartest machine on earth?". Computer Science and Electrical Engineering Department, UMBC. February 10, 2011. Retrieved 2011-02-11.
  7. ^ Building Watson: An Overview of the DeepQA Project. AI Magazine, Vol 31, No 3. 2010
  8. ^ a b c d e Clive Thompson (June 14, 2010). "What Is I.B.M.'s Watson?". New York Times. Retrieved 2010-07-02. {{cite news}}: Cite has empty unknown parameter: |coauthors= (help)
  9. ^ "PBS Nova ScienceNOW: Will Watson Win On Jeopardy!?". 2011-01-20. Retrieved 2011-01-27. Three artificial-intelligence experts, including the leader of the Watson team, discuss the supercomputer’s prospects.
  10. ^ Brian Stelter (December 14, 2010). "I.B.M. Supercomputer 'Watson' to Challenge 'Jeopardy' Stars". New York Times. Retrieved 2010-12-14. An I.B.M. supercomputer system named after the company's founder, Thomas J. Watson Sr., is almost ready for a televised test: a bout of questioning on the quiz show "Jeopardy." I.B.M. and the producers of "Jeopardy" will announce on Tuesday that the computer, "Watson," will face the two most successful players in "Jeopardy" history, Ken Jennings and Brad Rutter, in three episodes that will be broadcast Feb. 14–16, 2011. {{cite news}}: Cite has empty unknown parameter: |coauthors= (help)
  11. ^ IBM's Jeopardy-playing machine can now beat human contestants NetworkWorld, February 10, 2010
  12. ^ Dignan, Larry (2011-1-13). "IBM's Watson wins Jeopardy practice round: Can humans hang?". pp. ZDnet. Retrieved 2011-01-13. {{cite web}}: Check date values in: |date= (help)
  13. ^ Pepitone, Julianne (2011-1-13). "IBM's Jeopardy supercomputer beats humans in practice bout". CNNMoney. Retrieved 2011-01-13. {{cite web}}: Check date values in: |date= (help)
  14. ^ http://www.networkworld.com/news/2010/021010-ibm-jeopardy-game.html?page=2

Further reading

Videos

Template:Computable knowledge