Jump to content

Eliezer Yudkowsky

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 81.226.163.15 (talk) at 17:44, 9 October 2012 (Work). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Eliezer Yudkowsky
Eliezer Yudkowsky at the 2006 Stanford Singularity Summit
Born (1979-09-11) September 11, 1979 (age 45)
NationalityAmerican
CitizenshipAmerican
Known forSeed AI, Friendly AI, Harry Potter And The Methods Of Rationality
Scientific career
FieldsArtificial intelligence
InstitutionsSingularity Institute for Artificial Intelligence

Eliezer Shlomo Yudkowsky (born September 11, 1979[1]) is an American writer, blogger, and advocate for the development of friendly artificial intelligence[2] and the understanding of a possible future singularity.

Biography

Yudkowsky, who lives in Redwood City, California, [3] did not attend high school and is an autodidact, having no formal education in computer science or artificial intelligence. He co-founded the nonprofit Singularity Institute for Artificial Intelligence (SIAI) in 2000 and continues to be employed as a full-time Research Fellow there.[4]

Work

Yudkowsky's research focuses on Artificial Intelligence theory for self-understanding, self-modification, and recursive self-improvement (seed AI); and also on artificial-intelligence architectures and decision theories for stably benevolent motivational structures (Friendly AI, and Coherent Extrapolated Volition in particular).[5] Apart from his research work, Yudkowsky has written explanations of various philosophical topics in non-academic language, particularly on rationality, such as "An Intuitive Explanation of Bayes' Theorem".[6]

Yudkowsky is also a strong proponent of cryonics, the practice of freezing one's body after death in the hope of future resuscitation.[7] Mainstream science-based medicine is skeptical of the idea, some vieweing it as outright quackery.[8]

Publications

Yudkowsky was, along with Robin Hanson, one of the principal contributors to the blog Overcoming Bias[9] sponsored by the Future of Humanity Institute of Oxford University. In early 2009, he helped to found Less Wrong, a "community blog devoted to refining the art of human rationality".[10] The Sequences[11] on Less Wrong, comprising over two years of blog posts on epistemology, Artificial Intelligence, and metaethics, form the single largest bulk of Yudkowsky's writing.

He contributed two chapters to Oxford philosopher Nick Bostrom's and Milan Ćirković's edited volume Global Catastrophic Risks,[12] and "Complex Value Systems are Required to Realize Valuable Futures"[13] to the conference AGI-11.

Yudkowsky is the author of the Singularity Institute publications "Creating Friendly AI"[14] (2001), "Levels of Organization in General Intelligence"[15] (2002), "Coherent Extrapolated Volition"[16] (2004), and "Timeless Decision Theory"[17] (2010).[18]

Yudkowsky has also written several works[19] of science fiction and other fiction. His Harry Potter fan fiction story Harry Potter and the Methods of Rationality illustrates topics in cognitive science and rationality (The New Yorker described it as "a thousand-page online 'fanfic' text called 'Harry Potter and the Methods of Rationality', which recasts the original story in an attempt to explain Harry's wizardry through the scientific method"[20]), and has been favorably reviewed by authors David Brin[21][22][23] and Rachel Aaron,[24][25] Robin Hanson,[26] Aaron Swartz,[27] and by programmer Eric S. Raymond.[28]

References

  1. ^ Autobiography
  2. ^ "Singularity Institute for Artificial Intelligence: Team". Singularity Institute for Artificial Intelligence. Retrieved 2009-07-16.
  3. ^ Eliezer Yudkowsky: About
  4. ^ Kurzweil, Ray (2005). The Singularity Is Near. New York, US: Viking Penguin. p. 599. ISBN 0-670-03384-7.
  5. ^ Kurzweil, Ray (2005). The Singularity Is Near. New York, US: Viking Penguin. p. 420. ISBN 0-670-03384-7.
  6. ^ An Intuitive Explanation of Bayes' Theorem
  7. ^ "Normal Cryonics". Less Wrong. Retrieved 2012-08-31.
  8. ^ http://www.quackwatch.com/04ConsumerEducation/QA/cryonics.html
  9. ^ "Overcoming Bias: About". Robin Hanson. Retrieved 2012-02-01.
  10. ^ "Welcome to Less Wrong". Less Wrong. Retrieved 2012-02-01.
  11. ^ "Sequences-Lesswrongwiki". Retrieved 2012-02-01.
  12. ^ Bostrom, Nick; Ćirković, Milan M., eds. (2008). Global Catastrophic Risks. Oxford, UK: Oxford University Press. pp. 91–119, 308–345. ISBN 978-0-19-857050-9.
  13. ^ Yudkowsky, Eliezer (2011). "Complex Value Systems are Required to Realize Valuable Futures" (PDF). AGI-11. {{cite conference}}: Unknown parameter |booktitle= ignored (|book-title= suggested) (help)
  14. ^ Yudkowsky, Eliezer. "Creating Friendly AI". Singularity Institute for Artificial Intelligence. Retrieved 2012-02-01.
  15. ^ Yudkowsky, Eliezer. "Levels of Organization in General Intelligence" (PDF). Singularity Institute for Artificial Intelligence. Retrieved 2012-02-01.
  16. ^ Yudkowsky, Eliezer. "Coherent Extrapolated Volition". Singularity Institute for Artificial Intelligence. Retrieved 2012-02-01.
  17. ^ Yudkowsky, Eliezer. "Timeless Decision Theory" (PDF). Singularity Institute for Artificial Intelligence. Retrieved 2012-02-01.
  18. ^ "Eliezer Yudkowsky Profile". Accelerating Future.
  19. ^ "Yudkowsky- Fiction". Eliezer Yudkowsky. {{cite web}}: Cite has empty unknown parameter: |1= (help)
  20. ^ pg 54, "No Death, No Taxes: The libertarian futurism of a Silicon Valley billionaire"
  21. ^ David Brin (2010-06-21). "CONTRARY BRIN: A secret of college life... plus controversies and science!". Davidbrin.blogspot.com. Retrieved 2012-08-31.
  22. ^ "'Harry Potter' and the Key to Immortality", Daniel Snyder, The Atlantic
  23. ^ David Brin (2012-01-20). "CONTRARY BRIN: David Brin's List of "Greatest Science Fiction and Fantasy Tales"". Davidbrin.blogspot.com. Retrieved 2012-08-31.
  24. ^ Authors (2012-04-02). "Rachel Aaron interview (April 2012)". Fantasybookreview.co.uk. Retrieved 2012-08-31.
  25. ^ "Civilian Reader: An Interview with Rachel Aaron". Civilian-reader.blogspot.com. 2011-05-04. Retrieved 2012-08-31.
  26. ^ Hanson, Robin (2010-10-31). "Hyper-Rational Harry". Overcoming Bias. Retrieved 2012-08-31.
  27. ^ Swartz, Aaron. "The 2011 Review of Books (Aaron Swartz's Raw Thought)". Aaronsw.com. Retrieved 2012-08-31.
  28. ^ "Harry Potter and the Methods of Rationality". Esr.ibiblio.org. 2010-07-06. Retrieved 2012-08-31.

Further reading

  • Our Molecular Future: How Nanotechnology, Robotics, Genetics and Artificial Intelligence Will Transform Our World by Douglas Mulhall, 2002, p. 321.
  • The Spike: How Our Lives Are Being Transformed By Rapidly Advancing Technologies by Damien Broderick, 2001, pp. 236, 265-272, 289, 321, 324, 326, 337-339, 345, 353, 370.

Template:Persondata