Jump to content

WikiTrust

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Pbsouthwood (talk | contribs) at 09:44, 12 January 2020 (Changing short description from "A software product intended to assist editors in detecting vandalism and dubious edits" to "Software to assist in detecting vandalism and dubious edits" (Shortdesc helper)). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

WikiTrust
Developer(s)UCSC Online Collaboration Lab
Stable release
v2.12 / November 19, 2008[1]
Preview release
v3.0.pre1 / August 21, 2009[2]
Written inPHP, OCaml[3]
TypeMediaWiki plug-in
LicenseBSD, GPL [4]
Websitewikitrust.soe.ucsc.edu

WikiTrust is a software product, available as a Firefox Plugin, which aimed to assist editors in detecting vandalism and dubious edits, by highlighting the "untrustworthy" text with a yellow or orange background. As of September 2017, the server is offline,[5] but the code is still available for download, and parts of the code are being updated.

When the UCSC server was active, WikiTrust assessed the credibility of content and author reputation of wiki articles using an automated algorithm. WikiTrust provides a plug-in for servers using the MediaWiki platform, such as Wikipedia. When installed on a MediaWiki website it was designed to enable users of that website to obtain information about the author, origin, and reliability of that website's wiki text.[5] Content that is stable, based on an analysis of article history, should be displayed in normal black-on-white type, and content that is not stable is highlighted in varying shades of yellow or orange. It was formerly available for several language versions of Wikipedia.

WikiTrust on Wikipedia was a project undertaken by the Online Collaboration Lab at the University of California, Santa Cruz, in response to a Wikipedia:Meta quality initiative sponsored by the Wikimedia Foundation.[5] The project, discussed at Wikimania 2009, was one of a number of quality/rating tools for Wikipedia content that the Wikimedia Foundation was considering.[6] Communications of the ACM (August 2011) had an article on it.[7] WikiTrust is designed for English and German use via the Wiki-Watch pagedetails for Wikipedia articles,[8] in several languages via a Firefox plugin or it can be installed in any MediaWiki configuration.[9] By 2012, WikiTrust appeared to be an inactive project.[10]

A variant of the WikiTrust code was also used for selection of vandalism-free Revision IDs for the Wikipedia Version 0.8 offline selection. As of September 2017, this part of the code is reported to be under development again, for use in Version 0.9 and 1.0 offline collections.

Software application

Computing reliability

WikiTrust computes, for each word, three pieces of information:

  • The author of the word.
  • The revision where the word (and the immediately surrounding text) was inserted. By clicking on a word, visitors are sent to the revision where the word originated.
  • The "trust" of the word, indicated by the word background coloring (orange for "untrusted" text, white for "trusted" text).

The trust of the word is computed according to how much the word, and the surrounding text, have been revised by users that WikiTrust considers of "high authority."[11][12] This project is still in a beta test stage.[12][13]

Criticism

The criticism has been raised[11] that "the software doesn’t really measure trustworthiness, and the danger is that people will trust the software to measure something that it does not." Generally, users whose content persists for a long time without being "reverted" by other editors are deemed more trustworthy by the software.[12][13] This may mean that users who edit controversial articles subject to frequent reversion may be found to be less trustworthy than others.[13] The software uses a variation of Levenshtein distance to measure how much of user's edit is kept or rearranged, so that users can receive "partial credit" for their work.[14]

Community bias

The software has also been described as measuring the amount of consensus in an article.[15] The community of editors collaborate on articles and revise each other until agreement is reached. Users who make edits which are more similar to the final agreement will receive more reputation. The point is also made that consensus revolves around the beliefs of the community, so that the reputation computed is also a reflection of the community.

See also

References

  1. ^ WikiTrust Release History from the UCSC Online Collaboration Lab
  2. ^ Browse Files for WikiTrust on GitHub
  3. ^ Installation Advice from the UCSC Online Collaboration Lab
  4. ^ WikiTrust on GitHub
  5. ^ a b c Main Page from the UCSC Online Collaboration Lab
  6. ^ Wikipedia Considers Coloring Untested Text, an August 31, 2009 article from Information Week
  7. ^ CACM 54 No. 8 Reputation Systems for Open Collaboration Luca De Alfaro, Ashutosh Kulshreshtha, Ian Pye, B. Thomas Adler pp 81-7
  8. ^ Wiki-Watch.org: Wiki-Watch.org: a new tool to evaluate the quality of Wikipedia’s articles January 13, 2011.
  9. ^ WikiTrust: Wikitrust.soe.ucsc.edu
  10. ^ Andrew G. West (25 November 2012). "WikiTrust" Wikipedia:Village pump (technical).
  11. ^ a b "Computing Wikipedia's Authority". ACRLog. Association of College & Research Libraries.
  12. ^ a b c See "UCSC Wiki Lab".
  13. ^ a b c "Software tests accuracy of Wikipedia entries". Canadian Broadcasting Corporation. September 6, 2007.
  14. ^ B. Adler and L. de Alfaro. A Content-Driven Reputation System for the Wikipedia. In WWW 2007, Proceedings of the 16th International World Wide Web Conference, ACM Press, 2007.
  15. ^ Wikipedia to Color Code Untrustworthy Text, an August 30, 2009 article from Wired News