Jump to content

Media Bias/Fact Check: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
restore paragraph to lede as agreed on talk page
mNo edit summary
Line 15: Line 15:
It is widely used, and has been criticised for its [[methodology]] and accuracy.<ref>{{Cite web |last=Mantzarlis |first=Daniel Funke, Alexios |date=2018-12-18 |title=Here's what to expect from fact-checking in 2019 |url=https://www.poynter.org/fact-checking/2018/heres-what-to-expect-from-fact-checking-in-2019/ |access-date=2023-06-08 |website=Poynter |language=en-US}}</ref>
It is widely used, and has been criticised for its [[methodology]] and accuracy.<ref>{{Cite web |last=Mantzarlis |first=Daniel Funke, Alexios |date=2018-12-18 |title=Here's what to expect from fact-checking in 2019 |url=https://www.poynter.org/fact-checking/2018/heres-what-to-expect-from-fact-checking-in-2019/ |access-date=2023-06-08 |website=Poynter |language=en-US}}</ref>
A 2020 study published in ''[[Scientific Reports]]'' wrote: "While [Media Bias/Fact Check's] credibility is sometimes questioned, it has been regarded as accurate enough to be used as [[Ground truth|ground-truth]] for e.g. media bias classifiers, fake news studies, and automatic fact-checking systems."<ref name="Chołoniewski">{{Cite journal |last1=Chołoniewski |first1=Jan |last2=Sienkiewicz |first2=Julian |last3=Dretnik |first3=Naum |last4=Leban |first4=Gregor |last5=Thelwall |first5=Mike |last6=Hołyst |first6=Janusz A. |date=2020 |title=A calibrated measure to compare fluctuations of different entities across timescales |journal=[[Scientific Reports]] |language=en |volume=10 |issue=1 |pages=20673 |doi=10.1038/s41598-020-77660-4 |issn=2045-2322 |pmc=7691371 |pmid=33244096}}</ref> Scientific studies using its ratings note that ratings from Media Bias/Fact Check show high agreement with an independent fact checking dataset from 2017,<ref name="Weld"/> with [[NewsGuard]]<ref name="Broniatowski"/> and with [[BuzzFeed]] journalists.<ref name="Kiesel"/>
A 2020 study published in ''[[Scientific Reports]]'' wrote: "While [Media Bias/Fact Check's] credibility is sometimes questioned, it has been regarded as accurate enough to be used as [[Ground truth|ground-truth]] for e.g. media bias classifiers, fake news studies, and automatic fact-checking systems."<ref name="Chołoniewski">{{Cite journal |last1=Chołoniewski |first1=Jan |last2=Sienkiewicz |first2=Julian |last3=Dretnik |first3=Naum |last4=Leban |first4=Gregor |last5=Thelwall |first5=Mike |last6=Hołyst |first6=Janusz A. |date=2020 |title=A calibrated measure to compare fluctuations of different entities across timescales |journal=[[Scientific Reports]] |language=en |volume=10 |issue=1 |pages=20673 |doi=10.1038/s41598-020-77660-4 |issn=2045-2322 |pmc=7691371 |pmid=33244096}}</ref> Scientific studies using its ratings note that ratings from Media Bias/Fact Check show high agreement with an independent fact checking dataset from 2017,<ref name="Weld"/> with [[NewsGuard]]<ref name="Broniatowski"/> and with [[BuzzFeed]] journalists.<ref name="Kiesel"/>



== Methodology ==
== Methodology ==

Revision as of 16:15, 13 June 2023

Media Bias/Fact Check
Founded2015; 10 years ago (2015)
HeadquartersGreensboro, North Carolina
OwnerDave M. Van Zandt[1]
URLmediabiasfactcheck.com Edit this at Wikidata
Current statusActive

Media Bias/Fact Check (MBFC) is an American website founded in 2015 by Dave M. Van Zandt.[1] MBFC uses an explicit methodology to rate media outlets.[2] It considers four main categories and multiple subcategories in assessing the "Political bias" and "Factual Reporting" of each source.[3]

It is widely used, and has been criticised for its methodology and accuracy.[4] A 2020 study published in Scientific Reports wrote: "While [Media Bias/Fact Check's] credibility is sometimes questioned, it has been regarded as accurate enough to be used as ground-truth for e.g. media bias classifiers, fake news studies, and automatic fact-checking systems."[5] Scientific studies using its ratings note that ratings from Media Bias/Fact Check show high agreement with an independent fact checking dataset from 2017,[6] with NewsGuard[7] and with BuzzFeed journalists.[8]

Methodology

Four main categories are used by MBFC to assess political bias and factuality of a source. These are: (1) use of wording and headlines (2) fact-checking and sourcing (3) choice of stories and (4) political affiliation. MBFC additionally considers subcategories such as bias by omission, bias by source selection, and loaded use of language.[3][9] A source's "Factual Reporting" is rated on a seven-point scale from "Very high" down to "Very low".[10]

Chart showing the degree of bias and factual ratings given to Consumer Reports

Political bias ratings are American-centric[11] [9] and include "Extreme left", "Left", "Left center", "Least biased", "Right center", "Right", and "Extreme right".[12]

The category "Pro-science"[2] is used to indicate "evidence based" or "legitimate science". MBFC also associates sources with warning categories such as "Conspiracy/Pseudoscience", "Questionable Sources" and "Satire".[2]

Fact checks are carried out by independent reviewers who are associated with the International Fact-Checking Network (IFCN) and follow the International Fact-Checking Network Fact-checkers’ Code of Principles, which was developed by the Poynter Institute. [13][9] A source may be credited with high "Factual Reporting" and still show "Political bias" in its presentation of those facts, for example, through its use of emotional language.[14][15][16]

Reception

Media Bias/Fact Check is widely used in studies of media, social media, and disinformation.[5][6][17][18] It has been used by both single- and cross-platform studies.[19] A comparison of five fact checking datasets frequently used as "groundtruth lists" has suggested that choosing one groundtruth list over another has little impact on the evaluation of online content.[6][17] In some cases, MBFC has been selected because it categorizes sources using a larger range of labels than other rating services.[6] MBFC offers the largest dataset covering biased and low factual news sources. Over a 4-year span, the percentage of links that could be categorized with MBFC was found to be very consistent. Research also suggests that the bias and factualness of a news source are unlikely to change over time.[6][17]

When MBFC factualness ratings of ‘mostly factual’ or higher were compared to an independent fact checking dataset's ‘verified’ and ‘suspicious’ news sources, the two datasets showed “almost perfect” inter-rater reliability.[6][17][20] A 2022 study that evaluated sharing of URLs on Twitter and Facebook in March and April of 2020 and 2019, to compare the prevalence of misinformation, reports that scores from MediaBiasFactCheck [sic] correlate strongly with those from NewsGuard (r = 0.81).[7] Another study reports high agreement between ratings from Media Bias Fact Check [sic] and BuzzFeed journalists.[8]

The site has been used by researchers at the University of Michigan to create a tool called the "Iffy Quotient", which draws data from Media Bias/Fact Check and NewsWhip to track the prevalence of "fake news" and questionable sources on social media.[21][22]

According to Daniel Funke and Alexios Mantzarlis of the Poynter Institute, "Media Bias/Fact Check is a widely cited source for news stories and even studies about misinformation, despite the fact that its method is in no way scientific."[23] In 2018, the Columbia Journalism Review identified Media Bias/Fact Check as "an armchair media analysis."[24] Additionally, the Columbia Journalism Review described Media Bias/Fact Check as an amateur attempt at categorizing media bias and characterized their assessments as "subjective assessments [that] leave room for human biases, or even simple inconsistencies, to creep in".[25] A study published in Scientific Reports wrote: "While [Media Bias/Fact Check's] credibility is sometimes questioned, it has been regarded as accurate enough to be used as ground-truth for e.g. media bias classifiers, fake news studies, and automatic fact-checking systems."[5]

See also

References

  1. ^ a b "About". Media Bias/Fact Check. Retrieved 2019-03-30.
  2. ^ a b c Barclay, Donald A. (25 June 2018). Fake News, Propaganda, and Plain Old Lies: How to Find Trustworthy Information in the Digital Age. Rowman & Littlefield. p. 186. ISBN 978-1-5381-0890-1.
  3. ^ a b Larsen, Meng Zhen; Haupt, Michael R.; McMann, Tiana; Cuomo, Raphael E.; Mackey, Tim K. (January 2023). "The Influence of News Consumption Habits and Dispositional Traits on Trust in Medical Scientists". International Journal of Environmental Research and Public Health. 20 (10): 5842. doi:10.3390/ijerph20105842. ISSN 1660-4601. Retrieved 7 June 2023.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  4. ^ Mantzarlis, Daniel Funke, Alexios (2018-12-18). "Here's what to expect from fact-checking in 2019". Poynter. Retrieved 2023-06-08.{{cite web}}: CS1 maint: multiple names: authors list (link)
  5. ^ a b c Chołoniewski, Jan; Sienkiewicz, Julian; Dretnik, Naum; Leban, Gregor; Thelwall, Mike; Hołyst, Janusz A. (2020). "A calibrated measure to compare fluctuations of different entities across timescales". Scientific Reports. 10 (1): 20673. doi:10.1038/s41598-020-77660-4. ISSN 2045-2322. PMC 7691371. PMID 33244096.
  6. ^ a b c d e f Weld, Galen; Glenski, Maria; Althoff, Tim (22 May 2021). "Political Bias and Factualness in News Sharing across more than 100,000 Online Communities". Proceedings of the International AAAI Conference on Web and Social Media. 15: 796–807. doi:10.1609/icwsm.v15i1.18104. ISSN 2334-0770. Retrieved 8 June 2023.
  7. ^ a b Broniatowski, David A.; Kerchner, Daniel; Farooq, Fouzia; Huang, Xiaolei; Jamison, Amelia M.; Dredze, Mark; Quinn, Sandra Crouse; Ayers, John W. (12 January 2022). "Twitter and Facebook posts about COVID-19 are less likely to spread misinformation compared to other health topics". PLOS ONE. 17 (1): e0261768. doi:10.1371/journal.pone.0261768. ISSN 1932-6203.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  8. ^ a b Kiesel, Johannes; Mestre, Maria; Shukla, Rishabh; Vincent, Emmanuel; Adineh, Payam; Corney, David; Stein, Benno; Potthast, Martin (2019). "SemEval-2019 Task 4: Hyperpartisan News Detection". Proceedings of the 13th International Workshop on Semantic Evaluation. Minneapolis, Minnesota, USA: Association for Computational Linguistics. pp. 829–839. doi:10.18653/v1/S19-2145.
  9. ^ a b c "Methodology". Media Bias/Fact Check. 7 June 2023. Retrieved 8 June 2023.
  10. ^ Saro, Robert De (28 March 2023). A Crisis like No Other: Understanding and Defeating Global Warming. Bentham Science Publishers. pp. 74–79. ISBN 978-1-68108-962-1.
  11. ^ Baly, Ramy; Karadzhov, Georgi; Alexandrov, Dimitar; Glass, James; Nakov, Preslav (2018). "Predicting Factuality of Reporting and Bias of News Media Sources". Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Brussels, Belgium: Association for Computational Linguistics. pp. 3528–3539.
  12. ^ Main, Thomas J. (February 1, 2022). "Both the Right and Left Have Illiberal Factions. Which Is More Dangerous?". The Bulwark. Retrieved February 18, 2022.
  13. ^ "PIEGraph FAQ". University of North Carolina at Chapel Hill.
  14. ^ Christian, Sue Ellen (20 September 2019). Everyday Media Literacy: An Analog Guide for Your Digital Life. Routledge. ISBN 978-1-351-17548-7.
  15. ^ Solender, Andrew (12 June 2018). "How One Website Sets Out to Classify News, Expose 'Fake News'". InsideSources. Retrieved 7 June 2023.
  16. ^ Brown, Lyle; Langenegger, Joyce A.; Garcia, Sonia; Biles, Robert E.; Rynbrandt, Ryan (30 July 2021). Practicing Texas Politics. Cengage Learning. p. 224. ISBN 978-0-357-50532-8.
  17. ^ a b c d Bozarth, Lia; Saraf, Aparajita; Budak, Ceren (26 May 2020). "Higher Ground? How Groundtruth Labeling Impacts Our Understanding of Fake News about the 2016 U.S. Presidential Nominees". Proceedings of the International AAAI Conference on Web and Social Media. 14: 48–59. doi:10.1609/icwsm.v14i1.7278. ISSN 2334-0770. Despite the varied labeling and validation procedures used and domains listed by fake news annotators, the groundtruth selection has a limited to modest impact on studies reporting on the behaviors of fake news sites
  18. ^ Allen, Jennifer; Arechar, Antonio A.; Pennycook, Gordon; Rand, David G. (3 September 2021). "Scaling up fact-checking using the wisdom of crowds". Science Advances. 7 (36). doi:10.1126/sciadv.abf4393. ISSN 2375-2548.
  19. ^ Rogers, R (2021). "Marginalizing the Mainstream: How Social Media Privilege Political Information". Frontiers in big data. 4: 689036. doi:10.3389/fdata.2021.689036. PMID 34296078.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  20. ^ Volkova, Svitlana; Shaffer, Kyle; Jang, Jin Yea; Hodas, Nathan (July 2017). "Separating Facts from Fiction: Linguistic Models to Classify Suspicious and Trusted News Posts on Twitter". Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers). Association for Computational Linguistics: 647–653. doi:10.18653/v1/P17-2102.
  21. ^ Dian Schaffhauser. "U-M Tracker Measures Reliability of News on Facebook, Twitter -- Campus Technology". Campus Technology. Retrieved 2018-12-03.
  22. ^ Paul Resnick; Aviv Ovadya; Garlin Gilchrist. "Iffy Quotient: A Platform Health Metric for Misinformation" (PDF). School of Information - Center for Social Media Responsibility. University of Michigan. p. 5.
  23. ^ Funke, Daniel; Mantzarlis, Alexios (December 18, 2018). "Here's what to expect from fact-checking in 2019". Poynter.
  24. ^ Albarracin, Dolores; Albarracin, Julia; Chan, Man-pui Sally; Jamieson, Kathleen Hall (2021). Creating Conspiracy Beliefs: How Our Thoughts Are Shaped. Cambridge University Press. p. 130. doi:10.1017/9781108990936. ISBN 978-1-108-84578-6. S2CID 244413957.
  25. ^ Tamar Wilner (January 9, 2018). "We can probably measure media bias. But do we want to?". Columbia Journalism Review.