Media Bias/Fact Check: Difference between revisions
Thenightaway (talk | contribs) →Purpose and use: pbs newshour did not use them. it described some unpublished research that for some absurd reason happened to do so. |
Davide King (talk | contribs) m →Methodology: ce |
||
(473 intermediate revisions by more than 100 users not shown) | |||
Line 1: | Line 1: | ||
{{Short description|American website}} |
|||
{{third-party|date=December 2018}} |
|||
{{Infobox website |
{{Infobox website |
||
| name |
| name = Media Bias/Fact Check |
||
| logo |
| logo = Media Bias Fact Check wordmark.png |
||
| founded = {{start date and age|2015}} |
|||
| screenshot = |
|||
| url = {{Official URL}} |
|||
| collapsible = |
|||
| owner = Dave M. Van Zandt<ref name="about" /> |
|||
| collapsetext = |
|||
⚫ | |||
| caption = |
|||
| location = [[Greensboro, North Carolina]] |
|||
| url = mediabiasfactcheck.com |
|||
| alexa = {{increase}} [http://www.alexa.com/siteinfo/mediabiasfactcheck.com 32,786] ({{as of|2018|12|3|alt=December 2018}}) |
|||
| commercial = |
|||
| registration = |
|||
| owner = Dave Van Zant<ref name="MBFC FAQ">{{Cite web |url=https://mediabiasfactcheck.com/frequently-asked-questions/ |title=Frequently Asked Questions - Media Bias/Fact Check |website=Media Bias/Fact Check |language=en-US |access-date=2018-12-03}}</ref> |
|||
| launch date = |
|||
⚫ | |||
| revenue = |
|||
| content license = |
|||
}} |
}} |
||
'''Media Bias/Fact Check''' ('''MBFC''') is an American website founded in 2015 by Dave M. Van Zandt.<ref name="about">{{Cite web |url=https://mediabiasfactcheck.com/about/ |title=About |publisher=Media Bias/Fact Check |access-date=2019-03-30}}</ref> It considers four main categories and multiple subcategories in assessing the "political bias" and "factual reporting" of media outlets,<ref name="Larsen"/><ref name="Barclay"/> relying on a self-described "combination of objective measures and subjective analysis".<ref>{{Cite web |title=LibGuides: Journalism Reporting Resources: Fact Checking Resources |url=https://guides.library.illinois.edu/journreporting/factcheck |access-date=2024-07-10 |website=guides.library.illinois.edu |publisher=University of Illinois Urbana-Champaign |language=en}}</ref><ref>{{Cite web |title=Partnerships {{!}} Computational Social Science Lab |url=https://css.seas.upenn.edu/partnerships/ |access-date=2024-07-10 |publisher=The University of Pennsylvania |language=en-US}}</ref> |
|||
It is widely used, but has been criticized for its [[methodology]].<ref name="Funke"/> Scientific studies<ref>{{Cite journal |last=Lin |first=Hause |last2=Lasser |first2=Jana |last3=Lewandowsky |first3=Stephan |last4=Cole |first4=Rocky |last5=Gully |first5=Andrew |last6=Rand |first6=David G |last7=Pennycook |first7=Gordon |date=2023-09-05 |editor-last=Contractor |editor-first=Noshir |title=High level of correspondence across different news domain quality rating sets |url=https://academic.oup.com/pnasnexus/article/doi/10.1093/pnasnexus/pgad286/7258994 |journal=PNAS Nexus |language=en |volume=2 |issue=9 |doi=10.1093/pnasnexus/pgad286 |issn=2752-6542 |pmc=10500312 |pmid=37719749}}</ref> using its ratings note that ratings from Media Bias/Fact Check show high agreement with an independent fact checking dataset from 2017,<ref name="Weld"/> with [[NewsGuard]]<ref name="Broniatowski"/> and with [[BuzzFeed]] journalists.<ref name="Kiesel"/> |
|||
'''''Media Bias/Fact Check''''' is a fact-checking organisation and web site that reviews news and media organisations.<ref>{{Cite web |url=https://mediabiasfactcheck.com/about/ |title=About - Media Bias/Fact Check |website=Media Bias/Fact Check |language=en-US |access-date=2018-12-02}}</ref> |
|||
⚫ | |||
== Purpose and use == |
|||
Four main categories are used by MBFC to assess political bias and factuality of a source. These are: (1) use of wording and headlines (2) [[fact-checking]] and sourcing (3) choice of stories and (4) political affiliation. MBFC additionally considers subcategories such as bias by omission, bias by source selection, and loaded use of language.<ref name="Larsen">{{cite journal |last1=Larsen |first1=Meng Zhen |last2=Haupt |first2=Michael R. |last3=McMann |first3=Tiana |last4=Cuomo |first4=Raphael E. |last5=Mackey |first5=Tim K. |title=The Influence of News Consumption Habits and Dispositional Traits on Trust in Medical Scientists |journal=International Journal of Environmental Research and Public Health |date=January 2023 |volume=20 |issue=10 |pages=5842 |doi=10.3390/ijerph20105842 |pmid=37239568 |pmc=10218345 |language=en |issn=1660-4601 |doi-access=free }}</ref><ref name="Methodology"/> A source's "Factual Reporting" is rated on a seven-point scale from "Very high" down to "Very low".<ref name="Saro">{{cite book |last1=Saro |first1=Robert De |title=A Crisis like No Other: Understanding and Defeating Global Warming |date=28 March 2023 |publisher=Bentham Science Publishers |isbn=978-1-68108-962-1 |pages=74–79 |url=https://books.google.com/books?id=EOq3EAAAQBAJ&pg=PA74 |language=en}}</ref> |
|||
The site classifies media sources on a political bias spectrum, as well as on the accuracy of their factual reporting. ''Media Bias/Fact Check'' describes itself as "the most comprehensive media bias resource on the internet" and they list over 2500 media organisations on their web site.<ref>{{Cite web |url=https://mediabiasfactcheck.com/ |title=Media Bias/Fact Check main page |website=Media Bias/Fact Check |language=en-US |access-date=2018-12-02}}</ref> The site is run by founder and editor Dave Van Zant.<ref name="MBFC FAQ" /> |
|||
[[File:Consumer Reports – MBFC Bias and Credibility 2023-05-07.png|thumb|Chart showing the degree of bias and factual ratings given to ''[[Consumer Reports]]'']] |
|||
The site's ratings are also widely used by media sources when discussing the reliability and bias of other media organisations.<ref>{{Cite web |url=https://www.bbc.com/news/blogs-trending-43745629 |title=The online activists pushing Syria conspiracy theories |date=2018-04-19 |website=BBC News |language=en-GB |access-date=2018-12-03}}</ref><ref>{{Cite web |url=https://www.forbes.com/sites/martinrivers/2018/04/30/kremlin-run-news-agency-hints-at-political-motive-for-air-belgium-delay/ |title=Kremlin-Run News Agency Hints At Political Motive For Air Belgium Delay |last=Rivers |first=Martin |website=Forbes |language=en |access-date=2018-12-03}}</ref><ref>{{Cite web |url=https://thespinoff.co.nz/the-bulletin/18-06-2018/the-bulletin-police-to-reopen-pike-river-case/ |title=The Bulletin: Police to reopen Pike River case? |date=2018-06-18 |website=The Spinoff |language=en-US |access-date=2018-12-03}}</ref><ref>{{Cite web |url=http://finance.sina.com.cn/stock/usstock/c/2017-07-03/doc-ifyhrxsk1647627.shtml |title=特朗普为何如此High:探寻CNN撤稿事件背后的故事 |website=finance.sina.com.cn |access-date=2018-12-03}}</ref><ref>{{Cite web |url=https://www.latimes.com/opinion/op-ed/la-oe-main-alt-right-audience-20170822-story.html |title=What's the alt-right, and how large is its audience? |last=Main |first=Thomas J. |website=latimes.com |access-date=2018-12-03}}</ref><ref>{{Cite web |url=https://www.newsweek.com/trump-interested-seth-rich-conspiracy-new-report-says-693417 |title=Trump wants to know all about the Seth Rich conspiracy, a new report claims |date=2017-10-26 |website=Newsweek |language=en |access-date=2018-12-03}}</ref><ref>{{Cite web |url=https://theconversation.com/unlike-in-2016-there-was-no-spike-in-misinformation-this-election-cycle-105946 |title=Unlike in 2016, there was no spike in misinformation this election cycle |last=Resnick |first=Paul |website=The Conversation |language=en |access-date=2018-12-03}}</ref><ref>{{Cite web |url=https://www.newsweek.com/washington-examiner-twitter-says-trump-going-hell-1200950 |title=Washington Examiner tweets that Trump is "going to hell" |date=2018-11-05 |website=Newsweek |language=en |access-date=2018-12-03}}</ref> |
|||
Political bias ratings are American-centric,<ref name="Methodology">{{cite web |title=Methodology |url=https://mediabiasfactcheck.com/methodology/ |website=Media Bias/Fact Check |access-date=8 June 2023 |date=7 June 2023}}</ref><ref name="Baly">{{cite conference|first1=Ramy |last1=Baly|first2=Georgi |last2=Karadzhov |first3=Dimitar |last3=Alexandrov|first4=James |last4=Glass|first5=Preslav |last5=Nakov|year=2018|title=Predicting Factuality of Reporting and Bias of News Media Sources|url=http://aclweb.org/anthology/D18-1389|location=Brussels, Belgium|publisher=Association for Computational Linguistics|pages=3528–3539|authorlink5=Preslav Nakov|book-title=Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing}}</ref> and are "extreme-left", "left", "left-center", "least biased", "right-center", "right", and "extreme-right".<ref name="Main">{{cite web|url=https://www.thebulwark.com/both-the-right-and-left-have-illiberal-factions-which-is-more-dangerous/|title=Both the Right and Left Have Illiberal Factions. Which Is More Dangerous?|website=The Bulwark|date=February 1, 2022|first=Thomas J. |last=Main |access-date=February 18, 2022}}</ref> The category "Pro-science"<ref name="Barclay"/> is used to indicate "evidence based" or "legitimate science". MBFC also associates sources with warning categories such as "Conspiracy/Pseudoscience", "Questionable Sources" and "Satire".<ref name="Barclay">{{cite book |last1=Barclay |first1=Donald A. |title=Fake News, Propaganda, and Plain Old Lies: How to Find Trustworthy Information in the Digital Age |date=25 June 2018 |publisher=Rowman & Littlefield |isbn=978-1-5381-0890-1 |page=186 |url=https://books.google.com/books?id=3UdODwAAQBAJ&pg=PA186 |language=en}}</ref> |
|||
Fact checks are carried out by independent reviewers who are associated with the [[International Fact-Checking Network]] (IFCN) and follow the International Fact-Checking Network Fact-checkers’ Code of Principles, which was developed by the [[Poynter Institute]].<ref name="PIEGraph">{{cite web |title=PIEGraph FAQ |url=https://pcad.ils.unc.edu/FAQ#about_site_1 |website=University of North Carolina at Chapel Hill}}</ref><ref name="Methodology"/> |
|||
Data from the site has been used by researchers from [[MIT]] and the [[Qatar Computing Research Institute]] to train an [[artificial intelligence]] machine learning algorithm to identify 'fake news'.<ref>{{Cite web |url=https://www.geek.com/tech/ai-as-fact-checker-algorithm-identifies-fake-news-1754452/ |title=AI as Fact Checker: Algorithm Identifies Fake News - Geek.com |date=2018-10-04 |website=Geek.com |language=en-US |access-date=2018-12-03}}</ref><ref>{{Cite web |url=https://www.fastcompany.com/90246013/mit-and-qatari-scientists-are-training-computers-to-detect-fake-news-sites |title=MIT and Qatari scientists are training computers to detect fake news sites |date=2018-10-03 |website=Fast Company |language=en-US |access-date=2018-12-03}}</ref><ref>{{Cite web |url=https://www.popsci.com/AI-fake-news |title=This AI can help spot biased websites and false news |website=Popular Science |language=en |access-date=2018-12-03}}</ref> The site has also been used by researchers at the [[University of Michigan]] to create a tool called the "Iffy Quotient", which draws data from ''Media Bias/Fact Check'' and ''[[NewsWhip]]'' to track the prevalence of 'fake news' and questionable sources on social media.<ref>{{Cite web |url=https://economictimes.indiatimes.com/tech/software/novel-tool-to-monitor-fake-news-on-facebook-twitter/articleshow/66400446.cms |title=Novel tool to monitor fake news on Facebook, Twitter |date=2018-10-28 |website=The Economic Times |access-date=2018-12-03}}</ref><ref>{{Cite web |url=https://www.business-standard.com/article/international/us-midterm-polls-thanks-to-fb-no-spike-in-misinformation-unlike-in-2016-118110700094_1.html |title=US midterm polls: Thanks to FB, no spike in misinformation unlike in 2016 |last=Resnick |first=Paul |date=2018-11-07 |website=Business Standard India |access-date=2018-12-03}}</ref><ref>{{Cite web |url=https://campustechnology.com/articles/2018/10/16/u-m-tracker-measures-reliability-of-news-on-facebook-twitter.aspx |title=U-M Tracker Measures Reliability of News on Facebook, Twitter -- Campus Technology |website=Campus Technology |language=en |access-date=2018-12-03}}</ref> |
|||
A source may be credited with high "Factual Reporting" and still show "Political bias" in its presentation of those facts, for example, through its use of emotional language.<ref name="Christian">{{cite book |last1=Christian |first1=Sue Ellen |title=Everyday Media Literacy: An Analog Guide for Your Digital Life |date=20 September 2019 |publisher=Routledge |isbn=978-1-351-17548-7 |url=https://books.google.com/books?id=V9CwDwAAQBAJ&pg=PT53 |language=en}}</ref><ref name="Solender">{{cite news |last1=Solender |first1=Andrew |title=How One Website Sets Out to Classify News, Expose 'Fake News' |url=https://insidesources.com/one-website-sets-classify-news-expose-fake-news/ |access-date=7 June 2023 |work=InsideSources |date=12 June 2018}}</ref><ref name="Brown">{{cite book |last1=Brown |first1=Lyle |last2=Langenegger |first2=Joyce A. |last3=Garcia |first3=Sonia |last4=Biles |first4=Robert E. |last5=Rynbrandt |first5=Ryan |title=Practicing Texas Politics |date=30 July 2021 |publisher=Cengage Learning |isbn=978-0-357-50532-8 |url=https://books.google.com/books?id=OZs5EAAAQBAJ&pg=PA224 |page=224 |language=en}}</ref> |
|||
==Reception== |
|||
⚫ | |||
Media Bias/Fact Check has been used in studies of mainstream media, social media, and disinformation,<ref name="Chołoniewski"/><ref name="Weld"/><ref name="Bozarth"/><ref name="Allen">{{cite journal |last1=Allen |first1=Jennifer |last2=Arechar |first2=Antonio A. |last3=Pennycook |first3=Gordon |last4=Rand |first4=David G. |title=Scaling up fact-checking using the wisdom of crowds |journal=Science Advances |date=3 September 2021 |volume=7 |issue=36 |pages=eabf4393 |doi=10.1126/sciadv.abf4393 |pmid=34516925 |pmc=8442902 |bibcode=2021SciA....7.4393A |language=en |issn=2375-2548}}</ref> among them single- and cross-platform studies of services including TikTok, 4chan, Reddit, Twitter, Facebook, Instagram, and Google Web Search.<ref name="Rogers">{{cite journal |last1=Rogers |first1=R |title=Marginalizing the Mainstream: How Social Media Privilege Political Information. |journal=Frontiers in Big Data |date=2021 |volume=4 |pages=689036 |doi=10.3389/fdata.2021.689036 |pmid=34296078 |pmc=8290493 |doi-access=free }}</ref> |
|||
''Media Bias/Fact Check'' is open about the methodology and scoring structure that they use to classify sources, describing it in-depth on their web-site.<ref name="methodology MBFC">{{Cite web |url=https://mediabiasfactcheck.com/methodology/ |title=Methodology - Media Bias/Fact Check |website=Media Bias/Fact Check |language=en-US |access-date=2018-12-02}}</ref> ''Media Bias/Fact Check'' uses a rating system based on a [[political spectrum]] with the ratings for bias as 'extreme left', 'left bias', 'left-center bias', 'least biased', 'right-center bias', 'right bias', and 'extreme right'. For sources that don't fit on the political bias spectrum, the site also uses ratings such as 'pro-science', 'conspiracy/psuedoscience', and 'satire'. Each media source is also rated by the accuracy of their reporting, on a scale including 'very high', 'high', 'mixed', 'low', and 'very low'.<ref name="methodology MBFC" /> |
|||
Scientific studies<ref>{{Cite journal |last1=Lin |first1=Hause |last2=Lasser |first2=Jana |last3=Lewandowsky |first3=Stephan |last4=Cole |first4=Rocky |last5=Gully |first5=Andrew |last6=Rand |first6=David G |last7=Pennycook |first7=Gordon |date=2023-09-05 |editor-last=Contractor |editor-first=Noshir |title=High level of correspondence across different news domain quality rating sets |url=https://academic.oup.com/pnasnexus/article/doi/10.1093/pnasnexus/pgad286/7258994 |journal=PNAS Nexus |language=en |volume=2 |issue=9 |pages=pgad286 |doi=10.1093/pnasnexus/pgad286 |issn=2752-6542 |pmc=10500312 |pmid=37719749}}</ref> using its ratings note that ratings from Media Bias/Fact Check show high agreement with an independent fact checking dataset from 2017,<ref name="Weld" /> with [[NewsGuard]]<ref name="Broniatowski">{{cite journal |last1=Broniatowski |first1=David A. |last2=Kerchner |first2=Daniel |last3=Farooq |first3=Fouzia |last4=Huang |first4=Xiaolei |last5=Jamison |first5=Amelia M. |last6=Dredze |first6=Mark |last7=Quinn |first7=Sandra Crouse |last8=Ayers |first8=John W. |date=12 January 2022 |title=Twitter and Facebook posts about COVID-19 are less likely to spread misinformation compared to other health topics |journal=PLOS ONE |language=en |volume=17 |issue=1 |pages=e0261768 |doi=10.1371/journal.pone.0261768 |issn=1932-6203 |pmc=8754324 |pmid=35020727 |doi-access=free|bibcode=2022PLoSO..1761768B }}</ref> and with [[BuzzFeed]] journalists.<ref name="Kiesel">{{cite book |last1=Kiesel |first1=Johannes |title=Proceedings of the 13th International Workshop on Semantic Evaluation |last2=Mestre |first2=Maria |last3=Shukla |first3=Rishabh |last4=Vincent |first4=Emmanuel |last5=Adineh |first5=Payam |last6=Corney |first6=David |last7=Stein |first7=Benno |last8=Potthast |first8=Martin |date=2019 |publisher=Association for Computational Linguistics |location=Minneapolis, Minnesota, USA |pages=829–839 |chapter=SemEval-2019 Task 4: Hyperpartisan News Detection |doi=10.18653/v1/S19-2145 |s2cid=120224153}}</ref> When MBFC factualness ratings of ‘mostly factual’ or higher were compared to an independent fact checking dataset's ‘verified’ and ‘suspicious’ news sources, the two datasets showed “almost perfect” [[inter-rater reliability]].<ref name="Weld" /><ref name="Bozarth" /><ref name="Volkova">{{cite journal |last1=Volkova |first1=Svitlana |last2=Shaffer |first2=Kyle |last3=Jang |first3=Jin Yea |last4=Hodas |first4=Nathan |date=July 2017 |title=Separating Facts from Fiction: Linguistic Models to Classify Suspicious and Trusted News Posts on Twitter |url=https://www.aclweb.org/anthology/P17-2102 |journal=Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers) |publisher=Association for Computational Linguistics |pages=647–653 |doi=10.18653/v1/P17-2102 |s2cid=29259081 |doi-access=free}}</ref> A 2022 study that evaluated sharing of URLs on Twitter and Facebook in March and April 2020 and 2019, to compare the prevalence of misinformation, reports that scores from Media Bias/Fact Check correlate strongly with those from [[NewsGuard]] (r = 0.81).<ref name="Broniatowski" /> |
|||
''Media Bias/Fact Check'' states that they adhere to the International Fact-Checking Network Fact-checkers’ Code of Principles.<ref name="methodology MBFC" /> |
|||
A comparison of five fact checking datasets frequently used as "groundtruth lists" has suggested that choosing one groundtruth list over another has little impact on the evaluation of online content.<ref name="Weld"/><ref name="Bozarth"/> In some cases, MBFC has been selected because it categorizes sources using a larger range of labels than other rating services.<ref name="Weld">{{cite journal |last1=Weld |first1=Galen |last2=Glenski |first2=Maria |last3=Althoff |first3=Tim |title=Political Bias and Factualness in News Sharing across more than 100,000 Online Communities |journal=Proceedings of the International AAAI Conference on Web and Social Media |date=22 May 2021 |volume=15 |pages=796–807 |doi=10.1609/icwsm.v15i1.18104 |s2cid=231942492 |url=https://ojs.aaai.org/index.php/ICWSM/article/view/18104/17907 |access-date=8 June 2023 |language=en |issn=2334-0770|arxiv=2102.08537 }}</ref> MBFC offers the largest dataset covering biased and low factual news sources. Over a 4-year span, the percentage of links that could be categorized with MBFC was found to be very consistent. Research also suggests that the bias and factualness of a news source are unlikely to change over time.<ref name="Weld"/><ref name="Bozarth">{{cite journal |last1=Bozarth |first1=Lia |last2=Saraf |first2=Aparajita |last3=Budak |first3=Ceren |title=Higher Ground? How Groundtruth Labeling Impacts Our Understanding of Fake News about the 2016 U.S. Presidential Nominees |journal=Proceedings of the International AAAI Conference on Web and Social Media |date=26 May 2020 |volume=14 |pages=48–59 |doi=10.1609/icwsm.v14i1.7278 |url=https://ojs.aaai.org/index.php/ICWSM/article/view/7278/7132 |language=en |issn=2334-0770 |quote="Despite the varied labeling and validation procedures used and domains listed by fake news annotators, the groundtruth selection has a limited to modest impact on studies reporting on the behaviors of fake news sites"|doi-access=free }}</ref> |
|||
== Criticism == |
|||
James D Agresti, writing for ''Just Facts Daily'' criticised ''Media Bias/Fact Check'' for what he considered an inaccurate review of the sister site ''Just Facts''. ''Media Bias/Fact Check'' apologised for the original review, saying that "The [original] reviewer clearly zeroed in on one issue and did not look at the big picture"; later stating that the original reviewer is no longer affiliated with their organisation.<ref name="MFBC JFD">{{Cite web |url=https://mediabiasfactcheck.com/just-facts-daily/ |title=Just Facts Daily - Media Bias/Fact Check |website=Media Bias/Fact Check |language=en-US |access-date=2018-12-02}}</ref> ''Just Facts'' was re-classified as 'least-biased' with a 'high' factual reporting score, although this was later changed to 'right-center bias'.<ref>{{Cite web |url=https://mediabiasfactcheck.com/just-facts/ |title=Just Facts - Media Bias/Fact Check |website=Media Bias/Fact Check |language=en-US |access-date=2018-12-02}}</ref> ''Just Facts Daily'' holds a similar rating, but is listed separately.<ref name="MFBC JFD" /> |
|||
The site has been used by researchers at the [[University of Michigan]] to create a tool called the "Iffy Quotient", which draws data from Media Bias/Fact Check and [[NewsWhip]] to track the prevalence of "fake news" and questionable sources on social media.<ref>{{Cite web|author=Dian Schaffhauser|title=U-M Tracker Measures Reliability of News on Facebook, Twitter -- Campus Technology|url=https://campustechnology.com/articles/2018/10/16/u-m-tracker-measures-reliability-of-news-on-facebook-twitter.aspx|access-date=2018-12-03|website=Campus Technology|language=en}}</ref><ref>{{Cite web|author1=Paul Resnick|author2=Aviv Ovadya|author3=Garlin Gilchrist|title=Iffy Quotient: A Platform Health Metric for Misinformation|url=https://csmr.umich.edu/wp-content/uploads/2018/10/UMSI-CSMR-Iffy-Quotient-Whitepaper-810084.pdf|work=School of Information - Center for Social Media Responsibility|publisher=University of Michigan|page=5}}</ref> |
|||
''Media Bias/Fact Check'' was also criticised by ''[[World Net Daily]]'' (''WND''), who questioned the qualifications and expertise of the founder and reviewers for the site and listed them with 8 other online fact-checking sources that they called "The 9 fakest fake-news checkers". ''Media Bias/Fact Check'' responded, saying that they were "thrilled to be included with Politifact, Snopes, FactCheck.org, and the International Fact-Checking Network (Poynter Institute). These are the best and most credible fact checkers in the business."; adding that ''WND'' had contacted ''Media Bias/Fact Check'' to complain about their 'far right bias' rating days before the story was posted.<ref>{{Cite web |url=https://mediabiasfactcheck.com/2017/02/20/media-biasfact-check-makes-wnds-fakest-fact-checkers-list/ |title=Media Bias/Fact Check Makes WND's Fakest Fact Checkers List - Media Bias/Fact Check |date=2017-02-20 |website=Media Bias/Fact Check |language=en-US |access-date=2018-12-02}}</ref> ''[[The Washington Post]]'' also describes ''WND''<nowiki/>'s political lean as [[alt-right]] or [[Far-right politics|far-right]].<ref>{{Cite web|url=https://www.washingtonpost.com/news/the-fix/wp/2016/12/01/meet-the-alt-left-the-gops-response-to-its-alt-right-problem/|title=Introducing the ‘alt-left’: The GOP’s response to its alt-right problem|website=Washington Post|access-date=2017-05-26}}</ref><ref name=":0">{{cite web|url=https://www.washingtonpost.com/lifestyle/style/theres-the-major-media-and-then-theres-the-other-white-house-press-corps/2016/02/21/f69c5f92-c460-11e5-8965-0607e0e265ce_story.html|title=There’s the major media. And then there’s the ‘other’ White House press corps.|last2=Bruno|first2=Debra|date=February 21, 2016|website=|publisher=|archive-url=|archive-date=|dead-url=|access-date=|quote="Les Kinsolving, a reporter for the far-right World Net Daily, was a familiar White House gadfly from the days of the Nixon administration on."|first1=Debra|last1=Bruno|via=washingtonpost.com}}</ref><ref name=":2">{{Cite web|url=https://www.washingtonpost.com/news/the-fix/wp/2016/08/12/the-highly-reliable-definitely-not-crazy-places-where-donald-trump-gets-his-news/|title=The highly reliable, definitely-not-crazy places where Donald Trump gets his news|last=|first=|date=|website=Washington Post|archive-url=|archive-date=|dead-url=|access-date=2017-05-26|quote="WND is a leader in preserving murder cover-up theories, publishing 'exclusive reports' linking the Clintons to a plot to kill their longtime friend."}}</ref><ref>{{Cite news|url=https://www.washingtonpost.com/blogs/plum-line/wp/2017/03/09/want-to-know-what-deconstruction-of-the-administrative-state-looks-like-look-at-trumps-staffing/|title=Want to know what ‘deconstruction of the administrative state’ looks like? Look at Trump’s staffing.|last=Posner|first=Sarah|date=2017-03-09|work=The Washington Post|access-date=2017-05-26|archive-url=|archive-date=|dead-url=|last2=Posner|first2=Sarah|language=en-US|issn=0190-8286|quote="One of them is Curtis Ellis, a columnist for the far-right site WorldNetDaily, and now a special assistant to the secretary at the Labor Department."}}</ref>, a view shared by [[Michael Massing]] of the ''[[Columbia Journalism Review]]''.<ref>{{cite web|url=http://www.cjr.org/essay/unamerican_1.php|title=Un-American|last=Massing|first=Michael|date=|website=Columbia Journalism Review|publisher=|archive-url=|archive-date=|dead-url=|access-date=|quote=Far-right Web sites like World Net Daily and Newsmax.com floated all kinds of specious stories about Obama that quickly careened around the blogosphere and onto talk radio.}}</ref> |
|||
A 2018 year-in-review and prospective on [[fact-checking]] from the [[Poynter Institute]] (which develops [[PolitiFact]]<ref>{{cite web |title=Who Pays For PolitiFact? {{!}} PolitiFact |url=https://www.politifact.com/who-pays-for-politifact/ |website=www.politifact.com |access-date=14 June 2023}}</ref>) noted a proliferation of credibility score projects, including Media/Bias Fact Check, writing that "While these projects are, in theory, a good addition to the efforts combating misinformation, they have the potential to misfire," and stating that "Media Bias/Fact Check is a widely cited source for news stories and even studies about misinformation, despite the fact that its method is in no way scientific."<ref name="Funke">{{cite news |last1=Funke |first1=Daniel |last2=Mantzarlis |first2=Alexios |title=Here's what to expect from fact-checking in 2019 |url=https://www.poynter.org/fact-checking/2018/heres-what-to-expect-from-fact-checking-in-2019/ |work=Poynter |date=December 18, 2018}}</ref> Also in 2018, a writer in the ''[[Columbia Journalism Review]]'' described Media Bias/Fact Check as "an armchair media analysis"<ref>{{Cite book |last1=Albarracin |first1=Dolores |url=https://www.cambridge.org/core/books/creating-conspiracy-beliefs/918C84AEC301D40B569399C2B80D0517 |title=Creating Conspiracy Beliefs: How Our Thoughts Are Shaped |last2=Albarracin |first2=Julia |last3=Chan |first3=Man-pui Sally |last4=Jamieson |first4=Kathleen Hall |date=2021 |publisher=[[Cambridge University Press]] |isbn=978-1-108-84578-6 |location= |pages=130 |doi=10.1017/9781108990936 |s2cid=244413957 |author-link=Dolores Albarracín |author-link4=Kathleen Hall Jamieson}}</ref> and characterized their assessments as "subjective assessments [that] leave room for human biases, or even simple inconsistencies, to creep in".<ref name="CJR">{{Cite news|author=Tamar Wilner|url=https://www.cjr.org/innovations/measure-media-bias-partisan.php |title=We can probably measure media bias. But do we want to? |work=Columbia Journalism Review|date=January 9, 2018}}</ref> A study published in ''[[Scientific Reports]]'' wrote: "While [Media Bias/Fact Check's] credibility is sometimes questioned, it has been regarded as accurate enough to be used as [[Ground truth|ground-truth]] for e.g. media bias classifiers, fake news studies, and automatic fact-checking systems."<ref name="Chołoniewski">{{Cite journal |last1=Chołoniewski |first1=Jan |last2=Sienkiewicz |first2=Julian |last3=Dretnik |first3=Naum |last4=Leban |first4=Gregor |last5=Thelwall |first5=Mike |last6=Hołyst |first6=Janusz A. |date=2020 |title=A calibrated measure to compare fluctuations of different entities across timescales |journal=[[Scientific Reports]] |language=en |volume=10 |issue=1 |pages=20673 |doi=10.1038/s41598-020-77660-4 |issn=2045-2322 |pmc=7691371 |pmid=33244096}}</ref> |
|||
⚫ | |||
==See also== |
|||
* [[Ad Fontes Media]] |
|||
* [[AllSides]] |
|||
⚫ | |||
{{reflist}} |
{{reflist}} |
||
== |
==External links== |
||
* |
* {{Official website|https://mediabiasfactcheck.com/}} |
||
{{Authority control}} |
|||
[[Category:Criticism of journalism]] |
|||
[[Category:Fact-checking websites]] |
[[Category:Fact-checking websites]] |
||
[[Category:Internet properties established in 2015]] |
|||
[[Category:Media analysis organizations and websites]] |
|||
[[Category:Media bias]] |
Latest revision as of 01:40, 8 November 2024
Founded | 2015 |
---|---|
Headquarters | Greensboro, North Carolina |
Owner | Dave M. Van Zandt[1] |
URL | mediabiasfactcheck |
Current status | Active |
Media Bias/Fact Check (MBFC) is an American website founded in 2015 by Dave M. Van Zandt.[1] It considers four main categories and multiple subcategories in assessing the "political bias" and "factual reporting" of media outlets,[2][3] relying on a self-described "combination of objective measures and subjective analysis".[4][5]
It is widely used, but has been criticized for its methodology.[6] Scientific studies[7] using its ratings note that ratings from Media Bias/Fact Check show high agreement with an independent fact checking dataset from 2017,[8] with NewsGuard[9] and with BuzzFeed journalists.[10]
Methodology
[edit]Four main categories are used by MBFC to assess political bias and factuality of a source. These are: (1) use of wording and headlines (2) fact-checking and sourcing (3) choice of stories and (4) political affiliation. MBFC additionally considers subcategories such as bias by omission, bias by source selection, and loaded use of language.[2][11] A source's "Factual Reporting" is rated on a seven-point scale from "Very high" down to "Very low".[12]
Political bias ratings are American-centric,[11][13] and are "extreme-left", "left", "left-center", "least biased", "right-center", "right", and "extreme-right".[14] The category "Pro-science"[3] is used to indicate "evidence based" or "legitimate science". MBFC also associates sources with warning categories such as "Conspiracy/Pseudoscience", "Questionable Sources" and "Satire".[3]
Fact checks are carried out by independent reviewers who are associated with the International Fact-Checking Network (IFCN) and follow the International Fact-Checking Network Fact-checkers’ Code of Principles, which was developed by the Poynter Institute.[15][11] A source may be credited with high "Factual Reporting" and still show "Political bias" in its presentation of those facts, for example, through its use of emotional language.[16][17][18]
Reception
[edit]Media Bias/Fact Check has been used in studies of mainstream media, social media, and disinformation,[19][8][20][21] among them single- and cross-platform studies of services including TikTok, 4chan, Reddit, Twitter, Facebook, Instagram, and Google Web Search.[22]
Scientific studies[23] using its ratings note that ratings from Media Bias/Fact Check show high agreement with an independent fact checking dataset from 2017,[8] with NewsGuard[9] and with BuzzFeed journalists.[10] When MBFC factualness ratings of ‘mostly factual’ or higher were compared to an independent fact checking dataset's ‘verified’ and ‘suspicious’ news sources, the two datasets showed “almost perfect” inter-rater reliability.[8][20][24] A 2022 study that evaluated sharing of URLs on Twitter and Facebook in March and April 2020 and 2019, to compare the prevalence of misinformation, reports that scores from Media Bias/Fact Check correlate strongly with those from NewsGuard (r = 0.81).[9]
A comparison of five fact checking datasets frequently used as "groundtruth lists" has suggested that choosing one groundtruth list over another has little impact on the evaluation of online content.[8][20] In some cases, MBFC has been selected because it categorizes sources using a larger range of labels than other rating services.[8] MBFC offers the largest dataset covering biased and low factual news sources. Over a 4-year span, the percentage of links that could be categorized with MBFC was found to be very consistent. Research also suggests that the bias and factualness of a news source are unlikely to change over time.[8][20]
The site has been used by researchers at the University of Michigan to create a tool called the "Iffy Quotient", which draws data from Media Bias/Fact Check and NewsWhip to track the prevalence of "fake news" and questionable sources on social media.[25][26]
A 2018 year-in-review and prospective on fact-checking from the Poynter Institute (which develops PolitiFact[27]) noted a proliferation of credibility score projects, including Media/Bias Fact Check, writing that "While these projects are, in theory, a good addition to the efforts combating misinformation, they have the potential to misfire," and stating that "Media Bias/Fact Check is a widely cited source for news stories and even studies about misinformation, despite the fact that its method is in no way scientific."[6] Also in 2018, a writer in the Columbia Journalism Review described Media Bias/Fact Check as "an armchair media analysis"[28] and characterized their assessments as "subjective assessments [that] leave room for human biases, or even simple inconsistencies, to creep in".[29] A study published in Scientific Reports wrote: "While [Media Bias/Fact Check's] credibility is sometimes questioned, it has been regarded as accurate enough to be used as ground-truth for e.g. media bias classifiers, fake news studies, and automatic fact-checking systems."[19]
See also
[edit]References
[edit]- ^ a b "About". Media Bias/Fact Check. Retrieved 2019-03-30.
- ^ a b Larsen, Meng Zhen; Haupt, Michael R.; McMann, Tiana; Cuomo, Raphael E.; Mackey, Tim K. (January 2023). "The Influence of News Consumption Habits and Dispositional Traits on Trust in Medical Scientists". International Journal of Environmental Research and Public Health. 20 (10): 5842. doi:10.3390/ijerph20105842. ISSN 1660-4601. PMC 10218345. PMID 37239568.
- ^ a b c Barclay, Donald A. (25 June 2018). Fake News, Propaganda, and Plain Old Lies: How to Find Trustworthy Information in the Digital Age. Rowman & Littlefield. p. 186. ISBN 978-1-5381-0890-1.
- ^ "LibGuides: Journalism Reporting Resources: Fact Checking Resources". guides.library.illinois.edu. University of Illinois Urbana-Champaign. Retrieved 2024-07-10.
- ^ "Partnerships | Computational Social Science Lab". The University of Pennsylvania. Retrieved 2024-07-10.
- ^ a b Funke, Daniel; Mantzarlis, Alexios (December 18, 2018). "Here's what to expect from fact-checking in 2019". Poynter.
- ^ Lin, Hause; Lasser, Jana; Lewandowsky, Stephan; Cole, Rocky; Gully, Andrew; Rand, David G; Pennycook, Gordon (2023-09-05). Contractor, Noshir (ed.). "High level of correspondence across different news domain quality rating sets". PNAS Nexus. 2 (9). doi:10.1093/pnasnexus/pgad286. ISSN 2752-6542. PMC 10500312. PMID 37719749.
- ^ a b c d e f g Weld, Galen; Glenski, Maria; Althoff, Tim (22 May 2021). "Political Bias and Factualness in News Sharing across more than 100,000 Online Communities". Proceedings of the International AAAI Conference on Web and Social Media. 15: 796–807. arXiv:2102.08537. doi:10.1609/icwsm.v15i1.18104. ISSN 2334-0770. S2CID 231942492. Retrieved 8 June 2023.
- ^ a b c Broniatowski, David A.; Kerchner, Daniel; Farooq, Fouzia; Huang, Xiaolei; Jamison, Amelia M.; Dredze, Mark; Quinn, Sandra Crouse; Ayers, John W. (12 January 2022). "Twitter and Facebook posts about COVID-19 are less likely to spread misinformation compared to other health topics". PLOS ONE. 17 (1): e0261768. Bibcode:2022PLoSO..1761768B. doi:10.1371/journal.pone.0261768. ISSN 1932-6203. PMC 8754324. PMID 35020727.
- ^ a b Kiesel, Johannes; Mestre, Maria; Shukla, Rishabh; Vincent, Emmanuel; Adineh, Payam; Corney, David; Stein, Benno; Potthast, Martin (2019). "SemEval-2019 Task 4: Hyperpartisan News Detection". Proceedings of the 13th International Workshop on Semantic Evaluation. Minneapolis, Minnesota, USA: Association for Computational Linguistics. pp. 829–839. doi:10.18653/v1/S19-2145. S2CID 120224153.
- ^ a b c "Methodology". Media Bias/Fact Check. 7 June 2023. Retrieved 8 June 2023.
- ^ Saro, Robert De (28 March 2023). A Crisis like No Other: Understanding and Defeating Global Warming. Bentham Science Publishers. pp. 74–79. ISBN 978-1-68108-962-1.
- ^ Baly, Ramy; Karadzhov, Georgi; Alexandrov, Dimitar; Glass, James; Nakov, Preslav (2018). "Predicting Factuality of Reporting and Bias of News Media Sources". Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. Brussels, Belgium: Association for Computational Linguistics. pp. 3528–3539.
- ^ Main, Thomas J. (February 1, 2022). "Both the Right and Left Have Illiberal Factions. Which Is More Dangerous?". The Bulwark. Retrieved February 18, 2022.
- ^ "PIEGraph FAQ". University of North Carolina at Chapel Hill.
- ^ Christian, Sue Ellen (20 September 2019). Everyday Media Literacy: An Analog Guide for Your Digital Life. Routledge. ISBN 978-1-351-17548-7.
- ^ Solender, Andrew (12 June 2018). "How One Website Sets Out to Classify News, Expose 'Fake News'". InsideSources. Retrieved 7 June 2023.
- ^ Brown, Lyle; Langenegger, Joyce A.; Garcia, Sonia; Biles, Robert E.; Rynbrandt, Ryan (30 July 2021). Practicing Texas Politics. Cengage Learning. p. 224. ISBN 978-0-357-50532-8.
- ^ a b Chołoniewski, Jan; Sienkiewicz, Julian; Dretnik, Naum; Leban, Gregor; Thelwall, Mike; Hołyst, Janusz A. (2020). "A calibrated measure to compare fluctuations of different entities across timescales". Scientific Reports. 10 (1): 20673. doi:10.1038/s41598-020-77660-4. ISSN 2045-2322. PMC 7691371. PMID 33244096.
- ^ a b c d Bozarth, Lia; Saraf, Aparajita; Budak, Ceren (26 May 2020). "Higher Ground? How Groundtruth Labeling Impacts Our Understanding of Fake News about the 2016 U.S. Presidential Nominees". Proceedings of the International AAAI Conference on Web and Social Media. 14: 48–59. doi:10.1609/icwsm.v14i1.7278. ISSN 2334-0770.
Despite the varied labeling and validation procedures used and domains listed by fake news annotators, the groundtruth selection has a limited to modest impact on studies reporting on the behaviors of fake news sites
- ^ Allen, Jennifer; Arechar, Antonio A.; Pennycook, Gordon; Rand, David G. (3 September 2021). "Scaling up fact-checking using the wisdom of crowds". Science Advances. 7 (36): eabf4393. Bibcode:2021SciA....7.4393A. doi:10.1126/sciadv.abf4393. ISSN 2375-2548. PMC 8442902. PMID 34516925.
- ^ Rogers, R (2021). "Marginalizing the Mainstream: How Social Media Privilege Political Information". Frontiers in Big Data. 4: 689036. doi:10.3389/fdata.2021.689036. PMC 8290493. PMID 34296078.
- ^ Lin, Hause; Lasser, Jana; Lewandowsky, Stephan; Cole, Rocky; Gully, Andrew; Rand, David G; Pennycook, Gordon (2023-09-05). Contractor, Noshir (ed.). "High level of correspondence across different news domain quality rating sets". PNAS Nexus. 2 (9): pgad286. doi:10.1093/pnasnexus/pgad286. ISSN 2752-6542. PMC 10500312. PMID 37719749.
- ^ Volkova, Svitlana; Shaffer, Kyle; Jang, Jin Yea; Hodas, Nathan (July 2017). "Separating Facts from Fiction: Linguistic Models to Classify Suspicious and Trusted News Posts on Twitter". Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers). Association for Computational Linguistics: 647–653. doi:10.18653/v1/P17-2102. S2CID 29259081.
- ^ Dian Schaffhauser. "U-M Tracker Measures Reliability of News on Facebook, Twitter -- Campus Technology". Campus Technology. Retrieved 2018-12-03.
- ^ Paul Resnick; Aviv Ovadya; Garlin Gilchrist. "Iffy Quotient: A Platform Health Metric for Misinformation" (PDF). School of Information - Center for Social Media Responsibility. University of Michigan. p. 5.
- ^ "Who Pays For PolitiFact? | PolitiFact". www.politifact.com. Retrieved 14 June 2023.
- ^ Albarracin, Dolores; Albarracin, Julia; Chan, Man-pui Sally; Jamieson, Kathleen Hall (2021). Creating Conspiracy Beliefs: How Our Thoughts Are Shaped. Cambridge University Press. p. 130. doi:10.1017/9781108990936. ISBN 978-1-108-84578-6. S2CID 244413957.
- ^ Tamar Wilner (January 9, 2018). "We can probably measure media bias. But do we want to?". Columbia Journalism Review.