Old page wikitext, before the edit (old_wikitext ) | '{{Short description|Scientific study of science}}
{{for|the journal|Metascience (journal)}}
{{distinguish|text = [[Science studies]], or with the obsolete synonym 'Meta-science' for the [[Philosophy of science]]}}
{{Evidence-based practices}}
'''Metascience''' (also known as '''meta-research''') is the use of [[scientific methodology]] to study [[science]] itself. Metascience seeks to increase the quality of scientific research while reducing [[inefficiency]]. It is also known as "research on research" and "the science of science", as it uses [[research methods]] to study how [[research]] is done and find where improvements can be made. Metascience concerns itself with all fields of research and has been described as "a [[bird's eye view]] of science".<ref name=Ioannidis2015/> In the words of [[John Ioannidis]], "Science is the best thing that has happened to human beings{{Nbsp}}... but we can do it better."<ref>{{cite web |last1=Bach |first1=Becky |title=On communicating science and uncertainty: A podcast with John Ioannidis |url=https://scopeblog.stanford.edu/2015/12/08/on-communicating-science-and-uncertainty-a-podcast-with-john-ioannidis/ |website=Scope |access-date=20 May 2019 |date=8 December 2015}}</ref>
In 1966, an early meta-research paper examined the [[statistical methods]] of 295 papers published in ten high-profile medical journals. It found that "in almost 73% of the reports read{{Nbsp}}... conclusions were drawn when the justification for these conclusions was invalid." Meta-research in the following decades found many methodological flaws, inefficiencies, and poor practices in research across numerous scientific fields. Many scientific studies could not be [[reproducibility|reproduced]], particularly in [[evidence-based medicine|medicine]] and the [[soft sciences]]. The term "[[replication crisis]]" was coined in the early 2010s as part of a growing awareness of the problem.<ref>{{Cite journal |last1=Pashler |first1=Harold |last2=Harris |first2=Christine R. |date=2012 |title=Is the Replicability Crisis Overblown? Three Arguments Examined |url=http://journals.sagepub.com/doi/10.1177/1745691612463401 |journal=Perspectives on Psychological Science |language=en |volume=7 |issue=6 |pages=531–536 |doi=10.1177/1745691612463401 |pmid=26168109 |s2cid=1342421 |issn=1745-6916}}</ref>
Measures have been implemented to address the issues revealed by metascience. These measures include the [[pre-registration (science)|pre-registration]] of scientific studies and [[Clinical trial registration|clinical trials]] as well as the founding of organizations such as [[CONSORT]] and the [[EQUATOR Network]] that issue guidelines for methodology and reporting. There are continuing efforts to reduce the [[misuse of statistics]], to eliminate [[perverse incentives]] from academia, to improve the [[academic peer review|peer review]] process, to systematically collect data about the scholarly publication system,<ref name="Nishikawa-PacherHeckSchoch2022">{{cite journal | last1 = Nishikawa-Pacher | first1 = Andreas | last2 = Heck | first2 = Tamara | last3 = Schoch | first3 = Kerstin | title = Open Editors: A dataset of scholarly journals' editorial board positions | journal = Research Evaluation | date = 4 October 2022 | issn = 0958-2029 | eissn = 1471-5449 | doi = 10.1093/reseval/rvac037 | pmid = | url = | doi-access = free}}</ref> to combat [[bias]] in scientific literature, and to increase the overall quality and efficiency of the scientific process.
== History ==
[[File:Ioannidis (2005) Why Most Published Research Findings Are False.pdf|thumb|200px|[[John Ioannidis]] (2005), "[[Why Most Published Research Findings Are False]]"<ref name=Ioannidis2005/>]]
In 1966, an early meta-research paper examined the [[statistical methods]] of 295 papers published in ten high-profile medical journals. It found that, "in almost 73% of the reports read ... conclusions were drawn when the justification for these conclusions was invalid."<ref name="Schor1966">{{cite journal|last1=Schor|first1=Stanley|title=Statistical Evaluation of Medical Journal Manuscripts|journal=JAMA: The Journal of the American Medical Association|volume=195|issue=13|year=1966|pages=1123–1128|issn=0098-7484|doi=10.1001/jama.1966.03100130097026|pmid=5952081}}</ref> In 2005, [[John Ioannidis]] published a paper titled "[[Why Most Published Research Findings Are False]]", which argued that a majority of papers in the medical field produce conclusions that are wrong.<ref name=Ioannidis2005>{{cite journal |last1=Ioannidis |first1=JP |title=Why most published research findings are false. |journal=PLOS Medicine |date=August 2005 |volume=2 |issue=8 |page=e124 |doi=10.1371/journal.pmed.0020124 |pmid=16060722 |pmc=1182327 |doi-access=free }}</ref> The paper went on to become the most downloaded paper in the [[Public Library of Science]]<ref>{{cite web|title = Highly Cited Researchers |url= http://highlycited.com/ |access-date=September 17, 2015}}</ref><ref>[https://profiles.stanford.edu/john-ioannidis Medicine - Stanford Prevention Research Center.] John P.A. Ioannidis</ref> and is considered foundational to the field of metascience.<ref>{{cite news|author=Robert Lee Hotz|title=Most Science Studies Appear to Be Tainted By Sloppy Analysis|url=https://www.wsj.com/articles/SB118972683557627104|newspaper = Wall Street Journal|publisher=Dow Jones & Company|date=September 14, 2007 |access-date=2016-12-05 |author-link=Robert Lee Hotz}}</ref> In a related study with [[Jeremy Howick]] and [[Despina Koletsi]], Ioannidis showed that only a minority of medical interventions are supported by 'high quality' evidence according to [[The Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach]]. <ref>Howick J, Koletsi D, Pandis N, Fleming PS, Loef M, Walach H, Schmidt S, Ioannidis JA. The quality of evidence for medical interventions does not improve or worsen: a metaepidemiological study of Cochrane reviews. Journal of Clinical Epidemiology 2020;126:154-159 [https://www.jclinepi.com/article/S0895-4356(20)30777-0/fulltext]</ref> Later meta-research identified widespread difficulty in [[Reproducibility|replicating]] results in many scientific fields, including [[psychology]] and [[Evidence-based medicine|medicine]]. This problem was termed "[[replication crisis|the replication crisis]]". Metascience has grown as a reaction to the replication crisis and to concerns about waste in research.<ref>{{cite journal|year=2014|title=Researching the researchers|journal=Nature Genetics|volume=46|issue=5|page=417|doi=10.1038/ng.2972|issn=1061-4036|pmid=24769715|doi-access=free}}</ref>
Many prominent publishers are interested in meta-research and in improving the quality of their publications. Top journals such as ''[[Nature (journal)|Science]],'' ''[[The Lancet]]'', and ''[[Nature (journal)|Nature]],'' provide ongoing coverage of meta-research and problems with reproducibility.<ref name="Enserink20182">{{cite journal|last1=Enserink|first1=Martin|year=2018|title=Research on research|journal=Science|volume=361|issue=6408|pages=1178–1179|doi=10.1126/science.361.6408.1178|issn=0036-8075|pmid=30237336|bibcode=2018Sci...361.1178E|s2cid=206626417}}</ref> In 2012 ''[[PLOS ONE]]'' launched a Reproducibility Initiative. In 2015 [[BioMed Central|Biomed Central]] introduced a minimum-standards-of-reporting checklist to four titles.
The first international conference in the broad area of meta-research was the Research Waste/[[EQUATOR Network|EQUATOR]] conference held in Edinburgh in 2015; the first international conference on peer review was the [[Peer Review Congress]] held in 1989.<ref name="Rennie1990">{{cite journal|last1=Rennie|first1=Drummond|title=Editorial Peer Review in Biomedical Publication|journal=JAMA|volume=263|issue=10|year=1990|pages=1317–1441|issn=0098-7484|doi=10.1001/jama.1990.03440100011001|pmid=2304208}}</ref> In 2016, ''[[Research Integrity and Peer Review]]'' was launched. The journal's opening editorial called for "research that will increase our understanding and suggest potential solutions to issues related to peer review, study reporting, and research and publication ethics".<ref name="HarrimanKowalczuk2016">{{cite journal|last1=Harriman|first1=Stephanie L.|last2=Kowalczuk|first2=Maria K.|last3=Simera|first3=Iveta|last4=Wager|first4=Elizabeth|title=A new forum for research on research integrity and peer review|journal=Research Integrity and Peer Review|volume=1|issue=1|page=5|year=2016|issn=2058-8615|doi=10.1186/s41073-016-0010-y|pmid=29451544|pmc=5794038 |doi-access=free }}</ref>
== Fields and topics of meta-research ==
[[File:ToK Simple.jpg|thumb|300px|An exemplary visualization of a conception of scientific [[knowledge]] generation structured by layers, with the "Institution of Science" being the subject of metascience]]
Metascience can be categorized into five major areas of interest: Methods, Reporting, Reproducibility, Evaluation, and Incentives. These correspond, respectively, with how to perform, communicate, verify, evaluate, and reward research.<ref name=Ioannidis2015>{{cite journal |last1=Ioannidis |first1=John P. A. |last2=Fanelli |first2=Daniele |last3=Dunne |first3=Debbie Drake |last4=Goodman |first4=Steven N. |title=Meta-research: Evaluation and Improvement of Research Methods and Practices |journal=PLOS Biology |date=2 October 2015 |volume=13 |issue=10 |page=e1002264 |doi=10.1371/journal.pbio.1002264 |pmid=26431313 |pmc=4592065 |issn=1544-9173 |doi-access=free }}</ref>
=== Methods ===
Metascience seeks to identify poor research practices, including [[bias]]es in research, poor study design, [[abuse of statistics]], and to find methods to reduce these practices.<ref name=Ioannidis2015 /> Meta-research has identified numerous biases in scientific literature.<ref>{{cite journal |last1=Fanelli |first1=Daniele |last2=Costas |first2=Rodrigo |last3=Ioannidis |first3=John P. A. |title=Meta-assessment of bias in science |journal=Proceedings of the National Academy of Sciences of the United States of America |date=2017 |volume=114 |issue=14 |pages=3714–3719 |doi=10.1073/pnas.1618569114 |pmid=28320937 |pmc=5389310 |bibcode=2017PNAS..114.3714F |issn=1091-6490|doi-access=free }}</ref> Of particular note is the widespread [[misuse of p-values]] and abuse of [[statistical significance]].<ref>{{cite journal |last1=Check Hayden |first1=Erika |title=Weak statistical standards implicated in scientific irreproducibility |url=https://www.nature.com/news/weak-statistical-standards-implicated-in-scientific-irreproducibility-1.14131 |journal=Nature |access-date=9 May 2019 |language=en |doi=10.1038/nature.2013.14131|year=2013 |s2cid=211729036 |doi-access=free }}</ref>
==== Scientific data science ====
Scientific data science is the use of [[data science]] to analyse research papers. It encompasses both [[Qualitative method|qualitative]] and [[Quantitative method|quantitative]] methods. Research in scientific data science includes [[fraud detection]]<ref>{{cite journal
| last1 = Markowitz
| first1 = David M.
| last2 = Hancock
| first2 = Jeffrey T.
| s2cid = 146174471
| date = 2016
| title = Linguistic obfuscation in fraudulent science
| journal = Journal of Language and Social Psychology
| volume = 35
| issue = 4
| pages = 435–445
| doi = 10.1177/0261927X15614605
}}</ref> and [[citation network]] analysis.<ref>{{cite journal
| last = Ding
| first = Y.
| s2cid = 3752804
| date = 2010
| title = Applying weighted PageRank to author citation networks
| journal = Journal of the American Society for Information Science and Technology
| volume = 62
| issue = 2
| pages = 236–245
| doi = 10.1002/asi.21452
| arxiv= 1102.1760
}}</ref>
==== Journalology ====
{{Main|Journalology}}
Journalology, also known as publication science, is the scholarly study of all aspects of the [[academic publishing]] process.<ref>{{Cite journal |last1=Galipeau |first1=James |last2=Moher |first2=David |last3=Campbell |first3=Craig |last4=Hendry |first4=Paul |last5=Cameron |first5=D. William |last6=Palepu |first6=Anita |last7=Hébert |first7=Paul C. |date=March 2015 |title=A systematic review highlights a knowledge gap regarding the effectiveness of health-related training programs in journalology |journal=Journal of Clinical Epidemiology |language=en |volume=68 |issue=3 |pages=257–265 |doi=10.1016/j.jclinepi.2014.09.024|pmid=25510373 |doi-access=free }}</ref><ref>{{Cite journal |last1=Wilson |first1=Mitch |last2=Moher |first2=David |date=March 2019 |title=The Changing Landscape of Journalology in Medicine |journal=Seminars in Nuclear Medicine |language=en |volume=49 |issue=2 |pages=105–114 |doi=10.1053/j.semnuclmed.2018.11.009|pmid=30819390 |hdl=10393/38493 |s2cid=73471103 |hdl-access=free }}</ref> The field seeks to improve the quality of scholarly research by implementing [[evidence-based practices]] in academic publishing.<ref name= Sciencemag1>{{Cite journal | doi=10.1126/science.aav4758 |title = 'Journalologists' use scientific methods to study academic publishing. Is their work improving science?|journal = Science|date = 18 September 2018|last1 = Couzin-Frankel|first1 = Jennifer|s2cid = 115360831}}</ref> The term "journalology" was coined by [[Stephen Lock]], the former [[editor-in-chief]] of ''[[The BMJ]]''. The first Peer Review Congress, held in 1989 in [[Chicago]], [[Illinois]], is considered a pivotal moment in the founding of journalology as a distinct field.<ref name= Sciencemag1/> The field of journalology has been influential in pushing for study [[pre-registration (science)|pre-registration]] in science, particularly in [[clinical trials]]. [[Clinical trial registration|Clinical-trial registration]] is now expected in most countries.<ref name= Sciencemag1 />
=== Reporting ===
Meta-research has identified poor practices in reporting, explaining, disseminating and popularizing research, particularly within the social and health sciences. Poor reporting makes it difficult to accurately interpret the results of scientific studies, to [[Reproducibility|replicate]] studies, and to identify biases and conflicts of interest in the authors. Solutions include the implementation of reporting standards, and greater transparency in scientific studies (including better requirements for disclosure of conflicts of interest). There is an attempt to standardize reporting of data and methodology through the creation of guidelines by reporting agencies such as [[Consolidated Standards of Reporting Trials|CONSORT]] and the larger [[EQUATOR Network]].<ref name=Ioannidis2015 />
=== Reproducibility ===
{{Further|Replication crisis|Reproducibility}}
[[File:Barriers to conducting replications of experiment in cancer research.jpg|thumb|Barriers to conducting replications of experiment in cancer research, ''[[Reproducibility Project|The Reproducibility Project]]: Cancer Biology'']]
The replication crisis is an ongoing [[methodological]] crisis in which it has been found that many scientific studies are difficult or impossible to [[reproducibility|replicate]].<ref>{{Cite journal | doi = 10.1038/515009a| title = Metascience could rescue the 'replication crisis'| journal = Nature| volume = 515| issue = 7525| page = 9| year = 2014| last1 = Schooler | first1 = J. W.| pmid=25373639| bibcode = 2014Natur.515....9S| doi-access = free}}</ref><ref name="Why 'Statistical Significance' Is Often Insignificant">{{cite news|last1=Smith|first1=Noah|title=Why 'Statistical Significance' Is Often Insignificant|url=https://www.bloomberg.com/view/articles/2017-11-02/why-statistical-significance-is-often-insignificant|newspaper=Bloomberg.com|date=2 November 2017|access-date=7 November 2017}}</ref> While the crisis has its roots in the meta-research of the mid- to late 20th century, the phrase "replication crisis" was not coined until the early 2010s<ref name="ReferenceA">{{Cite journal |last1=Pashler |first1=Harold |last2=Wagenmakers |first2=Eric Jan |year=2012 |title=Editors' Introduction to the Special Section on Replicability in Psychological Science: A Crisis of Confidence? |journal=Perspectives on Psychological Science |volume=7 |issue=6 |pages=528–530 |doi=10.1177/1745691612465253 |pmid=26168108 |s2cid=26361121}}</ref> as part of a growing awareness of the problem.<ref name=Ioannidis2015 /> The replication crisis has been closely studied in [[psychology]] (especially [[social psychology]]) and [[medicine]],<ref>{{cite magazine|url=http://www.newyorker.com/tech/elements/the-crisis-in-social-psychology-that-isnt|title=The Crisis in Social Psychology That Isn't|author=Gary Marcus|magazine=The New Yorker|date=May 1, 2013}}</ref><ref>{{cite magazine|url=http://www.newyorker.com/magazine/2010/12/13/the-truth-wears-off|title=The Truth Wears Off|author=Jonah Lehrer|magazine=The New Yorker|date=December 13, 2010}}</ref> including cancer research.<ref>{{cite news |title=Dozens of major cancer studies can't be replicated |url=https://www.sciencenews.org/article/cancer-biology-studies-research-replication-reproducibility |access-date=19 January 2022 |work=Science News |date=7 December 2021}}</ref><ref>{{cite web |title=Reproducibility Project: Cancer Biology |url=https://www.cos.io/rpcb |website=www.cos.io |publisher=[[Center for Open Science]] |access-date=19 January 2022 |language=en}}</ref> Replication is an essential part of the scientific process, and the widespread failure of replication puts into question the reliability of affected fields.<ref>Staddon, John (2017) Scientific Method: How science works, fails to work or pretends to work. Taylor and Francis.</ref>
Moreover, replication of research (or failure to replicate) is considered less influential than original research, and is less likely to be published in many fields. This discourages the reporting of, and even attempts to replicate, studies.<ref>{{Cite journal|last=Yeung|first=Andy W. K.|date=2017|title=Do Neuroscience Journals Accept Replications? A Survey of Literature|journal=Frontiers in Human Neuroscience|language=en|volume=11|page=468|doi=10.3389/fnhum.2017.00468|pmid=28979201|pmc=5611708|issn=1662-5161|doi-access=free}}</ref><ref>{{Cite journal|last1=Martin|first1=G. N.|last2=Clarke|first2=Richard M.|date=2017|title=Are Psychology Journals Anti-replication? A Snapshot of Editorial Practices|journal=Frontiers in Psychology|language=en|volume=8|page=523|doi=10.3389/fpsyg.2017.00523|pmid=28443044|pmc=5387793|issn=1664-1078|doi-access=free}}</ref>
=== Evaluation and incentives ===
{{See also|Academic journal#Prestige and ranking}}
Metascience seeks to create a scientific foundation for peer review. Meta-research evaluates [[Scholarly peer review|peer review]] systems including [[Scholarly peer review#Pre-publication peer review|pre-publication]] peer review, [[Scholarly peer review#Post-publication peer review|post-publication]] peer review, and [[open peer review]]. It also seeks to develop better research funding criteria.<ref name=Ioannidis2015 />
Metascience seeks to promote better research through better incentive systems. This includes studying the accuracy, effectiveness, costs, and benefits of different approaches to ranking and evaluating research and those who perform it.<ref name=Ioannidis2015 /> Critics argue that [[perverse incentives]] have created a [[publish or perish|publish-or-perish]] environment in academia which promotes the production of [[junk science]], low quality research, and [[false positives]].<ref>{{Cite book|last=Binswanger|first=Mathias|chapter=How Nonsense Became Excellence: Forcing Professors to Publish|date=2015|work=Incentives and Performance: Governance of Research Organizations|pages=19–32|editor-last=Welpe|editor-first=Isabell M.|publisher=Springer International Publishing|language=en|doi=10.1007/978-3-319-09785-5_2|isbn=978-3319097855|editor2-last=Wollersheim|editor2-first=Jutta|editor3-last=Ringelhan|editor3-first=Stefanie|editor4-last=Osterloh|editor4-first=Margit|title=Incentives and Performance|s2cid=110698382 }}</ref><ref>{{Cite journal|last1=Edwards|first1=Marc A.|last2=Roy|first2=Siddhartha|date=2016-09-22|title=Academic Research in the 21st Century: Maintaining Scientific Integrity in a Climate of Perverse Incentives and Hypercompetition|journal=Environmental Engineering Science|volume=34|issue=1|pages=51–61|doi=10.1089/ees.2016.0223|pmc=5206685|pmid=28115824}}</ref> According to [[Brian Nosek]], "The problem that we face is that the incentive system is focused almost entirely on getting research published, rather than on getting research right."<ref>{{cite web |last1=Brookshire |first1=Bethany |title=Blame bad incentives for bad science |url=https://www.sciencenews.org/blog/scicurious/blame-bad-incentives-bad-science |website=Science News |access-date=11 July 2019 |language=en |date=21 October 2016}}</ref> Proponents of reform seek to structure the incentive system to favor higher-quality results.<ref>{{cite journal |last1=Smaldino |first1=Paul E. |last2=McElreath |first2=Richard |title=The natural selection of bad science |journal=Royal Society Open Science |volume=3 |issue=9 |page=160384 |doi=10.1098/rsos.160384 |pmid=27703703 |pmc=5043322 |language=en|arxiv=1605.09511 |bibcode=2016RSOS....360384S |year=2016 }}</ref> For example, by quality being judged on the basis of narrative expert evaluations ("rather than [only or mainly] indices"), institutional evaluation criteria, guaranteeing of transparency, and professional standards.<ref name="10.1098/rspb.2019.2047">{{cite journal |last1=Chapman |first1=Colin A. |last2=Bicca-Marques |first2=Júlio César |last3=Calvignac-Spencer |first3=Sébastien |last4=Fan |first4=Pengfei |last5=Fashing |first5=Peter J. |last6=Gogarten |first6=Jan |last7=Guo |first7=Songtao |last8=Hemingway |first8=Claire A. |last9=Leendertz |first9=Fabian |last10=Li |first10=Baoguo |last11=Matsuda |first11=Ikki |last12=Hou |first12=Rong |last13=Serio-Silva |first13=Juan Carlos |last14=Chr. Stenseth |first14=Nils |title=Games academics play and their consequences: how authorship, h -index and journal impact factors are shaping the future of academia |journal=Proceedings of the Royal Society B: Biological Sciences |date=4 December 2019 |volume=286 |issue=1916 |pages=20192047 |doi=10.1098/rspb.2019.2047 |pmid=31797732 |pmc=6939250 |s2cid=208605640 |language=en |issn=0962-8452}}</ref>
====Contributorship====
Studies proposed machine-readable standards and (a taxonomy of) [[Digital badge|badge]]s for science publication management systems that hones in on contributorship – who has contributed what and how much of the research labor – rather that using traditional concept of plain [[academic authorship|authorship]] – who was involved in any way creation of a publication.<ref>{{cite journal |last1=Holcombe |first1=Alex O. |title=Contributorship, Not Authorship: Use CRediT to Indicate Who Did What |journal=Publications |date=September 2019 |volume=7 |issue=3 |page=48 |doi=10.3390/publications7030048 |language=en|doi-access=free }}</ref><ref>{{cite journal |last1=McNutt |first1=Marcia K. |last2=Bradford |first2=Monica |last3=Drazen |first3=Jeffrey M. |last4=Hanson |first4=Brooks |last5=Howard |first5=Bob |last6=Jamieson |first6=Kathleen Hall |last7=Kiermer |first7=Véronique |last8=Marcus |first8=Emilie |last9=Pope |first9=Barbara Kline |last10=Schekman |first10=Randy |last11=Swaminathan |first11=Sowmya |last12=Stang |first12=Peter J. |last13=Verma |first13=Inder M. |title=Transparency in authors' contributions and responsibilities to promote integrity in scientific publication |journal=Proceedings of the National Academy of Sciences |date=13 March 2018 |volume=115 |issue=11 |pages=2557–2560 |doi=10.1073/pnas.1715374115 |pmid=29487213 |pmc=5856527 |bibcode=2018PNAS..115.2557M |language=en |issn=0027-8424|doi-access=free }}</ref><ref>{{cite journal |last1=Brand |first1=Amy |last2=Allen |first2=Liz |last3=Altman |first3=Micah |last4=Hlava |first4=Marjorie |last5=Scott |first5=Jo |title=Beyond authorship: attribution, contribution, collaboration, and credit |journal=Learned Publishing |date=1 April 2015 |volume=28 |issue=2 |pages=151–155 |doi=10.1087/20150211 |s2cid=45167271 |url=https://www.researchgate.net/publication/274098676|doi-access=free }}</ref><ref>{{cite journal |last1=Singh Chawla |first1=Dalmeet |title=Digital badges aim to clear up politics of authorship |journal=Nature |date=October 2015 |volume=526 |issue=7571 |pages=145–146 |doi=10.1038/526145a |pmid=26432249 |bibcode=2015Natur.526..145S |s2cid=256770827 |language=en |issn=1476-4687|doi-access=free }}</ref> A study pointed out one of the problems associated with the ongoing neglect of contribution nuanciation – it found that "the number of publications has ceased to be a good metric as a result of longer author lists, shorter papers, and surging publication numbers".<ref name="10.1093/gigascience/giz053">{{cite journal |last1=Fire |first1=Michael |last2=Guestrin |first2=Carlos |title=Over-optimization of academic publishing metrics: observing Goodhart's Law in action |journal=GigaScience |date=1 June 2019 |volume=8 |issue=6 |page=giz053 |doi=10.1093/gigascience/giz053|pmid=31144712 |pmc=6541803 }}</ref>
====Assessment factors====
Factors other than a submission's merits can substantially influence peer reviewers' evaluations.<ref name="10.1177/2515245919895419">{{cite journal |last1=Elson |first1=Malte |last2=Huff |first2=Markus |last3=Utz |first3=Sonja |title=Metascience on Peer Review: Testing the Effects of a Study's Originality and Statistical Significance in a Field Experiment |journal=Advances in Methods and Practices in Psychological Science |date=1 March 2020 |volume=3 |issue=1 |pages=53–65 |doi=10.1177/2515245919895419 |s2cid=212778011 |language=en |issn=2515-2459|url=https://psyarxiv.com/gyds8/ }}</ref> Such factors may however also be important such as the use of track-records about the veracity of a researchers' prior publications and its alignment with public interests. Nevertheless, evaluation systems – include those of peer-review – may substantially lack mechanisms and criteria that are oriented or well-performingly oriented towards merit, real-world positive impact, progress and public usefulness rather than analytical indicators such as number of citations or altmetrics even when such can be used as partial indicators of such ends.<ref>{{cite journal |last1=McLean |first1=Robert K D |last2=Sen |first2=Kunal |title=Making a difference in the real world? A meta-analysis of the quality of use-oriented research using the Research Quality Plus approach |journal=Research Evaluation |date=1 April 2019 |volume=28 |issue=2 |pages=123–135 |doi=10.1093/reseval/rvy026}}</ref><ref>{{cite web |title=Bringing Rigor to Relevant Questions: How Social Science Research Can Improve Youth Outcomes in the Real World |url=http://wtgrantfoundation.org/library/uploads/2017/05/How-Social-Science-Research-Can-Improve-Youth-Outcomes_WTG2017.pdf |access-date=22 November 2021}}</ref> Rethinking of the academic reward structure "to offer more formal recognition for intermediate products, such as data" could have positive impacts and reduce data withholding.<ref>{{cite journal |last1=Fecher |first1=Benedikt |last2=Friesike |first2=Sascha |last3=Hebing |first3=Marcel |last4=Linek |first4=Stephanie |title=A reputation economy: how individual reward considerations trump systemic arguments for open access to data |journal=Palgrave Communications |date=20 June 2017 |volume=3 |issue=1 |pages=1–10 |doi=10.1057/palcomms.2017.51 |s2cid=34449408 |language=en |issn=2055-1045|hdl=11108/308 |hdl-access=free }}</ref>
====Recognition of training====
A commentary noted that academic rankings don't consider where (country and institute) the respective researchers were trained.<ref>{{cite journal |last1=La Porta |first1=Caterina AM |last2=Zapperi |first2=Stefano |title=America's top universities reap the benefit of Italian-trained scientists |url=https://www.nature.com/articles/d43978-022-00163-5 |journal=Nature Italy |access-date=18 December 2022 |language=en |doi=10.1038/d43978-022-00163-5 |date=1 December 2022|s2cid=254331807 }}</ref>
====Scientometrics====
{{Main|Scientometrics}}
Scientometrics concerns itself with measuring [[bibliometrics|bibliographic data]] in scientific publications. Major research issues include the measurement of the impact of research papers and academic journals, the understanding of scientific citations, and the use of such measurements in policy and management contexts.<ref name="ScientometricsLeydesdorff">[[Loet Leydesdorff|Leydesdorff, L.]] and Milojevic, S., "Scientometrics" [https://arxiv.org/abs/1208.4566 arXiv:1208.4566] (2013), forthcoming in: Lynch, M. (editor), ''International Encyclopedia of Social and Behavioral Sciences'' subsection 85030. (2015)</ref> Studies suggest that "metrics used to measure academic success, such as the number of publications, citation number, and impact factor, have not changed for decades" and have to some degrees "ceased" to be good measures,<ref name="10.1093/gigascience/giz053"/> leading to issues such as "overproduction, unnecessary fragmentations, overselling, predatory journals (pay and publish), clever plagiarism, and deliberate obfuscation of scientific results so as to sell and oversell".<ref name="publishless">{{Cite arXiv|last1=Singh |first1=Navinder |title=Plea to publish less |date=8 October 2021|class=physics.soc-ph |eprint=2201.07985 }}</ref>
Novel tools in this area include systems to quantify how much the cited-node informs the citing-node.<ref>{{cite book |last1=Manchanda |first1=Saurav |last2=Karypis |first2=George |title=Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing |chapter=Evaluating Scholarly Impact: Towards Content-Aware Bibliometrics |date=November 2021 |pages=6041–6053 |doi=10.18653/v1/2021.emnlp-main.488 |publisher=Association for Computational Linguistics|s2cid=243865632 |doi-access=free }}</ref> This can be used to convert unweighted citation networks to a weighted one and then for [[importance]] assessment, deriving "impact metrics for the various entities involved, like the publications, authors etc"<ref>{{cite web |last1=Manchanda |first1=Saurav |last2=Karypis |first2=George |title=Importance Assessment in Scholarly Networks |url=http://ceur-ws.org/Vol-2831/paper17.pdf}}</ref> as well as, among other tools, for search engine- and [[recommendation system]]s.
==== Science governance ====
{{See also|Science policy|Science of science policy}}
[[Science funding]] and [[science governance]] can also be explored and informed by metascience.<ref name="10.1007/s11016-020-00581-5">{{cite journal |last1=Nielsen |first1=Kristian H. |title=Science and public policy |journal=Metascience |date=1 March 2021 |volume=30 |issue=1 |pages=79–81 |doi=10.1007/s11016-020-00581-5| pmc=7605730 |s2cid=226237994 |language=en |issn=1467-9981}}</ref>
===== Incentives =====
Various interventions such as [[prioritization]] can be important. {{anchor|Differential R&D}}For instance, the concept of [[differential technological development]] refers to deliberately developing technologies – e.g. control-, safety- and policy-technologies versus [[biosafety|risky biotechnologies]] – at different precautionary paces to decrease risks, mainly [[global catastrophic risk]], by influencing the sequence in which technologies are developed.<ref>{{cite book|last=Bostrom|first=Nick|title=Superintelligence: Paths, Dangers, Strategies|date=2014|publisher=Oxford University Press|isbn=978-0199678112|location=Oxford|pages=229–237}}</ref><ref>{{Cite book|last=Ord|first=Toby|title=The Precipice: Existential Risk and the Future of Humanity|publisher=[[Bloomsbury Publishing]]|year=2020|isbn=978-1526600219|location=United Kingdom|page=200}}</ref> Relying only on the established form of legislation and incentives to ensure the right outcomes may not be adequate as these may often be too slow<ref>{{cite web |title=Technology is changing faster than regulators can keep up - here's how to close the gap |url=https://www.weforum.org/agenda/2018/06/law-too-slow-for-new-tech-how-keep-up/ |website=World Economic Forum |date=21 June 2018 |access-date=27 January 2022 |language=en}}</ref> or inappropriate.
Other incentives to govern science and related processes, including via metascience-based reforms, may include ensuring accountability to the public (in terms of e.g. accessibility of, especially publicly-funded, research or of it addressing various research topics of public interest in serious manners), increasing the qualified productive scientific workforce, improving the efficiency of science to improve [[problem-solving]] in general, and facilitating that unambiguous societal needs based on solid scientific evidence – such as about human physiology – are adequately prioritized and addressed. Such interventions, incentives and intervention-designs can be subjects of metascience.
===== Science funding and awards =====
{{See also|Patent#Criticism}}
[[File:Locations of papers in a map of science and locations of the key papers for Nobel prizes.tif|thumb|right|200px|Cluster network of scientific publications in relation to Nobel prizes]]
[[File:Funding for climate research in the natural and technical sciences versus the social sciences and humanities.jpg|thumb|Funding for climate research in the natural and technical sciences versus the social sciences and humanities<ref>{{cite journal |last1=Overland |first1=Indra |last2=Sovacool |first2=Benjamin K. |title=The misallocation of climate research funding |journal=Energy Research & Social Science |date=1 April 2020 |volume=62 |pages=101349 |doi=10.1016/j.erss.2019.101349 |s2cid=212789228 |language=en |issn=2214-6296}}</ref>]]
Scientific awards are one category of science incentives. Metascience can explore existing and hypothetical systems of science awards. For instance, it found that work honored by [[Nobel prize]]s clusters in only a few [[scientific fields]] with only 36/71 having received at least one Nobel prize of the 114/849 domains science could be divided into according to their DC2 and DC3 classification systems. Five of the 114 domains were shown to make up over half of the Nobel prizes awarded 1995–2017 (particle physics [14%], cell biology [12.1%], atomic physics [10.9%], neuroscience [10.1%], molecular chemistry [5.3%]).<ref name="nobelprizes">{{cite news |title=Nobel prize-winning work is concentrated in minority of scientific fields |url=https://phys.org/news/2020-07-nobel-prize-winning-minority-scientific-fields.html |access-date=17 August 2020 |work=phys.org |language=en}}</ref><ref>{{cite journal |last1=Ioannidis |first1=John P. A. |last2=Cristea |first2=Ioana-Alina |last3=Boyack |first3=Kevin W. |title=Work honored by Nobel prizes clusters heavily in a few scientific fields |journal=PLOS ONE |date=29 July 2020 |volume=15 |issue=7 |page=e0234612 |doi=10.1371/journal.pone.0234612 |pmid=32726312 |pmc=7390258 |bibcode=2020PLoSO..1534612I |language=en |issn=1932-6203|doi-access=free }}</ref>
A study found that delegation of responsibility by [[policy]]-makers – a centralized authority-based top-down approach – for knowledge production and appropriate funding to science with science subsequently somehow delivering "reliable and useful knowledge to society" is too simple.<ref name="10.1007/s11016-020-00581-5"/>
Measurements show that allocation of bio-medical resources can be more strongly correlated to previous allocations and research than to [[DALY|burden of diseases]].<ref name="10.1126/science.aao0185"/>
A study suggests that "[i]f peer review is maintained as the primary mechanism of arbitration in the competitive selection of research reports and funding, then the [[scientific community]] needs to make sure it is not arbitrary".<ref name="10.1177/2515245919895419"/>
Studies indicate there to is a need to "reconsider how we measure success" {{see below|[[#Factors of success and progress]]}}.<ref name="10.1093/gigascience/giz053"/>
;Funding data
Funding information from grant databases and funding acknowledgment sections can be sources of data for scientometrics studies, e.g. for investigating or recognition of the impact of funding entities on the development of science and technology.<ref>{{cite journal |last1=Fajardo-Ortiz |first1=David |last2=Hornbostel |first2=Stefan |last3=Montenegro de Wit |first3=Maywa |last4=Shattuck |first4=Annie |title=Funding CRISPR: Understanding the role of government and philanthropic institutions in supporting academic research within the CRISPR innovation system |journal=Quantitative Science Studies |date=22 June 2022 |volume=3 |issue=2 |pages=443–456 |doi=10.1162/qss_a_00187|s2cid=235266330 }}</ref>
=====Research questions and coordination=====
{{Excerpt|Research question|Aggregated research questions and coordination}}{{Further|Research question#Aggregated research questions and coordination}}
=====Risk governance=====
{{See also|#Differential R&D}}
{{Excerpt|Biosecurity|The future|paragraphs=-2,3}}
=== Science communication and public use ===
{{See also|Science#Science and the public|#Science Education}}
It has been argued that "science has two fundamental attributes that underpin its value as a global public good: that knowledge claims and the evidence on which they are based are made openly available to scrutiny, and that the results of scientific research are communicated promptly and efficiently".<ref name="ISC1">{{cite web |title=Science as a Global Public Good |url=https://council.science/current/news/science-as-a-global-public-good/ |website=[[International Science Council]] |access-date=22 November 2021 |date=8 October 2021}}</ref> Metascientific research is exploring topics of [[science communication]] such as [[media coverage of science]], [[science journalism]] and online communication of results by science educators and scientists.<ref>{{cite book |last1=Jamieson |first1=Kathleen Hall |last2=Kahan |first2=Dan |last3=Scheufele |first3=Dietram A. |title=The Oxford Handbook of the Science of Science Communication |date=17 May 2017 |url=https://books.google.com/books?id=hSgmDwAAQBAJ&pg=PA51 |publisher=Oxford University Press |isbn=978-0190497637 |language=en}}</ref><ref>{{cite journal |last1=Grochala |first1=Rafał |title=Science communication in online media: influence of press releases on coverage of genetics and CRISPR |url=https://www.biorxiv.org/content/10.1101/2019.12.13.875278v1.abstract |page= |language=en |doi=10.1101/2019.12.13.875278 |date=16 December 2019 |s2cid=213125031 }}</ref><ref>{{cite journal |title=FRAMING ANALYSIS OF NEWS COVERAGE ON RENEWABLE ENERGYIN THE STAR ONLINE NEWS PORTAL |url=http://jestec.taylors.edu.my/Special%20Issue%20on%20SU18/SU18_04.pdf |access-date=22 November 2021}}</ref><ref>{{cite journal |last1=MacLaughlin |first1=Ansel |last2=Wihbey |first2=John |last3=Smith |first3=David |title=Predicting News Coverage of Scientific Articles |journal=Proceedings of the International AAAI Conference on Web and Social Media |date=15 June 2018 |volume=12 |issue=1 |doi=10.1609/icwsm.v12i1.14999 |s2cid=49412893 |url=https://ojs.aaai.org/index.php/ICWSM/article/view/14999 |language=en |issn=2334-0770|doi-access=free }}</ref> A study found that the "main incentive academics are offered for using social media is amplification" and that it should be "moving towards an institutional culture that focuses more on how these [or such] platforms can facilitate real engagement with research".<ref>{{cite journal |last1=Carrigan |first1=Mark |last2=Jordan |first2=Katy |title=Platforms and Institutions in the Post-Pandemic University: a Case Study of Social Media and the Impact Agenda |journal=Postdigital Science and Education |date=4 November 2021 |volume=4 |issue=2 |pages=354–372 |doi=10.1007/s42438-021-00269-x |s2cid=243760357 |language=en |issn=2524-4868|doi-access=free }}</ref> Science communication may also involve the communication of societal needs, concerns and requests to scientists.
====Alternative metrics tools====
{{anchor|altmetrics}}Alternative metrics tools can be used not only for help in assessment (performance and impact)<ref name="10.1126/science.aao0185"/> and findability, but also aggregate many of the public discussions about a scientific paper in social media such as [[reddit]], [[Science information on Wikipedia|citations on Wikipedia]], and reports about the study in the news media which can then in turn be analyzed in metascience or provided and used by related tools.<ref>{{cite journal |last1=Baykoucheva |first1=Svetla |title=Measuring attention |journal=Managing Scientific Information and Research Data |date=2015 |pages=127–136 |doi=10.1016/B978-0-08-100195-0.00014-7 |isbn=978-0081001950 |url=https://www.researchgate.net/publication/280568406}}</ref> In terms of assessment and findability, altmetrics rate publications' performance or impact by the interactions they receive through social media or other online platforms,<ref name="10.1162/qss_a_00171">{{cite journal |last1=Zagorova |first1=Olga |last2=Ulloa |first2=Roberto |last3=Weller |first3=Katrin |last4=Flöck |first4=Fabian |title="I updated the <ref>": The evolution of references in the English Wikipedia and the implications for altmetrics |journal=Quantitative Science Studies |date=12 April 2022 |volume=3 |issue=1 |pages=147–173 |doi=10.1162/qss_a_00171|s2cid=222177064 }}</ref> which can for example be used for sorting recent studies by measured impact, including before other studies are citing them. The specific procedures of established altmetrics are not transparent<ref name="10.1162/qss_a_00171"/> and the used algorithms can not be customized or altered by the user as open source software can. A study has described various limitations of altmetrics and points "toward avenues for continued research and development".<ref>{{cite journal |last1=Williams |first1=Ann E. |title=Altmetrics: an overview and evaluation |journal=Online Information Review |date=12 June 2017 |volume=41 |issue=3 |pages=311–317 |doi=10.1108/OIR-10-2016-0294}}</ref> They are also limited in their use as a primary tool for researchers to find received constructive feedback.
====Societal implications and applications====
It has been suggested that it may benefit science if "intellectual exchange—particularly regarding the societal implications and applications of science and technology—are better appreciated and incentivized in the future".<ref name="10.1126/science.aao0185"/>
====Knowledge integration====
Primary studies "without context, comparison or summary are ultimately of limited value" and various types{{additional citation needed|date=January 2023}} of research syntheses and summaries integrate primary studies.<ref name="10.1038/nature25753"/> Progress in key social-ecological challenges of the global environmental agenda is "hampered by a lack of [[knowledge integration|integration]] and synthesis of existing scientific evidence", with a "fast-increasing volume of data", compartmentalized information and generally unmet evidence synthesis challenges.<ref>{{cite journal |last1=Balbi |first1=Stefano |last2=Bagstad |first2=Kenneth J. |last3=Magrach |first3=Ainhoa |last4=Sanz |first4=Maria Jose |last5=Aguilar-Amuchastegui |first5=Naikoa |last6=Giupponi |first6=Carlo |last7=Villa |first7=Ferdinando |title=The global environmental agenda urgently needs a semantic web of knowledge |journal=Environmental Evidence |date=17 February 2022 |volume=11 |issue=1 |pages=5 |doi=10.1186/s13750-022-00258-y |s2cid=246872765 |language=en |issn=2047-2382|hdl=10278/5023700 |hdl-access=free |doi-access=free }}</ref> According to Khalil, researchers are facing the problem of [[information overload|too many papers]] – e.g. in March 2014 more than 8,000 papers were submitted to [[arXiv]] – and to "keep up with the huge amount of literature, researchers use reference manager software, they make summaries and [[note-taking|notes]], and they rely on review papers to provide an overview of a particular topic". He notes that review papers are usually (only)" for topics in which many papers were written already, and they can get outdated quickly" and suggests "wiki-review papers" that get continuously updated with new studies on a topic and summarize many studies' results and suggest future research.<ref name="10.1007/978-3-319-20717-9_11"/> A study suggests that if a scientific publication is being cited in a Wikipedia article this could potentially be considered as an indicator of some form of impact for this publication,<ref name="10.1162/qss_a_00171"/> for example as this may, over time, indicate that the reference has contributed to a high-level of summary of the given topic.
====Science journalism====
[[Science journalism|Science journalists]] play an important role in the scientific ecosystem and in science communication to the public and need to "know how to use, relevant information when deciding whether to trust a research finding, and whether and how to report on it", vetting the findings that get transmitted to the public.<ref>{{cite web |title=How Do Science Journalists Evaluate Psychology Research? |url=https://psyarxiv.com/26kr3/ |website=psyarxiv.com}}</ref>
=== Science education ===
Some studies investigate [[science education]], e.g. the teaching about selected [[scientific controversy|scientific controversies]]<ref>{{cite journal |last1=Dunlop |first1=Lynda |last2=Veneu |first2=Fernanda |title=Controversies in Science |journal=Science & Education |date=1 September 2019 |volume=28 |issue=6 |pages=689–710 |doi=10.1007/s11191-019-00048-y |s2cid=255016078 |language=en |issn=1573-1901|url=https://eprints.whiterose.ac.uk/146183/1/LD_final_edits.docx }}</ref> and historical discovery process of major scientific conclusions,<ref>{{cite book |last1=Norsen |first1=Travis |title=How Should Humanity Steer the Future? |chapter=Back to the Future: Crowdsourcing Innovation by Refocusing Science Education |series=The Frontiers Collection |date=2016 |pages=85–95 |doi=10.1007/978-3-319-20717-9_9|isbn=978-3-319-20716-2 }}</ref> and common [[scientific misconceptions]].<ref>{{cite journal |last1=Bschir |first1=Karim |title=How to make sense of science: Mano Singham: The great paradox of science: why its conclusions can be relied upon even though they cannot be proven. Oxford: Oxford University Press, 2019, 332 pp, £ 22.99 HB |journal=Metascience |date=July 2021 |volume=30 |issue=2 |pages=327–330 |doi=10.1007/s11016-021-00654-z|s2cid=254792908 }}</ref> Education can also be a topic more generally such as how to improve the quality of scientific outputs and reduce the time needed before scientific work or how to enlarge and retain various scientific workforces.
====Science misconceptions and anti-science attitudes====
{{Further|Science#Anti-science attitudes|Media literacy|Misinformation#Countermeasures}}
Many students have misconceptions about what science is and how it works.<ref>{{cite web |title=Correcting misconceptions - Understanding Science |url=https://undsci.berkeley.edu/for-educators/prepare-and-plan/correcting-misconceptions/ |access-date=25 January 2023 |date=21 April 2022}}</ref> [[Anti-science]] attitudes and beliefs are also a subject of research.<ref>{{cite journal |last1=Philipp-Muller |first1=Aviva |last2=Lee |first2=Spike W. S. |last3=Petty |first3=Richard E. |title=Why are people antiscience, and what can we do about it? |journal=Proceedings of the National Academy of Sciences |date=26 July 2022 |volume=119 |issue=30 |pages=e2120755119 |doi=10.1073/pnas.2120755119 |pmid=35858405 |pmc=9335320 |bibcode=2022PNAS..11920755P |language=en |issn=0027-8424}}</ref><ref>{{cite web |title=The 4 bases of anti-science beliefs – and what to do about them |url=https://scienmag.com/the-4-bases-of-anti-science-beliefs-and-what-to-do-about-them/ |website=SCIENMAG: Latest Science and Health News |access-date=25 January 2023 |date=11 July 2022}}</ref> Hotez suggests antiscience "has emerged as a dominant and highly lethal force, and one that threatens global security", and that there is a need for "new infrastructure" that mitigates it.<ref>{{cite web |last1=Hotez |first1=Peter J. |title=The Antiscience Movement Is Escalating, Going Global and Killing Thousands |url=https://www.scientificamerican.com/article/the-antiscience-movement-is-escalating-going-global-and-killing-thousands/ |website=Scientific American |access-date=25 January 2023 |language=en}}</ref>
=== Evolution of sciences ===
==== Scientific practice ====
[[File:The number of authors of research articles in six journals through time.jpg|thumb|Number of authors of research articles in six journals through time<ref name="10.1098/rspb.2019.2047"/>]]
[[File:Papers and patents are using narrower portions of existing knowledge.png|thumb|Trends of diversity of work cited, mean number of self-citations, and mean age of cited work may indicate papers are using "narrower portions of existing knowledge".<ref name="10.1038/s41586-022-05543-x"/>]]
Metascience can investigate how scientific processes evolve over time. A study found that teams are growing in size, "increasing by an average of 17% per decade".<ref name="10.1126/science.aao0185"/> {{see below|[[#LaborAdvantage|labor advantage]] below}}
[[File:ArXiv's yearly submission rate plot.jpg|thumb|[[ArXiv]]'s yearly submission rate growth over 30 years<ref>{{cite journal |last1=Ginsparg |first1=Paul |title=Lessons from arXiv's 30 years of information sharing |journal=Nature Reviews Physics |date=September 2021 |volume=3 |issue=9 |pages=602–603 |language=en |doi=10.1038/s42254-021-00360-z|pmid=34377944 |pmc=8335983 }}</ref>]]
It was found that prevalent forms of non-[[open access]] publication and prices charged for many conventional journals – even for publicly funded papers – are unwarranted, unnecessary – or suboptimal – and detrimental barriers to scientific progress.<ref name="ISC1"/><ref>{{cite news |title=Nature Journals To Charge Authors Hefty Fee To Make Scientific Papers Open Access |url=https://www.iflscience.com/editors-blog/nature-journals-to-charge-authors-hefty-fee-to-make-scientific-papers-open-access/ |access-date=22 November 2021 |work=IFLScience |language=en}}</ref><ref>{{cite news |title=Harvard University says it can't afford journal publishers' prices |url=https://www.theguardian.com/science/2012/apr/24/harvard-university-journal-publishers-prices |access-date=22 November 2021 |work=The Guardian |date=24 April 2012 |language=en}}</ref><ref>{{cite journal |last1=Van Noorden |first1=Richard |title=Open access: The true cost of science publishing |journal=Nature |date=1 March 2013 |volume=495 |issue=7442 |pages=426–429 |doi=10.1038/495426a |pmid=23538808 |bibcode=2013Natur.495..426V |s2cid=27021567 |language=en |issn=1476-4687|doi-access=free }}</ref> Open access can save considerable amounts of financial resources, which could be used otherwise, and level the playing field for researchers in developing countries.<ref>{{cite journal |last1=Tennant |first1=Jonathan P. |last2=Waldner |first2=François |last3=Jacques |first3=Damien C. |last4=Masuzzo |first4=Paola |last5=Collister |first5=Lauren B. |last6=Hartgerink |first6=Chris. H. J. |title=The academic, economic and societal impacts of Open Access: an evidence-based review |journal=F1000Research |date=21 September 2016 |volume=5 |pages=632 |doi=10.12688/f1000research.8460.3|pmid=27158456 |pmc=4837983 |doi-access=free }}</ref> There are substantial expenses for subscriptions, gaining access to specific studies, and for [[article processing charge]]s. ''[[Paywall: The Business of Scholarship]]'' is a documentary on such issues.<ref>{{cite news |title=Paywall: The business of scholarship review – analysis of a scandal |url=https://www.newscientist.com/article/2181744-paywall-the-business-of-scholarship-review-analysis-of-a-scandal/ |access-date=28 January 2023 |work=New Scientist}}</ref>
Another topic are the established styles of scientific communication (e.g. long text-form studies and reviews) and the [[scientific publishing]] practices – there are concerns about a "glacial pace" of conventional publishing.<ref>{{cite journal |last1=Powell |first1=Kendall |title=Does it take too long to publish research? |journal=Nature |date=1 February 2016 |volume=530 |issue=7589 |pages=148–151 |language=en |doi=10.1038/530148a|pmid=26863966 |bibcode=2016Natur.530..148P |s2cid=1013588 |doi-access=free }}</ref> The use of [[preprint]]-servers to publish study-drafts early is increasing and [[open peer review]],<ref>{{cite web |title=Open peer review: bringing transparency, accountability, and inclusivity to the peer review process |url=https://blogs.lse.ac.uk/impactofsocialsciences/2017/09/13/open-peer-review-bringing-transparency-accountability-and-inclusivity-to-the-peer-review-process/ |website=Impact of Social Sciences |access-date=28 January 2023 |date=13 September 2017}}</ref> new tools to screen studies,<ref>{{cite magazine |last1=Dattani |first1=Saloni |title=The Pandemic Uncovered Ways to Speed Up Science |url=https://www.wired.com/story/covid-19-open-science-public-health-data/ |access-date=28 January 2023 |magazine=Wired}}</ref> and improved matching of submitted manuscripts to reviewers<ref>{{cite web |title=Speeding up the publication process at PLOS ONE |url=https://everyone.plos.org/2019/05/13/publication_timings_2019/ |website=EveryONE |access-date=28 January 2023 |date=13 May 2019}}</ref> are among the proposals to speed up publication.
==== Science overall and intrafield developments ====
[[File:Academic papers by discipline (visualization of 2012–2021 OpenAlex data; v2).png|thumb|A visualization of scientific outputs by [[Branches of science|field]] in OpenAlex.<ref name="openalexa">{{cite web |title=Open Alex Data Evolution |url=https://observablehq.com/@napsternxg/open-alex-data-evolution |website=observablehq.com |date=8 February 2022 |access-date=18 February 2022}}</ref><br />A study can be part of multiple fields{{clarify|date=January 2023}} and lower numbers of papers is not necessarily detrimental<ref name="publishless"/> for fields.]]
[[File:Change of number of scientific papers by field (visualization of 2012–2021 OpenAlex data).png|thumb|Change of number of scientific papers by field according to OpenAlex<ref name="openalexa"/>]]
[[File:Graph of number of papers by year in PubMed containing "coronavirus" up to 2019.png|thumb|Number of PubMed search results for "coronavirus" by year from 1949 to 2020]]
Studies have various kinds of [[metadata]] which can be utilized, complemented and made accessible in useful ways. [[Our Research#OpenAlex|OpenAlex]] is a free online index of over 200 million scientific documents that integrates and provides metadata such as sources, [[scientific citation|citation]]s, [[Academic authorship|author information]], [[scientific field]]s and [[Subject indexing|research topics]]. Its [[API]] and open source website can be used for metascience, [[scientometrics]] and novel tools that query this [[Semantic Web|semantic]] web of [[Academic paper|papers]].<ref>{{cite news |last1=Singh Chawla |first1=Dalmeet |title=Massive open index of scholarly papers launches |url=https://www.nature.com/articles/d41586-022-00138-y |access-date=14 February 2022 |journal=Nature |date=24 January 2022 |language=en |doi=10.1038/d41586-022-00138-y}}</ref><ref>{{cite news |title=OpenAlex: The Promising Alternative to Microsoft Academic Graph |url=https://library.smu.edu.sg/topics-insights/openalex-promising-alternative-microsoft-academic-graph |access-date=14 February 2022 |work=Singapore Management University (SMU) |language=en}}</ref><ref>{{cite web |title=OpenAlex Documentation |url=https://docs.openalex.org/ |access-date=18 February 2022}}</ref> {{anchor|Scholia}}Another project under development, [[Scholia (Wikidata project)|Scholia]], uses [[metadata]] of scientific publications for various visualizations and aggregation features such as providing a simple user interface summarizing literature about a specific feature of the SARS-CoV-2 virus using [[Wikidata]]'s "main subject" property.<ref name="10.1186/s12915-020-00940-y">{{cite journal |last1=Waagmeester |first1=Andra |last2=Willighagen |first2=Egon L. |last3=Su |first3=Andrew I. |last4=Kutmon |first4=Martina |last5=Gayo |first5=Jose Emilio Labra |last6=Fernández-Álvarez |first6=Daniel |last7=Groom |first7=Quentin |last8=Schaap |first8=Peter J. |last9=Verhagen |first9=Lisa M. |last10=Koehorst |first10=Jasper J. |title=A protocol for adding knowledge to Wikidata: aligning resources on human coronaviruses |journal=BMC Biology |date=22 January 2021 |volume=19 |issue=1 |page=12 |doi=10.1186/s12915-020-00940-y |pmid=33482803 |pmc=7820539 |issn=1741-7007 |doi-access=free }}</ref>
=====Subject-level resolutions=====
Beyond metadata explicitly assigned to studies by humans, [[natural language processing]] and AI can be used to assign research publications [[Subject indexing|to topics]] – one study investigating the impact of science awards used such to associate a paper's text (not just keywords) with the linguistic content of Wikipedia's scientific topics pages ("pages are created and updated by scientists and users through crowdsourcing"), creating meaningful and plausible classifications of high-fidelity scientific topics for further analysis or navigability.<ref>{{cite journal |last1=Jin |first1=Ching |last2=Ma |first2=Yifang |last3=Uzzi |first3=Brian |title=Scientific prizes and the extraordinary growth of scientific topics |journal=Nature Communications |date=5 October 2021 |volume=12 |issue=1 |pages=5619 |doi=10.1038/s41467-021-25712-2 |pmid=34611161 |pmc=8492701 |arxiv=2012.09269 |bibcode=2021NatCo..12.5619J |language=en |issn=2041-1723}}</ref>
===== Growth or stagnation of science overall =====
{{Further|Scientific method#History|Philosophy of science}}
[[File:Biomarker Publications in Scholia.png|thumb|Rough trend of scholarly publications about [[biomarker (medicine)|biomarkers]] according to Scholia; biomarker-related publications may not follow closely the number of viable biomarkers.<ref>{{cite web |title=Scholia – biomarker |url=https://scholia.toolforge.org/chemical-class/Q864574#publications-per-year |website= |access-date=28 January 2023}}</ref>]]
[[File:CD index of high quality science over time.png|thumb|The CD index for papers published in ''Nature'', ''PNAS'', and ''Science'' and Nobel-Prize-winning papers<ref name="10.1038/s41586-022-05543-x"/>]]
[[File:Decline of disruptive science and technology (based on the CD index).png|thumb|The CD index may indicate a "decline of disruptive science and technology".<ref name="10.1038/s41586-022-05543-x"/>]]
Metascience research is investigating the growth of science overall, using e.g. data on the number of publications in [[bibliographic database]]s. A study found segments with different growth rates appear related to phases of "economic (e.g., industrialization)" – money is considered as necessary input to the science system – "and/or political developments (e.g., Second World War)". It also confirmed a recent exponential growth in the volume of scientific literature and calculated an average doubling period of 17.3 years.<ref>{{cite journal |last1=Bornmann |first1=Lutz |last2=Haunschild |first2=Robin |last3=Mutz |first3=Rüdiger |title=Growth rates of modern science: a latent piecewise growth curve approach to model publication numbers from established and new literature databases |journal=Humanities and Social Sciences Communications |date=7 October 2021 |volume=8 |issue=1 |pages=1–15 |doi=10.1057/s41599-021-00903-w |s2cid=229156128 |language=en |issn=2662-9992|arxiv=2012.07675 }}</ref>
However, others have pointed out that is difficult to measure scientific progress in meaningful ways, partly because it's hard to accurately evaluate how important any given scientific discovery is. A variety of perspectives of the trajectories of science overall (impact, number of major discoveries, etc) have been described in books and articles, including that science is becoming harder (per dollar or hour spent), that if science "slowing today, it is because science has remained too focused on established fields", that papers and patents are increasingly less likely to be "disruptive" in terms of breaking with the past as measured by the "CD index",<ref name="10.1038/s41586-022-05543-x">{{cite journal |last1=Park |first1=Michael |last2=Leahey |first2=Erin |last3=Funk |first3=Russell J. |title=Papers and patents are becoming less disruptive over time |journal=Nature |date=January 2023 |volume=613 |issue=7942 |pages=138–144 |doi=10.1038/s41586-022-05543-x |pmid=36600070 |bibcode=2023Natur.613..138P |s2cid=255466666 |language=en |issn=1476-4687|doi-access=free }}</ref> and that there is a great [[Progress#Stagnation|stagnation]] – possibly as part of a larger trend<ref name="outofideas">{{cite web |last1=Thompson |first1=Derek |title=America Is Running on Fumes |url=https://www.theatlantic.com/ideas/archive/2021/12/america-innovation-film-science-business/620858/ |website=The Atlantic |access-date=27 January 2023 |language=en |date=1 December 2021}}</ref> – whereby e.g. "things haven't changed nearly as much since the 1970s" when excluding the computer and the Internet.
Better understanding of potential slowdowns according to some measures could be a major opportunity to improve humanity's future.<ref>{{cite web |last1=Collison |first1=Patrick |last2=Nielsen |first2=Michael |title=Science Is Getting Less Bang for Its Buck |url=https://www.theatlantic.com/science/archive/2018/11/diminishing-returns-science/575665/ |website=The Atlantic |access-date=27 January 2023 |language=en |date=16 November 2018}}</ref> For example, emphasis on citations in the measurement of scientific productivity, information overloads,<ref name="outofideas"/> reliance on a narrower set of existing knowledge (which may include narrow [[Academic specialization|specialization]] and related contemporary practices) {{tooltip|based on three "use of previous knowledge"-indicators|"the diversity of work cited, mean number of self-citations and mean age of work cited"}},<ref name="10.1038/s41586-022-05543-x"/> and risk-avoidant funding structures<ref name="stagnation"/> may have "toward incremental science and away from [[Exploratory research|exploratory]] projects that are more likely to fail".<ref name="w26752">{{cite journal |last1=Bhattacharya |first1=Jay |last2=Packalen |first2=Mikko |title=Stagnation and Scientific Incentives |date=February 2020 |url=https://www.nber.org/system/files/working_papers/w26752/w26752.pdf |publisher=National Bureau of Economic Research}}</ref> The study that introduced the "CD index" suggests the overall number of papers has risen while the total of "highly disruptive" papers as measured by the index hasn't (notably, the [[1998 in science#Astronomy and space exploration|1998]] discovery of the [[accelerating expansion of the universe]] has a CD index of 0). Their results also suggest scientists and inventors "may be struggling to keep up with the pace of knowledge expansion".<ref>{{cite news |last1=Tejada |first1=Patricia Contreras |title=With fewer disruptive studies, is science becoming an echo chamber? |url=https://www.advancedsciencenews.com/with-fewer-disruptive-studies-is-science-becoming-an-echo-chamber/ |access-date=15 February 2023 |work=Advanced Science News |date=13 January 2023 |archive-date=15 February 2023 |archive-url=https://web.archive.org/web/20230215233007/https://www.advancedsciencenews.com/with-fewer-disruptive-studies-is-science-becoming-an-echo-chamber/ |url-status=live }}</ref><ref name="10.1038/s41586-022-05543-x"/>
Various ways of measuring "novelty" of studies, novelty metrics,<ref name="w26752"/> have been proposed to balance a potential anti-novelty bias – such as textual analysis<ref name="w26752"/> or measuring whether it makes first-time-ever combinations of referenced journals, taking into account the difficulty.<ref>{{cite journal |last1=Wang |first1=Jian |last2=Veugelers |first2=Reinhilde |last3=Stephan |first3=Paula |title=Bias against novelty in science: A cautionary tale for users of bibliometric indicators |journal=Research Policy |date=1 October 2017 |volume=46 |issue=8 |pages=1416–1436 |doi=10.1016/j.respol.2017.06.006 |language=en |issn=0048-7333|url=https://lirias.kuleuven.be/handle/123456789/590071 }}</ref> Other approaches include pro-actively funding risky projects.<ref name="10.1126/science.aao0185"/>
=== Topic mapping ===
Science maps could show main interrelated topics within a certain scientific domain, their change over time, and their key actors (researchers, institutions, journals). They may help find factors determine the emergence of new scientific fields and the development of interdisciplinary areas and could be relevant for science policy purposes.<ref>{{cite web |last1=Petrovich |first1=Eugenio |title=Science mapping |url=https://www.isko.org/cyclo/science_mapping#9 |website=www.isko.org |access-date=27 January 2023 |date=2020}}</ref> [[Theory of change|Theories of scientific change]] could guide "the exploration and interpretation of visualized intellectual structures and dynamic patterns".<ref>{{cite journal |last1=Chen |first1=Chaomei |title=Science Mapping: A Systematic Review of the Literature |journal=Journal of Data and Information Science |date=21 March 2017 |volume=2 |issue=2 |pages=1–40 |doi=10.1515/jdis-2017-0006 |s2cid=57737772 |url=https://www.researchgate.net/publication/313991204|doi-access=free }}</ref> The maps can show the intellectual, social or conceptual structure of a research field.<ref>{{cite journal |last1=Gutiérrez-Salcedo |first1=M. |last2=Martínez |first2=M. Ángeles |last3=Moral-Munoz |first3=J. A. |last4=Herrera-Viedma |first4=E. |last5=Cobo |first5=M. J. |title=Some bibliometric procedures for analyzing and evaluating research fields |journal=Applied Intelligence |date=1 May 2018 |volume=48 |issue=5 |pages=1275–1287 |doi=10.1007/s10489-017-1105-y |s2cid=254227914 |language=en |issn=1573-7497}}</ref> Beyond visual maps, expert [[Survey (human research)|survey]]-based studies and similar approaches could identify understudied or neglected societally important areas, topic-level problems (such as stigma or dogma), or potential misprioritizations.{{additional citation needed|date=January 2023}} Examples of such are [[policy studies|studies]] about [[policy]] in relation to public health<ref>{{cite journal |last1=Navarro |first1=V. |title=Politics and health: a neglected area of research |journal=The European Journal of Public Health |date=31 March 2008 |volume=18 |issue=4 |pages=354–355 |doi=10.1093/eurpub/ckn040|pmid=18524802 }}</ref> and the social science of climate change mitigation<ref name="10.1016/j.erss.2019.101349">{{Cite journal|last1=Overland|first1=Indra|last2=Sovacool|first2=Benjamin K.|date=1 April 2020|title=The misallocation of climate research funding|journal=Energy Research & Social Science|language=en|volume=62|pages=101349|doi=10.1016/j.erss.2019.101349|issn=2214-6296|doi-access=free}}</ref> where it has been estimated that only 0.12% of all funding for climate-related research is spent on such despite the most urgent puzzle at the current juncture being working out how to mitigate climate change, whereas the natural science of climate change is already well established.<ref name="10.1016/j.erss.2019.101349"/>
There are also studies that map a scientific field or a topic such as the study of the use of research evidence [[evidence-based policy|in policy]] and [[evidence-based practice|practice]], partly using [[Survey (human research)|surveys]].<ref>{{cite journal |last1=Farley-Ripple |first1=Elizabeth N. |last2=Oliver |first2=Kathryn |last3=Boaz |first3=Annette |title=Mapping the community: use of research evidence in policy and practice |journal=Humanities and Social Sciences Communications |date=7 September 2020 |volume=7 |issue=1 |pages=1–10 |doi=10.1057/s41599-020-00571-2 |language=en |issn=2662-9992|doi-access=free}}</ref>
=== Controversies, current debates and disagreement ===
{{See also|#scite.ai|#Topic mapping}}
[[File:Disagreement in the scientific literature by field.jpg|thumb|Percent of all citances in each field that contain signals of disagreement<ref name="10.7554/eLife.72737"/>]]
Some research is investigating [[scientific controversy]] or controversies, and may identify currently ongoing major debates (e.g. open questions), and [[disagreement]] between scientists or studies.{{additional citation needed|date=January 2023}} One study suggests the level of disagreement was highest in the [[social sciences]] and [[humanities]] (0.61%), followed by biomedical and health sciences (0.41%), life and earth sciences (0.29%); physical sciences and engineering (0.15%), and mathematics and computer science (0.06%).<ref name="10.7554/eLife.72737">{{cite journal |last1=Lamers |first1=Wout S |last2=Boyack |first2=Kevin |last3=Larivière |first3=Vincent |last4=Sugimoto |first4=Cassidy R |last5=van Eck |first5=Nees Jan |last6=Waltman |first6=Ludo |last7=Murray |first7=Dakota |title=Investigating disagreement in the scientific literature |journal=eLife |date=24 December 2021 |volume=10 |pages=e72737 |doi=10.7554/eLife.72737 |pmid=34951588 |pmc=8709576 |issn=2050-084X |doi-access=free }}</ref> Such research may also show, where the disagreements are, especially if they cluster, including visually such as with cluster diagrams.
=== Challenges of interpretation of pooled results ===
{{See also|Clinical trial#Trial design}}
{{Further|Meta-analysis#Challenges}}
Studies about a specific [[research question|research question or research topic]] are often reviewed in the form of higher-level overviews in which results from various studies are integrated, compared, critically analyzed and interpreted. Examples of such works are [[scientific review]]s and [[meta-analysis|meta-analyses]]. These and related practices<!-- or their intended purposes etc--> face various challenges and are a subject of metascience.
Various issues with included or available studies such as, for example, heterogeneity of methods used may lead to faulty conclusions of the meta-analysis.<ref>{{cite journal |last1=Stone |first1=Dianna L. |last2=Rosopa |first2=Patrick J. |title=The Advantages and Limitations of Using Meta-analysis in Human Resource Management Research |journal=Human Resource Management Review |date=1 March 2017 |volume=27 |issue=1 |pages=1–7 |doi=10.1016/j.hrmr.2016.09.001 |language=en |issn=1053-4822}}</ref><!-- which can for example preempt further research or misplace research focus.-->
=== Knowledge integration and living documents ===
Various problems require swift [[Knowledge integration|integration]] of new and existing science-based knowledge. Especially setting where there are a large number of loosely related projects and initiatives benefit from a common ground or "commons".<ref name="10.1186/s12915-020-00940-y"/>
Evidence synthesis can be applied to important and, notably, both relatively urgent and certain [[List of global issues|global challenges]]: "[[Climate change mitigation#Overviews, integration and comparisons of measures|climate change]], energy transitions, biodiversity loss, [[antimicrobial resistance]], poverty eradication and so on". It was suggested that a better system would keep summaries of research evidence up to date via living systematic reviews – e.g. as [[living document]]s. While the number of scientific papers and data (or information and online knowledge) [[Information explosion|has risen substantially]],{{additional citation needed|date=February 2022}} the number of published academic systematic reviews has risen from "around 6,000 in 2011 to more than 45,000 in 2021".<ref>{{cite journal |last1=Elliott |first1=Julian |last2=Lawrence |first2=Rebecca |last3=Minx |first3=Jan C. |last4=Oladapo |first4=Olufemi T. |last5=Ravaud |first5=Philippe |last6=Tendal Jeppesen |first6=Britta |last7=Thomas |first7=James |last8=Turner |first8=Tari |last9=Vandvik |first9=Per Olav |last10=Grimshaw |first10=Jeremy M. |title=Decision makers need constantly updated evidence synthesis |journal=Nature |date=December 2021 |volume=600 |issue=7889 |pages=383–385 |doi=10.1038/d41586-021-03690-1| pmid=34912079 |bibcode=2021Natur.600..383E |s2cid=245220047 |language=en|doi-access=free}}</ref> An [[evidence-based]] approach is important for progress in science, [[policy]], medical and other practices. For example, meta-analyses can quantify what is known and identify what is not yet known<ref name="10.1038/nature25753"/> and place "truly innovative and highly [[interdisciplinary]] ideas" into the context of established knowledge which may enhance their impact.<ref name="10.1126/science.aao0185"/>
=== Factors of success and progress ===
{{See also|#Growth or stagnation of science overall}}
It has been hypothesized that a deeper understanding of factors behind successful science could "enhance prospects of science as a whole to more effectively address societal problems".<ref name="10.1126/science.aao0185">{{cite journal |last1=Fortunato |first1=Santo |last2=Bergstrom |first2=Carl T. |last3=Börner |first3=Katy |last4=Evans |first4=James A. |last5=Helbing |first5=Dirk |last6=Milojević |first6=Staša |last7=Petersen |first7=Alexander M. |last8=Radicchi |first8=Filippo |last9=Sinatra |first9=Roberta |last10=Uzzi |first10=Brian |last11=Vespignani |first11=Alessandro |last12=Waltman |first12=Ludo |last13=Wang |first13=Dashun |last14=Barabási |first14=Albert-László |title=Science of science |journal=Science |date=2 March 2018 |volume=359 |issue=6379 |page=eaao0185 |doi=10.1126/science.aao0185 |pmid=29496846 |pmc=5949209 |url=https://www.researchgate.net/publication/323502497 |access-date=22 November 2021}}</ref>
;Novel ideas and disruptive scholarship
Two metascientists reported that "structures fostering disruptive scholarship and focusing attention on novel [[idea]]s" could be important as in a growing [[scientific field]] [[scientific citation|citation flows]] disproportionately consolidate to already well-cited papers, possibly slowing<!--stagnating--> and inhibiting canonical [[Progress#Scientific progress|progress]].<!--https://www.theatlantic.com/ideas/archive/2021/11/grants-american-scientific-revolution/620609/--><ref>{{cite web |last1=Snyder |first1=Alison |title=New ideas are struggling to emerge from the sea of science |date=14 October 2021 |url=https://www.axios.com/science-new-ideas-dbe29601-010c-411a-b79d-bbd1388ec5a0.html |publisher=Axios |access-date=15 November 2021}}</ref><ref>{{cite journal |last1=Chu |first1=Johan S. G. |last2=Evans |first2=James A. |title=Slowed canonical progress in large fields of science |journal=Proceedings of the National Academy of Sciences |date=12 October 2021 |volume=118 |issue=41 |page=e2021636118 |doi=10.1073/pnas.2021636118 |pmid=34607941 |pmc=8522281 |bibcode=2021PNAS..11821636C |language=en |issn=0027-8424|doi-access=free }}</ref> A study concluded that to enhance impact of truly innovative and highly interdisciplinary novel ideas, they should be placed in the context of established knowledge.<ref name="10.1126/science.aao0185"/>
;Mentorship, partnerships and social factors
Other researchers reported that the most successful – in terms of "likelihood of [[Lists of science and technology awards|prizewinning]], National Academy of Science (NAS) induction, or superstardom" – [[Mentorship|protégés studied under mentors]] who published [[research]] for which they were conferred a prize after the protégés' mentorship. Studying original topics rather than these mentors' research-topics was also positively associated with success.<ref>{{cite news |title=Sharing of tacit knowledge is most important aspect of mentorship, study finds |url=https://phys.org/news/2020-06-tacit-knowledge-important-aspect-mentorship.html |access-date=4 July 2020 |work=phys.org |language=en}}</ref><ref>{{cite journal |last1=Ma |first1=Yifang |last2=Mukherjee |first2=Satyam |last3=Uzzi |first3=Brian |title=Mentorship and protégé success in STEM fields |journal=Proceedings of the National Academy of Sciences |date=23 June 2020 |volume=117 |issue=25 |pages=14077–14083 |doi=10.1073/pnas.1915516117 |pmid=32522881 |pmc=7322065 |bibcode=2020PNAS..11714077M |language=en |issn=0027-8424|doi-access=free }}</ref> Highly productive partnerships are also a topic of research – e.g. "super-ties" of frequent co-authorship of two individuals who can complement skills, likely also the result of other factors such as mutual trust, conviction, commitment and fun.<ref>{{cite news |title=Science of Science authors hope to spark conversations about the scientific enterprise |url=https://phys.org/news/2018-03-science-authors-conversations-scientific-enterprise.html |access-date=28 January 2023 |work=phys.org |language=en}}</ref><ref name="10.1126/science.aao0185"/>
;Study of successful scientists and processes, general skills and activities
The emergence or origin of ideas by successful scientists is also a topic of research, for example reviewing existing ideas on how [[Gregor Mendel|Mendel]] made his [[scientific discovery|discoveries]],<ref>{{cite journal |last1=van Dijk |first1=Peter J. |last2=Jessop |first2=Adrienne P. |last3=Ellis |first3=T. H. Noel |title=How did Mendel arrive at his discoveries? |journal=Nature Genetics |date=July 2022 |volume=54 |issue=7 |pages=926–933 |doi=10.1038/s41588-022-01109-9 |pmid=35817970 |s2cid=250454204 |language=en |issn=1546-1718}}</ref> – or more generally, the process of discovery by scientists. Science is a "multifaceted process of appropriation, [[copying]], extending, or combining ideas and [[invention]]s" [and other types of knowledge or information], and not an isolated process.<ref name="10.1126/science.aao0185"/> There are also few studies investigating scientists' habits, common modes of thinking, reading habits, use of information sources, [[digital literacy]] skills, and [[workflow]]s.<ref>{{cite journal |last1=Root-Bernstein |first1=Robert S. |last2=Bernstein |first2=Maurine |last3=Garnier |first3=Helen |title=Correlations Between Avocations, Scientific Style, Work Habits, and Professional Impact of Scientists |journal=Creativity Research Journal |date=1 April 1995 |volume=8 |issue=2 |pages=115–137 |doi=10.1207/s15326934crj0802_2 |issn=1040-0419}}</ref><ref>{{cite journal |last1=Ince |first1=Sharon |last2=Hoadley |first2=Christopher |last3=Kirschner |first3=Paul A. |title=A qualitative study of social sciences faculty research workflows |journal=Journal of Documentation |date=1 January 2022 |volume=78 |issue=6 |pages=1321–1337 |doi=10.1108/JD-08-2021-0168 |s2cid=247078086 |issn=0022-0418}}</ref><ref>{{cite web |last1=Nassi-Calò |first1=Lilian |title=Researchers reading habits for scientific literature {{!}} SciELO in Perspective |url=https://blog.scielo.org/en/2014/04/03/researchers-reading-habits-for-scientific-literature/ |access-date=25 February 2023 |date=3 April 2014}}</ref><ref>{{cite news |last1=Van Noorden |first1=Richard |title=Scientists may be reaching a peak in reading habits |url=https://www.nature.com/articles/nature.2014.14658 |access-date=25 February 2023 |journal=Nature |date=3 February 2014 |language=en |doi=10.1038/nature.2014.14658}}</ref><ref>{{cite journal |last1=Arshad |first1=Alia |last2=Ameen |first2=Kanwal |title=Comparative analysis of academic scientists, social scientists and humanists' scholarly information seeking habits |journal=The Journal of Academic Librarianship |date=1 January 2021 |volume=47 |issue=1 |pages=102297 |doi=10.1016/j.acalib.2020.102297 |s2cid=229433047 |language=en |issn=0099-1333}}</ref>
;Labor advantage
{{anchor|LaborAdvantage}}A study theorized that in many disciplines, larger scientific productivity or success by [[College and university rankings|elite universities]] can be explained by their larger pool of available funded laborers.<ref>{{cite news |title=Why it pays to join a big research group if you want to be more scientifically productive |url=https://physicsworld.com/a/why-it-pays-to-join-a-big-research-group-if-you-want-to-be-more-scientifically-productive/ |access-date=13 December 2022 |work=Physics World |date=24 November 2022}}</ref><ref>{{cite journal |last1=Zhang |first1=Sam |last2=Wapman |first2=K. Hunter |last3=Larremore |first3=Daniel B. |last4=Clauset |first4=Aaron |title=Labor advantages drive the greater productivity of faculty at elite universities |journal=Science Advances |date=16 November 2022 |volume=8 |issue=46 |pages=eabq7056 |doi=10.1126/sciadv.abq7056 |pmid=36399560 |pmc=9674273 |arxiv=2204.05989 |bibcode=2022SciA....8.7056Z |language=en |issn=2375-2548}}</ref>{{elucidate|date=January 2023}}
;Ultimate impacts
Success (in science) is often measured in terms of metrics like citations, not in terms of the eventual or potential impact on lives and society, which awards sometimes do.{{additional citation needed|date=January 2023}} Problems with such metrics are roughly outlined elsewhere in this article and include that [[scientific review|reviews]] replace citations to primary studies.<ref name="10.1038/nature25753">{{cite journal |last1=Gurevitch |first1=Jessica |last2=Koricheva |first2=Julia |last3=Nakagawa |first3=Shinichi |last4=Stewart |first4=Gavin |title=Meta-analysis and the science of research synthesis |journal=Nature |date=March 2018 |volume=555 |issue=7695 |pages=175–182 |doi=10.1038/nature25753 |pmid=29517004 |bibcode=2018Natur.555..175G |s2cid=3761687 |language=en |issn=1476-4687}}</ref> There are also proposals for changes to the academic incentives systems that increase the recognition of societal impact in the research process.<ref>{{cite web |title=Academic Incentives and Research Impact: Developing Reward and Recognition Systems to Better People's Lives |url=https://sfdora.org/resource/academic-incentives-and-research-impact-developing-reward-and-recognition-systems-to-better-peoples-lives/ |website=DORA |access-date=28 January 2023}}</ref>
;Progress studies
A proposed field of "Progress Studies" could investigate how scientists (or funders or evaluators of scientists) should be acting, "figuring out interventions" and study [[progress]] itself.<ref>{{cite web |last1=Collison |first1=Patrick|last2=Cowen|first2=Tyler |title=We Need a New Science of Progress |url=https://www.theatlantic.com/science/archive/2019/07/we-need-new-science-progress/594946/ |website=The Atlantic |access-date=25 January 2023 |language=en |date=30 July 2019}}</ref> The field was explicitly proposed in a 2019 essay and described as an [[applied science]] that prescribes action.<ref>{{cite web |last1=Lovely |first1=Garrison |title=Do we need a better understanding of 'progress'? |url=https://www.bbc.com/future/article/20220615-do-we-need-a-better-understanding-of-progress |publisher=BBC |access-date=27 January 2023 |language=en}}</ref>
;As and for acceleration of progress
A study suggests that improving the way science is done could accelerate the rate of scientific discovery and its applications which could be useful for finding urgent solutions to humanity's problems, improve humanity's conditions, and enhance understanding of nature. Metascientific studies can seek to identify aspects of science that need improvement, and develop ways to improve them.<ref name="10.1007/978-3-319-20717-9_11">{{cite book |last1=Khalil |first1=Mohammed M. |title=How Should Humanity Steer the Future? |chapter=Improving Science for a Better Future |series=The Frontiers Collection |date=2016 |pages=113–126 |doi=10.1007/978-3-319-20717-9_11 |publisher=Springer International Publishing |isbn=978-3-319-20716-2 |language=en}}</ref> If science is accepted as the fundamental engine of economic growth and social progress, this could raise "the question of what we – as a society – can do to accelerate science, and to direct science toward solving society's most important problems."<ref>{{cite web |title=Developing the science of science |url=https://www.worksinprogress.co/issue/developing-the-science-of-science/ |website=Works in Progress |access-date=25 January 2023|first1=Paul|last1=Niehaus|first2=Heidi |last2=Williams }}</ref> However, one of the authors clarified that a one-size-fits-all approach is not thought to be right answer – for example, in funding, DARPA models, curiosity-driven methods, allowing "a single reviewer to champion a project even if his or her peers do not agree", and various other approaches all have their uses. Nevertheless, evaluation of them can help build knowledge of what works or works best.<ref name="stagnation">{{cite news |title=How to escape scientific stagnation |url=https://www.economist.com/finance-and-economics/2022/10/26/how-to-escape-scientific-stagnation |newspaper=The Economist |access-date=25 January 2023}}</ref>
==Reforms==
Meta-research identifying flaws in scientific practice has inspired reforms in science. These reforms seek to address and fix problems in scientific practice which lead to low-quality or inefficient research.
A 2015 study lists "fragmented" efforts in meta-research.<ref name=Ioannidis2015/>
===Pre-registration===
{{Further|Pre-registration (science)|Clinical trial registration}}
The practice of registering a scientific study before it is conducted is called [[pre-registration (science)|pre-registration]]. It arose as a means to address the [[replication crisis]]. Pregistration requires the submission of a registered report, which is then accepted for publication or rejected by a journal based on theoretical justification, experimental design, and the proposed statistical analysis. Pre-registration of studies serves to prevent [[publication bias]] (e.g. not publishing negative results), reduce [[data dredging]], and increase replicability.<ref>{{Cite web|title = Registered Replication Reports|publisher=Association for Psychological Science|url = http://www.psychologicalscience.org/index.php/replication|access-date = 2015-11-13}}</ref><ref>{{Cite news|title = Psychology's 'registration revolution'|url = https://www.theguardian.com/science/head-quarters/2014/may/20/psychology-registration-revolution|newspaper = the Guardian|access-date = 2015-11-13|first = Chris|last = Chambers|date = 2014-05-20}}</ref>
===Reporting standards===
{{Further|CONSORT|EQUATOR Network}}
Studies showing poor consistency and quality of reporting have demonstrated the need for reporting standards and guidelines in science, which has led to the rise of organisations that produce such standards, such as [[CONSORT]] (Consolidated Standards of Reporting Trials) and the [[EQUATOR Network]].
The EQUATOR ('''E'''nhancing the '''QUA'''lity and '''T'''ransparency '''O'''f health '''R'''esearch)<ref>{{cite journal | last1 = Simera | first1 = I | last2 = Moher | first2 = D | last3 = Hirst | first3 = A | last4 = Hoey | first4 = J | last5 = Schulz | first5 = KF | last6 = Altman | first6 = DG | title = Transparent and accurate reporting increases reliability, utility, and impact of your research: reporting guidelines and the EQUATOR Network | journal = BMC Medicine | volume = 8 | page = 24 | year = 2010 | pmid = 20420659 | pmc = 2874506 | doi = 10.1186/1741-7015-8-24 | doi-access = free }}</ref> Network is an international initiative aimed at promoting transparent and accurate reporting of health research studies to enhance the value and reliability of [[medical research]] literature.<ref>{{cite journal |last1=Simera |first1=I. |last2=Moher |first2=D. |last3=Hoey |first3=J. |last4=Schulz |first4=K. F. |last5=Altman |first5=D. G. |title=A catalogue of reporting guidelines for health research |journal=European Journal of Clinical Investigation |volume=40 |pages=35–53 |year=2010 |doi= 10.1111/j.1365-2362.2009.02234.x |pmid=20055895 |issue=1 |doi-access=free}}</ref> The EQUATOR Network was established with the goals of raising awareness of the importance of good reporting of research, assisting in the development, dissemination and implementation of reporting guidelines for different types of study designs, monitoring the status of the quality of reporting of research studies in the health sciences literature, and conducting research relating to issues that impact the quality of reporting of health research studies.<ref name=Simera2009>{{cite journal|last=Simera|first=I|author2=Altman, DG|s2cid=36739841|title=Writing a research article that is "fit for purpose": EQUATOR Network and reporting guidelines|journal= Evidence-Based Medicine|date=October 2009|volume=14|issue=5|pages=132–134|doi=10.1136/ebm.14.5.132|pmid=19794009}}<!--|access-date=28 July 2013--></ref> The Network acts as an "umbrella" organisation, bringing together developers of reporting guidelines, medical journal editors and peer reviewers, research funding bodies, and other key stakeholders with a mutual interest in improving the quality of research publications and research itself.
==Applications==
The areas of application of metascience include ICTs, medicine, psychology and physics.
=== Information and communications technologies ===
{{See also|#Reforms|#Evaluation and incentives|#Science overall and intrafield developments|Group decision-making|List of academic databases and search engines|Scientific communication}}
Metascience is used in the creation and improvement of technical systems ([[Information and communications technology|ICTs]]) and standards of science evaluation, incentivation, communication, commissioning, funding, regulation, production, management, use and publication. Such can be called "applied metascience"<ref>{{cite video |title=Ep. 49: Joel Chan on metascience, creativity, and tools for thought. |url=https://www.youtube.com/watch?v=KT-I_6TERKk |language=en}}</ref>{{better citation needed|date=February 2022}} and may seek to explore ways to increase quantity, quality and positive impact of research. One example for such is the [[Scientometrics#Altmetrics|development of alternative metrics]].<ref name="10.1126/science.aao0185"/>
;Study screening and feedback
Various websites or tools also identify inappropriate studies and/or enable feedback such as [[PubPeer]], [[Cochrane (organisation)|Cochrane]]'s Risk of Bias Tool<ref>{{cite web |title=Risk of Bias Tool {{!}} Cochrane Bias |url=https://methods.cochrane.org/bias/risk-bias-tool |website=methods.cochrane.org |access-date=25 January 2023 |language=en}}</ref> and [[RetractionWatch]]. Medical and academic disputes are as ancient as antiquity and a study calls for research into "constructive and obsessive criticism" and into policies to "help strengthen social media into a vibrant forum for discussion, and not merely an arena for gladiator matches".<ref>{{cite journal |last1=Prasad |first1=Vinay |last2=Ioannidis |first2=John P. A. |title=Constructive and obsessive criticism in science |journal=European Journal of Clinical Investigation |date=November 2022 |volume=52 |issue=11 |pages=e13839 |doi=10.1111/eci.13839 |pmid=35869811 |pmc=9787955 |language=en |issn=0014-2972}}</ref> Feedback to studies can be found via altmetrics which is often integrated at the website of the study – most often as an embedded [[Altmetrics]] badge – but may often be incomplete, such as only showing social media discussions that link to the study directly but not those that link to news reports about the study.
;Tools used, modified, extended or investigated
Tools may get developed with metaresearch or can be used or investigated by such. Notable examples may include:
* {{anchor|scite.ai}}The tool scite.ai aims to track and link citations of papers as 'Supporting', 'Mentioning' or 'Contrasting' the study.<ref>{{cite news |last1=Khamsi |first1=Roxanne |title=Coronavirus in context: Scite.ai tracks positive and negative citations for COVID-19 literature |url=https://www.nature.com/articles/d41586-020-01324-6 |access-date=19 February 2022 |journal=Nature |date=1 May 2020 |language=en |doi=10.1038/d41586-020-01324-6}}</ref><ref>{{cite journal |last1=Nicholson |first1=Josh M. |last2=Mordaunt |first2=Milo |last3=Lopez |first3=Patrice |last4=Uppala |first4=Ashish |last5=Rosati |first5=Domenic |last6=Rodrigues |first6=Neves P. |last7=Grabitz |first7=Peter |last8=Rife |first8=Sean C. |title=scite: A smart citation index that displays the context of citations and classifies their intent using deep learning |journal=Quantitative Science Studies |date=5 November 2021 |volume=2 |issue=3 |pages=882–898 |doi=10.1162/qss_a_00146|s2cid=232283218 }}</ref><ref name="newbot"/>
* The Scite Reference Check bot is an extension of scite.ai that scans new article PDFs "for references to retracted papers, and posts both the citing and retracted papers on Twitter" and also "flags when new studies cite older ones that have issued corrections, errata, withdrawals, or expressions of concern".<ref name="newbot">{{cite news |title=New bot flags scientific studies that cite retracted papers |url=https://www.nature.com/nature-index/news-blog/new-bot-flags-scientific-research-studies-that-cite-retracted-papers |website=Nature Index |date=2 February 2021 |access-date=25 January 2023 |language=en}}</ref> Studies have suggested as few as 4% of citations to retracted papers clearly recognize the retraction.<ref name="newbot"/>
* Search engines like [[Google Scholar]] are used to find studies and the notification service [[Google Alerts]] enables notifications for new studies matching specified search terms. Scholarly communication infrastructure includes search databases.<ref>{{cite book |last1=Chan |first1=Joel |last2=Lutters |first2=Wayne |last3=Schneider |first3=Jodi |last4=Kirsanow |first4=Karola |last5=Bessa |first5=Silvia |last6=Saunders |first6=Jonny L. |title=Companion Computer Supported Cooperative Work and Social Computing |chapter=Growing New Scholarly Communication Infrastructures for Sharing, Reusing, and Synthesizing Knowledge |date=8 November 2022 |pages=278–281 |doi=10.1145/3500868.3559398 |publisher=Association for Computing Machinery|isbn=9781450391900 |s2cid=253385733 }}</ref>
* [[Shadow library]] [[Sci-hub]] is a topic of metascience<ref>{{cite journal |last1=Segado-Boj |first1=Francisco |last2=Martín-Quevedo |first2=Juan |last3=Prieto-Gutiérrez |first3=Juan-José |title=Jumping over the paywall: Strategies and motivations for scholarly piracy and other alternatives |journal=Information Development |date=12 December 2022 |doi=10.1177/02666669221144429 |s2cid=254564205 |language=en |issn=0266-6669|url=https://eprints.ucm.es/id/eprint/75874/1/Preprint_Segado-Boj_ID_2022.pdf }}</ref>
* [[Personal knowledge management]] systems for research-, [[knowledge worker|knowledge-]] and task management, such as saving information in organized ways<ref>{{cite journal |last1=Gosztyla |first1=Maya |title=How to find, read and organize papers |url=https://www.nature.com/articles/d41586-022-01878-7 |journal=Nature |access-date=28 January 2023 |language=en |doi=10.1038/d41586-022-01878-7 |date=7 July 2022|pmid=35804061 |s2cid=250388551 }}</ref> with [[Comparison of note-taking software|multi-document]] [[text editor]]s for future use<ref>{{cite book |last1=Fastrez |first1=Pierre |last2=Jacques |first2=Jerry |title=Human Interface and the Management of Information. Information and Knowledge Design |chapter=Managing References by Filing and Tagging: An Exploratory Study of Personal Information Management by Social Scientists |series=Lecture Notes in Computer Science |date=2015 |volume=9172 |pages=291–300 |doi=10.1007/978-3-319-20612-7_28 |publisher=Springer International Publishing |isbn=978-3-319-20611-0 |language=en}}</ref><ref>{{cite journal |last1=Chaudhry |first1=Abdus Sattar |last2=Alajmi |first2=Bibi M. |title=Personal information management practices: how scientists find and organize information |url=https://www.emerald.com/insight/content/doi/10.1108/GKMC-04-2022-0082/full/html |journal=Global Knowledge, Memory and Communication |doi=10.1108/GKMC-04-2022-0082 |date=1 January 2022|volume=ahead-of-print |issue=ahead-of-print |s2cid=253363619 }}</ref> Such systems could be described as part of, along with e.g. Web browser ([[Tab (interface)#Development|tabs-addons]]<ref>{{cite book |last1=Chang |first1=Joseph Chee |last2=Kim |first2=Yongsung |last3=Miller |first3=Victor |last4=Liu |first4=Michael Xieyang |last5=Myers |first5=Brad A |last6=Kittur |first6=Aniket |title=The 34th Annual ACM Symposium on User Interface Software and Technology |chapter=Tabs.do: Task-Centric Browser Tab Management |date=12 October 2021 |pages=663–676 |doi=10.1145/3472749.3474777 |publisher=Association for Computing Machinery|isbn=9781450386357 |s2cid=237102658 }}</ref> etc) and search software,{{additional citation needed|date=January 2023}} "mind-machine partnerships" that could be investigated by metascience for how they could improve science.<ref name="10.1126/science.aao0185"/>
* Scholia – efforts to open scholarly publication metadata and use it via Wikidata.<ref>{{cite journal |last1=Rasberry |first1=Lane |last2=Tibbs |first2=Sheri |last3=Hoos |first3=William |last4=Westermann |first4=Amy |last5=Keefer |first5=Jeffrey |last6=Baskauf |first6=Steven James |last7=Anderson |first7=Clifford |last8=Walker |first8=Philip |last9=Kwok |first9=Cherrie |last10=Mietchen |first10=Daniel |title=WikiProject Clinical Trials for Wikidata |date=4 April 2022 |doi=10.1101/2022.04.01.22273328|s2cid=247936371 |website=medRxiv }}</ref>
* Various software enables common metascientific practices such as bibliometric analysis.<ref>{{cite journal |last1=Moral-Muñoz |first1=José A. |last2=Herrera-Viedma |first2=Enrique |last3=Santisteban-Espejo |first3=Antonio |last4=Cobo |first4=Manuel J. |title=Software tools for conducting bibliometric analysis in science: An up-to-date review |journal=El Profesional de la Información |date=19 January 2020 |volume=29 |issue=1 |doi=10.3145/epi.2020.ene.03|s2cid=210926828 |hdl=10481/62406 |hdl-access=free }}</ref>
;Development
According to a study "a simple way to check how often studies have been repeated, and whether or not the original findings are confirmed" is needed due to reproducibility issues in science.<ref>{{cite news |title=A new replication crisis: Research that is less likely to be true is cited more |url=https://phys.org/news/2021-05-replication-crisis-true-cited.html |access-date=14 June 2021 |work=phys.org |language=en}}</ref><ref>{{cite journal |last1=Serra-Garcia |first1=Marta |last2=Gneezy |first2=Uri |title=Nonreplicable publications are cited more than replicable ones |journal=Science Advances |date=2021-05-01 |volume=7 |issue=21 |page=eabd1705 |doi=10.1126/sciadv.abd1705 |pmid=34020944 |pmc=8139580 |bibcode=2021SciA....7.1705S |language=en |issn=2375-2548}}</ref> A study suggests a tool for screening studies for early warning signs for research fraud.<ref>{{cite journal |last1=Parker |first1=Lisa |last2=Boughton |first2=Stephanie |last3=Lawrence |first3=Rosa |last4=Bero |first4=Lisa |title=Experts identified warning signs of fraudulent research: a qualitative study to inform a screening tool |journal=Journal of Clinical Epidemiology |date=1 November 2022 |volume=151 |pages=1–17 |doi=10.1016/j.jclinepi.2022.07.006|pmid=35850426 |s2cid=250632662 }}</ref>
===Medicine===
{{See also|Profit motive#Criticisms}}
Clinical research in medicine is often of low quality, and many studies cannot be replicated.<ref name="Ioannidis2016">{{cite journal | last1 = Ioannidis | first1 = JPA | year = 2016 | title = Why Most Clinical Research Is Not Useful | journal = PLOS Med | volume = 13 | issue = 6| page = e1002049 | doi = 10.1371/journal.pmed.1002049 | pmid = 27328301 | pmc = 4915619 | doi-access = free }}</ref><ref>{{cite journal|title=Contradicted and initially stronger effects in highly cited clinical research|last=Ioannidis JA|date=13 July 2005|journal=JAMA|volume=294|issue=2|pages=218–228|doi=10.1001/jama.294.2.218|pmid=16014596|doi-access=free}}</ref> An estimated 85% of research funding is wasted.<ref name="ChalmersGlasziou2009">{{cite journal|last1=Chalmers|first1=Iain|last2=Glasziou|first2=Paul|s2cid=11797088|year=2009|title=Avoidable waste in the production and reporting of research evidence|journal=The Lancet|volume=374|issue=9683|pages=86–89|doi=10.1016/S0140-6736(09)60329-9|issn=0140-6736|pmid=19525005|url=http://timetravel.mementoweb.org/memento/2009/http://www.thelancet.com/journals/lancet }}</ref> Additionally, the presence of bias affects research quality.<ref>{{cite web |last1=June 24 |first1=Jeremy Hsu |last2=ET |first2=Jeremy Hsu |title=Dark Side of Medical Research: Widespread Bias and Omissions |url=https://www.livescience.com/8365-dark-side-medical-research-widespread-bias-omissions.html |website=Live Science |date=24 June 2010 |access-date=24 May 2019}}</ref> The [[pharmaceutical companies|pharmaceutical industry]] exerts substantial influence on the design and execution of medical research. Conflicts of interest are common among authors of medical literature<ref>{{cite journal |title=Confronting conflict of interest |journal=Nature Medicine |date=November 2018 |volume=24 |issue=11 |page=1629 |doi=10.1038/s41591-018-0256-7 |pmid=30401866 |language=en |issn=1546-170X|doi-access=free }}</ref> and among editors of medical journals. While almost all medical journals require their authors to disclose conflicts of interest, editors are not required to do so.<ref>{{cite journal |last1=Haque |first1=Waqas |last2=Minhajuddin |first2=Abu |last3=Gupta |first3=Arjun |last4=Agrawal |first4=Deepak |title=Conflicts of interest of editors of medical journals |journal=PLOS ONE |date=2018 |volume=13 |issue=5 |page=e0197141 |doi=10.1371/journal.pone.0197141 |pmid=29775468 |pmc=5959187 |issn=1932-6203|bibcode=2018PLoSO..1397141H |doi-access=free }}</ref> Financial [[conflicts of interest]] have been linked to higher rates of positive study results. In antidepressant trials, pharmaceutical sponsorship is the best predictor of trial outcome.<ref>{{cite journal |last1=Moncrieff |first1=J |title=The antidepressant debate. |journal=The British Journal of Psychiatry |date=March 2002 |volume=180 |issue=3 |pages=193–194 |pmid=11872507 |language=en |issn=0007-1250|doi=10.1192/bjp.180.3.193 |doi-access=free }}</ref>
[[blinded experiment|Blinding]] is another focus of meta-research, as error caused by poor blinding is a source of [[bias|experimental bias]]. Blinding is not well reported in medical literature, and widespread misunderstanding of the subject has resulted in poor implementation of blinding in [[clinical trial]]s.<ref>{{cite journal |last1=Bello |first1=S |last2=Moustgaard |first2=H |last3=Hróbjartsson |first3=A |title=The risk of unblinding was infrequently and incompletely reported in 300 randomized clinical trial publications. |journal=Journal of Clinical Epidemiology |date=October 2014 |volume=67 |issue=10 |pages=1059–1069 |doi=10.1016/j.jclinepi.2014.05.007 |pmid=24973822 |issn=1878-5921}}</ref> Furthermore, [[unblinding|failure of blinding]] is rarely measured or reported.<ref>{{cite journal |last1=Tuleu |first1=Catherine |last2=Legay |first2=Helene |last3=Orlu-Gul |first3=Mine |last4=Wan |first4=Mandy |title=Blinding in pharmacological trials: the devil is in the details |journal=Archives of Disease in Childhood |date=1 September 2013 |volume=98 |issue=9 |pages=656–659 |doi=10.1136/archdischild-2013-304037 |pmid=23898156 |pmc=3833301 |language=en |issn=0003-9888}}</ref> Research showing the failure of blinding in [[antidepressant]] trials has led some scientists to argue that antidepressants are no better than [[placebo]].<ref>{{cite journal |last1=Kirsch |first1=I |title=Antidepressants and the Placebo Effect. |journal=Zeitschrift für Psychologie |date=2014 |volume=222 |issue=3 |pages=128–134 |doi=10.1027/2151-2604/a000176 |pmid=25279271 |pmc=4172306 |issn=2190-8370}}</ref><ref>{{cite journal |last1=Ioannidis |first1=John PA |title=Effectiveness of antidepressants: an evidence myth constructed from a thousand randomized trials? |journal=Philosophy, Ethics, and Humanities in Medicine |date=27 May 2008 |volume=3 |page=14 |doi=10.1186/1747-5341-3-14 |pmid=18505564 |pmc=2412901 |issn=1747-5341 |doi-access=free }}</ref> In light of meta-research showing failures of blinding, [[CONSORT]] standards recommend that all clinical trials assess and report the quality of blinding.<ref name=":0">{{cite journal |last1=Moher |first1=David |last2=Altman |first2=Douglas G. |last3=Schulz |first3=Kenneth F. |title=CONSORT 2010 Statement: updated guidelines for reporting parallel group randomised trials |journal=BMJ |date=24 March 2010 |volume=340 |page=c332 |doi=10.1136/bmj.c332 |pmid=20332509 |pmc=2844940 |language=en |issn=0959-8138}}</ref>
Studies have shown that systematic reviews of existing research evidence are sub-optimally used in planning a new research or summarizing the results.<ref name=pmid9676682>{{cite journal |doi=10.1001/jama.280.3.280 |pmid=9676682 |title=Discussion Sections in Reports of Controlled Trials Published in General Medical Journals |journal=JAMA |volume=280 |issue=3 |pages=280–282 |year=1998 |last1=Clarke |first1=Michael |last2=Chalmers |first2=Iain |doi-access=free }}</ref> Cumulative meta-analyses of studies evaluating the effectiveness of medical interventions have shown that many clinical trials could have been avoided if a systematic review of existing evidence was done prior to conducting a new trial.<ref name=pmid1614465>{{cite journal |doi=10.1056/NEJM199207233270406 |pmid=1614465 |title=Cumulative Meta-Analysis of Therapeutic Trials for Myocardial Infarction |journal=New England Journal of Medicine |volume=327 |issue=4 |pages=248–254 |year=1992 |last1=Lau |first1=Joseph |last2=Antman |first2=Elliott M |last3=Jimenez-Silva |first3=Jeanette |last4=Kupelnick |first4=Bruce |last5=Mosteller |first5=Frederick |last6=Chalmers |first6=Thomas C |doi-access=free}}</ref><ref name=pmid16279145>{{cite journal |doi=10.1191/1740774505cn085oa |pmid=16279145 |title=Randomized controlled trials of aprotinin in cardiac surgery: Could clinical equipoise have stopped the bleeding? |journal=Clinical Trials|volume=2 |issue=3 |pages=218–229; discussion 229–232 |year=2016 |last1=Fergusson |first1=Dean |last2=Glass |first2=Kathleen Cranley |last3=Hutton |first3=Brian |last4=Shapiro |first4=Stan |s2cid=31375469 }}</ref><ref name=pmid25068257>{{cite journal |doi=10.1371/journal.pone.0102670 |pmid=25068257 |pmc=4113310 |title=Accumulating Research: A Systematic Account of How Cumulative Meta-Analyses Would Have Provided Knowledge, Improved Health, Reduced Harm and Saved Resources |journal=PLOS ONE |volume=9 |issue=7 |page=e102670 |year=2014 |last1=Clarke |first1=Mike |last2=Brice |first2=Anne |last3=Chalmers |first3=Iain |bibcode=2014PLoSO...9j2670C |doi-access=free }}</ref> For example, Lau et al.<ref name=pmid1614465/> analyzed 33 clinical trials (involving 36974 patients) evaluating the effectiveness of intravenous [[streptokinase]] for [[Myocardial infarction|acute myocardial infarction]]. Their cumulative meta-analysis demonstrated that 25 of 33 trials could have been avoided if a systematic review was conducted prior to conducting a new trial. In other words, randomizing 34542 patients was potentially unnecessary. One study<ref name="pmid21200038">{{cite journal |doi=10.7326/0003-4819-154-1-201101040-00007 |pmid=21200038 |title=A Systematic Examination of the Citation of Prior Research in Reports of Randomized, Controlled Trials |journal=Annals of Internal Medicine |volume=154 |issue=1 |pages=50–55 |year=2011 |last1=Robinson |first1=Karen A |last2=Goodman |first2=Steven N |s2cid=207536137 }}</ref> analyzed 1523 clinical trials included in 227 [[Meta-analysis|meta-analyses]] and concluded that "less than one quarter of relevant prior studies" were cited. They also confirmed earlier findings that most clinical trial reports do not present systematic review to justify the research or summarize the results.<ref name="pmid21200038" />
Many treatments used in modern medicine have been proven to be ineffective, or even harmful. A 2007 study by John Ioannidis found that it took an average of ten years for the medical community to stop referencing popular practices after their efficacy was unequivocally disproven.<ref>{{cite web |last1=Epstein |first1=David |title=When Evidence Says No, but Doctors Say Yes - The Atlantic |url=https://getpocket.com/explore/item/when-evidence-says-no-but-doctors-say-yes |website=Pocket |access-date=10 April 2020}}</ref><ref>{{cite journal |last1=Tatsioni |first1=A |last2=Bonitsis |first2=NG |last3=Ioannidis |first3=JP |title=Persistence of contradicted claims in the literature. |journal=JAMA |date=5 December 2007 |volume=298 |issue=21 |pages=2517–2526 |doi=10.1001/jama.298.21.2517 |pmid=18056905 |issn=1538-3598|doi-access=free }}</ref>
===Psychology===
{{Further|Replication crisis}}
Metascience has revealed significant problems in psychological research. The field suffers from high bias, low [[reproducibility]], and widespread [[misuse of statistics]].<ref>{{cite journal|last1=Franco|first1=Annie|last2=Malhotra|first2=Neil|author-link2=Neil Malhotra|last3=Simonovits|first3=Gabor|s2cid=143182733|date=1 January 2016|title=Underreporting in Psychology Experiments: Evidence From a Study Registry|journal=Social Psychological and Personality Science|language=en|volume=7|issue=1|pages=8–12|doi=10.1177/1948550615598377|issn=1948-5506}}</ref><ref>{{cite journal|last1=Munafò|first1=Marcus|date=29 March 2017|title=Metascience: Reproducibility blues|journal=Nature|language=en|volume=543|issue=7647|pages=619–620|doi=10.1038/543619a|issn=1476-4687|bibcode=2017Natur.543..619M|doi-access=free}}</ref><ref>{{Cite journal | doi=10.1126/science.aav4784 |title = This research group seeks to expose weaknesses in science{{snd}}and they'll step on some toes if they have to|journal = Science|date =20 September 2018|last1 = Stokstad|first1 = Erik|s2cid = 158525979}}</ref> The replication crisis affects [[psychology]] more strongly than any other field; as many as two-thirds of highly publicized findings may be impossible to replicate.<ref>{{cite journal |doi=10.1126/science.aac4716 |pmid=26315443 |title=Estimating the reproducibility of psychological science |journal=Science |volume=349 |issue=6251 |page=aac4716 |year=2015 |url=http://eprints.keele.ac.uk/877/1/Open%20Science%20%28Science%20Pre-Print%29.pdf |author1=Open Science Collaboration |s2cid=218065162 |hdl=10722/230596 |hdl-access=free }}</ref> Meta-research finds that 80-95% of psychological studies support their initial hypotheses, which strongly implies the existence of [[publication bias]].<ref name=":1" />
The replication crisis has led to renewed efforts to re-test important findings.<ref name="Simmons et al. (2011)">{{cite journal |doi=10.1177/0956797611417632 |pmid=22006061 |title=False-Positive Psychology |journal=Psychological Science |volume=22 |issue=11 |pages=1359–1366 |year=2011 |last1=Simmons |first1=Joseph P. |last2=Nelson |first2=Leif D. |last3=Simonsohn |first3=Uri |doi-access=free }}</ref><ref>{{cite journal |doi=10.1177/1745691613514450 |pmid=26173241 |title=The Alleged Crisis and the Illusion of Exact Replication |journal=Perspectives on Psychological Science |volume=9 |issue=1 |pages=59–71 |year=2014 |last1=Stroebe |first1=Wolfgang |last2=Strack |first2=Fritz |s2cid=31938129 |url=https://pure.rug.nl/ws/files/12588700/postprint_Stroebe_Strack_2014.pdf }}</ref> In response to concerns about [[publication bias]] and [[Data dredging|''p''-hacking]], more than 140 psychology journals have adopted [[Scholarly peer review#Result-blind peer review|result-blind peer review]], in which studies are [[Registered report|pre-registered]] and published without regard for their outcome.<ref>{{cite journal|last=Aschwanden|first=Christie|title=Psychology's Replication Crisis Has Made The Field Better|website=[[FiveThirtyEight]]|date=6 December 2018|access-date=19 December 2018|url=https://fivethirtyeight.com/features/psychologys-replication-crisis-has-made-the-field-better/}}</ref> An analysis of these reforms estimated that 61 percent of result-blind studies produce [[null result]]s, in contrast with 5 to 20 percent in earlier research. This analysis shows that result-blind peer review substantially reduces publication bias.<ref name=":1">{{cite journal |doi=10.31234/osf.io/3czyt |title=Open Science challenges, benefits and tips in early career and beyond |last1=Allen |first1=Christopher P G. |last2=Mehler |first2=David Marc Anton |s2cid=240061030 |url=http://psyarxiv.com/3czyt/ }}</ref>
Psychologists routinely confuse [[statistical significance]] with practical importance, enthusiastically reporting great certainty in unimportant facts.<ref>{{cite journal|last1=Cohen|first1=Jacob|s2cid=380942|year=1994|title=The earth is round (p < .05)|journal=American Psychologist|volume=49|issue=12|pages=997–1003|doi=10.1037/0003-066X.49.12.997}}</ref> Some psychologists have responded with an increased use of [[effect size]] statistics, rather than sole reliance on the [[P value|''p'' values]].{{Citation needed|date=June 2010}}
===Physics===
[[Richard Feynman]] noted that estimates of [[physical constant]]s were closer to published values than would be expected by chance. This was believed to be the result of [[confirmation bias]]: results that agreed with existing literature were more likely to be believed, and therefore published. Physicists now implement blinding to prevent this kind of bias.<ref>{{cite journal |last1=MacCoun |first1=Robert |last2=Perlmutter |first2=Saul |title=Blind analysis: Hide results to seek the truth |journal=Nature |volume=526 |issue=7572 |pages=187–189 |language=en |doi=10.1038/526187a |pmid=26450040 |date=8 October 2015|bibcode=2015Natur.526..187M |doi-access=free }}</ref>
===Computer Science===
Web measurement studies are essential for understanding the workings of the modern Web, particularly in the fields of security and privacy. However, these studies often require custom-built or modified crawling setups, leading to a plethora of analysis tools for similar tasks. In a paper by Demir et al., the authors surveyed 117 recent research papers to derive best practices for Web-based measurement studies and establish criteria for reproducibility and replicability. They found that experimental setups and other critical information for reproducing and replicating results are often missing. In a large-scale Web measurement study on 4.5 million pages with 24 different measurement setups, the authors demonstrated the impact of slight differences in experimental setups on the overall results, emphasizing the need for accurate and comprehensive documentation.<ref>{{cite conference |last1=Demir |first1=Nurullah |last2=Große-Kampmann |first2=Matteo |last3=Urban |first3=Tobias |last4=Wressnegger |first4=Christian |last5=Holz |first5=Thorsten |last6=Pohlmann |first6=Norbert |title=Reproducibility and Replicability of Web Measurement Studies |year=2022 |publisher=Association for Computing Machinery |location=New York |url=https://doi.org/10.1145/3485447.3512214 |doi=10.1145/3485447.3512214 |book-title=Proceedings of the ACM Web Conference 2022 |pages=533–544 |series=WWW '22 }}</ref>
==Organizations and institutes==
{{Further|List of metascience research centers}}
There are several organizations and universities across the globe which work on meta-research – these include the Meta-Research Innovation Center at Berlin,<ref>{{Cite web|last=Berlin|first=Meta-Research Innovation Center|title=Meta-Research Innovation Center Berlin|url=https://www.metricberlin.bihealth.org/|access-date=2021-12-06|website=Meta-Research Innovation Center Berlin|language=en-us}}</ref> the [[Meta-Research Innovation Center at Stanford]],<ref>{{Cite web|title=Home {{!}} Meta-research Innovation Center at Stanford|url=https://metrics.stanford.edu/|access-date=2021-12-06|website=metrics.stanford.edu}}</ref><ref>{{Cite web|title=Meta-research and Evidence Synthesis Unit|url=https://www.georgeinstitute.org.in/units/meta-research-and-evidence-synthesis-unit|access-date=2021-12-19|website=The George Institute for Global Health|language=en}}</ref> the [[Meta-Research Center at Tilburg University]], the Meta-research & Evidence Synthesis Unit, The George Institute for Global Health at India<!--the Sheffield Metascience Network https://www.sheffield.ac.uk/is/research/centres/metanet--> and [[Center for Open Science]]. Organizations that develop tools for metascience include [[Our Research]], [[Center for Scientific Integrity]] and [[Altmetrics#Adoption|altmetrics companies]]. There is an annual Metascience Conference hosted by the Association for Interdisciplinary Meta-Research and Open Science (AIMOS) and biannual conference hosted by the Centre for Open Science.<ref>{{cite web |title=AIMOS 2022 |url=https://www.eventcreate.com/e/aimos2022 |website=AIMOS 2022 |access-date=20 March 2023}}</ref><ref>{{cite web |title=Metascience 2023 |url=https://metascience.info/ |website=Metascience 2023 Conference |access-date=20 March 2023}}</ref>
== See also ==
{{columns-list|colwidth=18em|
* [[Accelerating change]]
* [[Basic research]]
* [[Citation analysis]]
* [[Epistemology]]
* [[Evidence-based practices]]
* [[Evidence-based medicine]]
* [[Evidence-based policy]]
* [[Further research is needed]]
* [[HARKing]]
* [[Logology (science)]]
* {{slink|Metadata#Science}}
* [[Metatheory]]
* [[Open science]]
* [[Philosophy of science]]
* [[Sociology of scientific knowledge]]
* [[Self-Organized Funding Allocation]]
}}
== References ==
{{Reflist}}
==Further reading==
* Bonett, D.G. (2021). Design and analysis of replication studies. Organizational Research Methods, 24, 513-529. https://doi.org/10.1177/1094428120911088
* Lydia Denworth, "[https://www.scientificamerican.com/article/the-significant-problem-of-p-values/ A Significant Problem: Standard scientific methods are under fire. Will anything change?]", ''[[Scientific American]]'', vol. 321, no. 4 (October 2019), pp. 62–67.
**"The use of [[p value|''p'' values]] for nearly a century [since 1925] to determine [[statistical significance]] of [[experiment]]al results has contributed to an illusion of [[certainty]] and [to] [[reproducibility|reproducibility crises]] in many [[science|scientific fields]]. There is growing determination to reform statistical analysis... Some [researchers] suggest changing statistical methods, whereas others would do away with a threshold for defining "significant" results." (p. 63.)
* {{cite book | last=Harris | first=Richard | title=Rigor Mortis: How Sloppy Science Creates Worthless Cures, Crushes Hopes, and Wastes Billions | year=2017 | url=https://books.google.com/books?id=lk5IDgAAQBAJ |publisher=Basic Books|isbn=978-0465097913}}
* {{cite journal |last1=Fortunato |first1=Santo |last2=Bergstrom |first2=Carl T. |display-authors=et al. |title=Science of science |journal=Science |date=2 March 2018 |volume=359 |issue=6379 |page=eaao0185 |doi=10.1126/science.aao0185 |pmid=29496846 |pmc=5949209 |url=https://www.researchgate.net/publication/323502497}}
==External links==
'''Journals'''
* ''[https://www.springer.com/journal/11024 Minerva: A Journal of Science, Learning and Policy]''
* ''[https://researchintegrityjournal.biomedcentral.com/ Research Integrity and Peer Review]''
* ''[https://www.journals.elsevier.com/research-policy Research Policy]''
* ''[https://web.archive.org/web/20120405101059/http://spp.oxfordjournals.org/ Science and Public Policy]''
'''Conferences'''
* ''[https://metascience.info/ Annual Metascience Conference]''
{{Evidence-based practice}}
{{Science and technology studies}}
{{Meta-prefix}}
{{DEFAULTSORT:Metascience}}
[[Category:Metascience| ]]
[[Category:Epistemology of science]]
[[Category:Ethics and statistics]]
[[Category:Evidence-based practices]]
[[Category:Metatheory of science]]
[[Category:Research]]
[[Category:Science policy]]
[[Category:Scientific method]]' |
New page wikitext, after the edit (new_wikitext ) | '{{Short description|Scientific study of science}}
{{for|the journal|Metascience (journal)}}
{{distinguish|text = [[Science studies]], or with the obsolete synonym 'Meta-science' for the [[Philosophy of science]]}}
{{Evidence-based practices}}
'''Metascience''' (also known as '''meta-research''') is the use of [[scientific methodology]] to study [[science]] itself. Metascience seeks to increase the quality of scientific research while reducing [[inefficiency]]. It is also known as "research on research" and "the science of science", as it uses [[research methods]] to study how [[research]] is done and find where improvements can be made. Metascience concerns itself with all fields of research and has been described as "a [[bird's eye view]] of science".<ref name=Ioannidis2015/> In the words of [[John Ioannidis]], "Science is the best thing that has happened to human beings{{Nbsp}}... but we can do it better."<ref>{{cite web |last1=Bach |first1=Becky |title=On communicating science and uncertainty: A podcast with John Ioannidis |url=https://scopeblog.stanford.edu/2015/12/08/on-communicating-science-and-uncertainty-a-podcast-with-john-ioannidis/ |website=Scope |access-date=20 May 2019 |date=8 December 2015}}</ref>
In 1966, an early meta-research paper examined the [[statistical methods]] of 295 papers published in ten high-profile medical journals. It found that "in almost 73% of the reports read{{Nbsp}}... conclusions were drawn when the justification for these conclusions was invalid." Meta-research in the following decades found many methodological flaws, inefficiencies, and poor practices in research across numerous scientific fields. Many scientific studies could not be [[reproducibility|reproduced]], particularly in [[evidence-based medicine|medicine]] and the [[soft sciences]]. The term "[[replication crisis]]" was coined in the early 2010s as part of a growing awareness of the problem.<ref>{{Cite journal |last1=Pashler |first1=Harold |last2=Harris |first2=Christine R. |date=2012 |title=Is the Replicability Crisis Overblown? Three Arguments Examined |url=http://journals.sagepub.com/doi/10.1177/1745691612463401 |journal=Perspectives on Psychological Science |language=en |volume=7 |issue=6 |pages=531–536 |doi=10.1177/1745691612463401 |pmid=26168109 |s2cid=1342421 |issn=1745-6916}}</ref>
Measures have been implemented to address the issues revealed by metascience. These measures include the [[pre-registration (science)|pre-registration]] of scientific studies and [[Clinical trial registration|clinical trials]] as well as the founding of organizations such as [[CONSORT]] and the [[EQUATOR Network]] that issue guidelines for methodology and reporting. There are continuing efforts to reduce the [[misuse of statistics]], to eliminate [[perverse incentives]] from academia, to improve the [[academic peer review|peer review]] process, to systematically collect data about the scholarly publication system,<ref name="Nishikawa-PacherHeckSchoch2022">{{cite journal | last1 = Nishikawa-Pacher | first1 = Andreas | last2 = Heck | first2 = Tamara | last3 = Schoch | first3 = Kerstin | title = Open Editors: A dataset of scholarly journals' editorial board positions | journal = Research Evaluation | date = 4 October 2022 | issn = 0958-2029 | eissn = 1471-5449 | doi = 10.1093/reseval/rvac037 | pmid = | url = | doi-access = free}}</ref> to combat [[bias]] in scientific literature, and to increase the overall quality and efficiency of the scientific process.
== History ==
[[File:Ioannidis (2005) Why Most Published Research Findings Are False.pdf|thumb|200px|[[John Ioannidis]] (2005), "[[Why Most Published Research Findings Are False]]"<ref name=Ioannidis2005/>]]
In 1966, an early meta-research paper examined the [[statistical methods]] of 295 papers published in ten high-profile medical journals. It found that, "in almost 73% of the reports read ... conclusions were drawn when the justification for these conclusions was invalid."<ref name="Schor1966">{{cite journal|last1=Schor|first1=Stanley|title=Statistical Evaluation of Medical Journal Manuscripts|journal=JAMA: The Journal of the American Medical Association|volume=195|issue=13|year=1966|pages=1123–1128|issn=0098-7484|doi=10.1001/jama.1966.03100130097026|pmid=5952081}}</ref> In 2005, [[John Ioannidis]] published a paper titled "[[Why Most Published Research Findings Are False]]", which argued that a majority of papers in the medical field produce conclusions that are wrong.<ref name=Ioannidis2005>{{cite journal |last1=Ioannidis |first1=JP |title=Why most published research findings are false. |journal=PLOS Medicine |date=August 2005 |volume=2 |issue=8 |page=e124 |doi=10.1371/journal.pmed.0020124 |pmid=16060722 |pmc=1182327 |doi-access=free }}</ref> The paper went on to become the most downloaded paper in the [[Public Library of Science]]<ref>{{cite web|title = Highly Cited Researchers |url= http://highlycited.com/ |access-date=September 17, 2015}}</ref><ref>[https://profiles.stanford.edu/john-ioannidis Medicine - Stanford Prevention Research Center.] John P.A. Ioannidis</ref> and is considered foundational to the field of metascience.<ref>{{cite news|author=Robert Lee Hotz|title=Most Science Studies Appear to Be Tainted By Sloppy Analysis|url=https://www.wsj.com/articles/SB118972683557627104|newspaper = Wall Street Journal|publisher=Dow Jones & Company|date=September 14, 2007 |access-date=2016-12-05 |author-link=Robert Lee Hotz}}</ref> In a related study with [[Jeremy Howick]] and [[Despina Koletsi]], Ioannidis showed that only a minority of medical interventions are supported by 'high quality' evidence according to [[The Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach]]. <ref>Howick J, Koletsi D, Pandis N, Fleming PS, Loef M, Walach H, Schmidt S, Ioannidis JA. The quality of evidence for medical interventions does not improve or worsen: a metaepidemiological study of Cochrane reviews. Journal of Clinical Epidemiology 2020;126:154-159 [https://www.jclinepi.com/article/S0895-4356(20)30777-0/fulltext]</ref> Later meta-research identified widespread difficulty in [[Reproducibility|replicating]] results in many scientific fields, including [[psychology]] and [[Evidence-based medicine|medicine]]. This problem was termed "[[replication crisis|the replication crisis]]". Metascience has grown as a reaction to the replication crisis and to concerns about waste in research.<ref>{{cite journal|year=2014|title=Researching the researchers|journal=Nature Genetics|volume=46|issue=5|page=417|doi=10.1038/ng.2972|issn=1061-4036|pmid=24769715|doi-access=free}}</ref>
Many prominent publishers are interested in meta-research and in improving the quality of their publications. Top journals such as ''[[Nature (journal)|Science]],'' ''[[The Lancet]]'', and ''[[Nature (journal)|Nature]],'' provide ongoing coverage of meta-research and problems with reproducibility.<ref name="Enserink20182">{{cite journal|last1=Enserink|first1=Martin|year=2018|title=Research on research|journal=Science|volume=361|issue=6408|pages=1178–1179|doi=10.1126/science.361.6408.1178|issn=0036-8075|pmid=30237336|bibcode=2018Sci...361.1178E|s2cid=206626417}}</ref> In 2012 ''[[PLOS ONE]]'' launched a Reproducibility Initiative. In 2015 [[BioMed Central|Biomed Central]] introduced a minimum-standards-of-reporting checklist to four titles.
The first international conference in the broad area of meta-research was the Research Waste/[[EQUATOR Network|EQUATOR]] conference held in Edinburgh in 2015; the first international conference on peer review was the [[Peer Review Congress]] held in 1989.<ref name="Rennie1990">{{cite journal|last1=Rennie|first1=Drummond|title=Editorial Peer Review in Biomedical Publication|journal=JAMA|volume=263|issue=10|year=1990|pages=1317–1441|issn=0098-7484|doi=10.1001/jama.1990.03440100011001|pmid=2304208}}</ref> In 2016, ''[[Research Integrity and Peer Review]]'' was launched. The journal's opening editorial called for "research that will increase our understanding and suggest potential solutions to issues related to peer review, study reporting, and research and publication ethics".<ref name="HarrimanKowalczuk2016">{{cite journal|last1=Harriman|first1=Stephanie L.|last2=Kowalczuk|first2=Maria K.|last3=Simera|first3=Iveta|last4=Wager|first4=Elizabeth|title=A new forum for research on research integrity and peer review|journal=Research Integrity and Peer Review|volume=1|issue=1|page=5|year=2016|issn=2058-8615|doi=10.1186/s41073-016-0010-y|pmid=29451544|pmc=5794038 |doi-access=free }}</ref>
== Fields and topics of meta-research ==
[[File:ToK Simple.jpg|thumb|300px|An exemplary visualization of a conception of scientific [[knowledge]] generation structured by layers, with the "Institution of Science" being the subject of metascience]]
Metascience can be categorized into five major areas of interest: Methods, Reporting, Reproducibility, Evaluation, and Incentives. These correspond, respectively, with how to perform, communicate, verify, evaluate, and reward research.<ref name=Ioannidis2015>{{cite journal |last1=Ioannidis |first1=John P. A. |last2=Fanelli |first2=Daniele |last3=Dunne |first3=Debbie Drake |last4=Goodman |first4=Steven N. |title=Meta-research: Evaluation and Improvement of Research Methods and Practices |journal=PLOS Biology |date=2 October 2015 |volume=13 |issue=10 |page=e1002264 |doi=10.1371/journal.pbio.1002264 |pmid=26431313 |pmc=4592065 |issn=1544-9173 |doi-access=free }}</ref>
=== Methods ===
Metascience seeks to identify poor research practices, including [[bias]]es in research, poor study design, [[abuse of statistics]], and to find methods to reduce these practices.<ref name=Ioannidis2015 /> Meta-research has identified numerous biases in scientific literature.<ref>{{cite journal |last1=Fanelli |first1=Daniele |last2=Costas |first2=Rodrigo |last3=Ioannidis |first3=John P. A. |title=Meta-assessment of bias in science |journal=Proceedings of the National Academy of Sciences of the United States of America |date=2017 |volume=114 |issue=14 |pages=3714–3719 |doi=10.1073/pnas.1618569114 |pmid=28320937 |pmc=5389310 |bibcode=2017PNAS..114.3714F |issn=1091-6490|doi-access=free }}</ref> Of particular note is the widespread [[misuse of p-values]] and abuse of [[statistical significance]].<ref>{{cite journal |last1=Check Hayden |first1=Erika |title=Weak statistical standards implicated in scientific irreproducibility |url=https://www.nature.com/news/weak-statistical-standards-implicated-in-scientific-irreproducibility-1.14131 |journal=Nature |access-date=9 May 2019 |language=en |doi=10.1038/nature.2013.14131|year=2013 |s2cid=211729036 |doi-access=free }}</ref>
==== Scientific data science ====
Scientific data science is the use of [[data science]] to analyse research papers. It encompasses both [[Qualitative method|qualitative]] and [[Quantitative method|quantitative]] methods. Research in scientific data science includes [[fraud detection]]<ref>{{cite journal
| last1 = Markowitz
| first1 = David M.
| last2 = Hancock
| first2 = Jeffrey T.
| s2cid = 146174471
| date = 2016
| title = Linguistic obfuscation in fraudulent science
| journal = Journal of Language and Social Psychology
| volume = 35
| issue = 4
| pages = 435–445
| doi = 10.1177/0261927X15614605
}}</ref> and [[citation network]] analysis.<ref>{{cite journal
| last = Ding
| first = Y.
| s2cid = 3752804
| date = 2010
| title = Applying weighted PageRank to author citation networks
| journal = Journal of the American Society for Information Science and Technology
| volume = 62
| issue = 2
| pages = 236–245
| doi = 10.1002/asi.21452
| arxiv= 1102.1760
}}</ref>
==== Journalology ====
{{Main|Journalology}}
Journalology, also known as publication science, is the scholarly study of all aspects of the [[academic publishing]] process.<ref>{{Cite journal |last1=Galipeau |first1=James |last2=Moher |first2=David |last3=Campbell |first3=Craig |last4=Hendry |first4=Paul |last5=Cameron |first5=D. William |last6=Palepu |first6=Anita |last7=Hébert |first7=Paul C. |date=March 2015 |title=A systematic review highlights a knowledge gap regarding the effectiveness of health-related training programs in journalology |journal=Journal of Clinical Epidemiology |language=en |volume=68 |issue=3 |pages=257–265 |doi=10.1016/j.jclinepi.2014.09.024|pmid=25510373 |doi-access=free }}</ref><ref>{{Cite journal |last1=Wilson |first1=Mitch |last2=Moher |first2=David |date=March 2019 |title=The Changing Landscape of Journalology in Medicine |journal=Seminars in Nuclear Medicine |language=en |volume=49 |issue=2 |pages=105–114 |doi=10.1053/j.semnuclmed.2018.11.009|pmid=30819390 |hdl=10393/38493 |s2cid=73471103 |hdl-access=free }}</ref> The field seeks to improve the quality of scholarly research by implementing [[evidence-based practices]] in academic publishing.<ref name= Sciencemag1>{{Cite journal | doi=10.1126/science.aav4758 |title = 'Journalologists' use scientific methods to study academic publishing. Is their work improving science?|journal = Science|date = 18 September 2018|last1 = Couzin-Frankel|first1 = Jennifer|s2cid = 115360831}}</ref> The term "journalology" was coined by [[Stephen Lock]], the former [[editor-in-chief]] of ''[[The BMJ]]''. The first Peer Review Congress, held in 1989 in [[Chicago]], [[Illinois]], is considered a pivotal moment in the founding of journalology as a distinct field.<ref name= Sciencemag1/> The field of journalology has been influential in pushing for study [[pre-registration (science)|pre-registration]] in science, particularly in [[clinical trials]]. [[Clinical trial registration|Clinical-trial registration]] is now expected in most countries.<ref name= Sciencemag1 />
=== Reporting ===
Meta-research has identified poor practices in reporting, explaining, disseminating and popularizing research, particularly within the social and health sciences. Poor reporting makes it difficult to accurately interpret the results of scientific studies, to [[Reproducibility|replicate]] studies, and to identify biases and conflicts of interest in the authors. Solutions include the implementation of reporting standards, and greater transparency in scientific studies (including better requirements for disclosure of conflicts of interest). There is an attempt to standardize reporting of data and methodology through the creation of guidelines by reporting agencies such as [[Consolidated Standards of Reporting Trials|CONSORT]] and the larger [[EQUATOR Network]].<ref name=Ioannidis2015 />
=== Reproducibility ===
{{Further|Replication crisis|Reproducibility}}
[[File:Barriers to conducting replications of experiment in cancer research.jpg|thumb|Barriers to conducting replications of experiment in cancer research, ''[[Reproducibility Project|The Reproducibility Project]]: Cancer Biology'']]
The replication crisis is an ongoing [[methodological]] crisis in which it has been found that many scientific studies are difficult or impossible to [[reproducibility|replicate]].<ref>{{Cite journal | doi = 10.1038/515009a| title = Metascience could rescue the 'replication crisis'| journal = Nature| volume = 515| issue = 7525| page = 9| year = 2014| last1 = Schooler | first1 = J. W.| pmid=25373639| bibcode = 2014Natur.515....9S| doi-access = free}}</ref><ref name="Why 'Statistical Significance' Is Often Insignificant">{{cite news|last1=Smith|first1=Noah|title=Why 'Statistical Significance' Is Often Insignificant|url=https://www.bloomberg.com/view/articles/2017-11-02/why-statistical-significance-is-often-insignificant|newspaper=Bloomberg.com|date=2 November 2017|access-date=7 November 2017}}</ref> While the crisis has its roots in the meta-research of the mid- to late 20th century, the phrase "replication crisis" was not coined until the early 2010s<ref name="ReferenceA">{{Cite journal |last1=Pashler |first1=Harold |last2=Wagenmakers |first2=Eric Jan |year=2012 |title=Editors' Introduction to the Special Section on Replicability in Psychological Science: A Crisis of Confidence? |journal=Perspectives on Psychological Science |volume=7 |issue=6 |pages=528–530 |doi=10.1177/1745691612465253 |pmid=26168108 |s2cid=26361121}}</ref> as part of a growing awareness of the problem.<ref name=Ioannidis2015 /> The replication crisis has been closely studied in [[psychology]] (especially [[social psychology]]) and [[medicine]],<ref>{{cite magazine|url=http://www.newyorker.com/tech/elements/the-crisis-in-social-psychology-that-isnt|title=The Crisis in Social Psychology That Isn't|author=Gary Marcus|magazine=The New Yorker|date=May 1, 2013}}</ref><ref>{{cite magazine|url=http://www.newyorker.com/magazine/2010/12/13/the-truth-wears-off|title=The Truth Wears Off|author=Jonah Lehrer|magazine=The New Yorker|date=December 13, 2010}}</ref> including cancer research.<ref>{{cite news |title=Dozens of major cancer studies can't be replicated |url=https://www.sciencenews.org/article/cancer-biology-studies-research-replication-reproducibility |access-date=19 January 2022 |work=Science News |date=7 December 2021}}</ref><ref>{{cite web |title=Reproducibility Project: Cancer Biology |url=https://www.cos.io/rpcb |website=www.cos.io |publisher=[[Center for Open Science]] |access-date=19 January 2022 |language=en}}</ref> Replication is an essential part of the scientific process, and the widespread failure of replication puts into question the reliability of affected fields.<ref>Staddon, John (2017) Scientific Method: How science works, fails to work or pretends to work. Taylor and Francis.</ref>
Moreover, replication of research (or failure to replicate) is considered less influential than original research, and is less likely to be published in many fields. This discourages the reporting of, and even attempts to replicate, studies.<ref>{{Cite journal|last=Yeung|first=Andy W. K.|date=2017|title=Do Neuroscience Journals Accept Replications? A Survey of Literature|journal=Frontiers in Human Neuroscience|language=en|volume=11|page=468|doi=10.3389/fnhum.2017.00468|pmid=28979201|pmc=5611708|issn=1662-5161|doi-access=free}}</ref><ref>{{Cite journal|last1=Martin|first1=G. N.|last2=Clarke|first2=Richard M.|date=2017|title=Are Psychology Journals Anti-replication? A Snapshot of Editorial Practices|journal=Frontiers in Psychology|language=en|volume=8|page=523|doi=10.3389/fpsyg.2017.00523|pmid=28443044|pmc=5387793|issn=1664-1078|doi-access=free}}</ref>
=== Evaluation and incentives ===
{{See also|Academic journal#Prestige and ranking}}
Metascience seeks to create a scientific foundation for peer review. Meta-research evaluates [[Scholarly peer review|peer review]] systems including [[Scholarly peer review#Pre-publication peer review|pre-publication]] peer review, [[Scholarly peer review#Post-publication peer review|post-publication]] peer review, and [[open peer review]]. It also seeks to develop better research funding criteria.<ref name=Ioannidis2015 />
Metascience seeks to promote better research through better incentive systems. This includes studying the accuracy, effectiveness, costs, and benefits of different approaches to ranking and evaluating research and those who perform it.<ref name=Ioannidis2015 /> Critics argue that [[perverse incentives]] have created a [[publish or perish|publish-or-perish]] environment in academia which promotes the production of [[junk science]], low quality research, and [[false positives]].<ref>{{Cite book|last=Binswanger|first=Mathias|chapter=How Nonsense Became Excellence: Forcing Professors to Publish|date=2015|work=Incentives and Performance: Governance of Research Organizations|pages=19–32|editor-last=Welpe|editor-first=Isabell M.|publisher=Springer International Publishing|language=en|doi=10.1007/978-3-319-09785-5_2|isbn=978-3319097855|editor2-last=Wollersheim|editor2-first=Jutta|editor3-last=Ringelhan|editor3-first=Stefanie|editor4-last=Osterloh|editor4-first=Margit|title=Incentives and Performance|s2cid=110698382 }}</ref><ref>{{Cite journal|last1=Edwards|first1=Marc A.|last2=Roy|first2=Siddhartha|date=2016-09-22|title=Academic Research in the 21st Century: Maintaining Scientific Integrity in a Climate of Perverse Incentives and Hypercompetition|journal=Environmental Engineering Science|volume=34|issue=1|pages=51–61|doi=10.1089/ees.2016.0223|pmc=5206685|pmid=28115824}}</ref> According to [[Brian Nosek]], "The problem that we face is that the incentive system is focused almost entirely on getting research published, rather than on getting research right."<ref>{{cite web |last1=Brookshire |first1=Bethany |title=Blame bad incentives for bad science |url=https://www.sciencenews.org/blog/scicurious/blame-bad-incentives-bad-science |website=Science News |access-date=11 July 2019 |language=en |date=21 October 2016}}</ref> Proponents of reform seek to structure the incentive system to favor higher-quality results.<ref>{{cite journal |last1=Smaldino |first1=Paul E. |last2=McElreath |first2=Richard |title=The natural selection of bad science |journal=Royal Society Open Science |volume=3 |issue=9 |page=160384 |doi=10.1098/rsos.160384 |pmid=27703703 |pmc=5043322 |language=en|arxiv=1605.09511 |bibcode=2016RSOS....360384S |year=2016 }}</ref> For example, by quality being judged on the basis of narrative expert evaluations ("rather than [only or mainly] indices"), institutional evaluation criteria, guaranteeing of transparency, and professional standards.<ref name="10.1098/rspb.2019.2047">{{cite journal |last1=Chapman |first1=Colin A. |last2=Bicca-Marques |first2=Júlio César |last3=Calvignac-Spencer |first3=Sébastien |last4=Fan |first4=Pengfei |last5=Fashing |first5=Peter J. |last6=Gogarten |first6=Jan |last7=Guo |first7=Songtao |last8=Hemingway |first8=Claire A. |last9=Leendertz |first9=Fabian |last10=Li |first10=Baoguo |last11=Matsuda |first11=Ikki |last12=Hou |first12=Rong |last13=Serio-Silva |first13=Juan Carlos |last14=Chr. Stenseth |first14=Nils |title=Games academics play and their consequences: how authorship, h -index and journal impact factors are shaping the future of academia |journal=Proceedings of the Royal Society B: Biological Sciences |date=4 December 2019 |volume=286 |issue=1916 |pages=20192047 |doi=10.1098/rspb.2019.2047 |pmid=31797732 |pmc=6939250 |s2cid=208605640 |language=en |issn=0962-8452}}</ref>
====Contributorship====
Studies proposed machine-readable standards and (a taxonomy of) [[Digital badge|badge]]s for science publication management systems that hones in on contributorship – who has contributed what and how much of the research labor – rather that using traditional concept of plain [[academic authorship|authorship]] – who was involved in any way creation of a publication.<ref>{{cite journal |last1=Holcombe |first1=Alex O. |title=Contributorship, Not Authorship: Use CRediT to Indicate Who Did What |journal=Publications |date=September 2019 |volume=7 |issue=3 |page=48 |doi=10.3390/publications7030048 |language=en|doi-access=free }}</ref><ref>{{cite journal |last1=McNutt |first1=Marcia K. |last2=Bradford |first2=Monica |last3=Drazen |first3=Jeffrey M. |last4=Hanson |first4=Brooks |last5=Howard |first5=Bob |last6=Jamieson |first6=Kathleen Hall |last7=Kiermer |first7=Véronique |last8=Marcus |first8=Emilie |last9=Pope |first9=Barbara Kline |last10=Schekman |first10=Randy |last11=Swaminathan |first11=Sowmya |last12=Stang |first12=Peter J. |last13=Verma |first13=Inder M. |title=Transparency in authors' contributions and responsibilities to promote integrity in scientific publication |journal=Proceedings of the National Academy of Sciences |date=13 March 2018 |volume=115 |issue=11 |pages=2557–2560 |doi=10.1073/pnas.1715374115 |pmid=29487213 |pmc=5856527 |bibcode=2018PNAS..115.2557M |language=en |issn=0027-8424|doi-access=free }}</ref><ref>{{cite journal |last1=Brand |first1=Amy |last2=Allen |first2=Liz |last3=Altman |first3=Micah |last4=Hlava |first4=Marjorie |last5=Scott |first5=Jo |title=Beyond authorship: attribution, contribution, collaboration, and credit |journal=Learned Publishing |date=1 April 2015 |volume=28 |issue=2 |pages=151–155 |doi=10.1087/20150211 |s2cid=45167271 |url=https://www.researchgate.net/publication/274098676|doi-access=free }}</ref><ref>{{cite journal |last1=Singh Chawla |first1=Dalmeet |title=Digital badges aim to clear up politics of authorship |journal=Nature |date=October 2015 |volume=526 |issue=7571 |pages=145–146 |doi=10.1038/526145a |pmid=26432249 |bibcode=2015Natur.526..145S |s2cid=256770827 |language=en |issn=1476-4687|doi-access=free }}</ref> A study pointed out one of the problems associated with the ongoing neglect of contribution nuanciation – it found that "the number of publications has ceased to be a good metric as a result of longer author lists, shorter papers, and surging publication numbers".<ref name="10.1093/gigascience/giz053">{{cite journal |last1=Fire |first1=Michael |last2=Guestrin |first2=Carlos |title=Over-optimization of academic publishing metrics: observing Goodhart's Law in action |journal=GigaScience |date=1 June 2019 |volume=8 |issue=6 |page=giz053 |doi=10.1093/gigascience/giz053|pmid=31144712 |pmc=6541803 }}</ref>
====Assessment factors====
Factors other than a submission's merits can substantially influence peer reviewers' evaluations.<ref name="10.1177/2515245919895419">{{cite journal |last1=Elson |first1=Malte |last2=Huff |first2=Markus |last3=Utz |first3=Sonja |title=Metascience on Peer Review: Testing the Effects of a Study's Originality and Statistical Significance in a Field Experiment |journal=Advances in Methods and Practices in Psychological Science |date=1 March 2020 |volume=3 |issue=1 |pages=53–65 |doi=10.1177/2515245919895419 |s2cid=212778011 |language=en |issn=2515-2459|url=https://psyarxiv.com/gyds8/ }}</ref> Such factors may however also be important such as the use of track-records about the veracity of a researchers' prior publications and its alignment with public interests. Nevertheless, evaluation systems – include those of peer-review – may substantially lack mechanisms and criteria that are oriented or well-performingly oriented towards merit, real-world positive impact, progress and public usefulness rather than analytical indicators such as number of citations or altmetrics even when such can be used as partial indicators of such ends.<ref>{{cite journal |last1=McLean |first1=Robert K D |last2=Sen |first2=Kunal |title=Making a difference in the real world? A meta-analysis of the quality of use-oriented research using the Research Quality Plus approach |journal=Research Evaluation |date=1 April 2019 |volume=28 |issue=2 |pages=123–135 |doi=10.1093/reseval/rvy026}}</ref><ref>{{cite web |title=Bringing Rigor to Relevant Questions: How Social Science Research Can Improve Youth Outcomes in the Real World |url=http://wtgrantfoundation.org/library/uploads/2017/05/How-Social-Science-Research-Can-Improve-Youth-Outcomes_WTG2017.pdf |access-date=22 November 2021}}</ref> Rethinking of the academic reward structure "to offer more formal recognition for intermediate products, such as data" could have positive impacts and reduce data withholding.<ref>{{cite journal |last1=Fecher |first1=Benedikt |last2=Friesike |first2=Sascha |last3=Hebing |first3=Marcel |last4=Linek |first4=Stephanie |title=A reputation economy: how individual reward considerations trump systemic arguments for open access to data |journal=Palgrave Communications |date=20 June 2017 |volume=3 |issue=1 |pages=1–10 |doi=10.1057/palcomms.2017.51 |s2cid=34449408 |language=en |issn=2055-1045|hdl=11108/308 |hdl-access=free }}</ref>
====Recognition of training====
A commentary noted that academic rankings don't consider where (country and institute) the respective researchers were trained.<ref>{{cite journal |last1=La Porta |first1=Caterina AM |last2=Zapperi |first2=Stefano |title=America's top universities reap the benefit of Italian-trained scientists |url=https://www.nature.com/articles/d43978-022-00163-5 |journal=Nature Italy |access-date=18 December 2022 |language=en |doi=10.1038/d43978-022-00163-5 |date=1 December 2022|s2cid=254331807 }}</ref>
====Scientometrics====
{{Main|Scientometrics}}
Scientometrics concerns itself with measuring [[bibliometrics|bibliographic data]] in scientific publications. Major research issues include the measurement of the impact of research papers and academic journals, the understanding of scientific citations, and the use of such measurements in policy and management contexts.<ref name="ScientometricsLeydesdorff">[[Loet Leydesdorff|Leydesdorff, L.]] and Milojevic, S., "Scientometrics" [https://arxiv.org/abs/1208.4566 arXiv:1208.4566] (2013), forthcoming in: Lynch, M. (editor), ''International Encyclopedia of Social and Behavioral Sciences'' subsection 85030. (2015)</ref> Studies suggest that "metrics used to measure academic success, such as the number of publications, citation number, and impact factor, have not changed for decades" and have to some degrees "ceased" to be good measures,<ref name="10.1093/gigascience/giz053"/> leading to issues such as "overproduction, unnecessary fragmentations, overselling, predatory journals (pay and publish), clever plagiarism, and deliberate obfuscation of scientific results so as to sell and oversell".<ref name="publishless">{{Cite arXiv|last1=Singh |first1=Navinder |title=Plea to publish less |date=8 October 2021|class=physics.soc-ph |eprint=2201.07985 }}</ref>
Novel tools in this area include systems to quantify how much the cited-node informs the citing-node.<ref>{{cite book |last1=Manchanda |first1=Saurav |last2=Karypis |first2=George |title=Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing |chapter=Evaluating Scholarly Impact: Towards Content-Aware Bibliometrics |date=November 2021 |pages=6041–6053 |doi=10.18653/v1/2021.emnlp-main.488 |publisher=Association for Computational Linguistics|s2cid=243865632 |doi-access=free }}</ref> This can be used to convert unweighted citation networks to a weighted one and then for [[importance]] assessment, deriving "impact metrics for the various entities involved, like the publications, authors etc"<ref>{{cite web |last1=Manchanda |first1=Saurav |last2=Karypis |first2=George |title=Importance Assessment in Scholarly Networks |url=http://ceur-ws.org/Vol-2831/paper17.pdf}}</ref> as well as, among other tools, for search engine- and [[recommendation system]]s.
==== Science governance ====
{{See also|Science policy|Science of science policy}}
[[Science funding]] and [[science governance]] can also be explored and informed by metascience.<ref name="10.1007/s11016-020-00581-5">{{cite journal |last1=Nielsen |first1=Kristian H. |title=Science and public policy |journal=Metascience |date=1 March 2021 |volume=30 |issue=1 |pages=79–81 |doi=10.1007/s11016-020-00581-5| pmc=7605730 |s2cid=226237994 |language=en |issn=1467-9981}}</ref>
===== Incentives =====
Various interventions such as [[prioritization]] can be important. {{anchor|Differential R&D}}For instance, the concept of [[differential technological development]] refers to deliberately developing technologies – e.g. control-, safety- and policy-technologies versus [[biosafety|risky biotechnologies]] – at different precautionary paces to decrease risks, mainly [[global catastrophic risk]], by influencing the sequence in which technologies are developed.<ref>{{cite book|last=Bostrom|first=Nick|title=Superintelligence: Paths, Dangers, Strategies|date=2014|publisher=Oxford University Press|isbn=978-0199678112|location=Oxford|pages=229–237}}</ref><ref>{{Cite book|last=Ord|first=Toby|title=The Precipice: Existential Risk and the Future of Humanity|publisher=[[Bloomsbury Publishing]]|year=2020|isbn=978-1526600219|location=United Kingdom|page=200}}</ref> Relying only on the established form of legislation and incentives to ensure the right outcomes may not be adequate as these may often be too slow<ref>{{cite web |title=Technology is changing faster than regulators can keep up - here's how to close the gap |url=https://www.weforum.org/agenda/2018/06/law-too-slow-for-new-tech-how-keep-up/ |website=World Economic Forum |date=21 June 2018 |access-date=27 January 2022 |language=en}}</ref> or inappropriate.
Other incentives to govern science and related processes, including via metascience-based reforms, may include ensuring accountability to the public (in terms of e.g. accessibility of, especially publicly-funded, research or of it addressing various research topics of public interest in serious manners), increasing the qualified productive scientific workforce, improving the efficiency of science to improve [[problem-solving]] in general, and facilitating that unambiguous societal needs based on solid scientific evidence – such as about human physiology – are adequately prioritized and addressed. Such interventions, incentives and intervention-designs can be subjects of metascience.
===== Science funding and awards =====
{{See also|Patent#Criticism}}
[[File:Locations of papers in a map of science and locations of the key papers for Nobel prizes.tif|thumb|right|200px|Cluster network of scientific publications in relation to Nobel prizes]]
[[File:Funding for climate research in the natural and technical sciences versus the social sciences and humanities.jpg|thumb|Funding for climate research in the natural and technical sciences versus the social sciences and humanities<ref>{{cite journal |last1=Overland |first1=Indra |last2=Sovacool |first2=Benjamin K. |title=The misallocation of climate research funding |journal=Energy Research & Social Science |date=1 April 2020 |volume=62 |pages=101349 |doi=10.1016/j.erss.2019.101349 |s2cid=212789228 |language=en |issn=2214-6296}}</ref>]]
Scientific awards are one category of science incentives. Metascience can explore existing and hypothetical systems of science awards. For instance, it found that work honored by [[Nobel prize]]s clusters in only a few [[scientific fields]] with only 36/71 having received at least one Nobel prize of the 114/849 domains science could be divided into according to their DC2 and DC3 classification systems. Five of the 114 domains were shown to make up over half of the Nobel prizes awarded 1995–2017 (particle physics [14%], cell biology [12.1%], atomic physics [10.9%], neuroscience [10.1%], molecular chemistry [5.3%]).<ref name="nobelprizes">{{cite news |title=Nobel prize-winning work is concentrated in minority of scientific fields |url=https://phys.org/news/2020-07-nobel-prize-winning-minority-scientific-fields.html |access-date=17 August 2020 |work=phys.org |language=en}}</ref><ref>{{cite journal |last1=Ioannidis |first1=John P. A. |last2=Cristea |first2=Ioana-Alina |last3=Boyack |first3=Kevin W. |title=Work honored by Nobel prizes clusters heavily in a few scientific fields |journal=PLOS ONE |date=29 July 2020 |volume=15 |issue=7 |page=e0234612 |doi=10.1371/journal.pone.0234612 |pmid=32726312 |pmc=7390258 |bibcode=2020PLoSO..1534612I |language=en |issn=1932-6203|doi-access=free }}</ref>
A study found that delegation of responsibility by [[policy]]-makers – a centralized authority-based top-down approach – for knowledge production and appropriate funding to science with science subsequently somehow delivering "reliable and useful knowledge to society" is too simple.<ref name="10.1007/s11016-020-00581-5"/>
Measurements show that allocation of bio-medical resources can be more strongly correlated to previous allocations and research than to [[DALY|burden of diseases]].<ref name="10.1126/science.aao0185"/>
A study suggests that "[i]f peer review is maintained as the primary mechanism of arbitration in the competitive selection of research reports and funding, then the [[scientific community]] needs to make sure it is not arbitrary".<ref name="10.1177/2515245919895419"/>
Studies indicate there to is a need to "reconsider how we measure success" {{see below|[[#Factors of success and progress]]}}.<ref name="10.1093/gigascience/giz053"/>
;Funding data
Funding information from grant databases and funding acknowledgment sections can be sources of data for scientometrics studies, e.g. for investigating or recognition of the impact of funding entities on the development of science and technology.<ref>{{cite journal |last1=Fajardo-Ortiz |first1=David |last2=Hornbostel |first2=Stefan |last3=Montenegro de Wit |first3=Maywa |last4=Shattuck |first4=Annie |title=Funding CRISPR: Understanding the role of government and philanthropic institutions in supporting academic research within the CRISPR innovation system |journal=Quantitative Science Studies |date=22 June 2022 |volume=3 |issue=2 |pages=443–456 |doi=10.1162/qss_a_00187|s2cid=235266330 }}</ref>
=====Research questions and coordination=====
{{main|Research question#Aggregated research questions and coordination}}
=====Risk governance=====
{{main|Biosecurity#Future}}
=== Science communication and public use ===
{{See also|Science#Science and the public}}
It has been argued that "science has two fundamental attributes that underpin its value as a global public good: that knowledge claims and the evidence on which they are based are made openly available to scrutiny, and that the results of scientific research are communicated promptly and efficiently".<ref name="ISC1">{{cite web |title=Science as a Global Public Good |url=https://council.science/current/news/science-as-a-global-public-good/ |website=[[International Science Council]] |access-date=22 November 2021 |date=8 October 2021}}</ref> Metascientific research is exploring topics of [[science communication]] such as [[media coverage of science]], [[science journalism]] and online communication of results by science educators and scientists.<ref>{{cite book |last1=Jamieson |first1=Kathleen Hall |last2=Kahan |first2=Dan |last3=Scheufele |first3=Dietram A. |title=The Oxford Handbook of the Science of Science Communication |date=17 May 2017 |url=https://books.google.com/books?id=hSgmDwAAQBAJ&pg=PA51 |publisher=Oxford University Press |isbn=978-0190497637 |language=en}}</ref><ref>{{cite journal |last1=Grochala |first1=Rafał |title=Science communication in online media: influence of press releases on coverage of genetics and CRISPR |url=https://www.biorxiv.org/content/10.1101/2019.12.13.875278v1.abstract |page= |language=en |doi=10.1101/2019.12.13.875278 |date=16 December 2019 |s2cid=213125031 }}</ref><ref>{{cite journal |title=FRAMING ANALYSIS OF NEWS COVERAGE ON RENEWABLE ENERGYIN THE STAR ONLINE NEWS PORTAL |url=http://jestec.taylors.edu.my/Special%20Issue%20on%20SU18/SU18_04.pdf |access-date=22 November 2021}}</ref><ref>{{cite journal |last1=MacLaughlin |first1=Ansel |last2=Wihbey |first2=John |last3=Smith |first3=David |title=Predicting News Coverage of Scientific Articles |journal=Proceedings of the International AAAI Conference on Web and Social Media |date=15 June 2018 |volume=12 |issue=1 |doi=10.1609/icwsm.v12i1.14999 |s2cid=49412893 |url=https://ojs.aaai.org/index.php/ICWSM/article/view/14999 |language=en |issn=2334-0770|doi-access=free }}</ref> A study found that the "main incentive academics are offered for using social media is amplification" and that it should be "moving towards an institutional culture that focuses more on how these [or such] platforms can facilitate real engagement with research".<ref>{{cite journal |last1=Carrigan |first1=Mark |last2=Jordan |first2=Katy |title=Platforms and Institutions in the Post-Pandemic University: a Case Study of Social Media and the Impact Agenda |journal=Postdigital Science and Education |date=4 November 2021 |volume=4 |issue=2 |pages=354–372 |doi=10.1007/s42438-021-00269-x |s2cid=243760357 |language=en |issn=2524-4868|doi-access=free }}</ref> Science communication may also involve the communication of societal needs, concerns and requests to scientists.
====Alternative metrics tools====
{{anchor|altmetrics}}Alternative metrics tools can be used not only for help in assessment (performance and impact)<ref name="10.1126/science.aao0185"/> and findability, but also aggregate many of the public discussions about a scientific paper in social media such as [[reddit]], [[Science information on Wikipedia|citations on Wikipedia]], and reports about the study in the news media which can then in turn be analyzed in metascience or provided and used by related tools.<ref>{{cite journal |last1=Baykoucheva |first1=Svetla |title=Measuring attention |journal=Managing Scientific Information and Research Data |date=2015 |pages=127–136 |doi=10.1016/B978-0-08-100195-0.00014-7 |isbn=978-0081001950 |url=https://www.researchgate.net/publication/280568406}}</ref> In terms of assessment and findability, altmetrics rate publications' performance or impact by the interactions they receive through social media or other online platforms,<ref name="10.1162/qss_a_00171">{{cite journal |last1=Zagorova |first1=Olga |last2=Ulloa |first2=Roberto |last3=Weller |first3=Katrin |last4=Flöck |first4=Fabian |title="I updated the <ref>": The evolution of references in the English Wikipedia and the implications for altmetrics |journal=Quantitative Science Studies |date=12 April 2022 |volume=3 |issue=1 |pages=147–173 |doi=10.1162/qss_a_00171|s2cid=222177064 }}</ref> which can for example be used for sorting recent studies by measured impact, including before other studies are citing them. The specific procedures of established altmetrics are not transparent<ref name="10.1162/qss_a_00171"/> and the used algorithms can not be customized or altered by the user as open source software can. A study has described various limitations of altmetrics and points "toward avenues for continued research and development".<ref>{{cite journal |last1=Williams |first1=Ann E. |title=Altmetrics: an overview and evaluation |journal=Online Information Review |date=12 June 2017 |volume=41 |issue=3 |pages=311–317 |doi=10.1108/OIR-10-2016-0294}}</ref> They are also limited in their use as a primary tool for researchers to find received constructive feedback.
====Societal implications and applications====
It has been suggested that it may benefit science if "intellectual exchange—particularly regarding the societal implications and applications of science and technology—are better appreciated and incentivized in the future".<ref name="10.1126/science.aao0185"/>
====Knowledge integration====
Primary studies "without context, comparison or summary are ultimately of limited value" and various types{{additional citation needed|date=January 2023}} of research syntheses and summaries integrate primary studies.<ref name="10.1038/nature25753"/> Progress in key social-ecological challenges of the global environmental agenda is "hampered by a lack of [[knowledge integration|integration]] and synthesis of existing scientific evidence", with a "fast-increasing volume of data", compartmentalized information and generally unmet evidence synthesis challenges.<ref>{{cite journal |last1=Balbi |first1=Stefano |last2=Bagstad |first2=Kenneth J. |last3=Magrach |first3=Ainhoa |last4=Sanz |first4=Maria Jose |last5=Aguilar-Amuchastegui |first5=Naikoa |last6=Giupponi |first6=Carlo |last7=Villa |first7=Ferdinando |title=The global environmental agenda urgently needs a semantic web of knowledge |journal=Environmental Evidence |date=17 February 2022 |volume=11 |issue=1 |pages=5 |doi=10.1186/s13750-022-00258-y |s2cid=246872765 |language=en |issn=2047-2382|hdl=10278/5023700 |hdl-access=free |doi-access=free }}</ref> According to Khalil, researchers are facing the problem of [[information overload|too many papers]] – e.g. in March 2014 more than 8,000 papers were submitted to [[arXiv]] – and to "keep up with the huge amount of literature, researchers use reference manager software, they make summaries and [[note-taking|notes]], and they rely on review papers to provide an overview of a particular topic". He notes that review papers are usually (only)" for topics in which many papers were written already, and they can get outdated quickly" and suggests "wiki-review papers" that get continuously updated with new studies on a topic and summarize many studies' results and suggest future research.<ref name="10.1007/978-3-319-20717-9_11"/> A study suggests that if a scientific publication is being cited in a Wikipedia article this could potentially be considered as an indicator of some form of impact for this publication,<ref name="10.1162/qss_a_00171"/> for example as this may, over time, indicate that the reference has contributed to a high-level of summary of the given topic.
====Science journalism====
[[Science journalism|Science journalists]] play an important role in the scientific ecosystem and in science communication to the public and need to "know how to use, relevant information when deciding whether to trust a research finding, and whether and how to report on it", vetting the findings that get transmitted to the public.<ref>{{cite web |title=How Do Science Journalists Evaluate Psychology Research? |url=https://psyarxiv.com/26kr3/ |website=psyarxiv.com}}</ref>
=== Science education ===
Some studies investigate [[science education]], e.g. the teaching about selected [[scientific controversy|scientific controversies]]<ref>{{cite journal |last1=Dunlop |first1=Lynda |last2=Veneu |first2=Fernanda |title=Controversies in Science |journal=Science & Education |date=1 September 2019 |volume=28 |issue=6 |pages=689–710 |doi=10.1007/s11191-019-00048-y |s2cid=255016078 |language=en |issn=1573-1901|url=https://eprints.whiterose.ac.uk/146183/1/LD_final_edits.docx }}</ref> and historical discovery process of major scientific conclusions,<ref>{{cite book |last1=Norsen |first1=Travis |title=How Should Humanity Steer the Future? |chapter=Back to the Future: Crowdsourcing Innovation by Refocusing Science Education |series=The Frontiers Collection |date=2016 |pages=85–95 |doi=10.1007/978-3-319-20717-9_9|isbn=978-3-319-20716-2 }}</ref> and common [[scientific misconceptions]].<ref>{{cite journal |last1=Bschir |first1=Karim |title=How to make sense of science: Mano Singham: The great paradox of science: why its conclusions can be relied upon even though they cannot be proven. Oxford: Oxford University Press, 2019, 332 pp, £ 22.99 HB |journal=Metascience |date=July 2021 |volume=30 |issue=2 |pages=327–330 |doi=10.1007/s11016-021-00654-z|s2cid=254792908 }}</ref> Education can also be a topic more generally such as how to improve the quality of scientific outputs and reduce the time needed before scientific work or how to enlarge and retain various scientific workforces.
====Science misconceptions and anti-science attitudes====
{{Further|Science#Anti-science attitudes|Media literacy|Misinformation#Countermeasures}}
Many students have misconceptions about what science is and how it works.<ref>{{cite web |title=Correcting misconceptions - Understanding Science |url=https://undsci.berkeley.edu/for-educators/prepare-and-plan/correcting-misconceptions/ |access-date=25 January 2023 |date=21 April 2022}}</ref> [[Anti-science]] attitudes and beliefs are also a subject of research.<ref>{{cite journal |last1=Philipp-Muller |first1=Aviva |last2=Lee |first2=Spike W. S. |last3=Petty |first3=Richard E. |title=Why are people antiscience, and what can we do about it? |journal=Proceedings of the National Academy of Sciences |date=26 July 2022 |volume=119 |issue=30 |pages=e2120755119 |doi=10.1073/pnas.2120755119 |pmid=35858405 |pmc=9335320 |bibcode=2022PNAS..11920755P |language=en |issn=0027-8424}}</ref><ref>{{cite web |title=The 4 bases of anti-science beliefs – and what to do about them |url=https://scienmag.com/the-4-bases-of-anti-science-beliefs-and-what-to-do-about-them/ |website=SCIENMAG: Latest Science and Health News |access-date=25 January 2023 |date=11 July 2022}}</ref> Hotez suggests antiscience "has emerged as a dominant and highly lethal force, and one that threatens global security", and that there is a need for "new infrastructure" that mitigates it.<ref>{{cite web |last1=Hotez |first1=Peter J. |title=The Antiscience Movement Is Escalating, Going Global and Killing Thousands |url=https://www.scientificamerican.com/article/the-antiscience-movement-is-escalating-going-global-and-killing-thousands/ |website=Scientific American |access-date=25 January 2023 |language=en}}</ref>
=== Evolution of sciences ===
==== Scientific practice ====
[[File:The number of authors of research articles in six journals through time.jpg|thumb|Number of authors of research articles in six journals through time<ref name="10.1098/rspb.2019.2047"/>]]
[[File:Papers and patents are using narrower portions of existing knowledge.png|thumb|Trends of diversity of work cited, mean number of self-citations, and mean age of cited work may indicate papers are using "narrower portions of existing knowledge".<ref name="10.1038/s41586-022-05543-x"/>]]
Metascience can investigate how scientific processes evolve over time. A study found that teams are growing in size, "increasing by an average of 17% per decade".<ref name="10.1126/science.aao0185"/> {{see below|[[#LaborAdvantage|labor advantage]] below}}
[[File:ArXiv's yearly submission rate plot.jpg|thumb|[[ArXiv]]'s yearly submission rate growth over 30 years<ref>{{cite journal |last1=Ginsparg |first1=Paul |title=Lessons from arXiv's 30 years of information sharing |journal=Nature Reviews Physics |date=September 2021 |volume=3 |issue=9 |pages=602–603 |language=en |doi=10.1038/s42254-021-00360-z|pmid=34377944 |pmc=8335983 }}</ref>]]
It was found that prevalent forms of non-[[open access]] publication and prices charged for many conventional journals – even for publicly funded papers – are unwarranted, unnecessary – or suboptimal – and detrimental barriers to scientific progress.<ref name="ISC1"/><ref>{{cite news |title=Nature Journals To Charge Authors Hefty Fee To Make Scientific Papers Open Access |url=https://www.iflscience.com/editors-blog/nature-journals-to-charge-authors-hefty-fee-to-make-scientific-papers-open-access/ |access-date=22 November 2021 |work=IFLScience |language=en}}</ref><ref>{{cite news |title=Harvard University says it can't afford journal publishers' prices |url=https://www.theguardian.com/science/2012/apr/24/harvard-university-journal-publishers-prices |access-date=22 November 2021 |work=The Guardian |date=24 April 2012 |language=en}}</ref><ref>{{cite journal |last1=Van Noorden |first1=Richard |title=Open access: The true cost of science publishing |journal=Nature |date=1 March 2013 |volume=495 |issue=7442 |pages=426–429 |doi=10.1038/495426a |pmid=23538808 |bibcode=2013Natur.495..426V |s2cid=27021567 |language=en |issn=1476-4687|doi-access=free }}</ref> Open access can save considerable amounts of financial resources, which could be used otherwise, and level the playing field for researchers in developing countries.<ref>{{cite journal |last1=Tennant |first1=Jonathan P. |last2=Waldner |first2=François |last3=Jacques |first3=Damien C. |last4=Masuzzo |first4=Paola |last5=Collister |first5=Lauren B. |last6=Hartgerink |first6=Chris. H. J. |title=The academic, economic and societal impacts of Open Access: an evidence-based review |journal=F1000Research |date=21 September 2016 |volume=5 |pages=632 |doi=10.12688/f1000research.8460.3|pmid=27158456 |pmc=4837983 |doi-access=free }}</ref> There are substantial expenses for subscriptions, gaining access to specific studies, and for [[article processing charge]]s. ''[[Paywall: The Business of Scholarship]]'' is a documentary on such issues.<ref>{{cite news |title=Paywall: The business of scholarship review – analysis of a scandal |url=https://www.newscientist.com/article/2181744-paywall-the-business-of-scholarship-review-analysis-of-a-scandal/ |access-date=28 January 2023 |work=New Scientist}}</ref>
Another topic are the established styles of scientific communication (e.g. long text-form studies and reviews) and the [[scientific publishing]] practices – there are concerns about a "glacial pace" of conventional publishing.<ref>{{cite journal |last1=Powell |first1=Kendall |title=Does it take too long to publish research? |journal=Nature |date=1 February 2016 |volume=530 |issue=7589 |pages=148–151 |language=en |doi=10.1038/530148a|pmid=26863966 |bibcode=2016Natur.530..148P |s2cid=1013588 |doi-access=free }}</ref> The use of [[preprint]]-servers to publish study-drafts early is increasing and [[open peer review]],<ref>{{cite web |title=Open peer review: bringing transparency, accountability, and inclusivity to the peer review process |url=https://blogs.lse.ac.uk/impactofsocialsciences/2017/09/13/open-peer-review-bringing-transparency-accountability-and-inclusivity-to-the-peer-review-process/ |website=Impact of Social Sciences |access-date=28 January 2023 |date=13 September 2017}}</ref> new tools to screen studies,<ref>{{cite magazine |last1=Dattani |first1=Saloni |title=The Pandemic Uncovered Ways to Speed Up Science |url=https://www.wired.com/story/covid-19-open-science-public-health-data/ |access-date=28 January 2023 |magazine=Wired}}</ref> and improved matching of submitted manuscripts to reviewers<ref>{{cite web |title=Speeding up the publication process at PLOS ONE |url=https://everyone.plos.org/2019/05/13/publication_timings_2019/ |website=EveryONE |access-date=28 January 2023 |date=13 May 2019}}</ref> are among the proposals to speed up publication.
==== Science overall and intrafield developments ====
[[File:Academic papers by discipline (visualization of 2012–2021 OpenAlex data; v2).png|thumb|A visualization of scientific outputs by [[Branches of science|field]] in OpenAlex.<ref name="openalexa">{{cite web |title=Open Alex Data Evolution |url=https://observablehq.com/@napsternxg/open-alex-data-evolution |website=observablehq.com |date=8 February 2022 |access-date=18 February 2022}}</ref><br />A study can be part of multiple fields{{clarify|date=January 2023}} and lower numbers of papers is not necessarily detrimental<ref name="publishless"/> for fields.]]
[[File:Change of number of scientific papers by field (visualization of 2012–2021 OpenAlex data).png|thumb|Change of number of scientific papers by field according to OpenAlex<ref name="openalexa"/>]]
[[File:Graph of number of papers by year in PubMed containing "coronavirus" up to 2019.png|thumb|Number of PubMed search results for "coronavirus" by year from 1949 to 2020]]
Studies have various kinds of [[metadata]] which can be utilized, complemented and made accessible in useful ways. [[Our Research#OpenAlex|OpenAlex]] is a free online index of over 200 million scientific documents that integrates and provides metadata such as sources, [[scientific citation|citation]]s, [[Academic authorship|author information]], [[scientific field]]s and [[Subject indexing|research topics]]. Its [[API]] and open source website can be used for metascience, [[scientometrics]] and novel tools that query this [[Semantic Web|semantic]] web of [[Academic paper|papers]].<ref>{{cite news |last1=Singh Chawla |first1=Dalmeet |title=Massive open index of scholarly papers launches |url=https://www.nature.com/articles/d41586-022-00138-y |access-date=14 February 2022 |journal=Nature |date=24 January 2022 |language=en |doi=10.1038/d41586-022-00138-y}}</ref><ref>{{cite news |title=OpenAlex: The Promising Alternative to Microsoft Academic Graph |url=https://library.smu.edu.sg/topics-insights/openalex-promising-alternative-microsoft-academic-graph |access-date=14 February 2022 |work=Singapore Management University (SMU) |language=en}}</ref><ref>{{cite web |title=OpenAlex Documentation |url=https://docs.openalex.org/ |access-date=18 February 2022}}</ref> {{anchor|Scholia}}Another project under development, [[Scholia (Wikidata project)|Scholia]], uses [[metadata]] of scientific publications for various visualizations and aggregation features such as providing a simple user interface summarizing literature about a specific feature of the SARS-CoV-2 virus using [[Wikidata]]'s "main subject" property.<ref name="10.1186/s12915-020-00940-y">{{cite journal |last1=Waagmeester |first1=Andra |last2=Willighagen |first2=Egon L. |last3=Su |first3=Andrew I. |last4=Kutmon |first4=Martina |last5=Gayo |first5=Jose Emilio Labra |last6=Fernández-Álvarez |first6=Daniel |last7=Groom |first7=Quentin |last8=Schaap |first8=Peter J. |last9=Verhagen |first9=Lisa M. |last10=Koehorst |first10=Jasper J. |title=A protocol for adding knowledge to Wikidata: aligning resources on human coronaviruses |journal=BMC Biology |date=22 January 2021 |volume=19 |issue=1 |page=12 |doi=10.1186/s12915-020-00940-y |pmid=33482803 |pmc=7820539 |issn=1741-7007 |doi-access=free }}</ref>
=====Subject-level resolutions=====
Beyond metadata explicitly assigned to studies by humans, [[natural language processing]] and AI can be used to assign research publications [[Subject indexing|to topics]] – one study investigating the impact of science awards used such to associate a paper's text (not just keywords) with the linguistic content of Wikipedia's scientific topics pages ("pages are created and updated by scientists and users through crowdsourcing"), creating meaningful and plausible classifications of high-fidelity scientific topics for further analysis or navigability.<ref>{{cite journal |last1=Jin |first1=Ching |last2=Ma |first2=Yifang |last3=Uzzi |first3=Brian |title=Scientific prizes and the extraordinary growth of scientific topics |journal=Nature Communications |date=5 October 2021 |volume=12 |issue=1 |pages=5619 |doi=10.1038/s41467-021-25712-2 |pmid=34611161 |pmc=8492701 |arxiv=2012.09269 |bibcode=2021NatCo..12.5619J |language=en |issn=2041-1723}}</ref>
===== Growth or stagnation of science overall =====
{{Further|Scientific method#History|Philosophy of science}}
[[File:Biomarker Publications in Scholia.png|thumb|Rough trend of scholarly publications about [[biomarker (medicine)|biomarkers]] according to Scholia; biomarker-related publications may not follow closely the number of viable biomarkers.<ref>{{cite web |title=Scholia – biomarker |url=https://scholia.toolforge.org/chemical-class/Q864574#publications-per-year |website= |access-date=28 January 2023}}</ref>]]
[[File:CD index of high quality science over time.png|thumb|The CD index for papers published in ''Nature'', ''PNAS'', and ''Science'' and Nobel-Prize-winning papers<ref name="10.1038/s41586-022-05543-x"/>]]
[[File:Decline of disruptive science and technology (based on the CD index).png|thumb|The CD index may indicate a "decline of disruptive science and technology".<ref name="10.1038/s41586-022-05543-x"/>]]
Metascience research is investigating the growth of science overall, using e.g. data on the number of publications in [[bibliographic database]]s. A study found segments with different growth rates appear related to phases of "economic (e.g., industrialization)" – money is considered as necessary input to the science system – "and/or political developments (e.g., Second World War)". It also confirmed a recent exponential growth in the volume of scientific literature and calculated an average doubling period of 17.3 years.<ref>{{cite journal |last1=Bornmann |first1=Lutz |last2=Haunschild |first2=Robin |last3=Mutz |first3=Rüdiger |title=Growth rates of modern science: a latent piecewise growth curve approach to model publication numbers from established and new literature databases |journal=Humanities and Social Sciences Communications |date=7 October 2021 |volume=8 |issue=1 |pages=1–15 |doi=10.1057/s41599-021-00903-w |s2cid=229156128 |language=en |issn=2662-9992|arxiv=2012.07675 }}</ref>
However, others have pointed out that is difficult to measure scientific progress in meaningful ways, partly because it's hard to accurately evaluate how important any given scientific discovery is. A variety of perspectives of the trajectories of science overall (impact, number of major discoveries, etc) have been described in books and articles, including that science is becoming harder (per dollar or hour spent), that if science "slowing today, it is because science has remained too focused on established fields", that papers and patents are increasingly less likely to be "disruptive" in terms of breaking with the past as measured by the "CD index",<ref name="10.1038/s41586-022-05543-x">{{cite journal |last1=Park |first1=Michael |last2=Leahey |first2=Erin |last3=Funk |first3=Russell J. |title=Papers and patents are becoming less disruptive over time |journal=Nature |date=January 2023 |volume=613 |issue=7942 |pages=138–144 |doi=10.1038/s41586-022-05543-x |pmid=36600070 |bibcode=2023Natur.613..138P |s2cid=255466666 |language=en |issn=1476-4687|doi-access=free }}</ref> and that there is a great [[Progress#Stagnation|stagnation]] – possibly as part of a larger trend<ref name="outofideas">{{cite web |last1=Thompson |first1=Derek |title=America Is Running on Fumes |url=https://www.theatlantic.com/ideas/archive/2021/12/america-innovation-film-science-business/620858/ |website=The Atlantic |access-date=27 January 2023 |language=en |date=1 December 2021}}</ref> – whereby e.g. "things haven't changed nearly as much since the 1970s" when excluding the computer and the Internet.
Better understanding of potential slowdowns according to some measures could be a major opportunity to improve humanity's future.<ref>{{cite web |last1=Collison |first1=Patrick |last2=Nielsen |first2=Michael |title=Science Is Getting Less Bang for Its Buck |url=https://www.theatlantic.com/science/archive/2018/11/diminishing-returns-science/575665/ |website=The Atlantic |access-date=27 January 2023 |language=en |date=16 November 2018}}</ref> For example, emphasis on citations in the measurement of scientific productivity, information overloads,<ref name="outofideas"/> reliance on a narrower set of existing knowledge (which may include narrow [[Academic specialization|specialization]] and related contemporary practices) {{tooltip|based on three "use of previous knowledge"-indicators|"the diversity of work cited, mean number of self-citations and mean age of work cited"}},<ref name="10.1038/s41586-022-05543-x"/> and risk-avoidant funding structures<ref name="stagnation"/> may have "toward incremental science and away from [[Exploratory research|exploratory]] projects that are more likely to fail".<ref name="w26752">{{cite journal |last1=Bhattacharya |first1=Jay |last2=Packalen |first2=Mikko |title=Stagnation and Scientific Incentives |date=February 2020 |url=https://www.nber.org/system/files/working_papers/w26752/w26752.pdf |publisher=National Bureau of Economic Research}}</ref> The study that introduced the "CD index" suggests the overall number of papers has risen while the total of "highly disruptive" papers as measured by the index hasn't (notably, the [[1998 in science#Astronomy and space exploration|1998]] discovery of the [[accelerating expansion of the universe]] has a CD index of 0). Their results also suggest scientists and inventors "may be struggling to keep up with the pace of knowledge expansion".<ref>{{cite news |last1=Tejada |first1=Patricia Contreras |title=With fewer disruptive studies, is science becoming an echo chamber? |url=https://www.advancedsciencenews.com/with-fewer-disruptive-studies-is-science-becoming-an-echo-chamber/ |access-date=15 February 2023 |work=Advanced Science News |date=13 January 2023 |archive-date=15 February 2023 |archive-url=https://web.archive.org/web/20230215233007/https://www.advancedsciencenews.com/with-fewer-disruptive-studies-is-science-becoming-an-echo-chamber/ |url-status=live }}</ref><ref name="10.1038/s41586-022-05543-x"/>
Various ways of measuring "novelty" of studies, novelty metrics,<ref name="w26752"/> have been proposed to balance a potential anti-novelty bias – such as textual analysis<ref name="w26752"/> or measuring whether it makes first-time-ever combinations of referenced journals, taking into account the difficulty.<ref>{{cite journal |last1=Wang |first1=Jian |last2=Veugelers |first2=Reinhilde |last3=Stephan |first3=Paula |title=Bias against novelty in science: A cautionary tale for users of bibliometric indicators |journal=Research Policy |date=1 October 2017 |volume=46 |issue=8 |pages=1416–1436 |doi=10.1016/j.respol.2017.06.006 |language=en |issn=0048-7333|url=https://lirias.kuleuven.be/handle/123456789/590071 }}</ref> Other approaches include pro-actively funding risky projects.<ref name="10.1126/science.aao0185"/>
=== Topic mapping ===
Science maps could show main interrelated topics within a certain scientific domain, their change over time, and their key actors (researchers, institutions, journals). They may help find factors determine the emergence of new scientific fields and the development of interdisciplinary areas and could be relevant for science policy purposes.<ref>{{cite web |last1=Petrovich |first1=Eugenio |title=Science mapping |url=https://www.isko.org/cyclo/science_mapping#9 |website=www.isko.org |access-date=27 January 2023 |date=2020}}</ref> [[Theory of change|Theories of scientific change]] could guide "the exploration and interpretation of visualized intellectual structures and dynamic patterns".<ref>{{cite journal |last1=Chen |first1=Chaomei |title=Science Mapping: A Systematic Review of the Literature |journal=Journal of Data and Information Science |date=21 March 2017 |volume=2 |issue=2 |pages=1–40 |doi=10.1515/jdis-2017-0006 |s2cid=57737772 |url=https://www.researchgate.net/publication/313991204|doi-access=free }}</ref> The maps can show the intellectual, social or conceptual structure of a research field.<ref>{{cite journal |last1=Gutiérrez-Salcedo |first1=M. |last2=Martínez |first2=M. Ángeles |last3=Moral-Munoz |first3=J. A. |last4=Herrera-Viedma |first4=E. |last5=Cobo |first5=M. J. |title=Some bibliometric procedures for analyzing and evaluating research fields |journal=Applied Intelligence |date=1 May 2018 |volume=48 |issue=5 |pages=1275–1287 |doi=10.1007/s10489-017-1105-y |s2cid=254227914 |language=en |issn=1573-7497}}</ref> Beyond visual maps, expert [[Survey (human research)|survey]]-based studies and similar approaches could identify understudied or neglected societally important areas, topic-level problems (such as stigma or dogma), or potential misprioritizations.{{additional citation needed|date=January 2023}} Examples of such are [[policy studies|studies]] about [[policy]] in relation to public health<ref>{{cite journal |last1=Navarro |first1=V. |title=Politics and health: a neglected area of research |journal=The European Journal of Public Health |date=31 March 2008 |volume=18 |issue=4 |pages=354–355 |doi=10.1093/eurpub/ckn040|pmid=18524802 }}</ref> and the social science of climate change mitigation<ref name="10.1016/j.erss.2019.101349">{{Cite journal|last1=Overland|first1=Indra|last2=Sovacool|first2=Benjamin K.|date=1 April 2020|title=The misallocation of climate research funding|journal=Energy Research & Social Science|language=en|volume=62|pages=101349|doi=10.1016/j.erss.2019.101349|issn=2214-6296|doi-access=free}}</ref> where it has been estimated that only 0.12% of all funding for climate-related research is spent on such despite the most urgent puzzle at the current juncture being working out how to mitigate climate change, whereas the natural science of climate change is already well established.<ref name="10.1016/j.erss.2019.101349"/>
There are also studies that map a scientific field or a topic such as the study of the use of research evidence [[evidence-based policy|in policy]] and [[evidence-based practice|practice]], partly using [[Survey (human research)|surveys]].<ref>{{cite journal |last1=Farley-Ripple |first1=Elizabeth N. |last2=Oliver |first2=Kathryn |last3=Boaz |first3=Annette |title=Mapping the community: use of research evidence in policy and practice |journal=Humanities and Social Sciences Communications |date=7 September 2020 |volume=7 |issue=1 |pages=1–10 |doi=10.1057/s41599-020-00571-2 |language=en |issn=2662-9992|doi-access=free}}</ref>
=== Controversies, current debates and disagreement ===
{{See also|#scite.ai|#Topic mapping}}
[[File:Disagreement in the scientific literature by field.jpg|thumb|Percent of all citances in each field that contain signals of disagreement<ref name="10.7554/eLife.72737"/>]]
Some research is investigating [[scientific controversy]] or controversies, and may identify currently ongoing major debates (e.g. open questions), and [[disagreement]] between scientists or studies.{{additional citation needed|date=January 2023}} One study suggests the level of disagreement was highest in the [[social sciences]] and [[humanities]] (0.61%), followed by biomedical and health sciences (0.41%), life and earth sciences (0.29%); physical sciences and engineering (0.15%), and mathematics and computer science (0.06%).<ref name="10.7554/eLife.72737">{{cite journal |last1=Lamers |first1=Wout S |last2=Boyack |first2=Kevin |last3=Larivière |first3=Vincent |last4=Sugimoto |first4=Cassidy R |last5=van Eck |first5=Nees Jan |last6=Waltman |first6=Ludo |last7=Murray |first7=Dakota |title=Investigating disagreement in the scientific literature |journal=eLife |date=24 December 2021 |volume=10 |pages=e72737 |doi=10.7554/eLife.72737 |pmid=34951588 |pmc=8709576 |issn=2050-084X |doi-access=free }}</ref> Such research may also show, where the disagreements are, especially if they cluster, including visually such as with cluster diagrams.
=== Challenges of interpretation of pooled results ===
{{See also|Clinical trial#Trial design}}
{{Further|Meta-analysis#Challenges}}
Studies about a specific [[research question|research question or research topic]] are often reviewed in the form of higher-level overviews in which results from various studies are integrated, compared, critically analyzed and interpreted. Examples of such works are [[scientific review]]s and [[meta-analysis|meta-analyses]]. These and related practices<!-- or their intended purposes etc--> face various challenges and are a subject of metascience.
Various issues with included or available studies such as, for example, heterogeneity of methods used may lead to faulty conclusions of the meta-analysis.<ref>{{cite journal |last1=Stone |first1=Dianna L. |last2=Rosopa |first2=Patrick J. |title=The Advantages and Limitations of Using Meta-analysis in Human Resource Management Research |journal=Human Resource Management Review |date=1 March 2017 |volume=27 |issue=1 |pages=1–7 |doi=10.1016/j.hrmr.2016.09.001 |language=en |issn=1053-4822}}</ref><!-- which can for example preempt further research or misplace research focus.-->
=== Knowledge integration and living documents ===
Various problems require swift [[Knowledge integration|integration]] of new and existing science-based knowledge. Especially setting where there are a large number of loosely related projects and initiatives benefit from a common ground or "commons".<ref name="10.1186/s12915-020-00940-y"/>
Evidence synthesis can be applied to important and, notably, both relatively urgent and certain [[List of global issues|global challenges]]: "[[Climate change mitigation#Overviews, integration and comparisons of measures|climate change]], energy transitions, biodiversity loss, [[antimicrobial resistance]], poverty eradication and so on". It was suggested that a better system would keep summaries of research evidence up to date via living systematic reviews – e.g. as [[living document]]s. While the number of scientific papers and data (or information and online knowledge) [[Information explosion|has risen substantially]],{{additional citation needed|date=February 2022}} the number of published academic systematic reviews has risen from "around 6,000 in 2011 to more than 45,000 in 2021".<ref>{{cite journal |last1=Elliott |first1=Julian |last2=Lawrence |first2=Rebecca |last3=Minx |first3=Jan C. |last4=Oladapo |first4=Olufemi T. |last5=Ravaud |first5=Philippe |last6=Tendal Jeppesen |first6=Britta |last7=Thomas |first7=James |last8=Turner |first8=Tari |last9=Vandvik |first9=Per Olav |last10=Grimshaw |first10=Jeremy M. |title=Decision makers need constantly updated evidence synthesis |journal=Nature |date=December 2021 |volume=600 |issue=7889 |pages=383–385 |doi=10.1038/d41586-021-03690-1| pmid=34912079 |bibcode=2021Natur.600..383E |s2cid=245220047 |language=en|doi-access=free}}</ref> An [[evidence-based]] approach is important for progress in science, [[policy]], medical and other practices. For example, meta-analyses can quantify what is known and identify what is not yet known<ref name="10.1038/nature25753"/> and place "truly innovative and highly [[interdisciplinary]] ideas" into the context of established knowledge which may enhance their impact.<ref name="10.1126/science.aao0185"/>
=== Factors of success and progress ===
{{See also|#Growth or stagnation of science overall}}
It has been hypothesized that a deeper understanding of factors behind successful science could "enhance prospects of science as a whole to more effectively address societal problems".<ref name="10.1126/science.aao0185">{{cite journal |last1=Fortunato |first1=Santo |last2=Bergstrom |first2=Carl T. |last3=Börner |first3=Katy |last4=Evans |first4=James A. |last5=Helbing |first5=Dirk |last6=Milojević |first6=Staša |last7=Petersen |first7=Alexander M. |last8=Radicchi |first8=Filippo |last9=Sinatra |first9=Roberta |last10=Uzzi |first10=Brian |last11=Vespignani |first11=Alessandro |last12=Waltman |first12=Ludo |last13=Wang |first13=Dashun |last14=Barabási |first14=Albert-László |title=Science of science |journal=Science |date=2 March 2018 |volume=359 |issue=6379 |page=eaao0185 |doi=10.1126/science.aao0185 |pmid=29496846 |pmc=5949209 |url=https://www.researchgate.net/publication/323502497 |access-date=22 November 2021}}</ref>
;Novel ideas and disruptive scholarship
Two metascientists reported that "structures fostering disruptive scholarship and focusing attention on novel [[idea]]s" could be important as in a growing [[scientific field]] [[scientific citation|citation flows]] disproportionately consolidate to already well-cited papers, possibly slowing<!--stagnating--> and inhibiting canonical [[Progress#Scientific progress|progress]].<!--https://www.theatlantic.com/ideas/archive/2021/11/grants-american-scientific-revolution/620609/--><ref>{{cite web |last1=Snyder |first1=Alison |title=New ideas are struggling to emerge from the sea of science |date=14 October 2021 |url=https://www.axios.com/science-new-ideas-dbe29601-010c-411a-b79d-bbd1388ec5a0.html |publisher=Axios |access-date=15 November 2021}}</ref><ref>{{cite journal |last1=Chu |first1=Johan S. G. |last2=Evans |first2=James A. |title=Slowed canonical progress in large fields of science |journal=Proceedings of the National Academy of Sciences |date=12 October 2021 |volume=118 |issue=41 |page=e2021636118 |doi=10.1073/pnas.2021636118 |pmid=34607941 |pmc=8522281 |bibcode=2021PNAS..11821636C |language=en |issn=0027-8424|doi-access=free }}</ref> A study concluded that to enhance impact of truly innovative and highly interdisciplinary novel ideas, they should be placed in the context of established knowledge.<ref name="10.1126/science.aao0185"/>
;Mentorship, partnerships and social factors
Other researchers reported that the most successful – in terms of "likelihood of [[Lists of science and technology awards|prizewinning]], National Academy of Science (NAS) induction, or superstardom" – [[Mentorship|protégés studied under mentors]] who published [[research]] for which they were conferred a prize after the protégés' mentorship. Studying original topics rather than these mentors' research-topics was also positively associated with success.<ref>{{cite news |title=Sharing of tacit knowledge is most important aspect of mentorship, study finds |url=https://phys.org/news/2020-06-tacit-knowledge-important-aspect-mentorship.html |access-date=4 July 2020 |work=phys.org |language=en}}</ref><ref>{{cite journal |last1=Ma |first1=Yifang |last2=Mukherjee |first2=Satyam |last3=Uzzi |first3=Brian |title=Mentorship and protégé success in STEM fields |journal=Proceedings of the National Academy of Sciences |date=23 June 2020 |volume=117 |issue=25 |pages=14077–14083 |doi=10.1073/pnas.1915516117 |pmid=32522881 |pmc=7322065 |bibcode=2020PNAS..11714077M |language=en |issn=0027-8424|doi-access=free }}</ref> Highly productive partnerships are also a topic of research – e.g. "super-ties" of frequent co-authorship of two individuals who can complement skills, likely also the result of other factors such as mutual trust, conviction, commitment and fun.<ref>{{cite news |title=Science of Science authors hope to spark conversations about the scientific enterprise |url=https://phys.org/news/2018-03-science-authors-conversations-scientific-enterprise.html |access-date=28 January 2023 |work=phys.org |language=en}}</ref><ref name="10.1126/science.aao0185"/>
;Study of successful scientists and processes, general skills and activities
The emergence or origin of ideas by successful scientists is also a topic of research, for example reviewing existing ideas on how [[Gregor Mendel|Mendel]] made his [[scientific discovery|discoveries]],<ref>{{cite journal |last1=van Dijk |first1=Peter J. |last2=Jessop |first2=Adrienne P. |last3=Ellis |first3=T. H. Noel |title=How did Mendel arrive at his discoveries? |journal=Nature Genetics |date=July 2022 |volume=54 |issue=7 |pages=926–933 |doi=10.1038/s41588-022-01109-9 |pmid=35817970 |s2cid=250454204 |language=en |issn=1546-1718}}</ref> – or more generally, the process of discovery by scientists. Science is a "multifaceted process of appropriation, [[copying]], extending, or combining ideas and [[invention]]s" [and other types of knowledge or information], and not an isolated process.<ref name="10.1126/science.aao0185"/> There are also few studies investigating scientists' habits, common modes of thinking, reading habits, use of information sources, [[digital literacy]] skills, and [[workflow]]s.<ref>{{cite journal |last1=Root-Bernstein |first1=Robert S. |last2=Bernstein |first2=Maurine |last3=Garnier |first3=Helen |title=Correlations Between Avocations, Scientific Style, Work Habits, and Professional Impact of Scientists |journal=Creativity Research Journal |date=1 April 1995 |volume=8 |issue=2 |pages=115–137 |doi=10.1207/s15326934crj0802_2 |issn=1040-0419}}</ref><ref>{{cite journal |last1=Ince |first1=Sharon |last2=Hoadley |first2=Christopher |last3=Kirschner |first3=Paul A. |title=A qualitative study of social sciences faculty research workflows |journal=Journal of Documentation |date=1 January 2022 |volume=78 |issue=6 |pages=1321–1337 |doi=10.1108/JD-08-2021-0168 |s2cid=247078086 |issn=0022-0418}}</ref><ref>{{cite web |last1=Nassi-Calò |first1=Lilian |title=Researchers reading habits for scientific literature {{!}} SciELO in Perspective |url=https://blog.scielo.org/en/2014/04/03/researchers-reading-habits-for-scientific-literature/ |access-date=25 February 2023 |date=3 April 2014}}</ref><ref>{{cite news |last1=Van Noorden |first1=Richard |title=Scientists may be reaching a peak in reading habits |url=https://www.nature.com/articles/nature.2014.14658 |access-date=25 February 2023 |journal=Nature |date=3 February 2014 |language=en |doi=10.1038/nature.2014.14658}}</ref><ref>{{cite journal |last1=Arshad |first1=Alia |last2=Ameen |first2=Kanwal |title=Comparative analysis of academic scientists, social scientists and humanists' scholarly information seeking habits |journal=The Journal of Academic Librarianship |date=1 January 2021 |volume=47 |issue=1 |pages=102297 |doi=10.1016/j.acalib.2020.102297 |s2cid=229433047 |language=en |issn=0099-1333}}</ref>
;Labor advantage
{{anchor|LaborAdvantage}}A study theorized that in many disciplines, larger scientific productivity or success by [[College and university rankings|elite universities]] can be explained by their larger pool of available funded laborers.<ref>{{cite news |title=Why it pays to join a big research group if you want to be more scientifically productive |url=https://physicsworld.com/a/why-it-pays-to-join-a-big-research-group-if-you-want-to-be-more-scientifically-productive/ |access-date=13 December 2022 |work=Physics World |date=24 November 2022}}</ref><ref>{{cite journal |last1=Zhang |first1=Sam |last2=Wapman |first2=K. Hunter |last3=Larremore |first3=Daniel B. |last4=Clauset |first4=Aaron |title=Labor advantages drive the greater productivity of faculty at elite universities |journal=Science Advances |date=16 November 2022 |volume=8 |issue=46 |pages=eabq7056 |doi=10.1126/sciadv.abq7056 |pmid=36399560 |pmc=9674273 |arxiv=2204.05989 |bibcode=2022SciA....8.7056Z |language=en |issn=2375-2548}}</ref>{{elucidate|date=January 2023}}
;Ultimate impacts
Success (in science) is often measured in terms of metrics like citations, not in terms of the eventual or potential impact on lives and society, which awards sometimes do.{{additional citation needed|date=January 2023}} Problems with such metrics are roughly outlined elsewhere in this article and include that [[scientific review|reviews]] replace citations to primary studies.<ref name="10.1038/nature25753">{{cite journal |last1=Gurevitch |first1=Jessica |last2=Koricheva |first2=Julia |last3=Nakagawa |first3=Shinichi |last4=Stewart |first4=Gavin |title=Meta-analysis and the science of research synthesis |journal=Nature |date=March 2018 |volume=555 |issue=7695 |pages=175–182 |doi=10.1038/nature25753 |pmid=29517004 |bibcode=2018Natur.555..175G |s2cid=3761687 |language=en |issn=1476-4687}}</ref> There are also proposals for changes to the academic incentives systems that increase the recognition of societal impact in the research process.<ref>{{cite web |title=Academic Incentives and Research Impact: Developing Reward and Recognition Systems to Better People's Lives |url=https://sfdora.org/resource/academic-incentives-and-research-impact-developing-reward-and-recognition-systems-to-better-peoples-lives/ |website=DORA |access-date=28 January 2023}}</ref>
;Progress studies
A proposed field of "Progress Studies" could investigate how scientists (or funders or evaluators of scientists) should be acting, "figuring out interventions" and study [[progress]] itself.<ref>{{cite web |last1=Collison |first1=Patrick|last2=Cowen|first2=Tyler |title=We Need a New Science of Progress |url=https://www.theatlantic.com/science/archive/2019/07/we-need-new-science-progress/594946/ |website=The Atlantic |access-date=25 January 2023 |language=en |date=30 July 2019}}</ref> The field was explicitly proposed in a 2019 essay and described as an [[applied science]] that prescribes action.<ref>{{cite web |last1=Lovely |first1=Garrison |title=Do we need a better understanding of 'progress'? |url=https://www.bbc.com/future/article/20220615-do-we-need-a-better-understanding-of-progress |publisher=BBC |access-date=27 January 2023 |language=en}}</ref>
;As and for acceleration of progress
A study suggests that improving the way science is done could accelerate the rate of scientific discovery and its applications which could be useful for finding urgent solutions to humanity's problems, improve humanity's conditions, and enhance understanding of nature. Metascientific studies can seek to identify aspects of science that need improvement, and develop ways to improve them.<ref name="10.1007/978-3-319-20717-9_11">{{cite book |last1=Khalil |first1=Mohammed M. |title=How Should Humanity Steer the Future? |chapter=Improving Science for a Better Future |series=The Frontiers Collection |date=2016 |pages=113–126 |doi=10.1007/978-3-319-20717-9_11 |publisher=Springer International Publishing |isbn=978-3-319-20716-2 |language=en}}</ref> If science is accepted as the fundamental engine of economic growth and social progress, this could raise "the question of what we – as a society – can do to accelerate science, and to direct science toward solving society's most important problems."<ref>{{cite web |title=Developing the science of science |url=https://www.worksinprogress.co/issue/developing-the-science-of-science/ |website=Works in Progress |access-date=25 January 2023|first1=Paul|last1=Niehaus|first2=Heidi |last2=Williams }}</ref> However, one of the authors clarified that a one-size-fits-all approach is not thought to be right answer – for example, in funding, DARPA models, curiosity-driven methods, allowing "a single reviewer to champion a project even if his or her peers do not agree", and various other approaches all have their uses. Nevertheless, evaluation of them can help build knowledge of what works or works best.<ref name="stagnation">{{cite news |title=How to escape scientific stagnation |url=https://www.economist.com/finance-and-economics/2022/10/26/how-to-escape-scientific-stagnation |newspaper=The Economist |access-date=25 January 2023}}</ref>
==Reforms==
Meta-research identifying flaws in scientific practice has inspired reforms in science. These reforms seek to address and fix problems in scientific practice which lead to low-quality or inefficient research.
A 2015 study lists "fragmented" efforts in meta-research.<ref name=Ioannidis2015/>
===Pre-registration===
{{Further|Pre-registration (science)|Clinical trial registration}}
The practice of registering a scientific study before it is conducted is called [[pre-registration (science)|pre-registration]]. It arose as a means to address the [[replication crisis]]. Pregistration requires the submission of a registered report, which is then accepted for publication or rejected by a journal based on theoretical justification, experimental design, and the proposed statistical analysis. Pre-registration of studies serves to prevent [[publication bias]] (e.g. not publishing negative results), reduce [[data dredging]], and increase replicability.<ref>{{Cite web|title = Registered Replication Reports|publisher=Association for Psychological Science|url = http://www.psychologicalscience.org/index.php/replication|access-date = 2015-11-13}}</ref><ref>{{Cite news|title = Psychology's 'registration revolution'|url = https://www.theguardian.com/science/head-quarters/2014/may/20/psychology-registration-revolution|newspaper = the Guardian|access-date = 2015-11-13|first = Chris|last = Chambers|date = 2014-05-20}}</ref>
===Reporting standards===
{{Further|CONSORT|EQUATOR Network}}
Studies showing poor consistency and quality of reporting have demonstrated the need for reporting standards and guidelines in science, which has led to the rise of organisations that produce such standards, such as [[CONSORT]] (Consolidated Standards of Reporting Trials) and the [[EQUATOR Network]].
The EQUATOR ('''E'''nhancing the '''QUA'''lity and '''T'''ransparency '''O'''f health '''R'''esearch)<ref>{{cite journal | last1 = Simera | first1 = I | last2 = Moher | first2 = D | last3 = Hirst | first3 = A | last4 = Hoey | first4 = J | last5 = Schulz | first5 = KF | last6 = Altman | first6 = DG | title = Transparent and accurate reporting increases reliability, utility, and impact of your research: reporting guidelines and the EQUATOR Network | journal = BMC Medicine | volume = 8 | page = 24 | year = 2010 | pmid = 20420659 | pmc = 2874506 | doi = 10.1186/1741-7015-8-24 | doi-access = free }}</ref> Network is an international initiative aimed at promoting transparent and accurate reporting of health research studies to enhance the value and reliability of [[medical research]] literature.<ref>{{cite journal |last1=Simera |first1=I. |last2=Moher |first2=D. |last3=Hoey |first3=J. |last4=Schulz |first4=K. F. |last5=Altman |first5=D. G. |title=A catalogue of reporting guidelines for health research |journal=European Journal of Clinical Investigation |volume=40 |pages=35–53 |year=2010 |doi= 10.1111/j.1365-2362.2009.02234.x |pmid=20055895 |issue=1 |doi-access=free}}</ref> The EQUATOR Network was established with the goals of raising awareness of the importance of good reporting of research, assisting in the development, dissemination and implementation of reporting guidelines for different types of study designs, monitoring the status of the quality of reporting of research studies in the health sciences literature, and conducting research relating to issues that impact the quality of reporting of health research studies.<ref name=Simera2009>{{cite journal|last=Simera|first=I|author2=Altman, DG|s2cid=36739841|title=Writing a research article that is "fit for purpose": EQUATOR Network and reporting guidelines|journal= Evidence-Based Medicine|date=October 2009|volume=14|issue=5|pages=132–134|doi=10.1136/ebm.14.5.132|pmid=19794009}}<!--|access-date=28 July 2013--></ref> The Network acts as an "umbrella" organisation, bringing together developers of reporting guidelines, medical journal editors and peer reviewers, research funding bodies, and other key stakeholders with a mutual interest in improving the quality of research publications and research itself.
==Applications==
The areas of application of metascience include ICTs, medicine, psychology and physics.
=== Information and communications technologies ===
{{See also|#Reforms|#Evaluation and incentives|#Science overall and intrafield developments|Group decision-making|List of academic databases and search engines|Scientific communication}}
Metascience is used in the creation and improvement of technical systems ([[Information and communications technology|ICTs]]) and standards of science evaluation, incentivation, communication, commissioning, funding, regulation, production, management, use and publication. Such can be called "applied metascience"<ref>{{cite video |title=Ep. 49: Joel Chan on metascience, creativity, and tools for thought. |url=https://www.youtube.com/watch?v=KT-I_6TERKk |language=en}}</ref>{{better citation needed|date=February 2022}} and may seek to explore ways to increase quantity, quality and positive impact of research. One example for such is the [[Scientometrics#Altmetrics|development of alternative metrics]].<ref name="10.1126/science.aao0185"/>
;Study screening and feedback
Various websites or tools also identify inappropriate studies and/or enable feedback such as [[PubPeer]], [[Cochrane (organisation)|Cochrane]]'s Risk of Bias Tool<ref>{{cite web |title=Risk of Bias Tool {{!}} Cochrane Bias |url=https://methods.cochrane.org/bias/risk-bias-tool |website=methods.cochrane.org |access-date=25 January 2023 |language=en}}</ref> and [[RetractionWatch]]. Medical and academic disputes are as ancient as antiquity and a study calls for research into "constructive and obsessive criticism" and into policies to "help strengthen social media into a vibrant forum for discussion, and not merely an arena for gladiator matches".<ref>{{cite journal |last1=Prasad |first1=Vinay |last2=Ioannidis |first2=John P. A. |title=Constructive and obsessive criticism in science |journal=European Journal of Clinical Investigation |date=November 2022 |volume=52 |issue=11 |pages=e13839 |doi=10.1111/eci.13839 |pmid=35869811 |pmc=9787955 |language=en |issn=0014-2972}}</ref> Feedback to studies can be found via altmetrics which is often integrated at the website of the study – most often as an embedded [[Altmetrics]] badge – but may often be incomplete, such as only showing social media discussions that link to the study directly but not those that link to news reports about the study.
;Tools used, modified, extended or investigated
Tools may get developed with metaresearch or can be used or investigated by such. Notable examples may include:
* {{anchor|scite.ai}}The tool scite.ai aims to track and link citations of papers as 'Supporting', 'Mentioning' or 'Contrasting' the study.<ref>{{cite news |last1=Khamsi |first1=Roxanne |title=Coronavirus in context: Scite.ai tracks positive and negative citations for COVID-19 literature |url=https://www.nature.com/articles/d41586-020-01324-6 |access-date=19 February 2022 |journal=Nature |date=1 May 2020 |language=en |doi=10.1038/d41586-020-01324-6}}</ref><ref>{{cite journal |last1=Nicholson |first1=Josh M. |last2=Mordaunt |first2=Milo |last3=Lopez |first3=Patrice |last4=Uppala |first4=Ashish |last5=Rosati |first5=Domenic |last6=Rodrigues |first6=Neves P. |last7=Grabitz |first7=Peter |last8=Rife |first8=Sean C. |title=scite: A smart citation index that displays the context of citations and classifies their intent using deep learning |journal=Quantitative Science Studies |date=5 November 2021 |volume=2 |issue=3 |pages=882–898 |doi=10.1162/qss_a_00146|s2cid=232283218 }}</ref><ref name="newbot"/>
* The Scite Reference Check bot is an extension of scite.ai that scans new article PDFs "for references to retracted papers, and posts both the citing and retracted papers on Twitter" and also "flags when new studies cite older ones that have issued corrections, errata, withdrawals, or expressions of concern".<ref name="newbot">{{cite news |title=New bot flags scientific studies that cite retracted papers |url=https://www.nature.com/nature-index/news-blog/new-bot-flags-scientific-research-studies-that-cite-retracted-papers |website=Nature Index |date=2 February 2021 |access-date=25 January 2023 |language=en}}</ref> Studies have suggested as few as 4% of citations to retracted papers clearly recognize the retraction.<ref name="newbot"/>
* Search engines like [[Google Scholar]] are used to find studies and the notification service [[Google Alerts]] enables notifications for new studies matching specified search terms. Scholarly communication infrastructure includes search databases.<ref>{{cite book |last1=Chan |first1=Joel |last2=Lutters |first2=Wayne |last3=Schneider |first3=Jodi |last4=Kirsanow |first4=Karola |last5=Bessa |first5=Silvia |last6=Saunders |first6=Jonny L. |title=Companion Computer Supported Cooperative Work and Social Computing |chapter=Growing New Scholarly Communication Infrastructures for Sharing, Reusing, and Synthesizing Knowledge |date=8 November 2022 |pages=278–281 |doi=10.1145/3500868.3559398 |publisher=Association for Computing Machinery|isbn=9781450391900 |s2cid=253385733 }}</ref>
* [[Shadow library]] [[Sci-hub]] is a topic of metascience<ref>{{cite journal |last1=Segado-Boj |first1=Francisco |last2=Martín-Quevedo |first2=Juan |last3=Prieto-Gutiérrez |first3=Juan-José |title=Jumping over the paywall: Strategies and motivations for scholarly piracy and other alternatives |journal=Information Development |date=12 December 2022 |doi=10.1177/02666669221144429 |s2cid=254564205 |language=en |issn=0266-6669|url=https://eprints.ucm.es/id/eprint/75874/1/Preprint_Segado-Boj_ID_2022.pdf }}</ref>
* [[Personal knowledge management]] systems for research-, [[knowledge worker|knowledge-]] and task management, such as saving information in organized ways<ref>{{cite journal |last1=Gosztyla |first1=Maya |title=How to find, read and organize papers |url=https://www.nature.com/articles/d41586-022-01878-7 |journal=Nature |access-date=28 January 2023 |language=en |doi=10.1038/d41586-022-01878-7 |date=7 July 2022|pmid=35804061 |s2cid=250388551 }}</ref> with [[Comparison of note-taking software|multi-document]] [[text editor]]s for future use<ref>{{cite book |last1=Fastrez |first1=Pierre |last2=Jacques |first2=Jerry |title=Human Interface and the Management of Information. Information and Knowledge Design |chapter=Managing References by Filing and Tagging: An Exploratory Study of Personal Information Management by Social Scientists |series=Lecture Notes in Computer Science |date=2015 |volume=9172 |pages=291–300 |doi=10.1007/978-3-319-20612-7_28 |publisher=Springer International Publishing |isbn=978-3-319-20611-0 |language=en}}</ref><ref>{{cite journal |last1=Chaudhry |first1=Abdus Sattar |last2=Alajmi |first2=Bibi M. |title=Personal information management practices: how scientists find and organize information |url=https://www.emerald.com/insight/content/doi/10.1108/GKMC-04-2022-0082/full/html |journal=Global Knowledge, Memory and Communication |doi=10.1108/GKMC-04-2022-0082 |date=1 January 2022|volume=ahead-of-print |issue=ahead-of-print |s2cid=253363619 }}</ref> Such systems could be described as part of, along with e.g. Web browser ([[Tab (interface)#Development|tabs-addons]]<ref>{{cite book |last1=Chang |first1=Joseph Chee |last2=Kim |first2=Yongsung |last3=Miller |first3=Victor |last4=Liu |first4=Michael Xieyang |last5=Myers |first5=Brad A |last6=Kittur |first6=Aniket |title=The 34th Annual ACM Symposium on User Interface Software and Technology |chapter=Tabs.do: Task-Centric Browser Tab Management |date=12 October 2021 |pages=663–676 |doi=10.1145/3472749.3474777 |publisher=Association for Computing Machinery|isbn=9781450386357 |s2cid=237102658 }}</ref> etc) and search software,{{additional citation needed|date=January 2023}} "mind-machine partnerships" that could be investigated by metascience for how they could improve science.<ref name="10.1126/science.aao0185"/>
* Scholia – efforts to open scholarly publication metadata and use it via Wikidata.<ref>{{cite journal |last1=Rasberry |first1=Lane |last2=Tibbs |first2=Sheri |last3=Hoos |first3=William |last4=Westermann |first4=Amy |last5=Keefer |first5=Jeffrey |last6=Baskauf |first6=Steven James |last7=Anderson |first7=Clifford |last8=Walker |first8=Philip |last9=Kwok |first9=Cherrie |last10=Mietchen |first10=Daniel |title=WikiProject Clinical Trials for Wikidata |date=4 April 2022 |doi=10.1101/2022.04.01.22273328|s2cid=247936371 |website=medRxiv }}</ref>
* Various software enables common metascientific practices such as bibliometric analysis.<ref>{{cite journal |last1=Moral-Muñoz |first1=José A. |last2=Herrera-Viedma |first2=Enrique |last3=Santisteban-Espejo |first3=Antonio |last4=Cobo |first4=Manuel J. |title=Software tools for conducting bibliometric analysis in science: An up-to-date review |journal=El Profesional de la Información |date=19 January 2020 |volume=29 |issue=1 |doi=10.3145/epi.2020.ene.03|s2cid=210926828 |hdl=10481/62406 |hdl-access=free }}</ref>
;Development
According to a study "a simple way to check how often studies have been repeated, and whether or not the original findings are confirmed" is needed due to reproducibility issues in science.<ref>{{cite news |title=A new replication crisis: Research that is less likely to be true is cited more |url=https://phys.org/news/2021-05-replication-crisis-true-cited.html |access-date=14 June 2021 |work=phys.org |language=en}}</ref><ref>{{cite journal |last1=Serra-Garcia |first1=Marta |last2=Gneezy |first2=Uri |title=Nonreplicable publications are cited more than replicable ones |journal=Science Advances |date=2021-05-01 |volume=7 |issue=21 |page=eabd1705 |doi=10.1126/sciadv.abd1705 |pmid=34020944 |pmc=8139580 |bibcode=2021SciA....7.1705S |language=en |issn=2375-2548}}</ref> A study suggests a tool for screening studies for early warning signs for research fraud.<ref>{{cite journal |last1=Parker |first1=Lisa |last2=Boughton |first2=Stephanie |last3=Lawrence |first3=Rosa |last4=Bero |first4=Lisa |title=Experts identified warning signs of fraudulent research: a qualitative study to inform a screening tool |journal=Journal of Clinical Epidemiology |date=1 November 2022 |volume=151 |pages=1–17 |doi=10.1016/j.jclinepi.2022.07.006|pmid=35850426 |s2cid=250632662 }}</ref>
===Medicine===
{{See also|Profit motive#Criticisms}}
Clinical research in medicine is often of low quality, and many studies cannot be replicated.<ref name="Ioannidis2016">{{cite journal | last1 = Ioannidis | first1 = JPA | year = 2016 | title = Why Most Clinical Research Is Not Useful | journal = PLOS Med | volume = 13 | issue = 6| page = e1002049 | doi = 10.1371/journal.pmed.1002049 | pmid = 27328301 | pmc = 4915619 | doi-access = free }}</ref><ref>{{cite journal|title=Contradicted and initially stronger effects in highly cited clinical research|last=Ioannidis JA|date=13 July 2005|journal=JAMA|volume=294|issue=2|pages=218–228|doi=10.1001/jama.294.2.218|pmid=16014596|doi-access=free}}</ref> An estimated 85% of research funding is wasted.<ref name="ChalmersGlasziou2009">{{cite journal|last1=Chalmers|first1=Iain|last2=Glasziou|first2=Paul|s2cid=11797088|year=2009|title=Avoidable waste in the production and reporting of research evidence|journal=The Lancet|volume=374|issue=9683|pages=86–89|doi=10.1016/S0140-6736(09)60329-9|issn=0140-6736|pmid=19525005|url=http://timetravel.mementoweb.org/memento/2009/http://www.thelancet.com/journals/lancet }}</ref> Additionally, the presence of bias affects research quality.<ref>{{cite web |last1=June 24 |first1=Jeremy Hsu |last2=ET |first2=Jeremy Hsu |title=Dark Side of Medical Research: Widespread Bias and Omissions |url=https://www.livescience.com/8365-dark-side-medical-research-widespread-bias-omissions.html |website=Live Science |date=24 June 2010 |access-date=24 May 2019}}</ref> The [[pharmaceutical companies|pharmaceutical industry]] exerts substantial influence on the design and execution of medical research. Conflicts of interest are common among authors of medical literature<ref>{{cite journal |title=Confronting conflict of interest |journal=Nature Medicine |date=November 2018 |volume=24 |issue=11 |page=1629 |doi=10.1038/s41591-018-0256-7 |pmid=30401866 |language=en |issn=1546-170X|doi-access=free }}</ref> and among editors of medical journals. While almost all medical journals require their authors to disclose conflicts of interest, editors are not required to do so.<ref>{{cite journal |last1=Haque |first1=Waqas |last2=Minhajuddin |first2=Abu |last3=Gupta |first3=Arjun |last4=Agrawal |first4=Deepak |title=Conflicts of interest of editors of medical journals |journal=PLOS ONE |date=2018 |volume=13 |issue=5 |page=e0197141 |doi=10.1371/journal.pone.0197141 |pmid=29775468 |pmc=5959187 |issn=1932-6203|bibcode=2018PLoSO..1397141H |doi-access=free }}</ref> Financial [[conflicts of interest]] have been linked to higher rates of positive study results. In antidepressant trials, pharmaceutical sponsorship is the best predictor of trial outcome.<ref>{{cite journal |last1=Moncrieff |first1=J |title=The antidepressant debate. |journal=The British Journal of Psychiatry |date=March 2002 |volume=180 |issue=3 |pages=193–194 |pmid=11872507 |language=en |issn=0007-1250|doi=10.1192/bjp.180.3.193 |doi-access=free }}</ref>
[[blinded experiment|Blinding]] is another focus of meta-research, as error caused by poor blinding is a source of [[bias|experimental bias]]. Blinding is not well reported in medical literature, and widespread misunderstanding of the subject has resulted in poor implementation of blinding in [[clinical trial]]s.<ref>{{cite journal |last1=Bello |first1=S |last2=Moustgaard |first2=H |last3=Hróbjartsson |first3=A |title=The risk of unblinding was infrequently and incompletely reported in 300 randomized clinical trial publications. |journal=Journal of Clinical Epidemiology |date=October 2014 |volume=67 |issue=10 |pages=1059–1069 |doi=10.1016/j.jclinepi.2014.05.007 |pmid=24973822 |issn=1878-5921}}</ref> Furthermore, [[unblinding|failure of blinding]] is rarely measured or reported.<ref>{{cite journal |last1=Tuleu |first1=Catherine |last2=Legay |first2=Helene |last3=Orlu-Gul |first3=Mine |last4=Wan |first4=Mandy |title=Blinding in pharmacological trials: the devil is in the details |journal=Archives of Disease in Childhood |date=1 September 2013 |volume=98 |issue=9 |pages=656–659 |doi=10.1136/archdischild-2013-304037 |pmid=23898156 |pmc=3833301 |language=en |issn=0003-9888}}</ref> Research showing the failure of blinding in [[antidepressant]] trials has led some scientists to argue that antidepressants are no better than [[placebo]].<ref>{{cite journal |last1=Kirsch |first1=I |title=Antidepressants and the Placebo Effect. |journal=Zeitschrift für Psychologie |date=2014 |volume=222 |issue=3 |pages=128–134 |doi=10.1027/2151-2604/a000176 |pmid=25279271 |pmc=4172306 |issn=2190-8370}}</ref><ref>{{cite journal |last1=Ioannidis |first1=John PA |title=Effectiveness of antidepressants: an evidence myth constructed from a thousand randomized trials? |journal=Philosophy, Ethics, and Humanities in Medicine |date=27 May 2008 |volume=3 |page=14 |doi=10.1186/1747-5341-3-14 |pmid=18505564 |pmc=2412901 |issn=1747-5341 |doi-access=free }}</ref> In light of meta-research showing failures of blinding, [[CONSORT]] standards recommend that all clinical trials assess and report the quality of blinding.<ref name=":0">{{cite journal |last1=Moher |first1=David |last2=Altman |first2=Douglas G. |last3=Schulz |first3=Kenneth F. |title=CONSORT 2010 Statement: updated guidelines for reporting parallel group randomised trials |journal=BMJ |date=24 March 2010 |volume=340 |page=c332 |doi=10.1136/bmj.c332 |pmid=20332509 |pmc=2844940 |language=en |issn=0959-8138}}</ref>
Studies have shown that systematic reviews of existing research evidence are sub-optimally used in planning a new research or summarizing the results.<ref name=pmid9676682>{{cite journal |doi=10.1001/jama.280.3.280 |pmid=9676682 |title=Discussion Sections in Reports of Controlled Trials Published in General Medical Journals |journal=JAMA |volume=280 |issue=3 |pages=280–282 |year=1998 |last1=Clarke |first1=Michael |last2=Chalmers |first2=Iain |doi-access=free }}</ref> Cumulative meta-analyses of studies evaluating the effectiveness of medical interventions have shown that many clinical trials could have been avoided if a systematic review of existing evidence was done prior to conducting a new trial.<ref name=pmid1614465>{{cite journal |doi=10.1056/NEJM199207233270406 |pmid=1614465 |title=Cumulative Meta-Analysis of Therapeutic Trials for Myocardial Infarction |journal=New England Journal of Medicine |volume=327 |issue=4 |pages=248–254 |year=1992 |last1=Lau |first1=Joseph |last2=Antman |first2=Elliott M |last3=Jimenez-Silva |first3=Jeanette |last4=Kupelnick |first4=Bruce |last5=Mosteller |first5=Frederick |last6=Chalmers |first6=Thomas C |doi-access=free}}</ref><ref name=pmid16279145>{{cite journal |doi=10.1191/1740774505cn085oa |pmid=16279145 |title=Randomized controlled trials of aprotinin in cardiac surgery: Could clinical equipoise have stopped the bleeding? |journal=Clinical Trials|volume=2 |issue=3 |pages=218–229; discussion 229–232 |year=2016 |last1=Fergusson |first1=Dean |last2=Glass |first2=Kathleen Cranley |last3=Hutton |first3=Brian |last4=Shapiro |first4=Stan |s2cid=31375469 }}</ref><ref name=pmid25068257>{{cite journal |doi=10.1371/journal.pone.0102670 |pmid=25068257 |pmc=4113310 |title=Accumulating Research: A Systematic Account of How Cumulative Meta-Analyses Would Have Provided Knowledge, Improved Health, Reduced Harm and Saved Resources |journal=PLOS ONE |volume=9 |issue=7 |page=e102670 |year=2014 |last1=Clarke |first1=Mike |last2=Brice |first2=Anne |last3=Chalmers |first3=Iain |bibcode=2014PLoSO...9j2670C |doi-access=free }}</ref> For example, Lau et al.<ref name=pmid1614465/> analyzed 33 clinical trials (involving 36974 patients) evaluating the effectiveness of intravenous [[streptokinase]] for [[Myocardial infarction|acute myocardial infarction]]. Their cumulative meta-analysis demonstrated that 25 of 33 trials could have been avoided if a systematic review was conducted prior to conducting a new trial. In other words, randomizing 34542 patients was potentially unnecessary. One study<ref name="pmid21200038">{{cite journal |doi=10.7326/0003-4819-154-1-201101040-00007 |pmid=21200038 |title=A Systematic Examination of the Citation of Prior Research in Reports of Randomized, Controlled Trials |journal=Annals of Internal Medicine |volume=154 |issue=1 |pages=50–55 |year=2011 |last1=Robinson |first1=Karen A |last2=Goodman |first2=Steven N |s2cid=207536137 }}</ref> analyzed 1523 clinical trials included in 227 [[Meta-analysis|meta-analyses]] and concluded that "less than one quarter of relevant prior studies" were cited. They also confirmed earlier findings that most clinical trial reports do not present systematic review to justify the research or summarize the results.<ref name="pmid21200038" />
Many treatments used in modern medicine have been proven to be ineffective, or even harmful. A 2007 study by John Ioannidis found that it took an average of ten years for the medical community to stop referencing popular practices after their efficacy was unequivocally disproven.<ref>{{cite web |last1=Epstein |first1=David |title=When Evidence Says No, but Doctors Say Yes - The Atlantic |url=https://getpocket.com/explore/item/when-evidence-says-no-but-doctors-say-yes |website=Pocket |access-date=10 April 2020}}</ref><ref>{{cite journal |last1=Tatsioni |first1=A |last2=Bonitsis |first2=NG |last3=Ioannidis |first3=JP |title=Persistence of contradicted claims in the literature. |journal=JAMA |date=5 December 2007 |volume=298 |issue=21 |pages=2517–2526 |doi=10.1001/jama.298.21.2517 |pmid=18056905 |issn=1538-3598|doi-access=free }}</ref>
===Psychology===
{{Further|Replication crisis}}
Metascience has revealed significant problems in psychological research. The field suffers from high bias, low [[reproducibility]], and widespread [[misuse of statistics]].<ref>{{cite journal|last1=Franco|first1=Annie|last2=Malhotra|first2=Neil|author-link2=Neil Malhotra|last3=Simonovits|first3=Gabor|s2cid=143182733|date=1 January 2016|title=Underreporting in Psychology Experiments: Evidence From a Study Registry|journal=Social Psychological and Personality Science|language=en|volume=7|issue=1|pages=8–12|doi=10.1177/1948550615598377|issn=1948-5506}}</ref><ref>{{cite journal|last1=Munafò|first1=Marcus|date=29 March 2017|title=Metascience: Reproducibility blues|journal=Nature|language=en|volume=543|issue=7647|pages=619–620|doi=10.1038/543619a|issn=1476-4687|bibcode=2017Natur.543..619M|doi-access=free}}</ref><ref>{{Cite journal | doi=10.1126/science.aav4784 |title = This research group seeks to expose weaknesses in science{{snd}}and they'll step on some toes if they have to|journal = Science|date =20 September 2018|last1 = Stokstad|first1 = Erik|s2cid = 158525979}}</ref> The replication crisis affects [[psychology]] more strongly than any other field; as many as two-thirds of highly publicized findings may be impossible to replicate.<ref>{{cite journal |doi=10.1126/science.aac4716 |pmid=26315443 |title=Estimating the reproducibility of psychological science |journal=Science |volume=349 |issue=6251 |page=aac4716 |year=2015 |url=http://eprints.keele.ac.uk/877/1/Open%20Science%20%28Science%20Pre-Print%29.pdf |author1=Open Science Collaboration |s2cid=218065162 |hdl=10722/230596 |hdl-access=free }}</ref> Meta-research finds that 80-95% of psychological studies support their initial hypotheses, which strongly implies the existence of [[publication bias]].<ref name=":1" />
The replication crisis has led to renewed efforts to re-test important findings.<ref name="Simmons et al. (2011)">{{cite journal |doi=10.1177/0956797611417632 |pmid=22006061 |title=False-Positive Psychology |journal=Psychological Science |volume=22 |issue=11 |pages=1359–1366 |year=2011 |last1=Simmons |first1=Joseph P. |last2=Nelson |first2=Leif D. |last3=Simonsohn |first3=Uri |doi-access=free }}</ref><ref>{{cite journal |doi=10.1177/1745691613514450 |pmid=26173241 |title=The Alleged Crisis and the Illusion of Exact Replication |journal=Perspectives on Psychological Science |volume=9 |issue=1 |pages=59–71 |year=2014 |last1=Stroebe |first1=Wolfgang |last2=Strack |first2=Fritz |s2cid=31938129 |url=https://pure.rug.nl/ws/files/12588700/postprint_Stroebe_Strack_2014.pdf }}</ref> In response to concerns about [[publication bias]] and [[Data dredging|''p''-hacking]], more than 140 psychology journals have adopted [[Scholarly peer review#Result-blind peer review|result-blind peer review]], in which studies are [[Registered report|pre-registered]] and published without regard for their outcome.<ref>{{cite journal|last=Aschwanden|first=Christie|title=Psychology's Replication Crisis Has Made The Field Better|website=[[FiveThirtyEight]]|date=6 December 2018|access-date=19 December 2018|url=https://fivethirtyeight.com/features/psychologys-replication-crisis-has-made-the-field-better/}}</ref> An analysis of these reforms estimated that 61 percent of result-blind studies produce [[null result]]s, in contrast with 5 to 20 percent in earlier research. This analysis shows that result-blind peer review substantially reduces publication bias.<ref name=":1">{{cite journal |doi=10.31234/osf.io/3czyt |title=Open Science challenges, benefits and tips in early career and beyond |last1=Allen |first1=Christopher P G. |last2=Mehler |first2=David Marc Anton |s2cid=240061030 |url=http://psyarxiv.com/3czyt/ }}</ref>
Psychologists routinely confuse [[statistical significance]] with practical importance, enthusiastically reporting great certainty in unimportant facts.<ref>{{cite journal|last1=Cohen|first1=Jacob|s2cid=380942|year=1994|title=The earth is round (p < .05)|journal=American Psychologist|volume=49|issue=12|pages=997–1003|doi=10.1037/0003-066X.49.12.997}}</ref> Some psychologists have responded with an increased use of [[effect size]] statistics, rather than sole reliance on the [[P value|''p'' values]].{{Citation needed|date=June 2010}}
===Physics===
[[Richard Feynman]] noted that estimates of [[physical constant]]s were closer to published values than would be expected by chance. This was believed to be the result of [[confirmation bias]]: results that agreed with existing literature were more likely to be believed, and therefore published. Physicists now implement blinding to prevent this kind of bias.<ref>{{cite journal |last1=MacCoun |first1=Robert |last2=Perlmutter |first2=Saul |title=Blind analysis: Hide results to seek the truth |journal=Nature |volume=526 |issue=7572 |pages=187–189 |language=en |doi=10.1038/526187a |pmid=26450040 |date=8 October 2015|bibcode=2015Natur.526..187M |doi-access=free }}</ref>
===Computer Science===
Web measurement studies are essential for understanding the workings of the modern Web, particularly in the fields of security and privacy. However, these studies often require custom-built or modified crawling setups, leading to a plethora of analysis tools for similar tasks. In a paper by Demir et al., the authors surveyed 117 recent research papers to derive best practices for Web-based measurement studies and establish criteria for reproducibility and replicability. They found that experimental setups and other critical information for reproducing and replicating results are often missing. In a large-scale Web measurement study on 4.5 million pages with 24 different measurement setups, the authors demonstrated the impact of slight differences in experimental setups on the overall results, emphasizing the need for accurate and comprehensive documentation.<ref>{{cite conference |last1=Demir |first1=Nurullah |last2=Große-Kampmann |first2=Matteo |last3=Urban |first3=Tobias |last4=Wressnegger |first4=Christian |last5=Holz |first5=Thorsten |last6=Pohlmann |first6=Norbert |title=Reproducibility and Replicability of Web Measurement Studies |year=2022 |publisher=Association for Computing Machinery |location=New York |url=https://doi.org/10.1145/3485447.3512214 |doi=10.1145/3485447.3512214 |book-title=Proceedings of the ACM Web Conference 2022 |pages=533–544 |series=WWW '22 }}</ref>
==Organizations and institutes==
{{Further|List of metascience research centers}}
There are several organizations and universities across the globe which work on meta-research – these include the Meta-Research Innovation Center at Berlin,<ref>{{Cite web|last=Berlin|first=Meta-Research Innovation Center|title=Meta-Research Innovation Center Berlin|url=https://www.metricberlin.bihealth.org/|access-date=2021-12-06|website=Meta-Research Innovation Center Berlin|language=en-us}}</ref> the [[Meta-Research Innovation Center at Stanford]],<ref>{{Cite web|title=Home {{!}} Meta-research Innovation Center at Stanford|url=https://metrics.stanford.edu/|access-date=2021-12-06|website=metrics.stanford.edu}}</ref><ref>{{Cite web|title=Meta-research and Evidence Synthesis Unit|url=https://www.georgeinstitute.org.in/units/meta-research-and-evidence-synthesis-unit|access-date=2021-12-19|website=The George Institute for Global Health|language=en}}</ref> the [[Meta-Research Center at Tilburg University]], the Meta-research & Evidence Synthesis Unit, The George Institute for Global Health at India<!--the Sheffield Metascience Network https://www.sheffield.ac.uk/is/research/centres/metanet--> and [[Center for Open Science]]. Organizations that develop tools for metascience include [[Our Research]], [[Center for Scientific Integrity]] and [[Altmetrics#Adoption|altmetrics companies]]. There is an annual Metascience Conference hosted by the Association for Interdisciplinary Meta-Research and Open Science (AIMOS) and biannual conference hosted by the Centre for Open Science.<ref>{{cite web |title=AIMOS 2022 |url=https://www.eventcreate.com/e/aimos2022 |website=AIMOS 2022 |access-date=20 March 2023}}</ref><ref>{{cite web |title=Metascience 2023 |url=https://metascience.info/ |website=Metascience 2023 Conference |access-date=20 March 2023}}</ref>
== See also ==
{{columns-list|colwidth=18em|
* [[Accelerating change]]
* [[Basic research]]
* [[Citation analysis]]
* [[Epistemology]]
* [[Evidence-based practices]]
* [[Evidence-based medicine]]
* [[Evidence-based policy]]
* [[Further research is needed]]
* [[HARKing]]
* [[Logology (science)]]
* {{slink|Metadata#Science}}
* [[Metatheory]]
* [[Open science]]
* [[Philosophy of science]]
* [[Sociology of scientific knowledge]]
* [[Self-Organized Funding Allocation]]
}}
== References ==
{{Reflist}}
==Further reading==
* Bonett, D.G. (2021). Design and analysis of replication studies. Organizational Research Methods, 24, 513-529. https://doi.org/10.1177/1094428120911088
* Lydia Denworth, "[https://www.scientificamerican.com/article/the-significant-problem-of-p-values/ A Significant Problem: Standard scientific methods are under fire. Will anything change?]", ''[[Scientific American]]'', vol. 321, no. 4 (October 2019), pp. 62–67.
**"The use of [[p value|''p'' values]] for nearly a century [since 1925] to determine [[statistical significance]] of [[experiment]]al results has contributed to an illusion of [[certainty]] and [to] [[reproducibility|reproducibility crises]] in many [[science|scientific fields]]. There is growing determination to reform statistical analysis... Some [researchers] suggest changing statistical methods, whereas others would do away with a threshold for defining "significant" results." (p. 63.)
* {{cite book | last=Harris | first=Richard | title=Rigor Mortis: How Sloppy Science Creates Worthless Cures, Crushes Hopes, and Wastes Billions | year=2017 | url=https://books.google.com/books?id=lk5IDgAAQBAJ |publisher=Basic Books|isbn=978-0465097913}}
* {{cite journal |last1=Fortunato |first1=Santo |last2=Bergstrom |first2=Carl T. |display-authors=et al. |title=Science of science |journal=Science |date=2 March 2018 |volume=359 |issue=6379 |page=eaao0185 |doi=10.1126/science.aao0185 |pmid=29496846 |pmc=5949209 |url=https://www.researchgate.net/publication/323502497}}
==External links==
'''Journals'''
* ''[https://www.springer.com/journal/11024 Minerva: A Journal of Science, Learning and Policy]''
* ''[https://researchintegrityjournal.biomedcentral.com/ Research Integrity and Peer Review]''
* ''[https://www.journals.elsevier.com/research-policy Research Policy]''
* ''[https://web.archive.org/web/20120405101059/http://spp.oxfordjournals.org/ Science and Public Policy]''
'''Conferences'''
* ''[https://metascience.info/ Annual Metascience Conference]''
{{Evidence-based practice}}
{{Science and technology studies}}
{{Meta-prefix}}
{{DEFAULTSORT:Metascience}}
[[Category:Metascience| ]]
[[Category:Epistemology of science]]
[[Category:Ethics and statistics]]
[[Category:Evidence-based practices]]
[[Category:Metatheory of science]]
[[Category:Research]]
[[Category:Science policy]]
[[Category:Scientific method]]' |