Jump to content

Metascience

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Macrakis (talk | contribs) at 23:15, 18 November 2022 (Changing short description from "The scientific study of science" to "Scientific study of science"). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Metascience (also known as meta-research) is the use of scientific methodology to study science itself. Metascience seeks to increase the quality of scientific research while reducing inefficiency. It is also known as "research on research" and "the science of science", as it uses research methods to study how research is done and find where improvements can be made. Metascience concerns itself with all fields of research and has been described as "a bird's eye view of science".[1] In the words of John Ioannidis, "Science is the best thing that has happened to human beings ... but we can do it better."[2]

In 1966, an early meta-research paper examined the statistical methods of 295 papers published in ten high-profile medical journals. It found that "in almost 73% of the reports read ... conclusions were drawn when the justification for these conclusions was invalid." Meta-research in the following decades found many methodological flaws, inefficiencies, and poor practices in research across numerous scientific fields. Many scientific studies could not be reproduced, particularly in medicine and the soft sciences. The term "replication crisis" was coined in the early 2010s as part of a growing awareness of the problem.[3]

Measures have been implemented to address the issues revealed by metascience. These measures include the pre-registration of scientific studies and clinical trials as well as the founding of organizations such as CONSORT and the EQUATOR Network that issue guidelines for methodology and reporting. There are continuing efforts to reduce the misuse of statistics, to eliminate perverse incentives from academia, to improve the peer review process, to systematically collect data about the scholarly publication system,[4] to combat bias in scientific literature, and to increase the overall quality and efficiency of the scientific process.

History

John Ioannidis (2005), "Why Most Published Research Findings Are False"[5]

In 1966, an early meta-research paper examined the statistical methods of 295 papers published in ten high-profile medical journals. It found that, "in almost 73% of the reports read ... conclusions were drawn when the justification for these conclusions was invalid."[6] In 2005, John Ioannidis published a paper titled "Why Most Published Research Findings Are False", which argued that a majority of papers in the medical field produce conclusions that are wrong.[5] The paper went on to become the most downloaded paper in the Public Library of Science[7][8] and is considered foundational to the field of metascience.[9] In a related study with Jeremy Howick and Despina Koletsi, Ioannidis showed that only a minority of medical interventions are supported by 'high quality' evidence according to The Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach. [10] Later meta-research identified widespread difficulty in replicating results in many scientific fields, including psychology and medicine. This problem was termed "the replication crisis". Metascience has grown as a reaction to the replication crisis and to concerns about waste in research.[11]

Many prominent publishers are interested in meta-research and in improving the quality of their publications. Top journals such as Science, The Lancet, and Nature, provide ongoing coverage of meta-research and problems with reproducibility.[12] In 2012 PLOS ONE launched a Reproducibility Initiative. In 2015 Biomed Central introduced a minimum-standards-of-reporting checklist to four titles.

The first international conference in the broad area of meta-research was the Research Waste/EQUATOR conference held in Edinburgh in 2015; the first international conference on peer review was the Peer Review Congress held in 1989.[13] In 2016, Research Integrity and Peer Review was launched. The journal's opening editorial called for "research that will increase our understanding and suggest potential solutions to issues related to peer review, study reporting, and research and publication ethics".[14]

Fields and topics of meta-research

An exemplary visualization of a conception of scientific knowledge generation structured by layers, with the "Institution of Science" being the subject of metascience.

Metascience can be categorized into five major areas of interest: Methods, Reporting, Reproducibility, Evaluation, and Incentives. These correspond, respectively, with how to perform, communicate, verify, evaluate, and reward research.[15]

Methods

Metascience seeks to identify poor research practices, including biases in research, poor study design, abuse of statistics, and to find methods to reduce these practices.[15] Meta-research has identified numerous biases in scientific literature.[16] Of particular note is the widespread misuse of p-values and abuse of statistical significance.[17]

Causality

The field of causality has evolved in an attempt to understand reality in terms of data-generating models as opposed to the distribution of observed variables. This is pertinent in the field of machine learning where trustworthy AI systems capable of explaining the logic involved in its predictions are becoming of focus, especially in applications where highly-sensitive data is involved such as finance. The term black box is commonly used in this field to describe models which lack interpretability of their internal structure. Causal discovery and causal inference both provide axiom-based frameworks to deal with such problems, with the former involved in identifying the causal structure inherent in a dataset, while the latter aims to estimate the causal effect of a specific variable (treatment) over a certain outcome of interest. Causality is also becoming common knowledge in the empirical research such as the social sciences, where causal conclusions are often drawn from observational data without conclusive justification.

Journalology

Journalology, also known as publication science, is the scholarly study of all aspects of the academic publishing process.[18][19] The field seeks to improve the quality of scholarly research by implementing evidence-based practices in academic publishing.[20] The term "journalology" was coined by Stephen Lock, the former editor-in-chief of The BMJ. The first Peer Review Congress, held in 1989 in Chicago, Illinois, is considered a pivotal moment in the founding of journalology as a distinct field.[20] The field of journalology has been influential in pushing for study pre-registration in science, particularly in clinical trials. Clinical-trial registration is now expected in most countries.[20]

Reporting

Meta-research has identified poor practices in reporting, explaining, disseminating and popularizing research, particularly within the social and health sciences. Poor reporting makes it difficult to accurately interpret the results of scientific studies, to replicate studies, and to identify biases and conflicts of interest in the authors. Solutions include the implementation of reporting standards, and greater transparency in scientific studies (including better requirements for disclosure of conflicts of interest). There is an attempt to standardize reporting of data and methodology through the creation of guidelines by reporting agencies such as CONSORT and the larger EQUATOR Network.[15]

Reproducibility

Barriers to conducting replications of experiment in cancer research, The Reproducibility Project: Cancer Biology

The replication crisis is an ongoing methodological crisis in which it has been found that many scientific studies are difficult or impossible to replicate.[21][22] While the crisis has its roots in the meta-research of the mid- to late-1900s, the phrase "replication crisis" was not coined until the early 2010s[23] as part of a growing awareness of the problem.[15] The replication crisis particularly affects psychology (especially social psychology) and medicine,[24][25] including cancer research.[26][27] Replication is an essential part of the scientific process, and the widespread failure of replication puts into question the reliability of affected fields.[28]

Moreover, replication of research (or failure to replicate) is considered less influential than original research, and is less likely to be published in many fields. This discourages the reporting of, and even attempts to replicate, studies.[29][30]

Evaluation and incentives

Metascience seeks to create a scientific foundation for peer review. Meta-research evaluates peer review systems including pre-publication peer review, post-publication peer review, and open peer review. It also seeks to develop better research funding criteria.[15]

Metascience seeks to promote better research through better incentive systems. This includes studying the accuracy, effectiveness, costs, and benefits of different approaches to ranking and evaluating research and those who perform it.[15] Critics argue that perverse incentives have created a publish-or-perish environment in academia which promotes the production of junk science, low quality research, and false positives.[31][32] According to Brian Nosek, "The problem that we face is that the incentive system is focused almost entirely on getting research published, rather than on getting research right."[33] Proponents of reform seek to structure the incentive system to favor higher-quality results.[34]

Studies proposed machine-readable standards for science publication management systems that hones in on contributorship – who has contributed what and how much of the research labor – rather that using traditional concept of plain authorship – who was involved in any way creation of a publication.[35][36][37] A study pointed out one of the problems associated with the ongoing neglect of contribution nuanciation – it found that "the number of publications has ceased to be a good metric as a result of longer author lists, shorter papers, and surging publication numbers".[38]

Factors other than a submission's merits can substantially influence peer reviewers' evaluations.[39] Such factors may however also be important such as the use of track-records about the veracity of a researchers' prior publications and its alignment with public interests. Nevertheless, evaluation systems – include those of peer-review – may substantially lack mechanisms and criteria that are oriented or well-performingly oriented towards merit, real-world positive impact, progress and public usefulness rather than analytical indicators such as number of citations or altmetric even when such can be used as partial indicators of such ends.[40][41]

Scientometrics

Scientometrics concerns itself with measuring bibliographic data in scientific publications. Major research issues include the measurement of the impact of research papers and academic journals, the understanding of scientific citations, and the use of such measurements in policy and management contexts.[42] Studies suggest that "metrics used to measure academic success, such as the number of publications, citation number, and impact factor, have not changed for decades" and have to some degrees "ceased" to be good measures.[38]

Science governance

Science funding and science governance can also be explored by metascience.[43]

Incentives

Various interventions such as prioritization can be important. For instance, the concept of differential technological development refers to deliberately developing technologies – e.g. control-, safety- and policy-technologies versus risky biotechnologies – at different precautionary paces to decrease risks, mainly global catastrophic risk, by influencing the sequence in which technologies are developed.[44][45] Relying only on the established form of legislation and incentives to ensure the right outcomes may not be adequate as these may often be too slow[46] or inappropriate.

Other incentives to govern science and related processes, including via metascience-based reforms, may include ensuring accountability to the public (in terms of e.g. accessibility of, especially publicly-funded, research or of it addressing various research topics of public interest in serious manners), increasing the qualified productive scientific workforce, improving the efficiency of science to improve problem-solving in general, and facilitating that unambiguous societal needs based on solid scientific evidence – such as about human physiology – are adequately prioritized and addressed. Such interventions, incentives and intervention-designs can be subjects of metascience.

Science funding and awards
Cluster network of scientific publications in relation to Nobel prizes.

Scientific awards are one category of science incentives. Metascience can explore existing and hypothetical systems of science awards. For instance, it found that work honored by Nobel prizes clusters in only a few scientific fields with only 36/71 having received at least one Nobel prize of the 114/849 domains science could be divided into according to their DC2 and DC3 classification systems. Five of the 114 domains were shown to make up over half of the Nobel prizes awarded 1995–2017 (particle physics [14%], cell biology [12.1%], atomic physics [10.9%], neuroscience [10.1%], molecular chemistry [5.3%]).[47][48]

A study found that delegation of responsibility by policy-makers – a centralized authority-based top-down approach – for knowledge production and appropriate funding to science with science subsequently somehow delivering "reliable and useful knowledge to society" is too simple.[43]

Measurements show that allocation of bio-medical resources can be more strongly correlated to previous allocations and research than to burden of diseases.[49]

A study suggests that "[i]f peer review is maintained as the primary mechanism of arbitration in the competitive selection of research reports and funding, then the scientific community needs to make sure it is not arbitrary".[39]

A study suggests there to be a need to "reconsider how we measure success" (see #Factors of success and progress).[38]

Science communication and public use

It has been argued that "science has two fundamental attributes that underpin its value as a global public good: that knowledge claims and the evidence on which they are based are made openly available to scrutiny, and that the results of scientific research are communicated promptly and efficiently".[50] Metascientific research is exploring topics of science communication such as media coverage of science, science journalism and online communication of results by science educators and scientists.[51][52][53][54] A study found that the "main incentive academics are offered for using social media is amplification" and that it should be "moving towards an institutional culture that focuses more on how these [or such] platforms can facilitate real engagement with research".[55] Science communication may also involve the communication of societal needs, concerns and requests to scientists.

Alternative metrics tools can be used not only for help in assessment and findability, but also aggregate many of the public discussions about a scientific paper in social media such as reddit, citations on Wikipedia, and reports about the study in the news media which can then in turn be analyzed in metascience or provided and used by related tools.[56]

Scientific data science

Scientific data science is the use of data science to analyse research papers. It encompasses both qualitative and quantitative methods. Research in scientific data science includes fraud detection[57] and citation network analysis.[58]

Evolution of sciences

Scientific practice

Metascience can investigate how scientific processes evolve over time. A study found that teams are growing in size, "increasing by an average of 17% per decade".[49]

ArXiv's yearly submission rate growth over 30 years.

It was found that prevalent forms of non-open access publication and prices charged for many conventional journals – even for publicly funded papers – are unwarranted, unnecessary – or suboptimal – and detrimental barriers to scientific progress.[50][59][60][61]

Science overall and intrafield developments

A visualization of scientific outputs by field in OpenAlex.
A study can be part of multiple fields and lower numbers of papers is not necessarily detrimental for fields.
Number of PubMed search results for "coronavirus" by year from 1949 to 2020.

Studies have various kinds of metadata which can be utilized, complemented and made accessible in useful ways. OpenAlex is a free online index of over 200 million scientific documents that integrates and provides metadata such as sources, citations, author information, scientific fields and research topics. Its API and open source website can be used for metascience, scientometrics and novel tools that query this semantic web of papers.[62][63][64] Another project under development, Scholia, uses metadata of scientific publications for various visualizations and aggregation features such as providing a simple user interface summarizing literature about a specific feature of the SARS-CoV-2 virus using Wikidata's "main subject" property.[65]

Challenges of interpretation of pooled results

Studies about a specific research question or research topic are often reviewed in the form of higher-level overviews in which results from various studies are integrated, compared, critically analyzed and interpreted. Examples of such works are scientific reviews and meta-analyses. These and related practices face various challenges and are a subject of metascience.

Knowledge integration and living documents

Various problems require swift integration of new and existing science-based knowledge. Especially setting where there are a large number of loosely related projects and initiatives benefit from a common ground or "commons".[65]

Evidence synthesis can be applied to important and, notably, both relatively urgent and certain global challenges: "climate change, energy transitions, biodiversity loss, antimicrobial resistance, poverty eradication and so on". It was suggested that a better system would keep summaries of research evidence up to date via living systematic reviews – e.g. as living documents. While the number of scientific papers and data (or information and online knowledge) has risen substantially,[additional citation(s) needed] the number of published academic systematic reviews has risen from "around 6,000 in 2011 to more than 45,000 in 2021".[66]

Meta-analyses

A meta-analysis of several small studies does not always predict the results of a single large study.[67] Some have argued that a weakness of the method is that sources of bias are not controlled by the method: a good meta-analysis cannot correct for poor design or bias in the original studies.[68] This would mean that only methodologically sound studies should be included in a meta-analysis, a practice called 'best evidence synthesis'.[68] Other meta-analysts would include weaker studies, and add a study-level predictor variable that reflects the methodological quality of the studies to examine the effect of study quality on the effect size.[69] However, others have argued that a better approach is to preserve information about the variance in the study sample, casting as wide a net as possible, and that methodological selection criteria introduce unwanted subjectivity, defeating the purpose of the approach.[70] More recently, and under the influence of a push for open practices in science, tools to develop "crowd-sourced" living meta-analyses that are updated by communities of scientists [71][72] in hopes of making all the subjective choices more explicit.

Various issues with included or available studies such as, for example, heterogeneity of methods used may lead to faulty conclusions of the meta-analysis.[73]

Factors of success and progress

Two metascientists reported that "structures fostering disruptive scholarship and focusing attention on novel ideas" could be important as in a growing scientific field citation flows disproportionately consolidate to already well-cited papers, possibly slowing and inhibiting canonical progress.[74][75] A study found to enhance impact of truly innovative and highly interdisciplinary novel ideas, they should be placed in the context of established knowledge.[49]

Other researchers reported that the most successful – in terms of "likelihood of prizewinning, National Academy of Science (NAS) induction, or superstardom" – protégés studied under mentors who published research for which they were conferred a prize after the protégés' mentorship. Studying original topics rather than these mentors' research-topics was also positively associated with success.[76][77]

It has been hypothesized that a deeper understanding of factors behind successful science could "enhance prospects of science as a whole to more effectively address societal problems".[49]

Reforms

Meta-research identifying flaws in scientific practice has inspired reforms in science. These reforms seek to address and fix problems in scientific practice which lead to low-quality or inefficient research.

Pre-registration

The practice of registering a scientific study before it is conducted is called pre-registration. It arose as a means to address the replication crisis. Pregistration requires the submission of a registered report, which is then accepted for publication or rejected by a journal based on theoretical justification, experimental design, and the proposed statistical analysis. Pre-registration of studies serves to prevent publication bias (e.g. not publishing negative results), reduce data dredging, and increase replicability.[78][79]

Reporting standards

Studies showing poor consistency and quality of reporting have demonstrated the need for reporting standards and guidelines in science, which has led to the rise of organisations that produce such standards, such as CONSORT (Consolidated Standards of Reporting Trials) and the EQUATOR Network.

The EQUATOR (Enhancing the QUAlity and Transparency Of health Research)[80] Network is an international initiative aimed at promoting transparent and accurate reporting of health research studies to enhance the value and reliability of medical research literature.[81] The EQUATOR Network was established with the goals of raising awareness of the importance of good reporting of research, assisting in the development, dissemination and implementation of reporting guidelines for different types of study designs, monitoring the status of the quality of reporting of research studies in the health sciences literature, and conducting research relating to issues that impact the quality of reporting of health research studies.[82] The Network acts as an "umbrella" organisation, bringing together developers of reporting guidelines, medical journal editors and peer reviewers, research funding bodies, and other key stakeholders with a mutual interest in improving the quality of research publications and research itself.

Applications

The areas of application of metascience include ICTs, medicine, psychology and physics.

ICTs

Metascience is used in the creation and improvement of technical systems (ICTs) and standards of science evaluation, incentivation, communication, commissioning, funding, regulation, production, management, use and publication. Such can be called "applied metascience"[83][better source needed] and may seek to explore ways to increase quantity, quality and positive impact of research. One example for such is the development of alternative metrics.[49] According to a study "a simple way to check how often studies have been repeated, and whether or not the original findings are confirmed" is needed due to reproducibility issues in science.[84][85] The tool scite.ai aims to track and link citations of papers as 'Supporting', 'Mentioning' or 'Contrasting' the study.[86]

Medicine

Clinical research in medicine is often of low quality, and many studies cannot be replicated.[87][88] An estimated 85% of research funding is wasted.[89] Additionally, the presence of bias affects research quality.[90] The pharmaceutical industry exerts substantial influence on the design and execution of medical research. Conflicts of interest are common among authors of medical literature[91] and among editors of medical journals. While almost all medical journals require their authors to disclose conflicts of interest, editors are not required to do so.[92] Financial conflicts of interest have been linked to higher rates of positive study results. In antidepressant trials, pharmaceutical sponsorship is the best predictor of trial outcome.[93]

Blinding is another focus of meta-research, as error caused by poor blinding is a source of experimental bias. Blinding is not well reported in medical literature, and widespread misunderstanding of the subject has resulted in poor implementation of blinding in clinical trials.[94] Furthermore, failure of blinding is rarely measured or reported.[95] Research showing the failure of blinding in antidepressant trials has led some scientists to argue that antidepressants are no better than placebo.[96][97] In light of meta-research showing failures of blinding, CONSORT standards recommend that all clinical trials assess and report the quality of blinding.[98]

Studies have shown that systematic reviews of existing research evidence are sub-optimally used in planning a new research or summarizing the results.[99] Cumulative meta-analyses of studies evaluating the effectiveness of medical interventions have shown that many clinical trials could have been avoided if a systematic review of existing evidence was done prior to conducting a new trial.[100][101][102] For example, Lau et al.[100] analyzed 33 clinical trials (involving 36974 patients) evaluating the effectiveness of intravenous streptokinase for acute myocardial infarction. Their cumulative meta-analysis demonstrated that 25 of 33 trials could have been avoided if a systematic review was conducted prior to conducting a new trial. In other words, randomizing 34542 patients was potentially unnecessary. One study[103] analyzed 1523 clinical trials included in 227 meta-analyses and concluded that "less than one quarter of relevant prior studies" were cited. They also confirmed earlier findings that most clinical trial reports do not present systematic review to justify the research or summarize the results.[103]

Many treatments used in modern medicine have been proven to be ineffective, or even harmful. A 2007 study by John Ioannidis found that it took an average of ten years for the medical community to stop referencing popular practices after their efficacy was unequivocally disproven.[104][105]

Psychology

Metascience has revealed significant problems in psychological research. The field suffers from high bias, low reproducibility, and widespread misuse of statistics.[106][107][108] The replication crisis affects psychology more strongly than any other field; as many as two-thirds of highly publicized findings may be impossible to replicate.[109] Meta-research finds that 80-95% of psychological studies support their initial hypotheses, which strongly implies the existence of publication bias.[110]

The replication crisis has led to renewed efforts to re-test important findings.[111][112] In response to concerns about publication bias and p-hacking, more than 140 psychology journals have adopted result-blind peer review, in which studies are pre-registered and published without regard for their outcome.[113] An analysis of these reforms estimated that 61 percent of result-blind studies produce null results, in contrast with 5 to 20 percent in earlier research. This analysis shows that result-blind peer review substantially reduces publication bias.[110]

Psychologists routinely confuse statistical significance with practical importance, enthusiastically reporting great certainty in unimportant facts.[114] Some psychologists have responded with an increased use of effect size statistics, rather than sole reliance on the p values.[citation needed]

Physics

Richard Feynman noted that estimates of physical constants were closer to published values than would be expected by chance. This was believed to be the result of confirmation bias: results that agreed with existing literature were more likely to be believed, and therefore published. Physicists now implement blinding to prevent this kind of bias.[115]

Organizations and institutes

There are several organizations and universities across the globe which work on meta-research – these include the Meta-Research Innovation Center at Berlin,[116] the Meta-Research Innovation Center at Stanford,[117][118] the Meta-Research Center at Tilburg University, the Meta-research & Evidence Synthesis Unit, The George Institute for Global Health at India and Center for Open Science. Organizations that develop tools for metascience include Our Research, Center for Scientific Integrity and altmetrics companies. There is an annual Metascience Conference.[119]

See also

References

  1. ^ Ioannidis, John P. A.; Fanelli, Daniele; Dunne, Debbie Drake; Goodman, Steven N. (2015-10-02). "Meta-research: Evaluation and Improvement of Research Methods and Practices". PLOS Biology. 13 (10): 1002264. doi:10.1371/journal.pbio.1002264. ISSN 1545-7885. PMC 4592065. PMID 26431313.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  2. ^ Bach, Becky (8 December 2015). "On communicating science and uncertainty: A podcast with John Ioannidis". Scope. Retrieved 20 May 2019.
  3. ^ Pashler, Harold; Harris, Christine R. (2012). "Is the Replicability Crisis Overblown? Three Arguments Examined". Perspectives on Psychological Science. 7 (6): 531–536. doi:10.1177/1745691612463401. ISSN 1745-6916.
  4. ^ Nishikawa-Pacher, Andreas; Heck, Tamara; Schoch, Kerstin (4 October 2022). "Open Editors: A dataset of scholarly journals' editorial board positions". Research Evaluation. doi:10.1093/reseval/rvac037. eISSN 1471-5449. ISSN 0958-2029.
  5. ^ a b Ioannidis, JP (August 2005). "Why most published research findings are false". PLOS Medicine. 2 (8): e124. doi:10.1371/journal.pmed.0020124. PMC 1182327. PMID 16060722.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  6. ^ Schor, Stanley (1966). "Statistical Evaluation of Medical Journal Manuscripts". JAMA: The Journal of the American Medical Association. 195 (13): 1123–1128. doi:10.1001/jama.1966.03100130097026. ISSN 0098-7484. PMID 5952081.
  7. ^ "Highly Cited Researchers". Retrieved September 17, 2015.
  8. ^ Medicine - Stanford Prevention Research Center. John P.A. Ioannidis
  9. ^ Robert Lee Hotz (September 14, 2007). "Most Science Studies Appear to Be Tainted By Sloppy Analysis". Wall Street Journal. Dow Jones & Company. Retrieved 2016-12-05.
  10. ^ Howick J, Koletsi D, Pandis N, Fleming PS, Loef M, Walach H, Schmidt S, Ioannidis JA. The quality of evidence for medical interventions does not improve or worsen: a metaepidemiological study of Cochrane reviews. Journal of Clinical Epidemiology 2020;126:154-159 [1]
  11. ^ "Researching the researchers". Nature Genetics. 46 (5): 417. 2014. doi:10.1038/ng.2972. ISSN 1061-4036. PMID 24769715.
  12. ^ Enserink, Martin (2018). "Research on research". Science. 361 (6408): 1178–1179. Bibcode:2018Sci...361.1178E. doi:10.1126/science.361.6408.1178. ISSN 0036-8075. PMID 30237336. S2CID 206626417.
  13. ^ Rennie, Drummond (1990). "Editorial Peer Review in Biomedical Publication". JAMA. 263 (10): 1317–1441. doi:10.1001/jama.1990.03440100011001. ISSN 0098-7484. PMID 2304208.
  14. ^ Harriman, Stephanie L.; Kowalczuk, Maria K.; Simera, Iveta; Wager, Elizabeth (2016). "A new forum for research on research integrity and peer review". Research Integrity and Peer Review. 1 (1): 5. doi:10.1186/s41073-016-0010-y. ISSN 2058-8615. PMC 5794038. PMID 29451544.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  15. ^ a b c d e f Ioannidis, John P. A.; Fanelli, Daniele; Dunne, Debbie Drake; Goodman, Steven N. (2 October 2015). "Meta-research: Evaluation and Improvement of Research Methods and Practices". PLOS Biology. 13 (10): e1002264. doi:10.1371/journal.pbio.1002264. ISSN 1544-9173. PMC 4592065. PMID 26431313.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  16. ^ Fanelli, Daniele; Costas, Rodrigo; Ioannidis, John P. A. (2017). "Meta-assessment of bias in science". Proceedings of the National Academy of Sciences of the United States of America. 114 (14): 3714–3719. doi:10.1073/pnas.1618569114. ISSN 1091-6490. PMC 5389310. PMID 28320937.
  17. ^ Check Hayden, Erika (2013). "Weak statistical standards implicated in scientific irreproducibility". Nature. doi:10.1038/nature.2013.14131. S2CID 211729036. Retrieved 9 May 2019.
  18. ^ Galipeau, James; Moher, David; Campbell, Craig; Hendry, Paul; Cameron, D. William; Palepu, Anita; Hébert, Paul C. (March 2015). "A systematic review highlights a knowledge gap regarding the effectiveness of health-related training programs in journalology". Journal of Clinical Epidemiology. 68 (3): 257–265. doi:10.1016/j.jclinepi.2014.09.024. PMID 25510373.
  19. ^ Wilson, Mitch; Moher, David (March 2019). "The Changing Landscape of Journalology in Medicine". Seminars in Nuclear Medicine. 49 (2): 105–114. doi:10.1053/j.semnuclmed.2018.11.009. hdl:10393/38493. PMID 30819390. S2CID 73471103.
  20. ^ a b c Couzin-Frankel, Jennifer (18 September 2018). "'Journalologists' use scientific methods to study academic publishing. Is their work improving science?". Science. doi:10.1126/science.aav4758. S2CID 115360831.
  21. ^ Schooler, J. W. (2014). "Metascience could rescue the 'replication crisis'". Nature. 515 (7525): 9. Bibcode:2014Natur.515....9S. doi:10.1038/515009a. PMID 25373639.
  22. ^ Smith, Noah (2 November 2017). "Why 'Statistical Significance' Is Often Insignificant". Bloomberg.com. Retrieved 7 November 2017.
  23. ^ Pashler, Harold; Wagenmakers, Eric Jan (2012). "Editors' Introduction to the Special Section on Replicability in Psychological Science: A Crisis of Confidence?". Perspectives on Psychological Science. 7 (6): 528–530. doi:10.1177/1745691612465253. PMID 26168108. S2CID 26361121.
  24. ^ Gary Marcus (May 1, 2013). "The Crisis in Social Psychology That Isn't". The New Yorker.
  25. ^ Jonah Lehrer (December 13, 2010). "The Truth Wears Off". The New Yorker.
  26. ^ "Dozens of major cancer studies can't be replicated". Science News. 7 December 2021. Retrieved 19 January 2022.
  27. ^ "Reproducibility Project: Cancer Biology". www.cos.io. Center for Open Science. Retrieved 19 January 2022.
  28. ^ Staddon, John (2017) Scientific Method: How science works, fails to work or pretends to work. Taylor and Francis.
  29. ^ Yeung, Andy W. K. (2017). "Do Neuroscience Journals Accept Replications? A Survey of Literature". Frontiers in Human Neuroscience. 11: 468. doi:10.3389/fnhum.2017.00468. ISSN 1662-5161. PMC 5611708. PMID 28979201.
  30. ^ Martin, G. N.; Clarke, Richard M. (2017). "Are Psychology Journals Anti-replication? A Snapshot of Editorial Practices". Frontiers in Psychology. 8: 523. doi:10.3389/fpsyg.2017.00523. ISSN 1664-1078. PMC 5387793. PMID 28443044.
  31. ^ Binswanger, Mathias (2015). "How Nonsense Became Excellence: Forcing Professors to Publish". In Welpe, Isabell M.; Wollersheim, Jutta; Ringelhan, Stefanie; Osterloh, Margit (eds.). Incentives and Performance. Springer International Publishing. pp. 19–32. doi:10.1007/978-3-319-09785-5_2. ISBN 978-3319097855. S2CID 110698382. {{cite book}}: |work= ignored (help)
  32. ^ Edwards, Marc A.; Roy, Siddhartha (2016-09-22). "Academic Research in the 21st Century: Maintaining Scientific Integrity in a Climate of Perverse Incentives and Hypercompetition". Environmental Engineering Science. 34 (1): 51–61. doi:10.1089/ees.2016.0223. PMC 5206685. PMID 28115824.
  33. ^ Brookshire, Bethany (21 October 2016). "Blame bad incentives for bad science". Science News. Retrieved 11 July 2019.
  34. ^ Smaldino, Paul E.; McElreath, Richard (2016). "The natural selection of bad science". Royal Society Open Science. 3 (9): 160384. arXiv:1605.09511. Bibcode:2016RSOS....360384S. doi:10.1098/rsos.160384. PMC 5043322. PMID 27703703.
  35. ^ Holcombe, Alex O. (September 2019). "Contributorship, Not Authorship: Use CRediT to Indicate Who Did What". Publications. 7 (3): 48. doi:10.3390/publications7030048.
  36. ^ McNutt, Marcia K.; Bradford, Monica; Drazen, Jeffrey M.; Hanson, Brooks; Howard, Bob; Jamieson, Kathleen Hall; Kiermer, Véronique; Marcus, Emilie; Pope, Barbara Kline; Schekman, Randy; Swaminathan, Sowmya; Stang, Peter J.; Verma, Inder M. (13 March 2018). "Transparency in authors' contributions and responsibilities to promote integrity in scientific publication". Proceedings of the National Academy of Sciences. 115 (11): 2557–2560. doi:10.1073/pnas.1715374115. ISSN 0027-8424. PMC 5856527. PMID 29487213.
  37. ^ Brand, Amy; Allen, Liz; Altman, Micah; Hlava, Marjorie; Scott, Jo (1 April 2015). "Beyond authorship: attribution, contribution, collaboration, and credit". Learned Publishing. 28 (2): 151–155. doi:10.1087/20150211. S2CID 45167271.
  38. ^ a b c Fire, Michael; Guestrin, Carlos (1 June 2019). "Over-optimization of academic publishing metrics: observing Goodhart's Law in action". GigaScience. 8 (6): giz053. doi:10.1093/gigascience/giz053. PMC 6541803. PMID 31144712.
  39. ^ a b Elson, Malte; Huff, Markus; Utz, Sonja (1 March 2020). "Metascience on Peer Review: Testing the Effects of a Study's Originality and Statistical Significance in a Field Experiment". Advances in Methods and Practices in Psychological Science. 3 (1): 53–65. doi:10.1177/2515245919895419. ISSN 2515-2459. S2CID 212778011.
  40. ^ McLean, Robert K D; Sen, Kunal (1 April 2019). "Making a difference in the real world? A meta-analysis of the quality of use-oriented research using the Research Quality Plus approach". Research Evaluation. 28 (2): 123–135. doi:10.1093/reseval/rvy026.
  41. ^ "Bringing Rigor to Relevant Questions: How Social Science Research Can Improve Youth Outcomes in the Real World" (PDF). Retrieved 22 November 2021.
  42. ^ Leydesdorff, L. and Milojevic, S., "Scientometrics" arXiv:1208.4566 (2013), forthcoming in: Lynch, M. (editor), International Encyclopedia of Social and Behavioral Sciences subsection 85030. (2015)
  43. ^ a b Nielsen, Kristian H. (1 March 2021). "Science and public policy". Metascience. 30 (1): 79–81. doi:10.1007/s11016-020-00581-5. ISSN 1467-9981. PMC 7605730. S2CID 226237994.
  44. ^ Bostrom, Nick (2014). Superintelligence: Paths, Dangers, Strategies. Oxford: Oxford University Press. pp. 229–237. ISBN 978-0199678112.
  45. ^ Ord, Toby (2020). The Precipice: Existential Risk and the Future of Humanity. United Kingdom: Bloomsbury Publishing. p. 200. ISBN 978-1526600219.
  46. ^ "Technology is changing faster than regulators can keep up - here's how to close the gap". World Economic Forum. Retrieved 27 January 2022.
  47. ^ "Nobel prize-winning work is concentrated in minority of scientific fields". phys.org. Retrieved 17 August 2020.
  48. ^ Ioannidis, John P. A.; Cristea, Ioana-Alina; Boyack, Kevin W. (29 July 2020). "Work honored by Nobel prizes clusters heavily in a few scientific fields". PLOS ONE. 15 (7): e0234612. Bibcode:2020PLoSO..1534612I. doi:10.1371/journal.pone.0234612. ISSN 1932-6203. PMC 7390258. PMID 32726312.
  49. ^ a b c d e Fortunato, Santo; Bergstrom, Carl T.; Börner, Katy; Evans, James A.; Helbing, Dirk; Milojević, Staša; Petersen, Alexander M.; Radicchi, Filippo; Sinatra, Roberta; Uzzi, Brian; Vespignani, Alessandro; Waltman, Ludo; Wang, Dashun; Barabási, Albert-László (2 March 2018). "Science of science". Science. 359 (6379): eaao0185. doi:10.1126/science.aao0185. PMC 5949209. PMID 29496846. Retrieved 22 November 2021.
  50. ^ a b "Science as a Global Public Good". International Science Council. 8 October 2021. Retrieved 22 November 2021.
  51. ^ Jamieson, Kathleen Hall; Kahan, Dan; Scheufele, Dietram A. (17 May 2017). The Oxford Handbook of the Science of Science Communication. Oxford University Press. ISBN 978-0190497637.
  52. ^ Grochala, Rafał (16 December 2019). "Science communication in online media: influence of press releases on coverage of genetics and CRISPR". doi:10.1101/2019.12.13.875278. S2CID 213125031. {{cite journal}}: Cite journal requires |journal= (help)
  53. ^ "FRAMING ANALYSIS OF NEWS COVERAGE ON RENEWABLE ENERGYIN THE STAR ONLINE NEWS PORTAL" (PDF). Retrieved 22 November 2021. {{cite journal}}: Cite journal requires |journal= (help)
  54. ^ MacLaughlin, Ansel; Wihbey, John; Smith, David (15 June 2018). "Predicting News Coverage of Scientific Articles". Proceedings of the International AAAI Conference on Web and Social Media. 12 (1). ISSN 2334-0770.
  55. ^ Carrigan, Mark; Jordan, Katy (4 November 2021). "Platforms and Institutions in the Post-Pandemic University: a Case Study of Social Media and the Impact Agenda". Postdigital Science and Education. 4 (2): 354–372. doi:10.1007/s42438-021-00269-x. ISSN 2524-4868. S2CID 243760357.
  56. ^ Baykoucheva, Svetla (2015). "Measuring attention". Managing Scientific Information and Research Data: 127–136. doi:10.1016/B978-0-08-100195-0.00014-7. ISBN 978-0081001950.
  57. ^ Markowitz, David M.; Hancock, Jeffrey T. (2016). "Linguistic obfuscation in fraudulent science". Journal of Language and Social Psychology. 35 (4): 435–445. doi:10.1177/0261927X15614605. S2CID 146174471.
  58. ^ Ding, Y. (2010). "Applying weighted PageRank to author citation networks". Journal of the American Society for Information Science and Technology. 62 (2): 236–245. arXiv:1102.1760. doi:10.1002/asi.21452. S2CID 3752804.
  59. ^ "Nature Journals To Charge Authors Hefty Fee To Make Scientific Papers Open Access". IFLScience. Retrieved 22 November 2021.
  60. ^ "Harvard University says it can't afford journal publishers' prices". The Guardian. 24 April 2012. Retrieved 22 November 2021.
  61. ^ Van Noorden, Richard (1 March 2013). "Open access: The true cost of science publishing". Nature. 495 (7442): 426–429. Bibcode:2013Natur.495..426V. doi:10.1038/495426a. ISSN 1476-4687. PMID 23538808. S2CID 27021567.
  62. ^ Singh Chawla, Dalmeet (24 January 2022). "Massive open index of scholarly papers launches". Nature. doi:10.1038/d41586-022-00138-y. Retrieved 14 February 2022.
  63. ^ "OpenAlex: The Promising Alternative to Microsoft Academic Graph". Singapore Management University (SMU). Retrieved 14 February 2022.
  64. ^ "OpenAlex Documentation". Retrieved 18 February 2022.
  65. ^ a b Waagmeester, Andra; Willighagen, Egon L.; Su, Andrew I.; Kutmon, Martina; Gayo, Jose Emilio Labra; Fernández-Álvarez, Daniel; Groom, Quentin; Schaap, Peter J.; Verhagen, Lisa M.; Koehorst, Jasper J. (22 January 2021). "A protocol for adding knowledge to Wikidata: aligning resources on human coronaviruses". BMC Biology. 19 (1): 12. doi:10.1186/s12915-020-00940-y. ISSN 1741-7007. PMC 7820539. PMID 33482803.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  66. ^ Elliott, Julian; Lawrence, Rebecca; Minx, Jan C.; Oladapo, Olufemi T.; Ravaud, Philippe; Tendal Jeppesen, Britta; Thomas, James; Turner, Tari; Vandvik, Per Olav; Grimshaw, Jeremy M. (December 2021). "Decision makers need constantly updated evidence synthesis". Nature. 600 (7889): 383–385. Bibcode:2021Natur.600..383E. doi:10.1038/d41586-021-03690-1. PMID 34912079. S2CID 245220047.
  67. ^ LeLorier J, Grégoire G, Benhaddad A, Lapierre J, Derderian F (August 1997). "Discrepancies between meta-analyses and subsequent large randomized, controlled trials". The New England Journal of Medicine. 337 (8): 536–542. doi:10.1056/NEJM199708213370806. PMID 9262498.
  68. ^ a b Slavin RE (1986). "Best-Evidence Synthesis: An Alternative to Meta-Analytic and Traditional Reviews". Educational Researcher. 15 (9): 5–9. doi:10.3102/0013189X015009005. S2CID 146457142.
  69. ^ Hunter JE, Schmidt FL, Jackson GB, et al. (American Psychological Association. Division of Industrial-Organizational Psychology) (1982). Meta-analysis: cumulating research findings across studies. Beverly Hills, California: Sage. ISBN 978-0-8039-1864-1.
  70. ^ Glass GV, McGaw B, Smith ML (1981). Meta-analysis in social research. Beverly Hills, California: Sage Publications. ISBN 978-0-8039-1633-3.
  71. ^ Wolf, Vinzent; Kühnel, Anne; Teckentrup, Vanessa; Koenig, Julian; Kroemer, Nils B. (2021). "Does transcutaneous auricular vagus nerve stimulation affect vagally mediated heart rate variability? A living and interactive Bayesian meta-analysis". Psychophysiology. 58 (11): e13933. doi:10.1111/psyp.13933. ISSN 0048-5772. PMID 34473846.
  72. ^ Allbritton, David; Gómez, Pablo; Angele, Bernhard; Vasilev, Martin; Perea, Manuel (2024-07-22). "Breathing Life Into Meta-Analytic Methods". Journal of Cognition. 7 (1): 61. doi:10.5334/joc.389. ISSN 2514-4820. PMC 11276543. PMID 39072210.
  73. ^ Stone, Dianna L.; Rosopa, Patrick J. (1 March 2017). "The Advantages and Limitations of Using Meta-analysis in Human Resource Management Research". Human Resource Management Review. 27 (1): 1–7. doi:10.1016/j.hrmr.2016.09.001. ISSN 1053-4822.
  74. ^ Snyder, Alison (14 October 2021). "New ideas are struggling to emerge from the sea of science". Axios. Retrieved 15 November 2021.
  75. ^ Chu, Johan S. G.; Evans, James A. (12 October 2021). "Slowed canonical progress in large fields of science". Proceedings of the National Academy of Sciences. 118 (41): e2021636118. doi:10.1073/pnas.2021636118. ISSN 0027-8424. PMC 8522281. PMID 34607941.
  76. ^ "Sharing of tacit knowledge is most important aspect of mentorship, study finds". phys.org. Retrieved 4 July 2020.
  77. ^ Ma, Yifang; Mukherjee, Satyam; Uzzi, Brian (23 June 2020). "Mentorship and protégé success in STEM fields". Proceedings of the National Academy of Sciences. 117 (25): 14077–14083. doi:10.1073/pnas.1915516117. ISSN 0027-8424. PMC 7322065. PMID 32522881.
  78. ^ "Registered Replication Reports". Association for Psychological Science. Retrieved 2015-11-13.
  79. ^ Chambers, Chris (2014-05-20). "Psychology's 'registration revolution'". the Guardian. Retrieved 2015-11-13.
  80. ^ Simera, I; Moher, D; Hirst, A; Hoey, J; Schulz, KF; Altman, DG (2010). "Transparent and accurate reporting increases reliability, utility, and impact of your research: reporting guidelines and the EQUATOR Network". BMC Medicine. 8: 24. doi:10.1186/1741-7015-8-24. PMC 2874506. PMID 20420659.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  81. ^ Simera, I.; Moher, D.; Hoey, J.; Schulz, K. F.; Altman, D. G. (2010). "A catalogue of reporting guidelines for health research". European Journal of Clinical Investigation. 40 (1): 35–53. doi:10.1111/j.1365-2362.2009.02234.x. PMID 20055895.
  82. ^ Simera, I; Altman, DG (October 2009). "Writing a research article that is "fit for purpose": EQUATOR Network and reporting guidelines". Evidence-Based Medicine. 14 (5): 132–134. doi:10.1136/ebm.14.5.132. PMID 19794009. S2CID 36739841.
  83. ^ Ep. 49: Joel Chan on metascience, creativity, and tools for thought.
  84. ^ "A new replication crisis: Research that is less likely to be true is cited more". phys.org. Retrieved 14 June 2021.
  85. ^ Serra-Garcia, Marta; Gneezy, Uri (2021-05-01). "Nonreplicable publications are cited more than replicable ones". Science Advances. 7 (21): eabd1705. Bibcode:2021SciA....7D1705S. doi:10.1126/sciadv.abd1705. ISSN 2375-2548. PMC 8139580. PMID 34020944.
  86. ^ Khamsi, Roxanne (1 May 2020). "Coronavirus in context: Scite.ai tracks positive and negative citations for COVID-19 literature". Nature. doi:10.1038/d41586-020-01324-6. Retrieved 19 February 2022.
  87. ^ Ioannidis, JPA (2016). "Why Most Clinical Research Is Not Useful". PLOS Med. 13 (6): e1002049. doi:10.1371/journal.pmed.1002049. PMC 4915619. PMID 27328301.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  88. ^ Ioannidis JA (13 July 2005). "Contradicted and initially stronger effects in highly cited clinical research". JAMA. 294 (2): 218–228. doi:10.1001/jama.294.2.218. PMID 16014596.
  89. ^ Chalmers, Iain; Glasziou, Paul (2009). "Avoidable waste in the production and reporting of research evidence". The Lancet. 374 (9683): 86–89. doi:10.1016/S0140-6736(09)60329-9. ISSN 0140-6736. PMID 19525005. S2CID 11797088.
  90. ^ June 24, Jeremy Hsu; ET, Jeremy Hsu (24 June 2010). "Dark Side of Medical Research: Widespread Bias and Omissions". Live Science. Retrieved 24 May 2019.{{cite web}}: CS1 maint: numeric names: authors list (link)
  91. ^ "Confronting conflict of interest". Nature Medicine. 24 (11): 1629. November 2018. doi:10.1038/s41591-018-0256-7. ISSN 1546-170X. PMID 30401866.
  92. ^ Haque, Waqas; Minhajuddin, Abu; Gupta, Arjun; Agrawal, Deepak (2018). "Conflicts of interest of editors of medical journals". PLOS ONE. 13 (5): e0197141. Bibcode:2018PLoSO..1397141H. doi:10.1371/journal.pone.0197141. ISSN 1932-6203. PMC 5959187. PMID 29775468.
  93. ^ Moncrieff, J (March 2002). "The antidepressant debate". The British Journal of Psychiatry. 180 (3): 193–194. doi:10.1192/bjp.180.3.193. ISSN 0007-1250. PMID 11872507.
  94. ^ Bello, S; Moustgaard, H; Hróbjartsson, A (October 2014). "The risk of unblinding was infrequently and incompletely reported in 300 randomized clinical trial publications". Journal of Clinical Epidemiology. 67 (10): 1059–1069. doi:10.1016/j.jclinepi.2014.05.007. ISSN 1878-5921. PMID 24973822.
  95. ^ Tuleu, Catherine; Legay, Helene; Orlu-Gul, Mine; Wan, Mandy (1 September 2013). "Blinding in pharmacological trials: the devil is in the details". Archives of Disease in Childhood. 98 (9): 656–659. doi:10.1136/archdischild-2013-304037. ISSN 0003-9888. PMC 3833301. PMID 23898156.
  96. ^ Kirsch, I (2014). "Antidepressants and the Placebo Effect". Zeitschrift für Psychologie. 222 (3): 128–134. doi:10.1027/2151-2604/a000176. ISSN 2190-8370. PMC 4172306. PMID 25279271.
  97. ^ Ioannidis, John PA (27 May 2008). "Effectiveness of antidepressants: an evidence myth constructed from a thousand randomized trials?". Philosophy, Ethics, and Humanities in Medicine. 3: 14. doi:10.1186/1747-5341-3-14. ISSN 1747-5341. PMC 2412901. PMID 18505564.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  98. ^ Moher, David; Altman, Douglas G.; Schulz, Kenneth F. (24 March 2010). "CONSORT 2010 Statement: updated guidelines for reporting parallel group randomised trials". BMJ. 340: c332. doi:10.1136/bmj.c332. ISSN 0959-8138. PMC 2844940. PMID 20332509.
  99. ^ Clarke, Michael; Chalmers, Iain (1998). "Discussion Sections in Reports of Controlled Trials Published in General Medical Journals". JAMA. 280 (3): 280–282. doi:10.1001/jama.280.3.280. PMID 9676682.
  100. ^ a b Lau, Joseph; Antman, Elliott M; Jimenez-Silva, Jeanette; Kupelnick, Bruce; Mosteller, Frederick; Chalmers, Thomas C (1992). "Cumulative Meta-Analysis of Therapeutic Trials for Myocardial Infarction". New England Journal of Medicine. 327 (4): 248–254. doi:10.1056/NEJM199207233270406. PMID 1614465.
  101. ^ Fergusson, Dean; Glass, Kathleen Cranley; Hutton, Brian; Shapiro, Stan (2016). "Randomized controlled trials of aprotinin in cardiac surgery: Could clinical equipoise have stopped the bleeding?". Clinical Trials. 2 (3): 218–229, discussion 229–232. doi:10.1191/1740774505cn085oa. PMID 16279145. S2CID 31375469.
  102. ^ Clarke, Mike; Brice, Anne; Chalmers, Iain (2014). "Accumulating Research: A Systematic Account of How Cumulative Meta-Analyses Would Have Provided Knowledge, Improved Health, Reduced Harm and Saved Resources". PLOS ONE. 9 (7): e102670. Bibcode:2014PLoSO...9j2670C. doi:10.1371/journal.pone.0102670. PMC 4113310. PMID 25068257.
  103. ^ a b Robinson, Karen A; Goodman, Steven N (2011). "A Systematic Examination of the Citation of Prior Research in Reports of Randomized, Controlled Trials". Annals of Internal Medicine. 154 (1): 50–55. doi:10.7326/0003-4819-154-1-201101040-00007. PMID 21200038. S2CID 207536137.
  104. ^ Epstein, David. "When Evidence Says No, but Doctors Say Yes - The Atlantic". Pocket. Retrieved 10 April 2020.
  105. ^ Tatsioni, A; Bonitsis, NG; Ioannidis, JP (5 December 2007). "Persistence of contradicted claims in the literature". JAMA. 298 (21): 2517–2526. doi:10.1001/jama.298.21.2517. ISSN 1538-3598. PMID 18056905.
  106. ^ Franco, Annie; Malhotra, Neil; Simonovits, Gabor (1 January 2016). "Underreporting in Psychology Experiments: Evidence From a Study Registry". Social Psychological and Personality Science. 7 (1): 8–12. doi:10.1177/1948550615598377. ISSN 1948-5506. S2CID 143182733.
  107. ^ Munafò, Marcus (29 March 2017). "Metascience: Reproducibility blues". Nature. 543 (7647): 619–620. Bibcode:2017Natur.543..619M. doi:10.1038/543619a. ISSN 1476-4687.
  108. ^ Stokstad, Erik (20 September 2018). "This research group seeks to expose weaknesses in science – and they'll step on some toes if they have to". Science. doi:10.1126/science.aav4784. S2CID 158525979.
  109. ^ Open Science Collaboration (2015). "Estimating the reproducibility of psychological science" (PDF). Science. 349 (6251): aac4716. doi:10.1126/science.aac4716. hdl:10722/230596. PMID 26315443. S2CID 218065162.
  110. ^ a b Allen, Christopher P G.; Mehler, David Marc Anton. "Open Science challenges, benefits and tips in early career and beyond". doi:10.31234/osf.io/3czyt. S2CID 240061030. {{cite journal}}: Cite journal requires |journal= (help)
  111. ^ Simmons, Joseph P.; Nelson, Leif D.; Simonsohn, Uri (2011). "False-Positive Psychology". Psychological Science. 22 (11): 1359–1366. doi:10.1177/0956797611417632. PMID 22006061.
  112. ^ Stroebe, Wolfgang; Strack, Fritz (2014). "The Alleged Crisis and the Illusion of Exact Replication" (PDF). Perspectives on Psychological Science. 9 (1): 59–71. doi:10.1177/1745691613514450. PMID 26173241. S2CID 31938129.
  113. ^ Aschwanden, Christie (6 December 2018). "Psychology's Replication Crisis Has Made The Field Better". FiveThirtyEight. Retrieved 19 December 2018.
  114. ^ Cohen, Jacob (1994). "The earth is round (p < .05)". American Psychologist. 49 (12): 997–1003. doi:10.1037/0003-066X.49.12.997. S2CID 380942.
  115. ^ MacCoun, Robert; Perlmutter, Saul (8 October 2015). "Blind analysis: Hide results to seek the truth". Nature. 526 (7572): 187–189. Bibcode:2015Natur.526..187M. doi:10.1038/526187a. PMID 26450040.
  116. ^ Berlin, Meta-Research Innovation Center. "Meta-Research Innovation Center Berlin". Meta-Research Innovation Center Berlin. Retrieved 2021-12-06.
  117. ^ "Home | Meta-research Innovation Center at Stanford". metrics.stanford.edu. Retrieved 2021-12-06.
  118. ^ "Meta-research and Evidence Synthesis Unit". The George Institute for Global Health. Retrieved 2021-12-19.
  119. ^ "Metascience 2021". Metascience 2021. Retrieved 20 February 2022.

Further reading

Journals

Conferences