By Eugenio Petrovich
In the last decades, the number of retractions of scientific articles has significantly grown in all disciplines (Steen et al., 2013). Even prestigious journals such as Science are not immune to such growth (Wray & Andersen, 2018). The spread of the phenomenon, as well as its accelerating pace, gives rise to concern in the scientific community, as a rising proportion of retractions are due to the manipulation of data, the use of fabricated or fraudulent data, plagiarism, and other types of research misconduct (Fang et al., 2012). Some striking cases have even reached the large public, such as the infamous article by Jeremy Wakefield about a connection between vaccines and autism, that was published in The Lancet in 1998 and retracted only twelve years later. Such cases are particularly troublesome since they risk to mine seriously the trust of society in science.
According to some authors, growing retraction rates might be an alarming signal of declining research integrity in science, a decline that could be caused, or at least exacerbated, by the hyper-competitive environment of research, dominated by the publish or perish imperative, and the increasing importance of quantitative metrics in the assessment of researchers (Biagioli et al., 2019; Edwards & Roy, 2017). Other authors, on the other hand, have argued that the increasing number of retractions may be a sign of the good health of the research system, as it could indicate a growing propensity to individuate flawed and fraudulent papers (Fanelli, 2013).
The retractions and their motivations, however, are not only interesting as indicators of high or low research ethics. They are also highly informative of the research practices in different scientific areas. Malpractices such as the fabrication of data or the hacking of statistical tests find a place only in those disciplines based on experiments. The manipulation of images is a typical fraudulent practice of biological areas, where images resulting from analyses play a pivotal role. Conflicts of interests not properly indicated are more relevant in pharmaceutics and applied sciences than, say, in theoretical physics.
If the kind of research malpractice reflects the dominant research practice of a field, it seems also reasonable to assume that fraudulent research behaviors are easier to spot and characterize in those fields where there are solid and shared methodological and epistemological underpinnings.
But what does it happen in areas epistemologically more fluid, where the methodological consensus is low and incommensurable paradigms cohabit together? The analysis of the retractions in these areas could be particularly instructive, since it could reveal whether, in spite of the epistemological diversity, these areas present a common research ethos core that allows stigmatizing non-acceptable research practices. Moreover, it could highlight the main norms and values belonging to such a core.
From this point of view, the humanities seem a perfectly tailored case study, as they are traditionally characterized by a high level of theoretical pluralism and by a non-cumulative process of knowledge creation. This post, then, is a preliminary analysis of the retractions in the humanities. To the best of our knowledge, there is only another study that has addressed this topic so far (Halevi, 2020).
Methods and Data
I will use the database created and maintained by Retraction Watch, a public blog founded in 2010 by two science journalists, Ivan Oronksy and Adam Marcus, to monitor the phenomenon of retractions.
In March 2020, the Retraction Watch database included 21 381 retracted documents, mainly concentrated in the category Basic Life Sciences, that accounts alone for the 70.3% of the documents (note that the same document can be classified in multiple disciplinary categories). The category Humanities (HUM) includes the lowest number of documents, 412 (1.9% of the total). Of these 412 documents, 142 are entirely classified as Humanities, the remaining being co-classified in other areas. The following analyses focus only on the entirely humanistic 142 records, whose publication span from 2006 and 2020.
Results
As for the sub-disciplinary composition, 111 documents are classified in only one humanistic sub-area, whereas 31 have multiple classifications (the maximum number of sub-areas for the same article is 4).
The most represented sub-areas are reported in the following table. Literature, Religion, Philosophy, and Arts are the most represented subcategories.
|
The 142 retracted articles were published in 77 journals or conference proceedings. The venues with more than 2 articles are reported in the following table. A significant number of the records are concentrated into conference proceedings published by the IEEE in 2009 and in the Neohelicon, a journal specialized in comparative literature, critical theory and practice, edited by Springer.
Venue | Records |
2009 IEEE 10th International Conference on Computer-Aided Industrial Design & Conceptual Design | 20 |
Neohelicon | 14 |
Revue Romane | 7 |
The Catholic Register | 7 |
The Cambridge Quarterly | 4 |
Procedia – Social and Behavioral Sciences | 3 |
College Literature | 3 |
Journal of Dramatic Theory and Criticism | 3 |
Literature Film Quarterly | 3 |
Les Lettres Romanes | 3 |
Anuario de Estudios Americanos | 2 |
Symposium: A Quarterly Journal in Modern Literatures | 2 |
Studies in Communication Sciences | 2 |
Humanitas | 2 |
The Bible Today | 2 |
Journal of Fundamental and Applied Sciences | 2 |
America: The Jesuit Review | 2 |
The Priest | 2 |
The publishers with more than 3 records are presented in the following table. Apart from the IEEE, which is a publisher characterized by a consistent number of retractions in the whole databases, the list includes high-profile publishers such as Springer as well as publishers specialized in religious studies.
Publisher | Records |
Institute of Electrical and Electronics Engineers (IEEE) | 24 |
Springer | 18 |
Taylor and Francis | 14 |
Archdiocese of Toronto | 7 |
John Benjamins | 7 |
Oxford Academic | 7 |
Consejo Superior de Investigaciones Científicas (CSIC) | 4 |
Elsevier | 4 |
As for the countries of the authors of retracted documents, United States and China dominate the ranking, with 32 and 24 retracted publications, respectively. This is not unexpected given their big academic population. They are followed by Canada (20) and Turkey (14).
Interestingly, six publications are American-French co-authored articles produced by two authors alone, Richard Lawrence and Etienne Barnett, and that appeared in the journal Neohelicon in 2015. In fact, these two authors are the most prolific authors of our corpus: they authored alone 26 of the 142 humanities records (almost 1 out of 5). Their publications are classified in the categories Arts – Literature/Poetry (25) and Philosophy (1).
Thomas M Rosica and Robert James Cardullo are two other prolific authors of retracted articles, being the authors of, respectively, 22 and 19 publications each. The first specializes in Religion, the second in Literature/Poetry, Film Studies, and Philosophy. Together with Lawrence and Barnett, their publications account for almost half of the total number of publications!
Besides retraction gossip, however, the most interesting data reported by Retraction Watch are probably the motivations of the retractions. Retraction Watch attributes to each retracted article one or more motivations based on a standard list. The following table shows the absolute and relative frequency of the reasons for retraction in our corpus. Note that since more motivations can be attributed to the same article, the sum of the percentages is higher than 100%.
Retraction motivation | Number of articles | Percentage on the total |
Plagiarism of Article | 54 | 38% |
Plagiarism of Text | 23 | 16% |
Euphemisms for Plagiarism | 23 | 16% |
Breach of Policy by Author | 22 | 15% |
Duplication of Article | 20 | 14% |
Date of Retraction/Other Unknown | 17 | 12% |
Euphemisms for Duplication | 15 | 11% |
Withdrawal | 13 | 9% |
Investigation by Journal/Publisher | 13 | 9% |
Duplication of Text | 12 | 8% |
Notice – Limited or No Information | 11 | 8% |
Error by Journal/Publisher | 9 | 6% |
Notice – Lack of | 9 | 6% |
Investigation by Company/Institution | 6 | 4% |
Error in Text | 3 | 2% |
Copyright Claims | 2 | 1% |
Error in Analyses | 2 | 1% |
Retract and Replace | 2 | 1% |
Legal Reasons/Legal Threats | 2 | 1% |
Withdrawn to Publish in Different Journal | 2 | 1% |
Lack of Approval from Company/Institution | 1 | 1% |
Unreliable Results | 1 | 1% |
Taken from Dissertation/Thesis | 1 | 1% |
Concerns/Issues about Referencing/Attributions | 1 | 1% |
Falsification/Fabrication of Data | 1 | 1% |
Misconduct – Official Investigation/Finding | 1 | 1% |
Error in Results and/or Conclusions | 1 | 1% |
Concerns/Issues About Results | 1 | 1% |
What do these results say about the humanities?
As we observed above, the malpractices we find in retractions reflect the standard research practices of a field as well as its methodological, epistemological, and academic norms. In other words, from a list of malpractices we can infer, to a certain extent, the epistemological and academic values of disciplines, perhaps even their relative weights. In the case of the humanities, our results show clearly that plagiarism, duplication, and other forms of misappropriation of texts are the most common causes of retractions. 135 articles on 142 (95%) report such motivation. Maybe this result may seem trivial but, if we reflect better on it, it highlights nonetheless two key aspects of humanistic research.
First, results highlight how data, experiments, and materials different from concepts and texts do not play a central role in the humanities. Retractions due to manipulation/falsification/fabrication of results or unreliable data are almost lacking from our corpus. No statistical manipulation of data is registered. The difference between humanistic research and experimental sciences is thus reaffirmed by our analysis. We could further speculate that the lack of methodological malpractices confirms the absence of a shared methodological core in the humanities, playing a role comparable to that of statistics in experimental science. Conflicts of interest are lacking too, reflecting probably the scarce presence of applied research in the humanities.
Secondly, but most importantly, it underscores the centrality of text writing and authorship in the humanities. As Hellqvist (2009) notes:
In the humanities the act of writing is seldom separated from research itself; it is truly a performative rather than a reporting act. (p. 314)
Differently from the sciences, where a scientist can be awarded with the authorship even if she did not contribute to the material writing of the paper, in the humanities writing is an ineliminable and essential part of the research process. Given the intertwining of research and writing, it is hence very difficult to distinguish between “publishing misconduct” and “research misconduct” in the humanities, as it can be done in the sciences (see e.g., Grieneisen & Zhang, 2012).
Taking into account the centrality of writing and the peculiar role of the author in the humanities, we can thus better appreciate why almost all the motivations for retractions in our corpus have to do with the infringement of the norms that govern the attribution of texts to authors. The ethical management of written products – an important component of which is originality – lies at the core of the humanistic research ethos.
If the authorship and authoriality are so central in the humanistic research ethos, what can happen to this ethos if co-authorship becomes widespread in the humanities as it is in the sciences, maybe as a response to the increasing pressure of research evaluation systems centered on productivity? Will the dilution of the centrality of the author affect the normative structure of the humanities, leading for instance to a lighter assessment of the less severe cases of authorship infringement, such as self-plagiarism? The analysis of retractions may be useful to monitor and better understand these normative changes.
The Retraction Watch database can be a useful entry point to these questions. However, the low number of humanistic articles present in the database, as well as their concentration in relatively few authors, invites some caution. Retraction Watch data may be non-representative of the humanistic production or biased towards certain areas. Moreover, the list of retraction motivations used by the site is probably more tailored for scientific research and it could miss the nuances of retraction motivations in the humanities. Lastly, it is not entirely clear how Retraction Watch collects its data.
Nonetheless, we hope to have shown how the analysis of retractions may shed new light on research ethos and practices in the humanities.
Acknowledgments
I am grateful to Mariano Senese for the preliminary analyses of the database he did in his MA thesis. Thanks also to Emiliano Tolusso and Simona Azzan for their useful feedback on an earlier draft of this post.
Cited references
Biagioli, M., Kenney, M., Martin, B. R., & Walsh, J. P. (2019). Academic misconduct, misrepresentation and gaming: A reassessment. Research Policy, 48(2), 401–413. https://doi.org/10.1016/j.respol.2018.10.025
Edwards, M. A., & Roy, S. (2017). Academic Research in the 21st Century: Maintaining Scientific Integrity in a Climate of Perverse Incentives and Hypercompetition. Environmental Engineering Science, 34(1), 51–61. https://doi.org/10.1089/ees.2016.0223
Fanelli, D. (2013). Why Growing Retractions Are (Mostly) a Good Sign. PLoS Medicine, 10(12), e1001563. https://doi.org/10.1371/journal.pmed.1001563
Fang, F. C., Steen, R. G., & Casadevall, A. (2012). Misconduct accounts for the majority of retracted scientific publications. Proceedings of the National Academy of Sciences, 109(42), 17028–17033. https://doi.org/10.1073/pnas.1212247109
Grieneisen, M. L., & Zhang, M. (2012). A Comprehensive Survey of Retracted Articles from the Scholarly Literature. PLoS ONE, 7(10), e44118. https://doi.org/10.1371/journal.pone.0044118
Halevi, G. (2020). Why Articles in Arts and Humanities Are Being Retracted? Publishing Research Quarterly, 36(1), 55–62. https://doi.org/10.1007/s12109-019-09699-9
Hellqvist, B. (2009). Referencing in the humanities and its implications for citation analysis. Journal of the American Society for Information Science and Technology, n/a-n/a. https://doi.org/10.1002/asi.21256
Steen, R. G., Casadevall, A., & Fang, F. C. (2013). Why Has the Number of Scientific Retractions Increased? PLoS ONE, 8(7), e68397. https://doi.org/10.1371/journal.pone.0068397
Wray, K. B., & Andersen, L. E. (2018). Retractions in Science. Scientometrics, 117(3), 2009–2019. https://doi.org/10.1007/s11192-018-2922-4