4.3 Academic integrity
USC defines ‘academic integrity’ as
…taking an ethical, honest and responsible approach to study and research
— From USC webpages
The opposite of academic integrity is academic misconduct. You are strongly encouraged to read the USC webpage about academic integrity and academic misconduct, which includes a discussion of the consequences of academic misconduct.
In the context of research design, academic integrity covers areas such as
The USC webpage defines collusion in this way:
In a research context, collusion means failing to acknowledge the contributions and ideas of others.
In the context of research, the USC webpage defines fraud in this way:
Definition 4.2 (Fraud) In a research context, fraud can occur by (for example):
- falsifying research data and findings;
- altering or fabricating information;
- forging a document.
The webpage then explains how to avoid fraud:
By acting ethically, honestly and responsibly in all your academic activities.
Example 3.5 (Fraud) Microbiologist Keka Sarkar had papers retracted due to fraud, including self-plagiarism and reusing figures that were claimed to be from different studies:
Two figures in the paper in Biotechnology Letters had been taken from [another paper of Sarkar’s]: Fig. 2c was identical to Fig. 3 in the J Nanobiotech paper and Fig. 4 (inset) was identical to Fig. 7 (left). No acknowledgement was given that these figures were identical. In addition, the two figures illustrated results from apparently different experiments […] Figure 2a in the Biotechnology Letters paper was also used without modification in another publication of this group…
4.3.3 Reproducible research
One way to ensure that the results of research are reliable and trustworthy is to ensure that research is reproducible: that someone else can repeat the study (including the analysis):
Reproducibility involves methods to ensure that independent scientists can reproduce published results by using the same procedures and data as the original investigators. It also requires that the primary investigators share their data and methodological details. These include, at a minimum, the original protocol, the dataset used for the analysis, and the computer code used to produce the results.
The means for ensuring that all components of research are reproducible are discipline dependent, are beyond the scope of this book. However, realising the importance of reproducibility is important; for example, it emphasises the importance of describing the protocol. Different journals also have different expectations regarding reproducibility.
Many researchers strongly advise against using point-and-click interfaces (such as found in Excel) to analysis since the results are not reproducible:
… it is increasingly clear that traditional means of data management, such as storing and manipulating data using spreadsheets, are no longer adequate […], and new approaches are required for analyzing data.
The importance of reproducibility in the analysis phase is crucial:
There are serious medical consequences to errors attributable to the effects of spreadsheet programs and software operated through a graphical user interface… Fundamentally, the issue is one of reproducibility. The opacity of graphical user interface–based statistical analysis and the importance of research transparency and reproducibility have been high-lighted by scientific scandals that could have been avoided through a reproducible research paradigm…
Rather than spreadsheets, which have significant problems when used for analysis, using analysis tools which enable reproducible research (for example, by using scripts all actions are documented), such as R (R Core Team 2018) (on which jamovi is based), are recommended:
We have all had the experience of having performed a laborious calculation in a spreadsheet program only to later be required to redo the analysis because of the availability of additional data, the discovery of an error, or because the analysis is part of a recurring report (e.g., monthly quality indicators). At that point we may have to return and begin the calculation all over, except we may not even remember what we did, or we may inadvertently perform the analysis in a slightly different way each time.
Example 4.2 (Unethical reporting and practice) A ‘Letter to the Editor’ of paramedicine journal (George et al. 2015) questioned an article (Hosseini et al. 2015) in which researchers claimed to have randomly allocated participants into two groups. The Letter noted that the initial average weights of the participants in each group were significantly different. The article states:
It is extraordinarily unlikely that any variable would be that different between two groups if allocation was truly random. Even it was truly random, the stated method of “the samples were randomly divided into two groups”… does not describe the “method used to generate the random allocation sequence” […] details specified by Consolidated Standards of Reporting Trials (CONSORT)…