The editors of a volume of conference proceedings have retracted a paper on machine-learning methods to predict autism, because of concerns about the authors’ use of unusual phrases, such as “counterfeit neural organization.”
The move follows reporting by Spectrum last year that highlighted work identifying such “tortured phrases” — odd ways of rewording established terms, often to avoid plagiarism detection — in half a dozen autism studies.
It is unclear why the paper was included in the conference proceedings, titled Technology Enabled Ergonomic Design, to begin with, says Guillaume Cabanac, professor of computer science at the University of Toulouse in France. “This could have been a red flag for the editors” at the start, he says.
Cabanac highlighted the paper’s potential problems in a March 2022 post on PubPeer, an online forum for discussing published studies. The paper contains “several tortured phrases that make some passages hard to parse,” he noted, including the terms “arbitrary timberland” and “irregular woodland” instead of the established term “random forest.”
Even though the article had not been cited in other research, its publication undermines the scientific process, Cabanac says. “This is damaging science. This is damaging what society expects from science.”
The 9 April retraction notice did not cite Cabanac’s PubPeer post, but it does state that “concerns were raised” about the paper’s phrases and that the study authors “did not provide a reasonable explanation” for the use of these phrases. (The study authors and journal editor did not respond to Spectrum’s request for comment.)
Cabanac says he first came across the paper via the Problematic Paper Screener, which he created to identify published work that contains tortured phrases or other issues, such as nonsensical grammar or improper citations. Cabanac and his “science detective” colleagues then sift through the flagged papers. Of the more than 12,000 they have identified as potentially problematic since they began their work in 2021, 644 have been identified by the screener as retracted.
One awkward phrase may be a coincidence, or a reflection of someone’s difficulty with the English language, Cabanac says. But the recently retracted study contains 25, according to his last count. Certain terms tend to be particularly telling, he says, such as “blunder rate” in place of “error rate;” he has also frequently seen other studies write about a nucleic or amino “corrosive” rather than “acid.”
“If an author relied on third-party software to write his paper, or her paper, quicker, what can we think about their respect of scientific protocol, of scientific ethics?” he says. “And if I find tortured phrases like ‘corrosive’ in a paper, I don’t believe that I should trust the scientists.”
Cite this article: https://doi.org/10.53053/OXMJ1455