Photograph of a stack of white papers with six red post-it notes sticking out.
Close inspection: The papers in question investigate causes and rehabilitative tools for autism and ADHD.
Photograph by Richard Drury

Work of autism researcher questioned again

Four studies, including results of a clinical trial, were flagged on PubPeer for a variety of criticisms.

A neuroscientist whose work in autism and attention-deficit/hyperactivity disorder has previously come under fire is once again under scrutiny.

In October, Dorothy Bishop, emeritus professor of developmental neuropsychology at the University of Oxford, posted comments on the online discussion forum PubPeer that criticize four papers by Gerry Leisman and his colleagues, pointing out flawed methodology, missing data in a supplementary information section and problems with the details of a clinical trial.

The papers, published between 2013 and 2023, investigate causes of and rehabilitative tools for autism and ADHD. Leisman is a research fellow at the University of Haifa in Israel who was also cited for scientific misconduct decades ago, for misrepresenting his academic credentials and professional experience and awarded patents in a grant application.

Of Leisman — the corresponding author on these four papers — and his co-investigators, Bishop says, “They do what a lot of these people like them do, which is to try and impress you with this huge amount of neurosciency-sounding, fancy methods, like qEEG,” she says.

B

ishop says she began looking at the work after she was tipped off by a researcher who was unhappy with the scientific content at the 2022 Movement: Brain, Body, Cognition conference in Paris, France, which was organized by Leisman. (That researcher declined to speak with The Transmitter, other than to confirm tipping off Bishop.)

Among the four papers authored by Leisman that Bishop flagged on PubPeer, the most recent is a 2023 study published in Brain Sciences titled “The relationship between retained primitive reflexes and hemispheric connectivity in autism spectrum disorders.” (Leisman is a section editor-in-chief for that journal, though not the section in which this paper was published.) Bishop wrote on PubPeer that the paper presents quantitative EEG (qEEG) data in the text of the article, but those data are not found in the associated dataset. Leisman has since made the qEEG data available by request on ResearchGate.

Bishop also critiqued on PubPeer a 2020 study published in Frontiers of Public Health on which Leisman is the corresponding author. She wrote that it should not provide definitive conclusions regarding a “synchronized metronome training” method the researchers used to improve cognitive, sensorimotor and academic performance in people with ADHD because the study had only a pre- and post-test analysis, and no control group. Bishop also questioned whether participants in the study had paid to access the synchronized metronome method being tested, based on a conflict-of-interest statement at the end of the paper.

Bishop flagged a third paper by Leisman and his colleagues, published in 2013, for containing mischaracterized references — for example, attributing information about ADHD to a book that does not mention that topic — and she noted that it “overlaps substantially” with earlier publications by the same researchers.

Finally, Bishop raised questions on PubPeer about a clinical trial discussed in a 2018 book chapter written by Leisman and his colleagues, pointing out that the data associated with the trial were not publicly available. The trial, “An evaluation of low level laser light therapy for autistic disorder,” involved aiming a laser (made by the company Erchonia) at the back of the head — described in the study plan as the “base of the brain and temporal areas” — for five minutes at a time, over eight sessions spread across four weeks. The trial included a self-reported survey of participants’ caregivers and claimed a statistically significant improvement in irritability in autistic children aged 5 to 17.

Leisman has since posted the mentioned data related to that clinical trial on ResearchGate. André Gillibert, a biostatistician at the Hospital Center University Rouen in France, and Florian Naudet, a psychiatrist at Rennes 1 University in France, then questioned on PubPeer data showing that 15 of the 19 controls were “perfectly stable,” meaning there was no change in irritability, at all time points across the study. That kind of uniformity “is not expected given that such subjective scales always have some variance,” they wrote. “Even the perfect stability of one patient on all subscales would be hardly believable.”

O

ne of Leisman’s co-investigators on the 2018 paper is Calixto Machado, a neurologist at the Institute of Neurology and Neurosurgery in Havana, Cuba, who led the laser trial. And Robert Melillo, a New York-based chiropractor, was a co-investigator on the 2020 and 2023 papers. He is also co-founder of the Brain Balance Achievement Centers, which sell programs to “strengthen and build brain connectivity” in children with ADHD, autism and other conditions through cognitive exercises, coaching and customized nutrition plans.

Investigations into Brain Balance by NPR and NBC News in 2018 and 2019, respectively, pointed out that the sessions generally cost families thousands of dollars and are not covered by insurance. The studies supporting the Brain Balance Program also have “serious scientific shortcomings,” NPR said. The NPR investigation pointed out that experts questioned the quality of the methods of the 2013 study, which is promoted on the Brain Balance website.

Neither Machado nor Melillo responded to a request for an interview from The Transmitter. But Leisman confirmed in an interview that the participants in the 2020 study in question came from a de-identified database of Brain Balance clients. He defended the paper’s methodology, saying that post-hoc analyses of databases are often done in the epidemiology field, for instance.

Leisman also addressed on PubPeer some of Bishop’s concerns, including fully describing the institute at the center of the trial. Following contact from The Transmitter about this article, Leisman again posted on PubPeer, touching on the trial design and questions about the control group. He also wrote about the statistical methods used in the paper, saying Haifa’s Department of Statistics found them valid, and that “our analytic methodology is appropriate and correct as is.” He told The Transmitter that it was a double-blind study, and “those were the numbers we got.”

Regarding the irritability scale used in that study, Leisman acknowledges self-reported questionnaires “are not particularly reliable in the first place,” but told The Transmitter that was the study protocol put forth by the company providing the laser. The Erchonia HLS laser was eventually submitted to the U.S. Food and Drug Administration for a 510(k) market clearance in treating autism traits, but in a 2019 interview, the company’s president indicated this application had been rejected.

L

eisman says the critiques of the clinical trial, specifically, “came to me quite by surprise, because that paper has been up for a long time.” He also told The Transmitter that he considered his 1994 misconduct finding “ancient history.” The questions around the clinical trial work have bothered him, he says, because “we were so, so, so careful with doing this.”

“Many of the treatment components used at Brain Balance lack empirical evidence, thus essentially meeting the definition of pseudoscientific or unproven fad treatments,” according to a 2021 review of Brain Balance’s methods for the nonprofit Association for Science in Autism Treatment.

Particularly in the autism field, parents and caregivers are looking for answers, says Thomas Zane, co-author of that review and director of online programs in behavior analysis at the University of Kansas in Lawrence. “All they want is something that works. They just want to help their children. And so if a study comes out that says if you wear specialized tinted glasses and that fixes the problem, or if you give mega doses of B12, that fixes the problem, they’re going to jump on that,” he says. “We should all have standards for what constitutes rigorous evidence, so that what’s published really does meet standards of something that will really help.”