Spotted: Celebrating scrutiny; impact detractor
Highlighting retractions can keep researchers in check, and a top journal editor wants to rethink ‘impact.’
Highlighting retractions can keep researchers in check, and a top journal editor wants to rethink ‘impact.’
The world’s most populous country takes stock of autism, and guinea pigs ease anxiety in children with the disorder.
A sexist peer review sparks a Twitter firestorm, and business is booming for some firms that employ people with autism.
A biotech breakthrough sparks a high-stakes patent war, and two new films follow people with autism looking for love.
Imagine a world in which researchers reveal all their clinical trial data, allowing their peers to do their own analyses and confirm the findings. A new report by the Institute of Medicine outlines ways to make this scenario a reality.
The more researchers poke around, the more likely they are to find a significant effect — and the more likely that the effect they end up reporting is just a fluke. A new kind of journal article, the ‘registered report,’ may address this problem, says Jon Brock.
Researchers must use better measures to show that experimental and control groups are well matched, says Jon Brock.
The SHANK3 mouse model described in a 2011 Cell paper that was retracted 17 January is still worth studying, says Alan Packer.
Papers that are turned down by one journal and end up being published by another are cited significantly more often than papers accepted by the first-choice journal, according to an analysis published 12 October in Science.