Wednesday, July 04, 2012

Using Statistics to Detect Scientific Fraud

This early preview from an article forthcoming in the journal Science has the details:

The most startling thing about the latest scandal to hit social psychology isn't the alleged violation of scientific ethics itself, scientists say, or the fact that it happened in the Netherlands, the home of fallen research star and serial fraudster Diederik Stapel, whose case shook the field to its core less than a year ago. Instead, what fascinates them most is how the new case, which led to the resignation of psychologist Dirk Smeesters of Erasmus University Rotterdam and the requested retraction of two of his papers by his school, came to light: through an unpublished statistical method to detect data fraud.

The technique was developed by Uri Simonsohn, a social psychologist at the Wharton School of the University of Pennsylvania, who tells Science that he has also notified a U.S. university of a psychology paper his method flagged.

That paper's main author, too, has been investigated and has resigned, he says. As Science went to press, Simonsohn said he planned to reveal details about his method, and both cases, as early as this week.

If it proves valid, Simonsohn's technique might find other possible cases of misconduct lurking in the vast body of scientific literature. "There's a lot of interest in this," says Brian Nosek of the University of Virginia in Charlottesville, who recently launched an examination of replicability in social psychology findings (Science, 30 March, p. 1558).

The method may help the field of psychological science clean up its act and restore its credibility, he adds--but it may also turn colleagues into adversaries and destroy careers. The field will need ample debate on how to use it, Nosek says, much the way physicists had to grapple with the advent of nuclear physics. "This is psychology's atomic bomb," he says.

Simonsohn already created a stir last year with a paper in Psychological Science showing that it's "unacceptably easy" to prove almost anything using common ways to massage data and suggesting that a large proportion of papers in the field may be false positives. He first contacted Smeesters on 29 August 2011 about a paper on the psychological effects of color, published earlier that year. The two corresponded for months, and Smeesters sent Simonsohn the underlying data file on 30 November. Smeesters also informed a university official about the exchange. Simonsohn says he was then contacted by the university.

No comments: