Friday Links

1. Should Science put up with sloppiness?

It is interesting that correct papers get retracted for ethical reasons, but rarely just because they are wrong.

2.Fraud & Science

The proposal, submitted some years earlier to a funding agency on a different continent, was copied by one of the reviewers, a highly recognized scientist, and then submitted to the ERC. It was pure chance that the former applicant detected the fraud.

[…] However, the larger legal framework of the European Commission (EC) under which the ERC operates links “frauds” only to financial aspects. The ERC is then obliged to report any (accomplished or attempted) misbehavior to OLAF, the European antifraud police. Financial fraud, however, causes the least headaches. In the above case, the ERC was unable to take action against the mischievous applicant.

Contrast with how standard scientific practice becomes criminal when money is involved:

“If you applied this rule to scientists, a sizable proportion of them might be in jail today,” said Steven N. Goodman, a pediatrician and biostatistician at Stanford University who submitted a statement supporting Harkonen’s appeal.

3. Another weird science fraud case:

The authors of a paper published in July […] are not only unknown at the institution listed on the paper, but no trace of them as researchers can be found.

The paper […] is not the kind of prank that journals have encountered before, in which hoaxsters have submitted dummy papers to highlight weaknesses in the peer-review process. The paper’s reported findings […] are, in fact, true.

Bruce Spiegelman […] says that he has presented similar findings at about six research meetings, and is preparing to submit them to a journal. He suspects that the [paper by unknown authors] was intended as a spoiler of his own lab’s work.

4. More non-computational thinking, the stupid snob edition

David Salz’s thoroughly researched assault on USB’s sonic handicaps delivers a relaxed, well-defined, dynamically evocative, and rhythmically taut performance. The Silver Starlight projects strings without screechiness, which cannot be said of most USB cables. For those seeking a mid-priced USB cable with obviously high build-quality and performance, the Silver Starlight is a solid choice.

The Silver Starlight USB digital cable costs $275/m!

My point stands: it is easier to identify non-computational thinking than to define what computational thinking is.

/ht Carter T Schonwald (@cartazio) on twitter

Predatory Authors & Predatory Publishers

There are two types of predatory publishers:

1. The truly predatory. These are the guys who build a website with the same name as a real journal, copy its editorial board and wait for scientists to make a mistake (this is journal hijacking). Or who have conferences with a similar name or a fake organizing committee.

These are publishers preying on trusting authors. This is, probably, the minority of predatory publishers.

2. The low quality (or perhaps no quality) journals. This is what most of Beall’s list is actually about.

In this second case, it is the publishers and the authors preying on the scientific system and often preying on taxpayer’s money! The authors are accomplices and not victims of predatory publishers.

The people who submit to these journals generally know what they’re getting. Or at least they should know. [1] An author who submits some mediocre poems as a research article cannot be deluded (ht Jeffrey Beall).

If you are at a third rate institution with third rate supervision and your work is third rate, this is win-win-lose: you win because you get a publication (which you can tout to your funders), the publisher wins because they get the publishing fees. The taxpayer loses twice: once for paying the publishing fee and they lose even more if this outlet is an excuse for you to not do the kind of good work that would get you in a real journal.

This is, naturally, a bigger problem at non-first world institutions [2]. The work is not as strong and the supervision is weaker, thus you can fool the bureaucracy by pointing to all of the shitty not-really-peer-reviewed papers you published (or the bureaucracy understands and helps you fool the politicians; it’s fools all the way up).

However, it can happen in the US as well. The story of Western Illinois University seems to be similar: the professors knew they weren’t really getting peer-reviewed, the union knew, the departments knew… Everyone still went through the motions to be able to claim that our faculty publishes needs two-peer peer-reviewed journal publications for tenure [3]. The fact that it was a private institution does not change the basic point that it is the taxpayer paying. Its college students get taxpayer subsidized loans and they may get some public research money as well (which go to pay for these publishing fees). But of course, the students are getting ripped-off too.

Most authors of “predatory journals” are not victims, they are predators themselves. The taxpayer and society are the victims.

[1] There is always the possibility of publishing in a barely-known journal that’s starting out because you think it shows promise and later realizing that it is actually a bad journal with little peer review. You can minimize the risk for it by looking at who is the editor, &c; but sometimes you might really make a mistake. This is not what we are discussing here, though. I have also heard people argue that 10-15 years ago, some people really were fooled by low-quality conference invitations which then required some payment as they were too trusting. This was understandable then when you got your first low-quality invitation, but nowadays people really should know better (I, as a lowly postdoc, get plenty of invitations to shitty conferences, I can only imagine how many real faculty get).
[2] I might get some pushback on this, but seriously: a country that had strong institutions and, on average, good work would be a first world country. That’s what being first world means! (Yes, there are exceptions; but there are more problems in non-first world countries).
[3] Read the second comment by Robert J. Hironimus-Wendt on this page Even as he tries to defend the school, what becomes obvious is that the standards are shoddy.

How Long Does Plos One Take to Accept A Paper?

How long do papers take to review?

Too long.

No, seriously, how long? I did a little measurement.

I downloaded the 360 most recent papers from Plos One (as of Friday). They are all annotated with submission and acceptance dates, so it was easy to just compute the differences.

The plot below is a histogram (one bin per day) in grey with a Kernel density estimate as a solid line.

Histogram of acceptance times

§

The result is it takes about 3 to 4 months to get a paper accepted, but with substancial variance.

§

Looking at the figure, I had to ask who the poor people were who published that paper which was longest in revision.

Alternative Sigma Factor Over-Expression Enables Heterologous Expression of a Type II Polyketide Biosynthetic Pathway in Escherichia coli by David Cole Stevens, Kyle R. Conway, Nelson Pearce, Luis Roberto Villegas-Peñaranda, Anthony G. Garza, and Christopher N. Boddy. DOI: 10.1371/journal.pone.0064858

Submitted on 29 March 2011 and accepted on 22 April 2013, this paper was 755 days in revision.

The fastest acceptance was only 19 days. However, this being Plos One, it is possible that the paper had been reviewed for another Plos journal, rejected with positive reviews on significance grounds, and had those reviews transferred to Plos One. After this, acceptance followed without a new round of peer review.

§

This is a gimmick. There is perhaps a paper to be written where this is extended to see what areas of research/keywords/&c matter to acceptance time. If I had more free time I might write that paper.

The code for the above is available on github.

UpdateFollowup with all PLoS Journals.