The Color Coded Tiers of Open Access

Open Access, or OA, in scientific publishing is bringing long-due attention to the question of availability. University libraries pay millions of dollars per year on subscriptions, sometimes under the influence of coercive package deals which encourage libraries to subscribe to a lump of journals rather than pick and choose the most relevant. Tim Gower, a fellow at Trinity College, Cambridge, reports that UK university libraries pay anywhere from £234,126 (Exeter) to £1,381,380 (University College London) in subscription costs to Elsevier alone. The excessive cost increases in journal subscriptions have led to substantial actions by some universities, including a cancellation of Elsevier subscriptions by Harvard, MIT avoiding a 3-year renewal commitment with Wiley and Elsevier, and selective cancellation of Elsevier journals by Cornell, to name a few.

The debate over the efficacy of the scientific publishing status quo is alive and well. By most counts the rate of retractions has increased, although it is not clear if more retractions are caused by more misconduct or better vigilance. eLife editor and Nobel Laureate Randy Schekman, among others, suspect the pressure to publish in superstar journals and over-reliance on impact factor leads to misplaced incentives and rewards showy failures. For example, the infamous “arsenic life” paper has amassed 287 citations as of this writing, as indexed by Google Scholar, and is unlikely to be retracted by Science as a result; the 287 references to the article could buoy an additional 8 articles, each with little to no citations, and still maintain Science’s impact factor of ~32.

So maybe you’ve become a bit frustrated with paywalls and the relative attention (and citation) monoply enjoyed by top-brand journals. Perhaps you are tired of your library paying exorbitant fees for bundled subscriptions. In any case, you’re considering pursuing open access for some of your work. It may be as simple as hosting PDFs of your articles on your own, but the options are diverse, as are the costs. OA is typically differentiated into two major types, designated by color: gold and green.

Green OA refers to self-hosting of copies by a person, lab, or university. These can be archived and made avaible as pre-prints, post-prints, or in the final, formatted version published by the journal. The latter method can be contentious with some publishers (see the recent spate at Nature over Duke University’s open access mandate). SHERPA/RoMEO further differentiates green OA friendliness of journals according to a range of colors according to what is allowed by a journal or publishers copyright transfer agreement.

  • green pre-print, post-print and publisher’s version
  • blue post-print and publisher’s version
  • yellow pre-print and publisher’s version
  • white not designated/not allowed

Gold OA is driven by the journal or publisher, rather than the author or university. These are the journals typically associated as open access, and they usually, but not always, charge a hefty fee to authors. Journals under the PLOS umbrella belong to this category, and big name publishers have been dipping their toes into gold open access as well.

A hybrid approach to publishing is becoming widespread. This is often implemented as making optional OA available at a few thousand dollars charged to the author, such as the policy employed by the Journal of Visual Experimentation or Optical Society publications. Other journals make the headline article for an issue freely available, often in advance of print publication, to draw interest. Many journals have explicit policies that OK green OA after a designated grace period, e.g. according to their policy Science allows free access to articles 12 months after initial publication.

OA has a role to play in the changing landscape of scientific publishing but there are still plenty of variations to be tried, and OA is no silver bullet for all that ails publication, funding, and promotion in science careers. Web resources such as figshare expand the role of data and figures, while online lab notebooks like OpenWetWare increase transparency. F1000 Research is experimenting with citeable, viewable, open peer review. OA won’t stop the occasional “arsenic life” paper from stealing headlines, but it definitely has a role to play in the future of access.

Additional OA resources:

The University of California Berkeley Library maintains an index of publishers with gold open access options and their associated publishing fees.

Duke University OA mandates versus Nature Publishing Group:
Duke Libraries take by Kevin Smith, JD: https://blogs.library.duke.edu/scholcomm/2014/03/27/attacking-academic-values/
Nature Publishing Group’s take by Grace Baynes http://blogs.nature.com/ofschemesandmemes/2014/03/28/clarifying-npgs-views-on-moral-rights-and-institutional-open-access-mandates

SHERPA/RoMEO. Provides shades of green to denote publisher’s OA archiving policies: http://www.sherpa.ac.uk/romeoinfo.html

Directory of Open Access Journals: http://doaj.org/

University of Colorado Denver Librarian Jeffrey Beall’s site: http://scholarlyoa.com/
Beall’s blog includes his list of potentially predatory publishers (http://scholarlyoa.com/publishers/), potentially predatory journals (http://scholarlyoa.com/individual-journals/), and the newer list of exploitative metric indexes (http://scholarlyoa.com/other-pages/misleading-metrics/). These are essential resources, particularly useful when conventional publishers conflate known exploitative publishers with OA as a whole.

Open Access Death Knell. Or Is It?

Is_this_tomorrowrev
I told you publication was a fiat currency

Last week, Science published a critical investigation into the state of peer review in open access journals. John Bohannon, the author,  generated a faux paper describing a set of fictional experiments testing the effects of secondary metabolites from lichen on cultured cancer cells. These papers were sent to a selection of open access journals sampled from the Directory of Open Access Journals (DOAJ) and Larry Beall’s infamous list. The lichen species, secondary metabolite, and cancer cell line were varied randomly to generate iterations of the paper with slight differences, but the shoddy results were the same in each. In a few of the iterations I examined, the structures didn’t even match the secondary metabolite described. The researchers didn’t exist, their names random combinations from a database of first and last names with random middle initials, and the institutions they worked at were fictional. A “dose-dependent” growth inhibition effect in ethanol buffer (with no EtOH in controls) spans five orders of magnitude, and shows growth rates all with overlapping confidence intervals at 95%.

Of 304 papers submitted to various open access journals, 157 were accepted, many of them without any real review taking place. 98 were rejected, and 49 were still up in the air at press time. The article seems to make a solid case against the relatively nascent open access model of publishing, and that is certainly the tone represented by the article and associated press coverage. However, if I assume that the average reader of Science is scientifically literate, then I would expect that most readers will remain unconvinced that open access is dead and dangerous.

In Who’s Afraid of Peer Review Bohannon combines language from both scientific and journalistic writing styles, taking advantage of the credibility implied by describing sample-selection and procedural decisions in a semi-scientific manner, as well as the journalist’s ability to make general claims with a strength that would be questionable in a scientific article.

Examples:

And the acceptances and rejections of the paper provide the first global snapshot of peer review across the open-access scientific enterprise.

137 of the journals chosen for this investigation were pulled from a black list maintained by Jeffrey Beall at the University of Colorado Boulder. In places (such as the general description of results) the overlap between Beall’s list and the journals selected from the DOAJ is not clear. In the original sample, 16 of these journals are in both the DOAJ and Beall’s list, but it is difficult to tell if they made it into the final analysis because 49 of the 304 journals selected for submission were thrown out for “appearing derelict” or failing to complete the article review by press time.

For the publishers on his [Beall’s] list that completed the review process, 82% accepted the paper. Of course that also means that almost one in five on his list did the right thing—at least with my submission. A bigger surprise is that for DOAJ publishers that completed the review process, 45% accepted the bogus paper.

This is somewhat misleading, as it implies that the 45% and 82% results are exclusive of each other. I could not tell just from reading the paper what proportion of the 16 journals found in both Beall’s list and the DOAJ made it to the final analysis. Furthermore, I know this is misleading based on how Jeffrey Beall, who is quite close to the subject, interpreted it: “Unfortunately, for journals on DOAJ but not on my list, the study found that 45% of them accepted the bogus paper, a poor indicator for scholarly open-access publishing overall.”

Acceptance was the norm, not the exception.

157/304 journals (51.64%) accepted the paper. While this is a majority, I would hardly qualify acceptance as a typical result when the split is so nearly even, especially when 137 of the 304 journals had already been blacklisted. Discrediting open access in general based on the results reported is not a fair conclusion.

Overall, the article just misses making a strong critical statement about the state of scientific publication, instead focusing only on problems with predatory publishing in open access. By ignoring traditional journals, we are left without a comparison to inform what may be quite necessary reform in scientific publishing. Bohannon’s article is likely to be seen and read by a large number of people in both science and scientific publishing. Editors can be expected to be on alert for the sort of fake paper used by Bohannon and Science, making any comparison to traditional publishing models just about impossible for now. Finally, the overall effect is to damn innovation in publishing, particularly open access models, and it is not surprising that the sting article was published by the “king-of-the-hill” of traditional scientific journals. It is possible that the backlash against open access and publishing innovation in general will actually impede necessary progress in scientific publishing.

As long as an academic career is judged blindly on marketing metrics such as publication frequency and researchers continue to accept excessive publication fees, there will remain an incentive for grey market “paper-mills” to gather up unpublishable papers for profit. Overall, the open access model has thus far incorporated too much from traditional publishing and not enough from the open source movement.

Science magazine warns you that open access is too open, I say that open access is not too open enough.

text in block quotes is from Who’s Afraid of Peer Review by John Bohannon, Science, Oct. 4 2013

Original version of image here

EDIT: link to John Bohannon’s article