Through the strange eyes of a cuttlefish

A classic teaching example in black and white film photography courses is the tomato on a bed of leaves. Without the use of a color filter, the resulting image is low-contrast and visually un-interesting. The tomato is likely to look unnaturally dark and lifeless next to similarly dark leaves; although in a color photograph the colors make for a stark contrast, in fact the intensity values of the red and green of tomato fruit and leaves are nearly the same. The use of a red or green filter can attenuate the intensity of one of the colors, making it possible for an eager photographer to undertake the glamorous pursuit of fine-art salad photography.


The always clever cephalopods (smart enough to make honorary vertebrate status in UK scientific research) somehow manage to pull off a similar trick without the use of a photographer’s color filters. Marine biologists have been flummoxed for years by the ability of squid, cuttlefish, and octopuses* to effect exact color camouflage in complex environments, and their impressive use of color patterning in hunting and inter-species communication. The paradox is that their eyes (cephalopods, not marine biologists) only contain a single type of photoreceptor, rather than the two or more different color photoreceptors of humans and other color sensitive animals.

Berkeley/Harvard duo Stubbs & Son have put forth a plausible explanation for the age-old paradox of color camouflage in color-blind cephalopods. They posit that cephalopods use chromatic aberration and a unique pupil shape to distinguish colors. With a wide, w-shaped pupil, cephalopods potentially retain much of the color blurring of different wavelengths of light. Chromatic aberration is nothing more than color-dependent defocus, and by focusing through the different colors it is theoretically possible for the many-limbed head-foots to use their aberrated eyes as an effective spectrophotometer, using a different eye length to sharply focus each color. A cuttlefish may distinguish tomato and lettuce in a very different way than a black and white film camera or human eyes.


A cuttlefish’s take on salad

A cuttlefish might focus each wavelength sequentially to discern color. In the example above, each image represents preferential focus for red, green, and blue from top to bottom. By comparing each image to every other image, the cephalopod could learn to distinguish the colorful expressions of their friends, foes, and environment. Much like our own visual system automatically filters and categorizes objects in a field of view before we know it, much of this perception likely occurs at the level of “pre-processing,” before the animal is acutely aware of how they are seeing.


How a cuttlefish might see itself


A view of the reef.

A typical night out through the eyes of a cuttlefish might look something like this:

There are distinct advantages to this type of vision in specialized contexts. Using only one type of photoreceptor, light sensitivity is increased compared to the same eye with multiple types of photoreceptors (ever notice how human color acuity falls off at night?) Mixed colors would look distinctly different, and, potentially, individual pure wavelength could be more accurately distinguished. In human vision we can’t tell the difference between an individual wavelength and a mix of colors that happen to excite our color photoreceptors in the same proportions as the pure color, but a cuttlefish might be able to resolve these differences.

On the other hand, the odd w-shaped pupil of cephalopods retains more imaging aberrations in than a circular pupil (check out the dependence of aberrations on the pupil radius in the corresponding Zernike polynomials to understand why). As a result, cephalopods would have slightly worse vision in some conditions as compared to humans with the same eye size. Mainly those conditions consist of living on land. Human eye underwater are not well-suited to the higher refractive index of water as compared to air. We would also probably need to incorporate some sort of lens hood (e.g. something like a brimmed hat) to deal with the strong gradient of light formed from light absorption in the water, another function of the w-shaped cephalopod pupil.

Studying the sensory lives of other organisms provides insight into how they might think, illuminating our own vision and nature of thought by contrast. We may still be a long ways off from understanding how it feels to instantly change the color and texture of one’s skin, but humans have just opened a small aperture into the minds of cuttlefish to increase our understanding of the nature of thought and experience.

How I did it
Ever image is formed by smearing light from a scene according to the Point Spread Function (PSF) of the imaging system. This is a consequence of the wave nature of light and the origins of the diffraction limit. In Fourier optics, the point spread function is the absolute value squared of the pupil function. To generate the PSF, I thresholded and dilated this image of a common cuttlefish eye (public domain from Wikipedia user FireFly5), before taking the Fourier transform and squaring the result. To generate the images and video mentioned above, I added differential defocus (using the Zernike polynomial for defocus) to each color channel and cycled through the result three monochromatic images. I used ImageJ and octave for image processing.

Sources for original images in order of appearance:

And Movie S2

*The plural of octopus has all the makings of another senseless ghif/gif/zhaif controversy. I have even heard one enthusiast insist on “octopodes”

Bonus Content:


Primary color disks.

In particular, defocus pseudocolor vision would make for interesting perceptions of mixed wavelengths. Observe the color disks above (especially the edges) in trichromatic and defocus pseudo-color.



The aperture used to calculate chromatic defocus.

Bonus content original image sources:

Swimming cuttlefish in camouflage CC SA BY Wikipedia user Konyali43 available at:

The aperture I used for computing chromatic defocus is a mask made from the same image as the top image for this post:

2017/05/03 – Fixed broken link to Stubbs & Stubbs PNAS paper:

The structure behind the simplicity of CRISPR/Cas9


The International Summit on Human Gene Editing took place in Washington D.C. a few weeks ago, underlining the critical attention continuing to follow CRISPR/Cas9 and its applications to genome editing. Recently I compared published protocols for CRISPR/Cas9 and a competing technique based on Zn-finger nucleases. Comparing the protocols suggests editing with CRISPR/Cas9 is vaguely simpler than using Zn-fingers, but didn’t discuss the biomolecular mechanisms underlying the increased ease of use. Here I’ll illustrate the fundamental difference between genome editing with Cas9 in simple terms, using relevant protein structures from the Protein Data Bank.

Each of the techniques I’ll mention here have the same end-goal: break double stranded DNA in a specific location. Once a DNA strand undergoes this type of damage, a cell’s own repair mechanisms take over to put it back together. It is possible to introduce a replacement strand and encourage the cell to incorporate this DNA into the break, instead of the original sequence.

The only fundamental difference in the main techniques used for genome editing is the way they are targeted. Cas9, Zn-finger, and Transcription Activator Like (TAL) nucleases all aim to make a targeted break in DNA. Other challenges, such as getting the system into cells in the first place, are shared alike by all three systems.


Zinc Fingers (red) bound to target DNA (orange). A sufficient number of fingers like these could be combined with a nuclease to specifically cut a target DNA sequence.


Transcription Activator Like (TAL) region bound to target DNA. Combined with a nuclease, TAL regions can also effect a break in a specific DNA location.


Cas9 protein (grey) with guide RNA (gRNA, red) and target DNA sequence (orange). The guide RNA is the component of this machine that does the targeting. This makes the guide RNA the only part that needs to be designed to target a specific sequence in an organism. The same Cas9 protein, combined with different gRNA strands, can target different locations on a genome.

Targeting a DNA sequence with an RNA sequence is simple. RNA and DNA are both chains of nucleotides, and the rules for binding are the same as for reading out or copying DNA: A binds with T, U binds with A, C binds with G, and G binds with C [1]. Targeting a DNA sequence with protein motifs is much more complicated. Unlike with nucleotide-nucleotide pairing, I can’t fully explain how these residues are targeted, let alone in a single sentence. This has consequences in the initial design of the gRNA as well as the efficacy of the system and the overall success rate.

So the comparative ease-of-application stems from the differences in protein engineering vs. sequence design. Protein engineering is hard, but designing a gRNA sequence is easy.

How easy is it really?

Say that New Year’s Eve is coming up, and we want to replace an under-functioning Acetaldehyde Dehydrogenase [2] with a functional version. First we would need a ~20 nucleotide sequence from the target DNA, like this one from just upstream of the ALDH1B gene:


You can write out the base-pairings by hand or use an online calculator to determine the complementary RNA sequence:


To associate the guide RNA to the Cas9 nuclease, the targeting sequence has to be combined with a scaffold RNA which the protein recognises.

Scaffold RNA:

Target Complement:

Target complement + scaffold = guide RNA:

With that sequence we could target the Cas9 nuclease to the acetaldehyde dehydrogenase (ALDH1B) gene, inducing a break and leaving it open to replacement. The scaffold sequence above turns back on itself at the end, sinking into the proper pocket in Cas9, while the target complement sequence coordinates the DNA target, bringing it close to the cutting parts of Cas9. If we introduce a fully functional version of the acetaldehyde dehydrogenase gene at the same time, then we surely deserve a toast as the target organism no longer suffers from an abnormal build-up of toxic acetaldehyde. Practical points remain to actually prepare the gRNA, make the Cas9 protein, and introduce the replacement sequence, but from an informatic design point of view that is, indeed, the gist.

That’s the basics of targeting Cas9 in 1,063 words. I invite you to try and explain the intricacies of TAL effector nuclease protein engineering with fewer words.


[1] That’s C for cytosine, G for guanine, U for uracil, and A for adenine. In DNA, the uracil is replace with thymine (T).

[2] Acetaldehyde is an intermediate produced during alcohol metabolism, thought to be largely responsible for hangovers. A mutation in one or both copies of the gene can lead to the so-called “Asian Flush”.

Sources for structures:

I rendered all of the structures using PyMol. The data come from the following publications:

PDB structure: 3VEK (Zn-finger)

Wilkinson-White, L.E., Ripin, N., Jacques, D.A., Guss, J.M., Matthews, J.M. DNA recognition by GATA1 double finger.To Be Published

PDB structure: 3ugm (TAL)

Mak, A.N., Bradley, P., Cernadas, R.A., Bogdanove, A.J., Stoddard, B.L. The Crystal Structure of TAL Effector PthXo1 Bound to Its DNA Target. (2012) Science 335: 716-719

PDB structure: 4oo8 (Cas9)
Nishimasu, H., Ran, F.A., Hsu, P.D., Konermann, S., Shehata, S.I., Dohmae, N., Ishitani, R., Zhang, F., Nureki, O. Crystal structure of Cas9 in complex with guide RNA and target DNA. (2014) Cell(Cambridge,Mass.) 156: 935-949

Comic cover original source:
“Amazing Stories Annual 1927” by Frank R. Paul – Scanned cover of pulp magazine. Licensed under Public Domain via Wikimedia Commons –

A Skeptic Over Coffee #1: Starter Kit


It takes effort and maintained vigilance to become an effective skeptic, with the penetrating mental focus to cut through the misleading. Honing one’s questioning acuity means hardening one’s mental defenses against charlatans, fraudsters, and the merely incompetent in all walks of life. With practice it’s possible to be the infamous “Reviewer Number 3” who gradually gets fewer and fewer invitations to provide peer-review for “paradigm shifting” articles from editors of high-impact journals. It may seem like a grandiose dream, but you too can in fact be the colleague who corrects the university press office’s outlandish claims about their own paper, causing their tenure review to be shelfed for another year (for failure to be interviewed on Science Friday</a<). If this glamourous lifestyle of modest claims and bold negations sounds appealing, read on!

I invite you to join me every once in a while to practice skepticism in these short segments designed to provide about one coffee's worth of skeptical inquiry. My day job pushing things around with lasers both takes a lot of time and requires that I drink a tremendous amount of coffee, so the concise aSOC format should fit right in with my new lab-monkey lifestyle.

Here is your Beginning Skeptics’ reading list:

  • A seminal paper by John Ioannidis runs the numbers on an over-abundance of false-positives in the scientific literature.
    John P.A. Ionnidis. Why Most Published Research Findings Are False. PLOS. (2005). DOI: 10.1371/journal.pmed.0020124

  • Retraction Watch is an important resource for any skeptic. If someone consistently publishes retractable articles and no one notices, does anyone lose their scientist licence?
  • Jeffrey Beall runs black lists of predatory publishers and journals taking advantage of pay-for-publish open access models atScholarly Open Access. Also consider John Bohannon’s misleading report generalising predatory practices by OA publishers and ensuing criticism of his approach.
  • And remember your statistics:
    Why it Always Pays to Think Twice About Your Statistics
    An investigation of the false discovery rate and the misinterpretation of p-values

  • UPDATE: Recent, interesting consideration of widespread inflation of scientific results.
    Megan L. Head, Luke Holman, Rob Lanfear, Andrew T. Kahn, Michael D. Jennions.
    The Extent and Consequences of P-Hacking in Science.
    PLOS. (2015) DOI: 10.1371/journal.pbio.1002106
  • Philaephilia?

    Philaephilia n. Temporary obsession with logistically important and risky stage of scientific endeavour and cometary rendezvous.

    Don’t worry, the condition is entirely transient

    Rivalling the 7 minutes of terror as NASA’s Curiosity rover entered the Martian atmosphere, Philae’s descent onto comet 67P/Churyumov-Gerasimenko Wednesday as part of the European Space Agency’s Rosetta mission had the world excited about space again.

    Comets don’t have the classic appeal of planets like Mars. The high visibility of Mars missions and moon shots has roots in visions of a Mars covered in seasonal vegetation and full of sexy humans dressed in scraps of leather, and little else. But comets may be much better targets in terms of the scientific benefits. Comets are thought to have added water to early Earth, after the young sun had blasted the substance out to the far reaches of the solar system beyond the realm of the rocky planets. Of course, comets are also of interest for pure novelty: until Philae, humans had never put a machine down on a comet gently. Now the feat has been accomplished three times, albeit a bit awkwardly, with all science instruments surviving two slow bounces and an unplanned landing site. Unfortunate that Philae is limited to only 1.5 hours of sunlight per 12 hour day, but there is some possibility that a last-minute attitude adjustment may have arranged the solar panels a bit more fortuitously.

    So if Rosetta’s Philae lander bounced twice, rather than grappling the surface as intended, and landed in a wayward orientation where its solar panels are limited to only 12.5% of nominal sun exposure, how is the mission considered a success?

    Most likely, the full significance of the data relayed from Philae via Rosetta will take several months of analysis to uncover. Perhaps some of the experiments will be wholly inconclusive and observational, neither confirming nor denying hypotheses of characteristic structure of comets. For example, it seems unlikely that the MUPUS instrument (i.e. cosmic drill) managed to penetrate a meaningful distance into the comet, and we probably won’t gain much insight concerning the top layers of a comet beyond perhaps a centimetre or so. In contrast, CONSERT may yield unprecedented observations about the interior makeup of a comet.

    In science, failures and negative findings are certainly more conclusive, and arguably more preferable, than so-called positive results, despite the selective pressure for the latter in science careers and the lay press. An exception disproves the rule, but a finding in agreement with theory merely “fails to negate” said theory. For example, we now know better than to use nitrocellulose as a vacuum propellant. Lesson learned on that front.

    In addition to a something-divided-by-nothing fold increase in knowledge about the specific scenario of attempting a soft landing on a comet, I’d suggest we now know a bit more about the value of autonomy in expeditions where the beck-and-call from mission control to operations obviates real time feedback. Perhaps if Philae had been optimised for adaptability, it would have been able to maintain orientation to the comet surface and give Rosetta and scientists at home a better idea of its (final) resting place after detecting that the touchdown and grapple didn’t go through. Space science is necessarily cautious, but adaptive neural networks and other alternative avenues may prove useful in future missions.

    I’ll eagerly await the aftermath, when the experimental and the telemetry data have been further analysed. The kind of space mission where a landing sequence can omit a major step and still have operational success of all scientific instruments on board is the kind of mission that space agencies should focus on. The Rosetta/Philae mission combined key elements of novelty (first soft landing and persistent orbiting of a comet) low cost (comparable to a fewspace shuttle missions), and robustness (grapples didn’t fire, comet bounced and got lost, science still occurred). Perhaps we’ll see continued ventures from international space agencies into novel, science-driven expeditions. Remember, the first scientist on the moon was on the (so far) final manned mission to Luna. Missions in the style of Rosetta may be more effective and valuable on all three of the above points, and are definitely more fundamental in terms of science achieved, than continuous returns to Mars and pushes for manned missions. In a perfect world where space agencies operate in a non-zero sum funding situation along with all the other major challenges faced by human society, we would pursue them all. But realistically, Philae has shown that not only do alternative missions potentially offer more for us to learn in terms ofscience and engineering, but can also enrapture the population in a transcendent endeavour. Don’t stop following the clever madness of humans pursuing their fundamental nature of exploring the universe they live in.

    Rubbish in, Garbage Out?

    Extraordinary claims require extraordinary press releases?

    You have probably read a headline in the past few weeks stating that NASA has verified that an infamous, seemingly reactionless propulsion drive does in fact produce force. You also might not have read the technical report that spurred the media frenzy (relative to the amount of press coverage normally allocated to space propulsion research, anyway), instead relying on the media reports and their own contracted expert opinion. The twist is that it seems to be the case that no one else- excepting perhaps the participants of the conference it was presented at– has read it either, and this includes myself and likely the authors of almost any other material you find commenting on it. The reason is that the associated entry in the NASA Technical Reports Server only consists of an abstract.

    The current upswing of interest and associated speculation on the matter of this strange drive is eerily reminiscent of other recent \begin{sarcasm}groundbreaking discoveries\end{sarcasm}: FTL neutrinos measured by the OPERA experiment and the Arsenic Life bacterium from Mono Lake, California. Both were later refuted, some important people at OPERA ended up resigning, and the Arsenic Life paper continues to boost the impact factors of the authors and publisher as Science Magazine refuses to retract it. (current citations according to Google Scholar number more than 300).

    I would venture that the manner of disclosing the OPERA findings was done more responsibly than the Arsenic Life paper. Although both research teams made use of press releases to gain a broad audience for their findings (note this down in your lab notebook as “do not do” if you are a researcher), the OPERA findings were at the pre-publication stage and disclosed as an invitation to greater scrutiny of their instrumentation, while the arsenic life strategy was much less reserved. From the OPERA press release:

    The OPERA measurement is at odds with well-established laws of nature, though science frequently progresses by overthrowing the established paradigms. For this reason, many searches have been made for deviations from Einstein’s theory of relativity, so far not finding any such evidence. The strong constraints arising from these observations makes an interpretation of the OPERA measurement in terms of modification of Einstein’s theory unlikely, and give further strong reason to seek new independent measurements.

    Notice the description of the search for exceptions to Einstein’s relativity as ” . . . so far not finding any evidence. . .” That despite the data they are reporting doing exactly that if anomalous instrumentation could be ruled out. This was a plea for help, not a claim of triumph.

    On the contrary, the press seminar associated with the release of Felisa Wolfe-Simon et al.’s A bacterium that can grow by using arsenic instead of phosphorus issued no such caveats with their claims. Likewise it was readily apparent in the methods sections of their paper that the Arsenic Life team made no strong efforts to refute their own data (the principal aim of experimentation), and the review process at Science should probably have been made more rigorous than standard practice. It is perhaps repeated too often without consideration, but I’ll mention the late, great Carl Sagan’s assertion that “extraordinary claims require extraordinary evidence.” The OPERA team kept this in mind, while the Arsenic Life paper showed a strong preference to sweep under the carpet any due diligence in considering alternative explanations. Ultimately, the OPERA results were explained as an instrumentation error and the Arsenic Life discovery has been refuted in several independent follow-up experiments (i.e. [1][2]).

    Is propellant-less propulsion on par with Arsenic Life or FTL neutrinos in terms of communicating findings? In this case I would lean toward the latter: more of a search for instrumentation error than a claim of the discovery of Totally New Physics. The title of the tech report “Anomalous Thrust Production from an RF Test Device Measured on a Low-Thrust Torsion Pendulum” denotes the minimum requisite dose of skepticism.

    Background reading below, but by far the best take on the subject is xkcd number 1404. The alt-text: “I don’t understand the things you do, and you may therefore represent an interaction with the quantum vacuum virtual plasma.”

    23/08/2014 several typos corrected
    [UPDATE Full version of tech report: via comments from . . . . . . . .

    Is the future of scientific publishing in-house open access?

    Photo from flickr user Tom Marxchivist, 1952 cover by Basil Wolverton, used under CC attribution license.

    Those of you that frequent theScinder know that I am pretty passionate about how science is disseminated, and you have probably noticed that, like their brethren in newsprint and magazine before them, the big-name publishers don’t know exactly how to react to a changing future, and despite what traditional publishers would have you believe, they are not immune to publishing tripe.

    Nature may be butting heads with Duke University over requesting waivers for the open access policy in place there. Apparently the waiver request isn’t even necessarily based on the practical implementation of Duke’s open access policy (Nature allows articles to be made freely available in their final version 6 months after publication), but it does raise the question: how much hassle will universities and their faculty put up with before they take matters into their own hands? As MIT’s Samuel Gershman points out, modern publishing doesn’t cost all that much. Even the fairly exorbitant fees charged to authors by the “gold standard” of open access publishers may be a transient relic of the conventional (turning archaic?) publishing business model. This provides incentive for predatory publishing (as discussed in this article at The Scientist and the basis for the Bohannon article published in Science last October) But if peer review and editing is largely volunteer labour, performed as an essential component of the role of a researcher and with the bill largely footed as a public expenditure, why keep paying enormous subscription fees for traditional publishing? If the trend catches on, as it almost certainly will, leading institutions will continue to adopt open access policies and libraries will see less and less reason to keep paying for outdated subscriptions.

    Relevant links:

    Scholarly Publishing: Where is Plan B?

    California univerisity system consider boycotting Nature Publishing Group

    Samuel Gershman’s ideal publishing model, the Journal of Machine Learning Research

    Open Access Death Knell. Or Is It?

    I told you publication was a fiat currency

    Last week, Science published a critical investigation into the state of peer review in open access journals. John Bohannon, the author,  generated a faux paper describing a set of fictional experiments testing the effects of secondary metabolites from lichen on cultured cancer cells. These papers were sent to a selection of open access journals sampled from the Directory of Open Access Journals (DOAJ) and Larry Beall’s infamous list. The lichen species, secondary metabolite, and cancer cell line were varied randomly to generate iterations of the paper with slight differences, but the shoddy results were the same in each. In a few of the iterations I examined, the structures didn’t even match the secondary metabolite described. The researchers didn’t exist, their names random combinations from a database of first and last names with random middle initials, and the institutions they worked at were fictional. A “dose-dependent” growth inhibition effect in ethanol buffer (with no EtOH in controls) spans five orders of magnitude, and shows growth rates all with overlapping confidence intervals at 95%.

    Of 304 papers submitted to various open access journals, 157 were accepted, many of them without any real review taking place. 98 were rejected, and 49 were still up in the air at press time. The article seems to make a solid case against the relatively nascent open access model of publishing, and that is certainly the tone represented by the article and associated press coverage. However, if I assume that the average reader of Science is scientifically literate, then I would expect that most readers will remain unconvinced that open access is dead and dangerous.

    In Who’s Afraid of Peer Review Bohannon combines language from both scientific and journalistic writing styles, taking advantage of the credibility implied by describing sample-selection and procedural decisions in a semi-scientific manner, as well as the journalist’s ability to make general claims with a strength that would be questionable in a scientific article.


    And the acceptances and rejections of the paper provide the first global snapshot of peer review across the open-access scientific enterprise.

    137 of the journals chosen for this investigation were pulled from a black list maintained by Jeffrey Beall at the University of Colorado Boulder. In places (such as the general description of results) the overlap between Beall’s list and the journals selected from the DOAJ is not clear. In the original sample, 16 of these journals are in both the DOAJ and Beall’s list, but it is difficult to tell if they made it into the final analysis because 49 of the 304 journals selected for submission were thrown out for “appearing derelict” or failing to complete the article review by press time.

    For the publishers on his [Beall’s] list that completed the review process, 82% accepted the paper. Of course that also means that almost one in five on his list did the right thing—at least with my submission. A bigger surprise is that for DOAJ publishers that completed the review process, 45% accepted the bogus paper.

    This is somewhat misleading, as it implies that the 45% and 82% results are exclusive of each other. I could not tell just from reading the paper what proportion of the 16 journals found in both Beall’s list and the DOAJ made it to the final analysis. Furthermore, I know this is misleading based on how Jeffrey Beall, who is quite close to the subject, interpreted it: “Unfortunately, for journals on DOAJ but not on my list, the study found that 45% of them accepted the bogus paper, a poor indicator for scholarly open-access publishing overall.”

    Acceptance was the norm, not the exception.

    157/304 journals (51.64%) accepted the paper. While this is a majority, I would hardly qualify acceptance as a typical result when the split is so nearly even, especially when 137 of the 304 journals had already been blacklisted. Discrediting open access in general based on the results reported is not a fair conclusion.

    Overall, the article just misses making a strong critical statement about the state of scientific publication, instead focusing only on problems with predatory publishing in open access. By ignoring traditional journals, we are left without a comparison to inform what may be quite necessary reform in scientific publishing. Bohannon’s article is likely to be seen and read by a large number of people in both science and scientific publishing. Editors can be expected to be on alert for the sort of fake paper used by Bohannon and Science, making any comparison to traditional publishing models just about impossible for now. Finally, the overall effect is to damn innovation in publishing, particularly open access models, and it is not surprising that the sting article was published by the “king-of-the-hill” of traditional scientific journals. It is possible that the backlash against open access and publishing innovation in general will actually impede necessary progress in scientific publishing.

    As long as an academic career is judged blindly on marketing metrics such as publication frequency and researchers continue to accept excessive publication fees, there will remain an incentive for grey market “paper-mills” to gather up unpublishable papers for profit. Overall, the open access model has thus far incorporated too much from traditional publishing and not enough from the open source movement.

    Science magazine warns you that open access is too open, I say that open access is not too open enough.

    text in block quotes is from Who’s Afraid of Peer Review by John Bohannon, Science, Oct. 4 2013

    Original version of image here

    EDIT: link to John Bohannon’s article